Osayidan

joined 2 years ago
[–] Osayidan@social.vmdk.ca 39 points 2 years ago

I'm sure the lawyers would love it too.

[–] Osayidan@social.vmdk.ca 17 points 2 years ago

Because chatgpt can do the task for you in a couple seconds, that's pretty much it. If the tool is there and you can use it then why not?

There's obviously going to be some funny scenarios like this tread, but if these kinds of interactions were a majority the company and the technology wouldn't be positioned the way they are right now.

[–] Osayidan@social.vmdk.ca 1 points 2 years ago

Have to think of it more like how quantum computers are right now. You aren't going to be running minecraft or a web browser on it, but it'll probably be very good at doing certain things. Those things can either be in their own silo never interacting directly with a traditional computer, or information will be sent between them in some way (such as sending a calculation job, then receiving the answers). That send/receive can afford to be slow if some translation is needed, if the performance gains on the actual task are worth it. It's not like a GPU where you would expect your frames to be rendered in real time to play a game.

Eventually that may change but until then it's no more than that, articles like these put a lot of hype on things that while very interesting can end up misleading people.

[–] Osayidan@social.vmdk.ca 4 points 2 years ago (1 children)

Possible yes. Cost effective / valid business case probably not. Every extra 9 is diminishing returns: it'll cost you exponentially more than the previous 9 and money saved from potential downtime is reduced. Like you said 32 seconds of downtime, how much money is that for the business?

You're pretty much looking at multiple geographically diverse T4 datacenters with N+2 or even N+3 redundancy all the way up and down the stack, while also implementing diversity wherever possible so no single vendor of anything can cause you to not be operational.

Even with all that though, you'll eventually get wrecked by DNS somewhere somehow, because it's always DNS.

[–] Osayidan@social.vmdk.ca 1 points 2 years ago (1 children)

I suspect it'll just lead to A.I generated headlines in lieu of headlines taken from news publishers then. You can't make any claims on the facts of the news. If there's a big earthquake in Japan then there's a big earthquake in Japan. An A.I can generate a headline saying as much, and the news publishers are cut out of the equation. At most they'd pay whatever subscription other news outlets pay for things like AP and other sources to get a stream of facts, then let the A.I go to town on that data.

And honestly as long as the info is accurate I'd much rather use something like that then have to deal with going to multiple websites, i'll go to multiple sources manually if I need to research something, not for my morning news. This is where RSS feeds come in handy but the same publishers who whine about this issue often don't publish a feed for the same reason. I'd personally prefer to not consume the content at all, if the only method was going to individual websites and checking for updates.

[–] Osayidan@social.vmdk.ca 2 points 2 years ago

Welcome to the federation. The cookies that were promised don't actually exist but at least we're not reddit so we've got that going for us.

[–] Osayidan@social.vmdk.ca 3 points 2 years ago

I run linux for everything, the nice thing is everything is a file so I use rsync to backup all my configs for physical servers. I can do a clean install, run my setup script, then rsync over the config files, reboot and everyone's happy.

For the actual data I also rsync from my main server to others. Each server has a schedule for when they get rsynced to so I have a history of about 3 weeks.

For virtual servers I just use the proxmox built in backup system which works great.

Very important files get encrypted and sent to the cloud as well, but out of dozens of TB this only accounts for a few gigs.

I've also never thrown out a disk or USB stick in my life and use them for archiving, even if the drive is half dead as long as it'll accept data I shove a copy of something on it, label and document it. There's so many copies of everything that it can all be rebuild if needed even if half these drives end up not working. I keep most of these off-site. At some point I'll have to physically destroy the oldest ones like the few 13 GB IDE disks that just make no sense to bother with.

[–] Osayidan@social.vmdk.ca 3 points 2 years ago (3 children)

I get it if those doing the linking also host enough of the content (or all of it) so that users don't have any need to go to the site. For example if a post here on lemmy or on reddit has the link to the article with the headline, but then the top comment is a copy/paste of the entire thing.

What I don't get is why news are any different than any other search result in this context? On my phone for example if I look at the android default feed that shows news and other content based on my interests, it's just headlines and thumbnail images. If I want to actually see the content I have to click on it, thus going to the news publisher's web page. Is this what they're mad about?

It just sounds to me like they're wanting to make it harder for individuals to find them. Since it's so accessible (don't even need to open an app) this is often where i'll see important headlines first thing in the morning, only later in the day will I bother to open apps, go to websites, check my RSS etc.

[–] Osayidan@social.vmdk.ca 2 points 2 years ago

Especially now that I can't even click on a twitter link without being asked to create an account and log in. It'll be chaos if some big situation is going on and people without twitter accounts are scrambling to find critical information from other sources.

[–] Osayidan@social.vmdk.ca 17 points 2 years ago

If you're using memory for storage operations, especially for something like ZFS cache, then you ideally want ECC so errors are caught and corrected before they corrupt your data, as a best practice.

In the real world unless you're buying old servers off ebay that already have it installed the economics don't make sense for self hosted. The issues are so rare and you should have good backups anyways. I've never run into a problem for not using ECC, been self hosting since 2010 and have some ZFS pools nearly that old. I exclusively run on consumer stuff with the exception of HBAs and networking, never had ECC.

[–] Osayidan@social.vmdk.ca 9 points 2 years ago (1 children)

Monitoring for my systems, like zabbix + grafana combo I want to do it but I never do. Mostly because the resources it would use, time it would take, and impact it would have on my storage (constant writes for the database on my SSDs would probably kill them faster). Right now I already get emails from my UPS for power issues, and from my proxmox hosts for backup status and ZFS status.

I'll probably cave and do it once I add a new server to my cluster.

[–] Osayidan@social.vmdk.ca 0 points 2 years ago (1 children)

I would say the entire experience of using youtube is having your feed with subscriptions and suggestions. Juggling being logged in in one window to browse around and decide what to watch, get the links, then paste them into another window to watch them while logged out doesn't sound like a good time.

Ads is also a bad time. So probably going to just drop the platform and stop consuming content from all those creators I've been following in some cases for nearly a decade.

view more: ‹ prev next ›