It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
1801
 
 
The original post: /r/datahoarder by /u/MrMargaretScratcher on 2025-04-23 07:37:41.

This can't be right, right?

https://www.aa2zsupply.co.uk/ProductDetails.asp?ProductCode=51700&gad_source=1

1802
 
 
The original post: /r/datahoarder by /u/ueommm on 2025-04-23 06:32:20.

Has this happen to any of you??

I can't actually believe this is happening. Last time I bought a 16TB, thinking I would expand my storage, but almost as soon as I bought it, the old 16TB drive failed, so instead of 32TB, I got the same 16TB of storage after paying extra money. And now, fucking once again, bcoz that last 16TB has failed I am running out of space, so I bought a new drive literally 2 days ago, and today, an old drive out of nowhere has failed, when it has been working normal for the last few years, like WTF???? I wouldn't be THIS ANGRY if it failed randomly anytime in the last year, but you failed AS SOON AS I bought a new drive??? WTF???? Is the god playing a joke on me or what?? Sure maybe I have run it a lot these few days by copy and pasting and backing up all the data onto this drive, but still, WTF?? You are supposed to do ONE job which is to store data and you have COMPLETELY failed!!! FUCK YOU WESTERN DIGITAL MY PASSPORT!

Western Digital is the WORST brand EVER. DO NOT fucking buy it. EVERY SINGLE DRIVE I have ever bought from them have failed. FUCK! Now I have to spend money again to buy a new drive to replace it. I am not saying Seagate is absolutely better, bcoz that last 16TB that failed is a Seagate external, but at least I have one Seagate drive that is still working after many years, I do not have a single WD drive that is still working after a few years.

1803
 
 
The original post: /r/datahoarder by /u/SultanGreat on 2025-04-23 05:45:23.

So I had a Maxtor Blue Portable hard disk that had some very important data.

Couple years ago I wanted to install "Hackintosh" so I took the disk and ERASED the disk, which turned it into APFS format. This Version os Hackintosh was specifically High sierra/ Mojave -ish.

Then I re-erased it into NTFS format.

And downloaded/transfer some files onto it.

Does this hard disk have any chance?

1804
 
 
The original post: /r/datahoarder by /u/ufokid on 2025-04-23 04:27:43.

I'm looking to record an ongoing radio audio stream, but it will have a lot of dead air.

Is there an existing way to achieve this?

Broadcastify.com has a way of doing this with an uploaded audio stream.

1805
 
 
The original post: /r/datahoarder by /u/martellus on 2025-04-23 03:37:31.

Hello,

I recently got a DLT7000 drive with a tape that I need to pull data off of. I had a SCSI card for an LTO drive in old computer already, so I rebuilt it, got an extra cable, hooked it up and got some fresh DLTIV tapes to test with. SCSI card seems to read the drive and the drive seems to at least cycle correctly.

OS is windows 10 on an i7 desktop, what is my best options for software? Being that DLT is dead, I would really like to just find a free program to pull data off it after confirming drive function with the test tapes.

from my limited knowledge: Z-datdump - I don't think it supports DLT..? Bacula - mostly Linux and doesn't really support tapes in free..? Veeam - mixed info but supposedly could do it. Tried installing the community edition and got errors that computer does not match system requirements. (??)

Seen other options but all big enterprise solutions I'm not going to budget for.

Always found tapes and old hardware fascinating (that craiglist drive post I see on the front page is incredibly cool), but this is far beyond my usual.

While tape seems to possibly be fine (going off drive indicator lights), there is the possibility its trash so I would rather not spend anything if I didn't have to.

1806
 
 
The original post: /r/datahoarder by /u/Prog47 on 2025-04-23 02:53:05.

So after the whole synology fiasco i have decided to build my own nas. I can't seem to find a case that fit my needs (maybe it doesn't exist). So anyways here are my requirements:

  1. At least 12 hot swappable bays (well at least there is drive caddies (if i have to shut down to replace drives thats fine).
  2. fits atx motherboards
  3. used a standard psu
  4. Doesn't sound like a jet engine taking off when you turn it on .

I've had norco case in the past & what i remember of them they were junk. They probably haven't improved that much i would guess. Mine broke & company was impossible to get ahold of. I ended up giving it away.

I've researched the supermicro 4U cases & from what i've seen they are built great but they are VERY loud (& wouldn't meet the wife approval factor). I see people doing all types of hacks to make them quiter like making a foam backplane, 3d printing stuff, ect.. & honestly hats not thats not something i want to mess with.

There is the jonsbo n5. As most of you are aware its doesn't really have drive cadies. These rubber band like mechanisms to put the drives out with i'm not impressed with & just the overall build quality just seems to be mehhhh.

There is also this style of case on aliexpress:

https://www.aliexpress.us/item/3256808684423329.html?algo_exp_id=7d699873-ee40-4924-882e-678e8de4d96a-14&pdp_ext_f=%7B%22order%22%3A%22-1%22%2C%22eval%22%3A%221%22%7D&utparam-url=scene%3Asearch%7Cquery_from%3A

This looks nice & all & the price is good but the shipping is not. Shipping is generally as much if not more than the case cost. The *few* reviews i've seen are pretty good but since its going to be shipped from china it could be a pain if there were any issues & had to return it.

Then there is the HL15 (45 drives). It seems like its built like a rock & is just what i need "but" dang its expensive. Paying for $1k for a case is a hard pill to swallow.

I'm thinking my only true options are the HL15 or the jonsbo at this point. Anything i'm missing or any options i should look into?

1807
 
 
The original post: /r/datahoarder by /u/No_Independence8747 on 2025-04-23 02:07:01.

Guide me o wise ones!

Thanks to this sub I shelled out some cash and got a 12tb hard drive last year. No problems other than transfer speed seems kind of slow. Little more than half full.

I think I'd like to back up everything now. I won't be at a loss if it all disappears, this is just for peace of mind and to prevent any inconvenience. The drive I bought last year for $90 is now $180. I bought from Go hard drive on eBay and just found they don't have a good reputation here. Back to the drawing board.

I found the hard drive below but I don't know what to make of it. I think it may be a "white label" drive but info is sparse. It's pretty much the cheapest (I am poor) but I'll only spin it up occasionally.

What am I looking at and what should I expect from it? Is it ok to buy? I can get by on 10tb if anyone has any recommendations.

Thank you!

Suspect in question:

https://www.ebay.com/itm/127014254745

1808
 
 
The original post: /r/datahoarder by /u/KalistoZenda1992 on 2025-04-22 22:38:05.

I had some discussion with folks that Libby & Hoopla kind of is held within whatever local library hosts them. In certain areas there are more options due to what patrons ask for, so that means that if the library association is defunded or impacted by DOGE in the worst way that patrons would lose these services. Is there a way of archiving something that could be lost?

1809
 
 
The original post: /r/datahoarder by /u/RadiantQuests on 2025-04-22 21:20:38.

Please I am kinda new to archiving and I am trying to help a writer to upload his audio content on archive.org.

Here are my specific questions:

  1. What is the best approach if I want to upload files that may often be updated or replaced in the future. 1.1 Do you advise to create a page (while uploading files). And later on, upload new the audio files there? 1.2 Or do you advise on uploading each file separately in its own page/item? And why?
  2. Is there a way to delete all XML and spectogram png and generated torrent file from an item/page, leaving only the audio files? Because there exists with each upload a file ending with meta.xml exposing the uploader's personal email.

Thank you.

1810
 
 
The original post: /r/datahoarder by /u/DrudgeForScience on 2025-04-22 20:39:07.

I am looking for the Quality Action plan, it was on the HHS Website but that doesn’t exist any longer. It was issued in 2022. CMS was the lead agency for a number of actions. Any help would be appreciated

1811
 
 
The original post: /r/datahoarder by /u/IroesStrongarm on 2025-04-22 19:57:34.

I recently acquired an LTO-5 drive and tapes and am about to go down the LTO archive rabbit hole. This is just for me, my data, and my home lab. I'm trying to come up with best practices and procedures and have the start of a automated script going to facilitate backups. Here's my current thought process:

  1. On the archiving PC, setup a locally stored staging area to store about 1.2-1.25Gb of data.
  2. Use find to create a file list of all files in the backup directory.
  3. Use sha256deep to create checksums for the entire directory.
  4. Create a tar file of the entire directory.
  5. Use sha256 on the tar to create a checksum file.
  6. Create a set of par2 files at 10% redundancy.
  7. Verify final checksum and par2 files.

My first question is, any fault in logic in my plans here? I intend to keep the checksums and file list in a separate location from the tape. Should I also store them directory on the tape itself?

The second question, and slightly more why I'm here, should I create the tar directly to the tape drive, at which point the second checksum and the par2 files are created by reading the data on the tape in order to write it? Or should I create the tar to a local staging drive and then transfer all the files over to the tape?

Thoughts? Criticisms? Suggestions?

1812
 
 
The original post: /r/datahoarder by /u/markth_wi on 2025-04-22 18:27:21.

Given that in many circumstances a change in regime can also be a change in data-policy - the ongoing situation with the US is a good example where basically every federal program , data repository or dataset oftentimes collected over decades is in danger of being purged.

Does there exist a non-denominational data-warehousing group that allows custodians of data to put such depots of data into a repository - these could be TB's or PB's of data sometimes moving on short notice but then not again for some time.

Is there a non-profit that exists around the idea of creating such an archive or does on exist that's not as ad-hoc as things seem to be?

1813
 
 
The original post: /r/datahoarder by /u/LibrarianGreedy8034 on 2025-04-22 17:26:18.

Hi,

I'm currently using SingleFile web extension to save my grades as an HTML file. The problem that I want to solve is when I click the comments button to view feedback it does nothing. I'm assuming because it doesn't save the javascript. Is there a work around? I would like to save my grades page offline.

1814
 
 
The original post: /r/datahoarder by /u/bhoffman20 on 2025-04-22 16:58:29.
1815
 
 
The original post: /r/datahoarder by /u/fakeacc001002003 on 2025-04-22 12:52:39.
1816
 
 
The original post: /r/datahoarder by /u/LAN_Mind on 2025-04-22 12:50:59.

Good morning. This is probably a super basic question, but I haven't been able to figure out how to pull a video from yt. It's definitely related to cookies. For better or worse, I have two G profiles on this machine. I figured it wouldn't work, but here is the command I first tried:

yt-dlp -f bestvideo+bestaudio https://youtu.be/JVywqFx0GdE

Which gives me "Sign in to confirm you’re not a bot." as expected. So I tried this:

yt-dlp -f bestvideo+bestaudio --cookies-from-browser chrome  https://youtu.be/JVywqFx0GdE

That gave me the error "Could not copy Chrome cookie database.", so I tried telling it my profile:

yt-dlp -f bestvideo+bestaudio --cookies-from-browser chrome:<GProfileName> https://youtu.be/JVywqFx0GdE

Which gives me this error: could not find chrome cookies database in "C:\Users<WindowsUserName>\AppData\Local\Google\Chrome\User Data<GProfileName>"

Can anyone spot what I'm doing wrong? Thanks in advance.

1817
 
 
The original post: /r/datahoarder by /u/shady101852 on 2025-04-22 15:44:28.

I'm backing up the things i wanna save before i fully reset my pc and i was wondering if i could just drag and drop and replace all files on c:users when im done or if i have to do it differently?

1818
 
 
The original post: /r/datahoarder by /u/deepblue02101996 on 2025-04-22 15:24:20.

hello!

I am having some trouble trying to save my iphone photos to my Qnap. currently trying to free up space on the phone, and was hoping i could utilize my Qnap to get free space. does anyone have a great link they could share? (youtube/website) that i could reference? ideally would like to just save jpeg (like i do my DSLR if possible). thanks in advance!

1819
 
 
The original post: /r/datahoarder by /u/MysteriousArrival8 on 2025-04-22 14:19:07.

I know there are OS's and hardware that make more sense for home servers, but wanted to experiment with using an M1 Mac mini 16GB/1TB SSD.

I have a few external SSDs laying around - what's the best way to set up storage with these?

  • 1TB Samsung 970 EVO SSD in a TB3 enclosure
  • 2TB Samsung T7 SSD
  • 2TB Samsung T7 Touch SSD
  • 2TB External 2.5" HDD - WD My Passport Ultra
  • 128GB 14-year old Crucial m4 SSD
  • 64GB 2230 SSD pulled from a Steam Deck

I was considering partitioning either 500GB or 750GB of the internal SSD and then doing a JBOD concatenation of that with the 1TB 970 EVO SSD have a larger combined volume of 1.5TB or 1.75TB for storage outside of the OS volume. Then leaving the T7 2TB and T7 Touch 2TB as separate volumes and use the 2TB WD HDD as a backup for important files. Are the Crucial and 2230 SSD's worth keeping for anything, or should I just trash them?

Any better suggestions? Would it be okay to JBOD the 500GB or 750GB internal partition + 970 EVO 1TB + Samsung T7 2TB so that I don't have to manage jumping between volumes?

1820
 
 
The original post: /r/datahoarder by /u/Pasta-hobo on 2025-04-22 14:17:14.

The United States, current 'politics' aside, was never hospitable for free information. Their copyright system takes a lifetime for fair use to kick in, and they always side with corporations in court.

The IA needs to both acknowledge these and move house. The only way I think they could be worse off for their purposes is if they were somewhere like Japan.

Sweden has historically been a good choice for Freedom of Information.

1821
 
 
The original post: /r/datahoarder by /u/EchoGecko795 on 2025-04-22 14:15:47.
1822
 
 
The original post: /r/datahoarder by /u/DendriteCocktail on 2025-04-22 14:15:00.

I've a number of older books that I want to digitize, ideally without cutting off the binding.

NAPS2 with an Epson V600 works well but with each scan I have to manually rotate the image and then split the two page scan in to two separate pages. A lot of extra time and clicks.

Is there a way to have it do this automatically?

In this post, u/32contrabombarde talked about using NAPS2 then Scantailer, then back to NAPS2 which seems like a much more laborious process than what I'm doing now, but perhaps I'm missing something.

Thanks all,

1823
 
 
The original post: /r/datahoarder by /u/scorcher24 on 2025-04-22 10:49:33.

Hey guys. A while ago I upgraded my home network, to 2.5 Gbps. The only thing missing at the moment is my NAS. Of course I want to upgrade that as well.

With the latest Synology controversy, I'm unable to buy a new device from them, so I'm looking into options that I have.

One of my ideas is to buy a JBOD enclosure and connect that to a itx computer with a sff-8088 card. I have no experience in doing this so I'm looking if somebody can tell me more about this method.

My main concern is what happens if the card should fail at some point. With raid cards, you always buy two more, so you can rescue your data should one card fail. But this should not be needed because only SATA channels are transferred to the other computer right? The raid itself should be software.

I'm also curious how stable this runs in general. Are there lots of issues or is this really something you put in and it always works? How is Linux support for this type of feature?

I'm looking to add something like the TL-D800S, so I have to use the cards provided by QNAP.

Thanks for any input.

1824
 
 
The original post: /r/datahoarder by /u/aygross on 2025-04-22 10:24:37.

I run a bunch of things off of Raspberry Pi at my house, but I'm looking to do this remotely. I would assume Hetzner would be the cheapest way to do this. I want to download all of Lewis Rossman's YouTube channel for archive purposes. What would be a simple way to get this going? Preferably for a one month period.

Should I just be spinning up a vulture instance or something else.

What would be a pretty plug in play way to do this. I would then download it to my home storage once it's finished so I can avoid yt hardware fingerprinting etc .

1825
 
 
The original post: /r/datahoarder by /u/StarBirds007 on 2025-04-22 08:41:48.

A user from r/PrinceOfPersia had the manual for this game, which I found out, has not been archived yet. So I asked him to scan and send me a copy of it. Now where and how am I supposed to archive it?

EDIT:

My language was a bit confusing... I'm only asking how I should preserve this pdf manual for future players of this game.

view more: ‹ prev next ›