It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
776
 
 
The original post: /r/datahoarder by /u/ExcitementClean7872 on 2025-06-19 21:06:17.

Hey everyone, Trying to access full chat history data and need a matching version of WhatsApp for iOS — ideally a .ipa from before mid-2024. If anyone happens to archive older .ipas or can point me in the right direction, I’d really appreciate it.

Thanks in advance.

777
 
 
The original post: /r/datahoarder by /u/zujaloM on 2025-06-19 18:24:19.

mainly to expand storage

rn i have 12 ssd/hdd's plus 5 inside of pc ..

want to move full to ssd stick like

which will be easy on pocket ?

778
 
 
The original post: /r/datahoarder by /u/SLJ7 on 2025-06-19 17:13:47.

Bought a 22 TB Seagate external recently when it was on sale for roughly $250 USD from Amazon. I discuss this here. That post was more than 2 months ago, and the drive is working well. I see those same drives are on sale directly from Seagate for CAD $350 (just over 250 US).

My old tower recently crapped out on me, and I had to shuffle a bunch of data from a Windows Storage Space onto spare drives. I realized the three 8 TB Storage Space drives I had in there are at least five years old. They're probably fine for non-critical data, but it seems like it might be smart to replace them. Should I just get another of these? Is there a better deal elsewhere?

I also notice 24 TB drives are $40 CAD more (about $30 US), which brings the price/TB up from $15.90 CAD to $16.25, but still seems like a fairly good deal. No idea if the drives inside the 24 TB models are different.

I'm thinking of getting more than 1. So thoughts are welcome.

I don't know if either of these links will work properly for non-Canadians. I think they'll just show local prices until you change countries.

779
 
 
The original post: /r/datahoarder by /u/Swimming-Ad5188 on 2025-06-19 17:13:16.

Hello

I want to archive android programs in case I need to reinstall them and they have been removed from the store (or just in case an update removes a feature I rely on).

I know I can copy the apk from my device with adb, but some programs have native code that depends on the platform, and the apk on my device (downloaded from the Google store) only has the relevant code for the current architecture.

From the Android documentation I found out there is an aab (android bundle) format, which is uploaded to the store, and the used for creating "on the fly" the apk to install on my device.

Is there any way to download the aab from the store, or somehow get a multiplatform apk?

780
 
 
The original post: /r/datahoarder by /u/terminal-overflow on 2025-06-19 15:27:18.

TL;DR

Based on some testing I did today, I need to warn you about the backup feature in the Mega desktop app, and urge you to consider other methods for backing up important data. The backup feature in the desktop app is unfortunately broken in its current state (as of the 19th of June, 2025), and calling it a backup is possibly even misleading. It is more like a one-way sync, where locally deleted files are deleted from the cloud backup.


Background

I've been a paying Mega Pro II user for about a year now, and I have been content with the experience overall. I set up my devices to back up to Mega automatically via the desktop app, and I let the program do its thing. Besides an occasional hiccup here and there all went well. So far so good, right? My files were safe and I could always choose to go back to an earlier backup if something went wrong, such as me accidentally deleting a folder (happens more often than I’d like to admit) or a complete hard drive failure.

Well, I decided to do some testing to be on the safe side. I wanted to see how fast I could get back to speed after files have been accidentally deleted or modified on my computer. So I tried to do just that, but after deleting some files on the PC I noticed that I couldn’t find them in the Mega backup folder! So here's the shocker after testing Mega's "backup" feature:

Deleting backed up files on your device deletes them from the Mega backup

If you delete a file on your computer that is being backed up, for example Pictures/Family/2023/Vacation/001.jpg, it's moved to: Rubbish bin > SyncDebris > (Date) > 001.jpg on Mega.

The original folder path is completely lost, and you have to guess where this file should be when restoring it. As you can imagine, this is not a comforting thought if dozens, hundreds, or thousands of files are involved. You are pretty much on your own in trying to figure the whole thing out.

Once the file is moved from the backup folder to the rubbish bin on Mega, you also cannot reverse it. So it is technically deleted from the backup folder permanently. If you want to restore deleted files you need to do it before the rubbish bin is automatically cleared, which varies from 30 days to 180 days (or longer if you contact Mega’s support). This leads to my second discovery, which almost shocked me more than the first one:

Backups are not easily restored

There is no folder separation for backups made at different times. There is file versioning, but only for single files, meaning you have to select one file at a time and restore to an earlier version that way. If things go wrong and you need to restore many files as quickly as possible, how would you go about that? Here’s what you’re stuck with:

  • Open Mega desktop app > Press “…” (settings) > Files > Backups, select your device and download the files/folders from Mega. If your files and folders have been deleted on your PC you'll need to search the Mega rubbish bin to find them.
  • Download the files directly from the Mega account centre (the drive) in your web browser. Same thing goes for files that have been deleted.
  • Right-click individual files, select “File history”, and download a previous version of the file via the web browser after logging in to Mega and waiting for the decryption of your data to complete, which might take a while.

You currently cannot go back to a specific point in time for a whole folder or backup, it only works on individual files. A backup should preserve your data exactly as it was at a certain point in time and not be modified afterwards, allowing full restoration from that point in time if something goes wrong. Mega's desktop app "backup" is not doing that, it is really just a one-way sync from your device to the cloud, of a folder of your choosing.

My recommendation

If you're using the Mega desktop app to back up anything important, please consider switching to a different solution until this is fixed. Since I haven’t extensively tested other backup service providers, I cannot really give any alternatives. However, I am sure others can give recommendations of solutions they are satisfied with.

End notes

I hope this can save someone from a potential backup disaster and loss of data, I would also love to hear if anyone else has run into these issues with the Mega desktop app, and what backup solutions have worked well for you! Hopefully Mega will address these issues quickly in upcoming versions of the app. I really like their idea of putting privacy first and their pricing for storage is good, so it’s not all bad at the end of the day!

Let me know your thoughts!

781
 
 
The original post: /r/datahoarder by /u/maolzine on 2025-06-19 08:00:06.

Hi,

I’m looking for HDD enclosure for my 10TB WD Ultrastar HC330.

I have noticed 2 cases with UASP which supposedly boosts performance by 10-30%, or is it just a gimmick?

It’s hard to find a good one with uasp and usb-c 3.1

Thanks

782
 
 
The original post: /r/datahoarder by /u/rolliepollie929 on 2025-06-19 04:40:54.

Hi friends! Preparing for first time homeowner life and came across this TikTok with free and life changing advice for home maintenance. I’ve been trying to export the comments into a spreadsheet but have had no luck. Any genius able to help? Thank you in advance!!!

783
 
 
The original post: /r/datahoarder by /u/PulsedMedia on 2025-06-19 20:56:08.

These just passed CPU stress test and are fully functioning. This is the platform we have been developing over at PulsedMedia.com for a few years, but now we have been working with the 12x3.5" HDD + 2x mITX nodes instead of 8x mITX/1L MiniPC on 1 rack unit.

https://reddit.com/link/1lfltnf/video/5lkyfzs34y7f1/player

We share a lot of this process in other forums and in our discord.

https://preview.redd.it/9ihd38554y7f1.jpg?width=1868&format=pjpg&auto=webp&s=e7f68388a4864229ee4b38199d9855badb252fc6

I think we can stuff also 2x N100 w/ 4x M.2 NVMe in the same 1RU, but it's still untested, this is up next;

https://preview.redd.it/m9fikwhc4y7f1.jpg?width=4000&format=pjpg&auto=webp&s=f8f87370b559a454383497dee8af0503d622a235

https://preview.redd.it/xjswfnsh4y7f1.jpg?width=4000&format=pjpg&auto=webp&s=a01c931be2f06cbdf6cd41e1363e0861922c6ab7

Stress Test Passed Today!

Temps remained slightly over 60C on ~20C ambient.

mPlate NAS Power Consumption From Wall;

Idle consumption is ~102W

Under load 130-137W

Config 2x N100 + 12x 3.5" 8TB 7200rpm + 16G DDR5 on each + 2x 500G NVMe + 2x2.5Gig Net connected + 2x USB stick (for rescue boot).

Comparison i5-6500t HP Prodesk Mini G3

From Wall; Idle consumption ~15W

Under load 43W

Note Double conversion, so efficiency is lower on this power delivery by estimated 10%. (edited)

We can probably even put a Ryzen 8C/16T on these for some added compute! Also the i3-n305 is more or less everything exactly the same.

https://preview.redd.it/t941u62a5y7f1.jpg?width=1868&format=pjpg&auto=webp&s=3875525657a9f63362a034aa38a293760d987013

Hope you enjoy the engineering, we are going to start sales soon(tm) with these units. These are part of our mini dedicated server series.

In our discord we (or ... I, the founder of Pulsed Media, Aleksi U) post development photos from the lab constantly and try to keep up with the background info too.

Personally i'm a long time datahoarder afficionado ... Well more like, enabling people to datahoard, not as much myself, but absolutely love making data hoarding solutions and think in €/TB terms constantly! Check our Storage Box offers for example.

Hope you enjoy the mad engineering from a Finnish garage (literally ...)! These are actual functional servers to be, the 8x mITX has been functioning really well for years and with passing these tests we don't expect surprises with 12x HDD versions neither.

Got 5x of these plates prepped for early sales already, expecting we will be producing a few each month.

Any question? Or just enjoy the mad engineering from cold nordic madlab? Ask me down, i'll try to answer ... well within a week or so... Midsummer in Finland right now.

(so wanted to tag this 18+ ...)

784
 
 
The original post: /r/datahoarder by /u/wewewawa on 2025-06-19 20:18:24.
785
 
 
The original post: /r/datahoarder by /u/Naernoo on 2025-06-19 19:16:20.

Hi!

I have a large number of images I want to deduplicate. I tried Anti-Twin because it worked out of the box.

However, the performance is really bad. I ran a deduplication scan between two folders and it found about 10 GB of duplicates, which I deleted. Then I ran a second scan, and it found another 2 GB. A third scan found 1 GB, and then another found around 500 MB, and so on.

It seems like it never catches all duplicates in one go. Why is that? I set all limits really high.

Are there better alternatives that don’t have these issues?

I tried using Czkawka a few years ago, but ran into permission errors, missing dependencies, and other problems.

786
 
 
The original post: /r/datahoarder by /u/lionsrawrr on 2025-06-19 18:24:05.

been having trouble finding a bulk post downloader that will work on a mac. tried jdownloader but it did not include the audio portion of videos even tho the original posts do have sound. checked the settings and then read up on it and guess its just something it doesnt always do. Suggestions?

787
 
 
The original post: /r/datahoarder by /u/Wonde_Alice_rland on 2025-06-19 18:13:07.

Original Title: Looking to store $100,000 worth of data (300GB) with $250. Was looking at Verbatim M Disc BDXL with 100GB, but read that a lot of these are fakes. I'm in over my head and don't know enough to parse what to trust and what not on the internet.


Making music, sfx, saving images, saving code, etc for a videogame. Likely worth $100,000; but until release I'm very poor. Saved up $250 to save data, as I've already had over half of my SD cards/External SSDs fail and corrupt most of their data (luckily I have about 7 methods of data storage for copies).

Need some advice from people with more knowledge than I. A lot of people were complaining about Verbatim 25GB BDXL as fake, but I'm unsure if that means the Verbatim 100GB BDXLs are fake or not. What should I do?

788
 
 
The original post: /r/datahoarder by /u/Melodic-Network4374 on 2025-06-19 18:12:59.

Yeah, I know about the wiki, it has links to a bunch of stuff but I'm interested in hearing your workflow.

I have in the past used wget to mirror sites, which is fine for just getting the files. But ideally I'd like something that can make WARCs, singlefile dumps from headless chrome and the like. My dream would be something that can handle (mostly) everything, including website-specific handlers like yt-dlp. Just a web interface where I can put in a link, set whether to do recursive grabbing and if it can follow outside links.

I was looking at ArchiveBox yesterday and was quite excited about it. I set it up and it's soooo close to what I want but there is no way to do recursive mirroring (wget -m style). So I can't really grab a whole site with it, which really limits its usefulness to me.

So, yeah. What's your workflow and do you have any tools to recommend that would check these boxes?

789
 
 
The original post: /r/datahoarder by /u/tyrandemain on 2025-06-19 17:59:04.

USBs can be lost, drives can break, and some cloud storages are eager to terminate your account if you don't log in for a certain period of time. I'm looking for an option to store a handful of tiny files (mostly text documents with notes, configs etc, but some media as well: screenshots a short clips), where I can upload those files to, and be sure that if I only need them in 5-10 years time, they will still be there?

790
 
 
The original post: /r/datahoarder by /u/nothing-counts on 2025-06-19 17:57:57.

I built this as a fast, use from all devices , and completely free peer-to-peer file sharing tool. It works right in your browser—no logins, no uploads to the cloud.

so constantly working on it . this is early version so any feedback will be awesome .

791
 
 
The original post: /r/datahoarder by /u/grn_frog on 2025-06-19 17:38:48.

Trying to dig through the mountains of different options and suggestions hasn't provided a straightforward answer.

I do hobby photography and am looking for recommendations for photo backup.

I have a MacBook Air running Lightroom, and I import all my photos to an external drive that I work off of. I have a separate NVMe M.2 external drive that I wanted to use purely as a backup device for my photos. Ideally, I'd like it to automatically back up my external drive containing my photos and Lightroom catalog once a month. From doing some reading, people have recommended ChronoSync and Carbon Copy Cloner for this use case. I've read and gotten into the weeds regarding NAS setups and the 3-2-1 strategy, but for now I just wanna get a simple hard copy reliable backup going so I can feel comfortable deleting the photos off my camera memory card (which is my current second copy of most photos).

Should I buy CCC or ChronoSync, or is there a free alternative? Or does anyone have any other recommendations?

792
 
 
The original post: /r/datahoarder by /u/bufferOverflowCanuck on 2025-06-19 17:27:58.

Hey all,

I am moving cross country and packing up the apartment into a moving truck and driving 10.5 hours to the new place.

What is the best way to transport 3.5 HDDs in my NAS so they dont get damaged?

Moving trucks dont have much support and everything feels the vibrations more than your average car on the road. is there a case, trunk, or box that i can purchase off amazon or something that will help keep my data safe in transport?

I'll try and put the most important stuff on the cloud BUT realistically i dont have close to enough cloud storage for everything

tips, tricks, products all welcome!

EDIT: for reference I have a QNAP TS-453D , not an enterprise rack. im a baby data horder. 28TB in the NAS , with additional 24TB in my tower.

793
 
 
The original post: /r/datahoarder by /u/scampy008 on 2025-06-19 16:18:59.

Hi Guys,

Looking for simple cheap 6G SAS raid controller that can load / be seen / configured in UEFI mode.

I don't mind JBOD or Raid (better)

It seems hard to tell which cards cab to be recognised in UEFI, I tried a couple of LSI cards, but they only seem to be picked up in Legacy Mode, when I disable Sata drives and enable Legacy Roms in UEFI mode the cards come up with driver health "Failed" and configuration required.

Any ideas or help would be appreciated.

794
 
 
The original post: /r/datahoarder by /u/T0biasCZE on 2025-06-19 15:51:10.

On my NAS/server, i had a small 128GB NVMe ssd, on which i just had some VMs and docker image... I accidentelly overfilled the ssd, and after server restart, the xfs file system got corrupted and its not being mounted anymore (I am getting kernel error in syslog :|)

Is there some free software that could manually scan the drive and try to recover the files? I found ReclaiMe, and its finding the files, but it costs 120€ for the licence, which is a lot...

Is there some free software that could do this?

Alternatively, is there some software that could repair the xfs file table? (xfs_repair command doesnt work)

795
 
 
The original post: /r/datahoarder by /u/BadWi-Fi on 2025-06-19 14:00:10.

I bought a bunch of Ritek DVDs ( not blu ray). How do I know that this are genuine. Also, can someone tell how to see an MID of a disc?

796
 
 
The original post: /r/datahoarder by /u/n3IVI0 on 2025-06-19 13:30:05.

I have for years been downloading comics from GoComics.com via wget. Recently, they have made changes to the website that have killed my handy bash script. They seem to be hiding the main comic of the day behind a javascript loader. I'll use Sherman's Lagoon as an example.

wget -E -H -k -K -p -nd -R html,svg,gif,css,jpg,jpeg,png,js,json,ico -P -T 5 -t 1 -e robots=off --http-user=USER -U "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36 Edge/12.0" --referer="https://gocomics.com/" https://www.gocomics.com/shermanslagoon/$(date +%Y)/$(date +%m)/$(date +%d)

This will download the old comics down below, but not the latest comic being displayed by the Viewer up top. Can anybody figure out how to get wget to access the DAILY comic?

Thank you.

797
 
 
The original post: /r/datahoarder by /u/Any_Bandicoot9863 on 2025-06-19 13:14:42.

Hey everyone,

I’ve been trying to get the Internet Archive’s attention about a serious and time-sensitive issue I reported over a week ago. I’ve done everything I can think of:

  • Emailed them at info@archive.org
  • Submitted the official contact form
  • Posted a polite GitHub issue
  • Even tweeted at @internetarchive

Still no reply.

I know they’re a nonprofit and probably flooded with requests, but this isn't just a normal takedown or technical bug — it's something that really needs human eyes ASAP. The silence has been honestly overwhelming — I’ve gone from stressed to anxious to just plain frustrated.

Has anyone here had luck getting a faster response from them?

Maybe a backchannel, an active team member, or even a time of day they’re more likely to reply?

Any advice would mean a lot. 🙏

Thanks for reading — and for any help you can share.

798
 
 
The original post: /r/datahoarder by /u/eymo-1 on 2025-06-19 10:18:35.
799
 
 
The original post: /r/datahoarder by /u/Reasonable_Brief578 on 2025-06-19 08:59:37.

Hey everyone! 👋

I’d like to share a small open-source project I built called LocalAlbum — a simple desktop app that lets you easily browse local photo and video collections using your default browser.

https://reddit.com/link/1lf62ip/video/os1sriphmu7f1/player

preview

🔧 What it does:

  • 📂 Select a local folder with images/videos
  • 🌐 Launches a lightweight, zero-dependency local web server
  • 🖥️ Opens in your default browser as a clean, navigable photo album
  • ⚡ Super lightweight — no database, no indexing, no cloud, no tracking

✅ Why I made this:

I wanted a quick and private way to view photos from external drives, archives, and backups without uploading them anywhere or installing complex media servers.

LocalAlbum runs entirely locally — perfect for minimalists, tinkerers, or data hoarders who want control over their media browsing.

💻 Tech stack:

  • Python (just 1 script)
  • HTML/CSS frontend
  • Works on Linux, macOS, and Windows (tested on all)

🚀 Get started:

https://github.com/Laszlobeer/localalbum
cd localalbum
python3 app.py 

No install, no nonsense — just point it to a folder, and browse.

🙌 Looking for:

  • Feedback / feature ideas
  • Contributors welcome!
  • If you use it, I’d love to know how you’re organizing your local media

📎 GitHub: https://github.com/Laszlobeer/localalbum

more info in the repo!

Thanks for reading, and happy browsing! 😊

Let me know if you’d like it.

800
 
 
The original post: /r/datahoarder by /u/DRONE_SIC on 2025-06-19 08:47:48.
view more: ‹ prev next ›