It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
8976
 
 
The original post: /r/datahoarder by /u/dokha on 2024-06-19 11:47:01.

It’s important to note thats the kind of assessment pages im talking about are the casual ones such as the fun ones you take on Buzzfeed and the ones you see on astrology sites..

I have no idea the correct tools involved and how to use them .. I did try browser addons such as Single File but the offline files never reach the results after taking the quiz..

8977
 
 
The original post: /r/datahoarder by /u/PuzzleHeadPistion on 2024-06-19 10:45:43.

Hey,

I've got two NAS and a bunch of drives. One EXOS 16Tb, two Barracuda 8Tb, one IronWolf 6Tb, on WD "white" 6Tb and on WD Red 3Tb. My main NAS is a desktop with +6 slots, my second NAS is a 4 bay Asustor.

Currently I don't use raid/parity of any kind... All drives are single volumes and it just sync's each drive from my main NAS to the second NAS, so I try to make the drive setup equal. However, the recent addition of the EXOS 16Tb kills that.

Since my main NAS is changing to a FreeBSD/TrueNAS with a ZFS pool, what's the best setup?

I was thinking maybe 8Tb+8Tb+6Tb+6Tb in the ZFS pool, then 16Tb+3Tb as single volumes on my second NAS. This way drives in the pool are as similar as possible in size and if parity wastes one drive, it will be about the same total capacity as the second NAS.

Does this make sense?

8978
 
 
The original post: /r/datahoarder by /u/DevanteWeary on 2024-06-19 16:42:25.

Hey guys. Using mITX motherboard and out of my four SATA ports, and need four more ports to connect four more drives.

My PCIe slot is free and both m.2 slots are taken.

What's the best way to get more free SATA ports?

It's for my low power streaming server/NAS that is running Unraid.

Thank you for any advice!

8979
 
 
The original post: /r/datahoarder by /u/Thehobbyist916 on 2024-06-19 16:24:33.

YT-DLP is only able to download one link at a time

Anyone have any suggestions or advice?

Also, I’d like to be able to download YouTube RED content

Thanks

8980
 
 
The original post: /r/datahoarder by /u/dany_897 on 2024-06-19 15:31:09.

Just received an email saying that my 3GB plan will be downgraded to 1GB in 10 days.

So for everyone contemplating the possibility of subscribing to the service, reconsider it, as if the company is no able to sustain 3GB free accounts, how would they be able to respect lifetime plans in the near future?

8981
 
 
The original post: /r/datahoarder by /u/diradi on 2024-06-19 15:06:12.

I subscribed to an online course service, and the provider uploads the class recordings to the platform through a private Vimeo server. I can watch the classes, but it's practically impossible to download them using traditional methods. I was able to download some videos on the platform using IDM (Internet Download Manager), but lately, whenever I try to download a video, a message appears saying "Unknown error, please try again."

Can someone help me with a solution? Either a method to download private Vimeo videos or a way to fix the IDM error.

Thank you.

8982
 
 
The original post: /r/datahoarder by /u/AlvTellez on 2024-06-19 15:04:48.

I have several larger hard drives:

  • 5TB portable drive connected to my HTPC for films and series via Plex
  • Two unused 8TB drives
  • 4TB drive containing important media (that doesn't fit in my laptop and/or is more important)

Initially, I considered getting a Synology NAS, but with less than 10TB of actual data, it seems like overkill, especially since I rarely access this data and usually keep the drives unplugged, except for the 5TB drive that's always connected to my HTPC (I also don't really need a NAS for my Plex needs, since I already have the HTPC as a server for that).

After reading some posts, I thought about purchasing a license for Bvckup 2, which is more cost-effective and would allow me to use my other drives for backup.

My plan is to transfer data from the 5TB drive to one of the 8TB drives and periodically back up the data to the other 8TB drive. If I run out of space, I could use the 4TB drive similarly and back up data to the other 5TB drive.

While this might sound inefficient to experienced data hoarders, how bad/good is this idea? Are there any other software options that could simplify this process, compared to manually copying and pasting data between drives?

8983
 
 
The original post: /r/datahoarder by /u/justreddit2024 on 2024-06-19 14:08:01.
8984
 
 
The original post: /r/datahoarder by /u/auridas330 on 2024-06-19 13:27:11.

I bought two WD elements drives, both are same size, manufacture date, drive number, but one needed me to play with the 3.3v pin to show up.

Never knew that WD plays Russian roulette with their drives lol

8985
 
 
The original post: /r/datahoarder by /u/didyousayboop on 2024-06-19 11:48:06.

Important information from the Canadian Conservation Institute, an agency of the federal government of Canada.

Table 2: the relative stability of optical disc formats

| Optical disc formats | Average longevity | |


|


| | CD-R (phthalocyanine dye, gold metal layer) | >100 years | | CD-R (phthalocyanine dye, silver alloy metal layer) | 50 to 100 years | | DVD-R (gold metal layer) | 50 to 100 years | | CD (read-only, such as an audio CD) | 50 to 100 years | | CD-RW (erasable CD) | 20 to 50 years | | BD-RE (erasable Blu-ray) | 20 to 50 years | | DVD+R (silver alloy metal layer) | 20 to 50 years | | CD-R (cyanine or azo dye, silver alloy metal layer) | 20 to 50 years | | DVD+RW (erasable DVD) | 20 to 50 years | | BD-R (non-dye, gold metal layer) | 10 to 20 years | | DVD-R (silver alloy metal layer) | 10 to 20 years | | DVD and BD (read-only, such as a DVD or Blu-ray movie) | 10 to 20 years | | BD-R (dye or non-dye, single layer or dual layer) | 5 to 10 years | | DVD-RW (erasable DVD) | 5 to 10 years | | DVD+R DL (dual layer) | 5 to 10 years |

8986
 
 
The original post: /r/datahoarder by /u/syswraith on 2024-06-19 11:30:43.

The website seems to be down for a while now. Also my password's *******. What's yours?

8987
 
 
The original post: /r/datahoarder by /u/Nerds_r_us45 on 2024-06-19 10:42:20.

I like downloading some channels in bulk and idk if this will break my ability to hoard music easily or not.

8988
 
 
The original post: /r/datahoarder by /u/500xp1 on 2024-06-19 09:06:01.

Looking for a method that makes it impossible to recover the wiped data.

8989
 
 
The original post: /r/datahoarder by /u/Long_Instruction_391 on 2024-06-19 00:48:29.
8990
 
 
The original post: /r/datahoarder by /u/sir_fish_cakes on 2024-06-19 08:28:35.

I currently use an external 2TB hard drive to store my games for my Xbox, and I want to be able to offload some of these games on my discs and other, larger, hard drive, but when I connect it to my Mac, I'm not able to view or transfer contents from it without wiping the memory completely. Does anyone know software I can use to store this digital data?

8991
 
 
The original post: /r/datahoarder by /u/Comfortable_Ad_6823 on 2024-06-19 07:36:20.

I am currently ripping my SpongeBob DVDs. For anyone unaware, (almost) all SpongeBob episodes are split into A and B parts. For example, "Pizza Delivery" is only episode 5a, with "Home Sweet Pineapple" being episode 5b. Two "segments" make up one episode.

Normally this isn't a problem, as each segment in the SpongeBob DVDs has its own .mkv file. That is, until season 9, where there are no .mkv files for individual segments, only the combined episode. This is rather annoying as I don't want to scrub to the half-way mark of the videos just to watch the episode I actually want to see. I've thought of splitting the files in half, but it would be a tedious process as seasons are quite long.

Are there any programs or tools that would make this easier?

8992
 
 
The original post: /r/datahoarder by /u/EasyMoney322 on 2024-06-19 07:21:49.

Hello, I'm looking for a self-hosted software (that also wouldn't upload photos anywhere) that could do image recognition on the fileshare with an acceptable success rate. I was able to find posts on this sub about nsfw bodyparts recognition, but its not what Im looking for.

What level of recognition? It must be able to tell appart photos of mass events, people, pets, documents, roads, buildings etc. Having them organized by a location metadata, perhaps. Finding similar (almost duplicate, but with different hash) images.

Would be great if I could select all the tagged images after, re-check them for false-positives, and delete.

The fileshare is hosted on OpenSuse VM, but I also can deploy and mount it on any other OS on the same server. I have a lot of processing power, but I'd like to avoid training the AI by myself.

8993
 
 
The original post: /r/datahoarder by /u/another_lease on 2024-06-19 04:50:15.

I need to do Boolean search on my unconnected disks.

E.g., I need to search for files that contain the word "confidence" and "interval". If I enter confidence interval in the search bar,

  • Cathy will only find files that contain the phrase "confidence interval" in their file name.
  • Virtual Volumes View will find me files that contain both words anywhere in the file name, but it will also return files that contain either word in their filename.

I know that Cathy and VVV have come up before on this subreddit. I was wondering if anyone's figured out some portable freeware that can do Boolean search.

Thanks in advance.

8994
 
 
The original post: /r/datahoarder by /u/VanillaKirby on 2024-06-19 04:22:10.

Hello, I need some assistance with archiving a book containing some of the only documentation of one of the only shorthand systems in the Norwegian Language, Wang-Krogdahl.

The book is scanned and located in the Norwegian Public Library available here https://www.nb.no/items/URN:NBN:no-nb_digibok_2016011905022

You will need a Norwegian VPN to access the book, (i am using tunnelbear with a free license)

I do not know of a way to extract these scans of the book from the website, help appreciated.

8995
 
 
The original post: /r/datahoarder by /u/ejpman on 2024-06-19 00:38:24.

Anyone seen this before? 4 identical new disks with a huge difference in completion time for long smart tests. They are 14TB HC530’s

admin@NAS[~]$ sudo smartctl -x /dev/sdc | grep 'test remaining.' admin@UgreenNAS[~]$ sudo smartctl -x /dev/sda | grep 'test remaining.' 10% of test remaining. admin@UgreenNAS[~]$ sudo smartctl -x /dev/sdb | grep 'test remaining.' 60% of test remaining. admin@UgreenNAS[~]$ sudo smartctl -x /dev/sdc | grep 'test remaining.' 10% of test remaining. admin@UgreenNAS[~]$ sudo smartctl -x /dev/sdd | grep 'test remaining.' 60% of test remaining.

8996
 
 
The original post: /r/datahoarder by /u/TruthsTrueTruant on 2024-06-18 23:58:13.

I have a little home server, built around a cheapo mini-PC - Intel N100, 16GB RAM, 500GB internal SSD - that I’m finally upgrading the storage on. Storage workload is mainly Plex library, Seafile document/photo storage, and backups of other computers on the network.

I have 4 12TB HDDs that I was planning on setting up as a RAID1+0, and am debating the pros and cons of going for a full ZFS setup instead (as I understand it, the equivalent would be 1 zpool -> 2 vdevs -> 2 disks each).

I know that ZFS has the advantage of better integrity protection for bitrot etc., and depending who you ask is simpler to administer. It sounds like ZFS can also have better i/o performance depending on tuning, but I don’t think I really care about that for current workload. However, I know it can also have pretty significant memory and cpu overhead. I haven’t found much info on how much that actually is though, beyond the 1GB/TB rule of thumb for dedup. On a server with these specs, how much performance impact can I expect zfs to actually have? Is it enough to be worth sticking with RAID or is it overblown?

8997
 
 
The original post: /r/datahoarder by /u/Topangers on 2024-06-18 21:22:19.

I have iCloud but sometimes popups that appear confuse me and it doesn't seem like the images are actually being backed up? I need to make space on my phone but I hoard the images and videos; same with the general data and applications on my device. Any advice would be greatly appreciated! :)

8998
 
 
The original post: /r/datahoarder by /u/88rcg on 2024-06-18 21:00:44.

I currently have a custom script I created that I run from time to time manually to poll my drives for their SMART data and record it in date stamped text files.

I however don't know when should I really attempt to RMA my drives and I'd also like to setup some form of automated notification solution where I can get an email that effectively tells me I need to replace a drive.

I am using a bare metal system with the latest ubuntu server 24.02 LTS and utilize snapraid for a cheap backup solution that provides parity.

Drives:

NAME          MAJ:MIN RM  SIZE RO TYPE MOUNTPOINTS
sda             8:0    0 16.4T  0 disk /mnt/disk9
sdb             8:16   0  9.1T  0 disk /mnt/disk3
sdc             8:32   0 12.7T  0 disk /mnt/disk4
sdd             8:48   0  9.1T  0 disk /mnt/disk2
sde             8:64   0 14.6T  0 disk /mnt/disk7
sdf             8:80   0 12.7T  0 disk /mnt/disk6
sdg             8:96   0 12.7T  0 disk /mnt/disk1
sdh             8:112  0 16.4T  0 disk /mnt/disk12
sdi             8:128  1 16.4T  0 disk /mnt/disk11
sdj             8:144  1 16.4T  0 disk /mnt/disk8
sdk             8:160  1 14.6T  0 disk /mnt/disk5
sdl             8:176  1 14.6T  0 disk /mnt/disk10
sdm             8:192  1 16.4T  0 disk /mnt/parity2
sdn             8:208  1 16.4T  0 disk /mnt/parity3
sdo             8:224  1  2.3T  0 disk /mnt/temp2
sdp             8:240  1 16.4T  0 disk /mnt/parity
nvme1n1       259:0    0  1.8T  0 disk
├─nvme1n1p1   259:2    0    1G  0 part /boot/efi
├─nvme1n1p2   259:3    0    1G  0 part
├─nvme1n1p3   259:4    0    1G  0 part /boot
└─nvme1n1p4   259:5    0  1.8T  0 part
  └─vg0-lv--0 252:0    0  1.8T  0 lvm  /
nvme0n1       259:1    0  1.8T  0 disk /mnt/temp

Below I've provided my latest report which provides a lot of info but I'd like to rather get some emailed notifications that let me know when to replace, rather than me having to lookup which drive attributes really matter and thresholds.

latest report: https://pastebin.com/dwbF2F8u

8999
 
 
The original post: /r/datahoarder by /u/humanbandwidth on 2024-06-18 16:00:34.

I know how data centers work when setting up backups and backups of backups and having them regularly transfering data back and forth between back ups of the back up.

With that said. For non essential but important to me TV shows I have a question.

I have a substantial amount of TV hoarded. Upwards of 16 TB's.

Is it enough to transfer this data from HD A to another identical hard drive HD B every 2-3 years; quick format HD A; then transfer it back to HD A?

Will this prevent data rot theoretically? HD A is a doomsday Drive that isnt used regularly; I plug it in a couple times a year.

Will using it this way provide 10, 15, 20 years of storage time and will CrystalDiskInfo be the best way to monitor it?

Side note; Can CrystalDiskInfo detect bitrot?

9000
 
 
The original post: /r/datahoarder by /u/Ninj_Pizz_ha on 2024-06-19 00:49:53.

I expect I'll get some flack from people super immersed into this subculture, but why do people still recommend opening up random files in the backup to make sure the backup actually worked? Why isn't rsync -c or the equivalent sufficient? Personally I only open my backups every once in blue moon. Maybe there's some edge case where rsync checksum itself is faulty or something I guess, but that's not on my list of likely concerns tbh.

view more: ‹ prev next ›