It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
9476
 
 
The original post: /r/datahoarder by /u/X2ytUniverse on 2024-06-01 11:41:42.

Over the last few years I've done quite a bit of wedding photography and videography, and have quite a lot of footage. As a rule of thumb, I keep footage for 5 years, in case people need some additonal stuff, photos or videos later (happened only like 3 times ever, but still).

For quite some time i've been using OM-D E-M5 Mark III, which as far as I know can only record with h.264. (at least thats what we've always recorded in), and only switched to h.265/hevc camera quite recently. Problem is, I've got terabytes of old h.264 files left over, and space is becoming an issue., there's only so many drives I can store safely and/or connect to computer.

What I'd like is to convert h.264 files to h.265, which would save me terabytes of space, but all the solutions I've found by researching so far include very small amount of files being converted, and even then it takes quite some time.

What I've got is ~3520 video files in h.264, around 9 terabytes total space.

What would be the best way to convert all of that into h.265?

9477
 
 
The original post: /r/datahoarder by /u/fatguybike on 2024-06-01 11:25:49.

I'm running out of space on my iCloud, MacBooks and phones. I was leaning towards a NAS and then everyone here says you should have one copy offsite so it got me thinking. I already have 1 18TB external HD so if I purchased two more 18tb HD's and had two at home and one at work where I could upload to one and it would automatically copy the files to the others. Would that be possible? I'm currently doing the juggling thing where you upload to one and then try to manually copy to another and it's a major PITA.

9478
 
 
The original post: /r/datahoarder by /u/rubbishfairy on 2024-06-01 08:34:36.

Hey guys I know there is a lot of info about this out there already but it seems to go out of date quickly and a lot of it just seems to be wrong. So I'm hoping there's some proper experts in this sub.

Firstly I know there are various browser plugins. I got one called "video download helper" especially for the task but it just doesn't work with FB. I'm also a fan of Jdownloader but again that doesn't work with FB.

Secondly I know you can switch "www" for "mbasic" and find the actual URLs of videos. You can then download them but ONLY it seems in the lowest possible quality. I suspect the URL just needs to be changed slightly to give me the high quality but I don't know how.

What do you guys suggest?

9479
 
 
The original post: /r/datahoarder by /u/ChroodlesG on 2024-06-01 06:49:41.

Hello!

I'm currently hosting a giveaway on tiktok, and I've had FAR more entries than I expected. A bit over 115 thousand of them to be precise. I'm supposed to be livestreaming the winner tomorrow afternoon, but I'm at a loss for how to choose somebody. All of the websites I've found want to charge me 100+ dollars to get all of the comments, which I simply don't have. That would be close to 1/3rd of the giveaway's total, which I really don't want to do.

Does anybody have any ideas on how I can get the comments into a spreadsheet, or any other ideas on how I can pick a winner? I've tried manually scrolling, but my browser freezes up and starts to load painfully slowly after the thousand comment mark. Because of this, the solution that I *thought* would work (An HTML console script that scrapes the comments, I've used it before for MUCH smaller videos) isn't working...which is why this request is so last minute.

I don't think I'm allowed to post a link here, but if anybody wants the link I'd be happy to provide it! Thank you so much!

9480
 
 
The original post: /r/datahoarder by /u/Fairytaleautumnfox on 2024-06-01 02:34:05.

I haven’t gotten an external hard drive yet, but it’s something I’m planning on buying.

Is it better to save up downloads on my laptop and just do one big transfer from that to the external every so often, or should I do smaller, more frequent transfers?

9481
 
 
The original post: /r/datahoarder by /u/cruciblemedialabs on 2024-06-01 02:27:37.

Original Title: Absolutely at the end of my rope with looping backups. 3 weeks of back-and-forth with support has gotten me nowhere, despite them claiming it's a resolved bug and implementing every "fix" they told me to implement.


Pretty much title.

I'm a photographer and cinematographer and I have terabytes and terabytes of stuff that I need backed up across multiple drives.

Until about a month ago, Backblaze worked flawlessly. No issues whatsoever.

Now, every single time a backup finishes, it starts right back up again from the beginning. Every time I reboot my machine, it starts up again, from the beginning. Every time I pause a backup and resume it, it starts up again, from the beginning. Every time I change my scheduling to a different option, it starts up again, from the beginning. Always the same amount of data, ~2.4TB.

Support has told me that it was an issue with temporary files because they were listed as issues in the control panel, so I needed to exclude them. So I did. Nope, still happening.

Then they told me it was a resolved bug, and to download the newest beta client. So I did. Nope, still happening.

Then they said to change to the current full release rather than beta. So I did. Nope, still happening.

Then they said it was an issue with Windows Defender, and to add Backblaze to the exceptions list. So I did. Every single folder and process in any way associated with Backblaze, one by one. Nope, still happening.

I've sent them logs no less than 3 times. I've sent in a process report. And every single time, it's something different.

I did have one shining glimmer of hope today, when it finished and seemed to actually be finished. That is, until I moved some data from an ingest drive to one of my drives marked for backup, which was about 1.2TB in total, and it started fucking backing up that same 2.4TB of files. Again. Before the data had even been fully moved over.

I've checked the logs. There were errors in the earlier ones, but they seem to have been resolved by one of the troubleshooting steps. And yet the client is still broken.

I swear to god I'm about to lose it. Every time one of these backups starts it grinds my network upload speed to a halt, and I can't throttle it because the backups would go from several days long to several weeks, which simply isn't practical. I straight-up don't understand what's wrong. Any help would be appreciated.

9482
 
 
The original post: /r/datahoarder by /u/tellmewhy24 on 2024-05-31 20:19:10.

Do you guys accept your losses and move on?, Have backups on backups of everything? or try to recreate and save back whatever you lost?

9483
 
 
The original post: /r/datahoarder by /u/SausageGrenade on 2024-05-31 20:00:52.

I am a video editor with 90 tb of data. I have multiple raid devices at home. I back them up to a QNAP nas I have at home. For remote backup, I have 5x 18tb WD external drives that i back up to. I keep them at my parents house a few miles away. I bring them home once in a while to update backups. True cloud backup is too expensive for me. I am wondering if this is the best set up , or I should switch to a NAS at my parents. I see the advantage is that I could remotely back up to the nas at my parents. Without being drives home. But idk if that is possible, without a very expensive enclosure. Am I getting too fancy ? I am also worried about them accidentally unplugging it or something. I can’t afford an expensive enclosure. I wonder if the safest solution is the one I have. Or if a nas at my parents house is better. Any input is helpful.

9484
 
 
The original post: /r/datahoarder by /u/MachineThatGoesP1ng on 2024-05-31 19:06:00.

I'm assuming some of you guys use Freefilesync and i was wondering if anyone can help me with syncing files on automation to a specific Sd card. The problem I'm having is that when i put in the Sd card as the destination folder FFS only recognizes the D: drive and doesn't consider the Sd unique, so if i put in a different Sd card it still tried to sync into that drive. Im sure the Sd has a unique ID somewhere, is there a way for FFS to recognize and use that as a destination and how do i obtain the unique ID?

9485
 
 
The original post: /r/datahoarder by /u/Distinct-Yoghurt5665 on 2024-05-31 17:37:44.

I really hope it's ok to ask this. From the rules and the wiki it does seem ok, but I'm new to this sub.

So currently I have two external SSDs for my home server. I'm using SSD_A to save all my stuff to it and SSD_B as a backup.

Now in the near future my SSD_A will be full. So I want to do the following: I want to buy a very large HDD use that as a backup and then use the two SSDs as my standard mounts for the home server to save stuff to.

Ideas

  1. I could use mergerfs to combine both external SSDs to one mount point. I need to to this for some of my applications.
  2. I could then use rsync to create my backups from the SSD-mergerfs-mount to the HDD.
  3. Hopefully there is an rsync flag to indicate that I do not want to sync already existing data again (even if it changed). This ensures that broken files will not overwrite healthy files on the HDD. Yes, I know that this means that changes to files won't be backed up but this is ok for my use case.
  4. I could use ext4 for all drives, cause there is no reason not to and I do not know anything else.

What do the experts think? Does that sound like a good approach? Could anything go wrong? I do not care about write speed to the SSDs they are fast enough I do not need striping. I also do not care if some data gets lost between the back up cycles so I do not need RAID.

Is mergerfs the best there is and is it easy to set up?

Should I use anything else then ext4? ext4 has always been my go to and I do not really see anything wrong with it. File hashes seem to be unnecessary cause I can just not overwrite existing files, that seems to do the trick for my use case.

Very much appreciate any feedback on my plan.

9486
 
 
The original post: /r/datahoarder by /u/EngineeringGlum5318 on 2024-05-31 15:52:00.

Hello everyone, as my title suggests I have a bunch of old VHS compact that I’m looking to digitize.

Just trying to figure out how to best go about digitizing with the right equipment for decent quality.

Any suggestions / help is welcome!

9487
 
 
The original post: /r/datahoarder by /u/RDRulez on 2024-05-31 15:03:04.

The Fractal Node 804 case is stated to fit up to 10* HDDs. Finding a setup to achieve this is proving challenging. I found a motherboard that can accept 6x SATA connections, so my remaining questions I'd like to ask for help on are:

  • Can I utilise one or more PCIe adapters to add 4 more to achieve 10 HDDs? If yes, can you recommend what exactly I need to purchase to make it happen?
  • If not, what is the maximum HDD setup you think I can achieve with this build?
  • Any other considerations for the build? Current intentions are OS like True as scale for containers + Jellyfin media server

Build proposal: https://uk.pcpartpicker.com/list/Z9CzVW

9488
 
 
The original post: /r/datahoarder by /u/moonronic on 2024-05-31 14:29:33.

Hi there, I am not massively tech literate nor sure of where to go or who to ask about this, but I have this website I've been using for 12 years called Quotev, recently, they took out the social feed where you could see other people's posts, now it's hard to find your friends on there and such from years passed. It says they are deleting all messages on July 1st, and I was hoping if anybody knows an easy method to archive all my posts on the website? the activity page is the main part i'd want to save; it's https://www.quotev.com/(username)/activity

9489
 
 
The original post: /r/datahoarder by /u/thegameksk on 2024-05-31 12:03:43.

Looking to upgrade my NAS storage. They will he used mainly for Plex, transmission, calibre and comics reading. Also how do I tell when serverpartdeals has a sale? Is it advertised somewhere?

9490
 
 
The original post: /r/datahoarder by /u/LAR1998 on 2024-05-31 09:13:58.
9491
 
 
The original post: /r/datahoarder by /u/ElonTastical on 2024-05-31 07:29:16.
9492
 
 
The original post: /r/datahoarder by /u/rogueSleipnir on 2024-05-31 05:28:35.

As a game dev and a fan of a lot of artists, I still heavily use Twitter to mainly collect stuff for inspiration. The problem is the site has gotten worse at searching and even looking back at my own timeline for posts.

Like today, I wanted to go back and archive some items that I remember reposting maybe two weeks ago. But I found out can't even scroll that far back on my own page. I hit 10 days ago and it stopped loading anything. I don't think I'm spamming likes or reposts too much, maybe around 30 a day. It just won't load more results, even through advanced search.

I don't even need to download the content, saving links to a text file is enough. I have tried the Twitter API with python a few years ago. But the rate limits are so small now, 1500 per month on a free tier. I might hit that when testing the program I make.

I do have a note-taking app that I save links in, it's Logseq. I copy links occasionally to it, but manually doing this takes some time and is limited by how much Twitter can scroll back. Something to automatically interact with this would be cool.

So any advice at what I can use for this is appreciated. The general case is saving links of Retweets, Likes, and/or Bookmarks from my own account. Something I can run every few days or a week. I am a software dev and open to learning anything I could for this project.

9493
 
 
The original post: /r/datahoarder by /u/Snowblind45 on 2024-05-31 04:05:54.

Hello, I’ve new this this stuff. I’m trying to improve my data hoarding habits.

Currently I have 2 external HDD identical to each other (5 TB filled). I plan on also storing a copy on Blackblaze Cloud storage. I’ve heard that rclone can encrypt the files as it uploads them to the cloud, amazing! But what about encrypting the drive itself? Do I spend time using rclone to encrypt and upload, then encrypt the drive using maybe veracrypt? Is it possible to encrypt first using veracrypt then simply rclone upload those contents to the cloud?

I’m feeling a bit confused about veracrypt since a tutorial makes it seem like I need to have free space to allocate a virtual partition to encrypt but I don’t have the extra space…is there a way to encrypt in place?

9494
 
 
The original post: /r/datahoarder by /u/TrisMcC on 2024-05-31 04:01:50.
9495
 
 
The original post: /r/datahoarder by /u/ngs428 on 2024-05-31 02:44:14.

I have a W11 desktop PC with about 3TB of data consisting of family videos and pictures. I currently have it on an 8TB internal HDD and once every 3 months do a copy to my external HDD and delete the old backup.

Moving forward I am looking to have the external HDD connected to the PC 24/7 and monitor for changes in the source folders on the internal HDD and mirror those changes to the external HDD. So if I delete something on the internal, it deletes from the external. Add to the internal, it adds to the external.

After some research of looks like SyncBack 11 will do this and is free. I see Veeam mentioned quite a bit. What is the best option out there for free or low cost?

Edit: I see freefilesync may be an option. https://freefilesync.org/download.php

Thank you!

9496
 
 
The original post: /r/datahoarder by /u/RelaxRelapse on 2024-05-30 22:46:42.

I often buy used VHS-C and Video8 tapes from Japan, both because they tend to be cheap and also because often times people in Japan used it to make personal copies of TV shows using those formats. Plus it’s cheaper than shipping full sized VHS tapes.

Unsurprisingly I’ll find home videos in the lots I buy. A part of me feels obligated to make digital copies since they’re someone’s memories, and also a time capsule of that period of time. Another part of me feels I shouldn’t because they are someone’s personal recording that I assume they had no plans to show outside of their homes. And then again, what do I even do with the digital copies if I do make them besides hoard them? I legally couldn’t share them most likely. I wouldn’t mind digitizing a copy and sending it to the original owner, but since it’s just a random lot online from a resale shop, there’s nothing really to go off of.

I’ve found NSFW content as well, but the choice with that stuff is obvious and I just erase them. The normal, everyday stuff though I’m unsure about. I think I just want to get a second opinion before I proceed.

9497
 
 
The original post: /r/datahoarder by /u/Quirky_Ad_69 on 2024-05-30 20:15:25.
9498
 
 
The original post: /r/datahoarder by /u/Bringback-T_D on 2024-05-30 19:13:51.

A shootproof.com gallery was sent out to attendees of an event I just went to... However, I can't figure out how to download everything... It's all watermarked, so I wouldn't consider it 'stealing' (I also plan to purchase some of the pictures), but I just want to have a good-enough archive, before it's taken down....

I attempted to use https://github.com/ShootProof/shootproof-cli, however it's broken and out-of-date. What other tools/commands should I try out?

9499
 
 
The original post: /r/datahoarder by /u/redexposure on 2024-05-30 18:16:30.

I'm wondering if anyone can recommend a type of "Diff Checker", for files and folders, to see whether backup drive/folder contents match the source folder (and highlight files which haven't been duplicated)?

Essentially, the inverse of a Duplicate Finder.

Does anyone know of any tools like this?

9500
 
 
The original post: /r/datahoarder by /u/alex_5506 on 2024-05-30 17:16:21.

Hi there- I have an older laptop with a broken monitor so I can’t just boot it up (also can’t locate cords so I can’t use an external monitor). Is there any kind of external hdd reader that will work without another laptop/desktop? These days I’m strictly iPhone/ipad and don’t own a windows based machine. Any suggestions for easily accessing my old laptop hdd would be greatly appreciated.

view more: ‹ prev next ›