It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
4201
 
 
The original post: /r/datahoarder by /u/Impossible_Ad_8475 on 2025-01-26 21:23:12.
4202
 
 
The original post: /r/datahoarder by /u/Terrible_Ex-Joviot on 2025-01-26 20:55:29.

Well, this Question mght Sound stupid, but I really struggle to download new Music now.

I use both Deezer and Spotify for discovering a lot Music and I put everything I want to get in huge Playlists. I Always downloaded These Playlists via Deemix (and similar previous tools) in the past 7 years or so. Deemix was a Software that could directly download from Deezer, even in flac. I absolutely loved it! But unfortunately the dev gave up and it's not working anymore. At least not for me. It was only a Question of time. Firehawk has no more Deezer arls, but the Software itself also doesn't seem to work anymore. RIP Deemix, the GOAT!

Ok, so now I am Looking for an alternative that is conveniant, or at least works.

I found doubledouble and Lucida - 2 websites that do the same as Deemix - but they both don't work for me. Both are extremely slow and Show me hundrets of Errors. It is a pain in the ass to even download one single song. Damn I have Playlists with 200+ Songs that I want to download. Lucida can download from Spotify, but that doesn't work for me too. Errors everywhere!

Ok, so there is spotifydown, which I think converts Spotify Playlists to YouTube and Downloads from there. Quality is not as good but ok. The Problem is, this site has become extremely slow too! It is unbearable that it takes 30min to download 20 mp3s! And sometimes if breaks in the middle and doesn't even download all Songs. It used to be working some months ago, but now it just sucks!

Of Course I know Soulseek. But it's only good for downloading single Songs and complete Albums/discograpies. I am not an Album listener, I want to get the Songs I actually really like/download my Playlists. Soulseek cannot download Playlists. I found some Spotify Plugin, but of Course it didn't work. Besides that, there is a lot of Music that Soulseek just doesn't offer. When you want to get Music besides Mainstream, good luck finding it there. At least for me it is not a good Option.

So, how the hell can I download my Playlists in acceptable Quality now after all the great Tools died???

I remember there was a Spotify Downloader, that downloaded directly from Spotify, but only in 128kbps. I gotto test if it is still around and working, but the Quality would be a huge compromise. I also know About These Telegram bots, but they were Always gone after some time and they were also extremely slow. So, i don't have telegram anymore. The only method that Comes to my mind that is left possible, is to manually convert all Playlists to YouTube and then download from there via Foobar2000 or some YTDL... Damn, this can't be it! It's 2025 and it got so ridiculously hard to download fucking Music! When YT fixes their shit too, i think we are completely fucked and have to give in to Streaming. But I don't want to stream only, I want to care for my offline Collection.

Are there any good Options left for me??? How can I download my Deezer Playlists (in flac)???

What also would be nice was some Auto download feature for liked Songs. What is there left?

4203
 
 
The original post: /r/datahoarder by /u/mergesortissalvation on 2025-01-26 20:08:01.
4204
 
 
The original post: /r/datahoarder by /u/andreas0069 on 2025-01-26 20:02:40.

https://preview.redd.it/244b82xf8efe1.png?width=1280&format=png&auto=webp&s=cabacebc795564fd66b8e639d88ea2e9a94c9a50

I'm really interested in low-power NAS setups and had some spare parts lying around, so I decided to put them to good use and build my own! Speed wasn't my main priority since I mainly use it for smaller files and backups.

If you're curious, I made a video about it – check it out here:

https://www.youtube.com/watch?v=2kwnsDc7_fs&ab_channel=HGSoftware

4205
 
 
The original post: /r/datahoarder by /u/ZoeEatsToes on 2025-01-26 19:27:38.

I'll start with what I currently do and what I'd like.

Currently I download all my media (movies and series) onto my pc and I use mpv and some upscalers to run my series. This setup works fine for loading media onto my pc screen and watching it there.

The current issues with this are:

  1. my setup is for more than just media and media is starting to take up a lot of space on my drives.
  2. I'm currently moving into my first home so would like to watch my downloaded media in my living room, office and bedroom
  3. I've started travelling more for work so would like a way to access my media from anywhere and it'd need to handle 2-3 people streaming from it at the same time.

So what I'd like is a NAS that has the potential to have a gpu for hardware upscaling (mainly anime on mpv) that can be streamed across my house and across the internet. Now Im a complete beginner to this space, I have experience with building pcs, linux and I am a software engineer so would like to do this myself over building a prebuilt system. I am hoping for advice on what specs I'd need, what software Id need to make this possible (if its possible to stream upscaled media to a tv) and honestly just what to do any info would be greatly appreciated!

Thanks, Owen

4206
 
 
The original post: /r/datahoarder by /u/traecoto on 2025-01-26 18:12:41.

Hello I am looking for a way to download this twitter stream https://x.com/i/broadcasts/1ypJdpZebboJW but the usual methods I use are not working I am open to any and all ideas.

4207
 
 
The original post: /r/datahoarder by /u/kuro68k on 2025-01-26 17:44:44.

I already have a Plustek OpticBook 3800 scanner, but it's not big enough for some things like Laserdisc covers and larger magazines.

I've looked at camera based scanners but they aren't great. Limited DPI and the CZUR ones are complete crap because of their software. Are the Fujitsu ones any good?

Ideally I'd like to scan at 600 DPI, and most of the camera ones can't do that.

I see Epson make some large ones but they are very expensive. Any other options?

4208
 
 
The original post: /r/datahoarder by /u/manzurfahim on 2025-01-26 17:08:10.

https://preview.redd.it/q7hkvxxcedfe1.png?width=2019&format=png&auto=webp&s=095ccc812ff2891951b9e546d7765fed8707681f

I use a G-RAID USB 3.0 enclosure for backup. I have it configured as a RAID 0 (2 x 10TB), and I do one of the monthly backups to this. It currently has 2 x WD Ultrastar 10TB drives inside.

I recently upgraded my RAID array, and got some spare drives, so I thought I will replace these drives with two Seagate Exos X18 18TB drives. I replaced the drives, but the enclosure is not working. The drives power up, and after a short while, they power down and the LED lights are constant red.

It was surprising, because a few weeks ago I checked if the enclosure could take large drive, and I tried a 20TB drive and it worked. So I removed the 18TB drives, and tried a 20TB WD drive, and it worked fine. So, it is working with 10TB and 20TB WD drives, but not working with Seagate 18TB drives.

Just wondering if anybody had this same issue. It is unlikely for an enclosure to reject drive of one brand and accept another. Unless WD actually designed it to work with WD drives only.

4209
 
 
The original post: /r/datahoarder by /u/PricePerGig on 2025-01-26 16:57:21.

Hagglezone is a website that will show you the price of the same 'thing' across Europe, say Hard Drives for all our data hoarding needs, including non Euro countries (UK, Poland, Switzerland). So you search for something, it shows you all the prices for the same thing, and because shipping is about the same, you can choose the cheapest one.

Question is, does anybody in this audience (storage media, huge storage media, MASSIVE storage media :) ) actually use it?

Why? I make pricepergig.com ; a website to find the best price per GB/TB of storage, organised in the best most concise way on the planet; and mobile friendly.

I've been asked just twice to 'copy hagglezone' features from people in this sub (here: https://www.reddit.com/r/DataHoarder/comments/1i8yvso/i_updated_pricepergigcom_to_add_germany_amazonde/ and here: https://www.reddit.com/r/DataHoarder/comments/1hsrbyd/i_updated_pricepergigcom_to_add_spanish_amazon_as/ ), but looking into it, there would be quite some extra data storage, and I'm not sure it's worth it.

For the USA residents; is there something similar, do you buy from Canada for example? (sorry if that sounds ridiculous, it seems similar from this side of the Atlantic)

Now, back to storing all this data, about how to store data cheaper :)

4210
 
 
The original post: /r/datahoarder by /u/HiOscillation on 2025-01-26 16:10:07.

I've lurked & learned a LONG time in this sub, and TBH, I thought a lot of you were a little....over the top (and I say that with kindness).

I'm good at maintaining a data pile, it's all fairly routine for me. I've never lost a personal file since a disaster in 2003 which eradicated, to a level I didn't think possible, photos of the birth of one of my kids. That's what got me into data hoarding. Since then, my data hoarding has been more about safely managing and maintaining genuinely irreplaceable digital media - the stuff we have created over the years - as the underlying physical formats change.

I was less concerned with commercial media; I have subscriptions to various news sites with archives, and have always enjoyed funny/sarcastic content. Way, way, way back in 2001, The Onion had a moderately funny article about Starbucks - and the thing I remembered most was the absolutely perfect re-design of the Starbucks logo, with the mermaid now having a cyclops eye and looking pretty mean. You can just barely see the redesigned logo in this image. The redesigned logo featured prominently in the original article, and I liked it so much I printed it out. Well, I lost that printout years ago, and a few years ago, the article was scrubbed of the redesigned logo for some reason, who knows how many years ago. Archive.org does not have it either.

And that's when I started collecting all of the articles I read online in my own collection. Because the past is erasable now.

4211
 
 
The original post: /r/datahoarder by /u/IanProton123 on 2025-01-26 16:07:07.

Accidently purchased a couple SAS drives from SPD for a PC build. I thought I'd be better off buying an HBA card to use the HDDs instead of paying for return shipping and restocking fees (almost $100 total). Finally got the card and disks connected but received "This Device is Not Ready" error in Device Manager while trying to format the new drives.

A bit of googling lead to a possible issue with sector size that are common in refurbed enterprise disks. I downloaded SeaChest utilities, messed around with cmd for a couple hours (I have zero experience with command lines). Info was unable to tell me what the current sector size was. I almost proceeded with "--setSectorSize 512" command but a thorough warning scared me out of proceeding.

Questions:

Have other people had sector size issues with refurb SAS drives from SPD?

Any other recommendations of things to check?

Should I just take the drives to a shop that actually knows what they are doing?

Should I just quit and return the drives regardless of cost?

4212
 
 
The original post: /r/datahoarder by /u/gargravarr2112 on 2025-01-26 14:51:42.

So I've got a tape setup and it's generally working okay. I'm using Bacula to store encrypted backups.

However, I seem to have a box of LTO-6 tapes that won't write their full capacity (2.5TB). I've tried several methods but they never seem to go past about 37GB when being written by Bacula. It's 4 or 5 tapes and I think they're from the same manufacturer, possibly the same batch, so I'm willing to conclude that the tapes are physically faulty. However, as they're fairly expensive for a home user, I wonder if there's any way to fix them. They were bought new, but I don't have a warranty on them.

# mt -f /dev/nst1 status
SCSI 2 tape drive:
File number=0, block number=0, partition=0.
Tape block size 0 bytes. Density code 0x5a (LTO-6).
Soft error count since last status=0
General status bits on (41010000):
 BOT ONLINE IM_REP_EN

Things I've tried:

Bacula's btape program with a rawfill command:

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Write failed at block 576603. stat=-1 ERR=No space left on device
25-Jan-2025 23:20:15 btape: btape.c:408-0 Volume bytes=37.19 GB. Write rate = 44.17 MB/s
25-Jan-2025 23:20:18 btape: btape.c:611-0 Wrote 1 EOF to "LTO6" (/dev/tape/by-id/scsi-DRIVE-nst)

dd:

# dd if=/dev/zero bs=1M | pv -paerts 2500G | dd of=/dev/nst1 bs=1M
dd: error writing '/dev/nst1': No space left on device==================================================================>                                                                         ] 55% ETA 6:02:19
7:35:34 [52.2MiB/s] [52.2MiB/s] [=======================================================================================>                                                                         ] 55%
0+22815195 records in
0+22815194 records out
1495216553984 bytes (1.5 TB, 1.4 TiB) copied, 27336.1 s, 54.7 MB/s

I think I've also tried tar and LTFS, as well as using mt to retension the tape. As much as I could continue experimenting, I also know that each cycle is adding mechanical wear to the tape.

It's not consistent where the tape stops writing when I use additional tools. Trying to seek to EOM on these tapes seems to hit the above limitation - the tape returns EOM far too soon. Is there any way to force the tape to seek past this?

Anyone have any advice?

4213
 
 
The original post: /r/datahoarder by /u/WaitForItTheMongols on 2025-01-26 13:35:02.

I'm a Linux user and have been ripping optical media for a long time. Last I looked into it, Blu-ray ripping was not operational due to DRM stuff. I know there's MakeMKV, but I don't want to be relying on paid, closed-source software for ripping. It really seems like there should be a ready made open source solution, and it's weird that one dude making MakeMKV could manage it but no team has been able to get it working.

When I tried doing it in the past, I would get recognizable but garbled images. Is there a trick to it, or is this still not achievable? Kind of sad that I spent a fair bit of money on a Blu ray drive and thus far have not been able to put it to use.

4214
 
 
The original post: /r/datahoarder by /u/RedditNoobie777 on 2025-01-26 12:29:31.

elements seems same as my passport but My Password looks more rugged and Ultra is same thing and uses USB C and costs 12$ more

4215
 
 
The original post: /r/datahoarder by /u/El_B1tch_Cazador on 2025-01-26 08:14:43.

I saw that there are other people asking how to download on this site but whenever I look at the comments I can't understand anything. I tried to download the video through seal but it didn't work

4216
 
 
The original post: /r/datahoarder by /u/Heiopei_42 on 2025-01-26 10:55:03.

Is there any way to bulk download images from a specific channel on a Discord server? There are some Patreon content creators I'm subscribed to, that have stuff exclusive on their discord servers, but I don't wanna download every image other users have uploaded onto that server, but only from specific channels where the creator shared their content.

4217
 
 
The original post: /r/datahoarder by /u/extraaa1 on 2025-01-26 09:55:41.

I am not sure, if this sub is the right to post this question. If not, maybe you could let me know which could be.

I am planning on building a simple NAS setup based on a CM3588 and some old HDDs. This requires both a DC barrel plug for the board and SATA power for the drives. I have an old but good quality PSU and was originally thinking of using it to power everything. However, after some research, this doesn’t seem to straight forward or at least not recommended (e.g., using molex to barrel plug adapters). An alternative approach I read about was to use a DC power brick and barrel plug “y-splitters” with barrel plug to sata power adapters, which sounded a bit crazy to me…

What would be your recommended solution, ideally preventing any fires? 😂

4218
 
 
The original post: /r/datahoarder by /u/Antique__throwaway on 2025-01-26 09:54:41.

So I entered an account I knew was deleted or unavailable on twitterwrapped.exa.ai and it still worked, with the top three tweets for 2024 being shown despite them being unavailable on Twitter and failing to load anything but the profile on Wayback Machine. How is this possible? Is there some way I can use this to get all of the account's activity?

That would be fantastic because there are a lot of important and unique discussions that account has posted, possibly of genuine historical importance.

4219
 
 
The original post: /r/datahoarder by /u/staleferrari on 2025-01-26 09:51:06.

I travel occasionally and I like to take videos. However, I'm not a great video editing guy. I rarely color grade my videos, only simple cut and trim.

But I might in the future, cause I also like watching my old videos. I'm just thinking, what if 10, 15 years from now I decide on editing my old videos?

I'm asking your opinion on ProRes. Are they worth recording in this format compared to HEVC if I don't plan on editing it anytime soon? (Cloud backup is not a problem).

It's just that I'll be needing to buy an expensive SSD because the iPhone 16 Pro cannot record the high resolution ProRes directly to the phone storage.

4220
 
 
The original post: /r/datahoarder by /u/Mun333r on 2025-01-26 09:17:58.

Hi guys, I hope I'm not duplicating the question, I have googling for a couple days now but not able gain a clear insight. So I wanted to download all the files from a telegram group I'm part of, it contains alot of zip files and pdf when you click ont he group summary it shows 13941 files to be precise. So I wanted to export the files to create a local copy of the data on my laptop for which I used telegram portable

  • when exporting, only the files and nothing else it's showing total of 39447 files instead of 13942 files. Thought it might be an error or something but the download kept going even after hitting the 14000 mark. -now the download via exporting is horrendously slow, like 90%of the files will be less than 10 mb and 10 % of them will be less than 500mb since it's all safety codes and stuff. It took my about 3 days to download 14000 files via telegram export where as downloading directly via scrolling manages around 2000 files in 15-20 minutes. -is there a way to speed up telegram export download speed. I don't mind taking premium but I haven't found a clear insight on wethwr the premium increases export speeds as well or just the download speed. -alternatively I have tried some plus messenger third party Android app but that does not show all the files. Is there any other method I can follow? Sorry about the long post but I'm out of options and I'm travelling to UK for my studies in a week and wanted to export all my data before I leave the country and loose access to wifi. Thanks guys.
4221
 
 
The original post: /r/datahoarder by /u/cip43r on 2025-01-26 06:49:30.

I work for an engineering firm. We generate a log of documentstion and have everything on our internal server. The data is on an unraid server with parity with offsite backs to two sepearate servers with raid.

However, we have designs, code and documentation which we sign of and flash to systems. These systems may never be seen again but also have a life time of 30 to 50 years for which we should provide support or build more.

Currently, we burn the data to a set of BluRays, depending on the size with redundancy and checksums, often allowing us to lose 1 of 3 discs due to damage, theft or whatever. And we will still be able to resilver and get all data from the remaining 2 discs.

I have recently seen that Bluray production is stopping.

What are other alternatives for us to use? We cannot store air gapped SSDs as not touching them for 30 years my result in data loss. HDDs are better, but I have heard running an HDD for a very long time and then stopping and storing it for many years and spinning it up again may also result in loss.

What medium can we use to solve this problem? This information may be confidential and protected by arms control and may not be backed up to other cloud services.

4222
 
 
The original post: /r/datahoarder by /u/clickbaitishate on 2025-01-26 06:37:47.

Ideally something from serverpartsdeals, though I'm open to new with decent price/tb. Looking for 8-12tb for a plex server, but not something terribly loud as it's in the same room as the theater.

4223
 
 
The original post: /r/datahoarder by /u/Haunting-Ad4860 on 2025-01-26 06:23:09.

I want to have 1 terabyte of completely unmanaged memory storing a long time lapse video, with about 70 GB/year. I'm assuming SSD because they won't break as often, but I want your advice; I'm new to this.

4224
 
 
The original post: /r/datahoarder by /u/Playful-Ease2278 on 2025-01-26 04:09:20.

I have a drive that my PC notified me was failing. It did this for a few days, but then stopped. Though I noticed it was running slow. Even though the warnings stopped I decided to replace the drive. Is there any good use for this old drive? I would hate to add to ewaste if there is some use for this. Thanks!

4225
 
 
The original post: /r/datahoarder by /u/DubbaThony on 2025-01-26 03:35:00.

Hi, I've tried few things for my raid 5 array in my workstation. Windows storage spaces (including powershell commands to make it faster) horribly slow. Btrfs for windows worked great until I learned the hard way that btrfs + raid 5 is no no.

Long story short, I end up with setup:

Host: windows 10 enterprise iot ltsc 2021

VMWare Workstation VM running Alpine Linux, VM flavor, 8G ram, 8c setup with host-only network

Passthrough of my disks into VM

ZFS setup in RAIDZ1 exposing volume /dev/zd0

tgtd with following config:

tgtadm --lld iscsi --op new --mode target --tid 1 --targetname iqn.2025-01.local.raidvm:r5store

tgtadm --lld iscsi --op bind --mode target --tid 1 -I ALL

tgtadm --lld iscsi --op new --mode logicalunit --tid 1 --lun 1 -b /dev/zd0

I also tried replacing last line with:

tgtadm --lld iscsi --op new --mode logicalunit --tid 1 --lun 1 --backing-store /dev/zd0 --bsoflags direct

And windows iSCSI connector. It connected and let me format the partition no issue.

Crystal disk mark shows numbers that look +- legitmite, and align with performence tests I done inside the VM.

When I try to copy files into the partition with windows explorer, the speeds skyrocket to +-4GB/s for few seconds, and after that it gets stuck, VM freezes, SSH sessions to it stop responding. After waiting some time, VM un-freezes, ssh are back responsive. Smaller transfers (like 10 gb), don't freeze the VM (VM has 8 gigs of ram, and 2 gigs of persistent storage for system), and running iftop I can see that long after windows says it finished the transfer, its still pushing the data (!). For transfers over 30 gigs, during the transfer windows times out during VM freeze, the operation gets canceled, it drops iSCSI connection and shortly after reconnects.

When I try to use WSL for copying, it also shows stupidly high speeds, but it doesnt even produce network traffic on VM, and the files never appear on the targeted file system.

As sanity check, if maybe thats some ZFS or maybe something I didnt consider, Ive setup temporary Samba. Samba, like I expected underperforms (still overperforming storagespaces drastically), but works. Nothing to write home about, but I noticed that while writing with samba, the speed drops from 220MB/s to 50MB/s and recovers, back to orginal 220MB/s speed and than drops once again. But it does indeed finalize transfers correctly.

Other things:

  • I disabled write cache in windows, one of first troubleshooting steps (disk manager -> disk properties -> policy)

  • I played around zfs ARC, but its not the cause of the issue

  • I tried to figure out other server than tgt, but scst is deprecated, and the LIO setup requires KMod that isn't available in this flavor of linux. And I think that since Crystal Disk Mark can get true results, the blame squarely falls on windows iSCSI client.

  • dmesg from lockup: https://pastebin.com/mGUSEKzv https://pastebin.com/xbxq0pQ1

  • after few stalls and reconnects, it seems like windows figures out that its pushing it too hard, and the transfer speed gets normal for specific file transfer operation.

Im at loss here, I would like to use the iSCSI, but this setup is unusable (or bearly usable).

I don't know how to proceed with this. Please advise.

view more: ‹ prev next ›