bOt

joined 2 years ago
MODERATOR OF
 
The original post: /r/earthporn by /u/PhotoBoyWonder on 2025-07-20 08:31:58.
 
The original post: /r/mullvadvpn by /u/CT_CT_CT_CT_CT on 2025-07-19 09:37:01.

Edit: K nevermind US works, I just wasn't patient enough, lol.

~~I'm in China and Mullvad won't connect to any of the servers other than Manila, Philippines. It doesn't particularly bother me since it's working (as you can see xD) but I was just wondering why is that the case. Selecting literally any other country results in a seemingly endless connecting screen, while Philippines connect near instantly. Am I doing something wrong? I'm using the WG option.~~

 
The original post: /r/datahoarder by /u/ThePirer on 2025-07-18 16:33:21.

Hi guys,

In the past, I just used VLC as a player for watching movies and series. However, since last year, I've been running an emby server in my laptop, since it is always on, and it's been amazing. Because of that, I want to buy a NAS in like 2-3 years, since right now it is not possible for different reasons.

When looking at NAS, I found them to be very limiting. What if I needed more disks, more ram, a more powerful CPU or whatever in the future? If I do something, I optimize the shit out of it. In the end, I thought that a custom NAS would be the best option. But the cases are very expensive, or too big, or too small or too loud, or too ugly... So, I have an old pc tower with a ton of 5.2 and 3.5 slots. I removed those racks and 3D printed a 12 bay rack in TPU with an attachment for 4 fans on the side, as well as an hexagon front mesh in PETG for airflow. A bit of walnut vinyl and now it looks like something made by Fractal Design, has as lot of storage, and can fit any MB and PSU while being smaller than a standard ATX.

With that out of the way, my 7-8 year old 5TB external HDD with movies and series is finally full, so I need to buy a new disk in the following months. But I thought that, instead of buying just another 5TB disk, the most cost-effective option would be to just go ahead and buy the disk that I would use in the NAS.

  1. Which capacity should I go for? 14 TB? 16? 20? It took me like 7 years to fill 5TB, maybe 14 would be enough to last me for years and taking into account the amount of bays at my disposal. Maybe 20TB is better because if the increased file size nowadays. Maybe the 18TB disk is of a higher quality because of the specific model. Also, in Server Part Deals there are mainly Seagate Exos and Ultrastars. Which model do you recommend? I would like to buy 2 disks to have a Raid 1, since the more data I have, the more I worry about losing it, and then going for a Raid 5, 6 or 10 or whatever when I eventually have to add more disks.

Now, once I have the disks, I have to connect them to the laptop to keep the emby server running. I've seen that there are docking stations for around 30€. I liked one from Orico. Now, the problem lies in the formats, since TrueNAS doesn't recognize NTFS and Windows doesn't recognize ZFS. 2 solutions come to mind:

  1. Since I'd have two mirrored disks, when I have the NAS set up, I can connect the mirror, create a pool, transfer the files and then set up the Raid 1. There's a risk of losing the data here, but I don't think the probability is high.
  2. I can use OpenZFS, but it doesn't seem easy nor reliable.

Which one would you choose? Is it possible? Are there more options? I'd like to hear your thoughts.

 
The original post: /r/datahoarder by /u/elsbeth-salander on 2025-07-18 20:58:25.

People may differ in their viewpoints on the quality or perspective of PBS programming in recent years, but there’s no denying that it has produced a lot of memorable series that many viewers enjoyed and which did have an intent to inform and/or educate the populace, including children.

Some of these shows ran for decades and therefore might not be on DVD box sets. For instance NOVA has aired since 1974. I’ve already noticed that some of the children’s series like The Puzzle Place are considered partially lost media due to being “copyright abandonware” (the original IP holder temporarily licensed it to public broadcasting but then went bankrupt, leaving the rights essentially in limbo).

With Paramount having obliterated all of its Daily Show archive from the website, it’s probably only a matter of time before something similar happens to those PBS series that are viewable in streaming format. Is there an effort under way to 1) download whatever can be saved to disk from their streaming video site, and/or 2) dispatch whatever else (reels, tapes, etc) is collecting dust in the vaults distributed among the various public broadcasters, to some kind of preservation service / museum (maybe outside the US?) before it gets sold off or thrown away?

 
The original post: /r/datahoarder by /u/PusheenHater on 2025-07-18 20:15:46.

I've got a bunch of external/internal hard drives, SSDs, flash drives, etc.

I'm using a cardboard box but I have so many hard drives that it's sagging. Not very sturdy.

I know plastic is static-y which is really bad for the hard drives.

So I ask if there's a container:

  • Big, that can hold many hard drives
  • Anti-static
  • Not plastic or cardboard
  • Sturdy
  • Preferably allows you to lock it up with a lock
 
The original post: /r/datahoarder by /u/Alphabethur on 2025-07-18 19:54:22.
 
The original post: /r/datahoarder by /u/Repulsive_Market_728 on 2025-07-18 18:21:05.

Just in case there's anyone who may be interested and who might have the space/resources to use something like this, I saw this up for auction. It closes at around 9pm eastern today (Friday the 18th).

https://www.allsurplus.com/en/asset/1021/13971

I also found this article which provides a pretty good overview of the system.

https://www.itpro.com/155268/quantum-scalar-i2000-tape-library

 
The original post: /r/datahoarder by /u/AshleyAshes1984 on 2025-07-18 17:35:28.
 
The original post: /r/datahoarder by /u/palepatriot76 on 2025-07-18 17:23:00.

So I have used DVDFab for well over 40 DVD boxed sets, no issues but I have an issue with my Benny Hill Megaset

I am crating ISO files fine, but when I try to watch I can hear but not see, and when I can see very messed up, pixelated and green screen

When I use those ISO files and Make MKV, same thing, just a mess

Is this a DVD protection thing? If so what is my next step?

 
The original post: /r/datahoarder by /u/aJakalope on 2025-07-18 17:18:15.

I'm mostly making this post because I googled the differences between these a lot before purchasing and wish I had seen a post like this before I had.

I currently use a Beelink Mini S12 as a Plex server and although I had been using external drives, I was running out of USB ports on the Beelink. So I was looking into a DAS to use and found very similar reviews for both products named in the title. The Terramaster was a little cheaper so I went with it, especially since I was not looking for proper RAID functionality since I use the drives for easily replaceable media files.

I used WD Red Pro 18TB drives for this.

The first drive I put in it seemed to function alright, but when I attached a second drive, there seemed to be issues. Drives randomly disconnecting, errors while transferring large files, qBitTorrent error messages I had never seen before, etc. I read that it was likely a cord issue, so I bought a nicer data cable. The issues persisted. I continued to check the drives using CrystalDiskInfo and it showed no problems on any of the drives.

I finally decided to order a QNAP to see if it was a drive issue and once I put the drives in the QNAP, they immediately were recognized, transfer speeds were faster, and I have not had any issues whatsoever.

I'd say I'm no expert at all in these fields, so it's possible that there was a small issue I was overlooking with the Terramaster. I've also only had the QNAP a few days, so it's possible I'll encounter issues down the road. But if anyone in the future is reading this and considering saving a few bucks and buying a Terramaster, go with the QNAP.

 
The original post: /r/datahoarder by /u/Gunfighter1776 on 2025-07-18 17:15:33.

I have never had a NAS. I know what it is, and I have used them in work environments - never from home network pov.

Question and Comment:

I have a PC with several hdd's -- I have data duplicated across the drives for redundancies in case one of the drives fail -- I have a total of 30tb - ish this includes all drives and duplicated data - so my conundrum is do I use this number to calculate how much actual drive space I need in my NAS setup?

Or do I just take ONE COPY of everything - and dump it onto my NAS... I ask because I don't know how the NAS -- in what will be most likely a RAID5 configuration -- will treat the data if I have several copies of the data also on my NAS... or will it just be that the duplicated data will be all spanned across all drives -- just like any other deployment of data in a NAS...

I guess I am asking -- what is best practice -and which is a best stragegy? ONE COPY of everything on my NAS... or several copies on the NAS in different folders??

I have a ugreen 4800plus -- and I am trying to buy drives big enough to grow into - but don't want to spend more than i have to -- I initially was going to go for a RAID5 3 DISK ARRAY and have an extra drive to drop in - in the event I need to save the data - or grow my data needs.

Advice?

 
The original post: /r/datahoarder by /u/itsbentheboy on 2025-07-18 15:51:11.

I have created a set of bashRC aliases for use with YT-DLP.

These make some longer commands more easily accessible without the need of calling specific scripts.

These should also be translatable to Windows as well since the commands are all in the yt-dlp binary - but I have not tested that.

Usage is simple, just use the alias that correlates with what you want to do - and paste the URL of the video, for example:

yt-dlp-archive https://my-video.url.com/video to use the basic archive alias.

You may use these in your shell by placing them in a file located at ~/.bashrc.d/yt-dlp_alias.bashrc or similar bashrc directories. Simply copy and paste the code block below into an alias file and reload your shell to use them.

These preferences are opinionated for my own use cases, but should be broadly acceptable. however if you wish to change them I have attempted to order the command flags for easy searching and readability. note: some of these aliases make use of cookies - please read the notes and commands - don't blindly run things you see on the internet.

##############
# Aliases to use common advanced YT-DLP commands
##############
# Unless specified, usage is as follows:
# Example: yt-dlp-get-metadata <URL_OF_VIDEO>
#
# All download options embed chapters, thumbnails, and metadata when available.
# Metadata files such as Thumbnail, a URL link, and Subtitles (Including Automated subtitles) are written next to the media file in the same folder for Media Server compatibility.
#
# All options also trim filenames to a maximum of 248 characters
# The character limit is set slightly below most filesystem maximum filenames
# to allow for FilePath data on systems that count paths in their length.
##############

# Basic Archive command.
# Writes files: description, thumbnail, URL link, and subtitles into a named folder:
# Output Example: ./Title - Creator (Year)/Title-Year.ext
alias yt-dlp-archive='yt-dlp \
--embed-thumbnail \
--embed-metadata \
--embed-chapters \
--write-thumbnail \
--write-description \
--write-url-link \
--write-subs \
--write-auto-subs \
--sub-format srt \
--trim-filenames 248 \
--sponsorblock-mark all \
--output "%(title)s - %(channel,uploader)s (%(release_year,upload_date>%Y)s)/%(title)s - %(release_year,upload_date>%Y)s - [%(id)s].%(ext)s"'

# Archiver in Playlist mode.
# Writes files: description, thumbnail, URL link, subtitles, auto-subtitles
#
# NOTE: The output will be a folder: Playlist_Name/Title-Creator-Year.ext
# This is different from the above, to avoid large amount of folders.
# The assumption is you want only the playlist as it appears online.
# Output Example: ./Playlist-name/Title - Creator (Year)/Title-Year.ext    
alias yt-dlp-archive-playlist='yt-dlp \
--embed-thumbnail \
--embed-metadata \
--embed-chapters \
--write-thumbnail \
--write-description \
--write-url-link \
--write-subs \
--write-auto-subs \
--sub-format srt \
--trim-filenames 248 \
--sponsorblock-mark all \
--output "%(playlist)s/%(title)s - %(creators,creator,channel,uploader)s - %(release_year,upload_date>%Y)s - [%(id)s].%(ext)s"'

# Audio Extractor
# Writes: <ARTIST> / <ALBUM> / <TRACK> with fallback values
# Embeds available metadata
alias yt-dlp-audio-only='yt-dlp \
--embed-thumbnail \
--embed-metadata \
--embed-chapters \
--extract-audio \
--audio-quality 320K \
--trim-filenames 248 \
--output "%(artist,channel,album_artist,uploader)s/%(album)s/%(track,title,track_id)s - [%(id)s].%(ext)s"'

# Batch mode for downloading multiple videos from a list of URLs in a file.
# Must provide a file containing URL's as your argument.
# Writes files: description, thumbnail, URL link, subtitles, auto-subtitles
#
# Example usage: yt-dlp-batch ~/urls.txt
alias yt-dlp-batch='yt-dlp \
--embed-thumbnail \
--embed-metadata \
--embed-chapters \
--write-thumbnail \
--write-description \
--write-url-link \
--write-subs \
--write-auto-subs \
--sub-format srt \
--trim-filenames 248 \
--sponsorblock-mark all \
--output "%(title)s - %(channel,uploader)s (%(release_year,upload_date>%Y)s)/%(title)s - %(release_year,upload_date>%Y)s - [%(id)s].%(ext)s" \
--batch-file'

# Livestream recording.
# Writes files: thumbnail, url link, subs and auto-subs (if available).
# Also writes files: Info.json and Live Chat if available.
alias yt-dlp-livestream='yt-dlp \
--live-from-start \
--write-thumbnail \
--write-url-link \
--write-subs \
--write-auto-subs \
--write-info-json \
--sub-format srt \
--trim-filenames 248 \
--output "%(title)s - %(channel,uploader)s (%(upload_date)s)/%(title)s - (%(upload_date)s) - [%(id)s].%(ext)s"'

##############
# UTILITIES:
# Yt-dlp based tools that provide uncommon outputs.
##############

# Only download metadata, no downloading of video or audio files
# Writes files: Description, Info.json, Thumbnail, URL Link, Subtitles
# The usecase for this tool is grabbing extras for videos you already have downloaded, or to only grab metadata about a video.
alias yt-dlp-get-metadata='yt-dlp \
--skip-download \
--write-description \
--write-info-json \
--write-thumbnail \
--write-url-link \
--write-subs \
--write-auto-subs \
--sub-format srt \
--trim-filenames 248'

# Takes in a playlist URL, and generates a CSV of the data.
# Writes a CSV using a pipe { | } as a delimiter, allowing common delimiters in titles.
# Titles that contain invalid file characters are replaced.
#
# !!! IMPORTANT NOTE - THIS OPTION USES COOKIES !!!
# !!! MAKE SURE TO SPECIFY THE CORRECT BROWSER !!!
# This is required if you want to grab information from your private or unlisted playlists
# 
#
# Documents columns:
# Webpage URL, Playlist Index Number, Title, Channel/Uploader, Creators,
# Channel/Uploader URL, Release Year, Duration, Video Availability, Description, Tags
alias yt-dlp-export-playlist-info='yt-dlp \
--skip-download \
--cookies-from-browser firefox \
--ignore-errors \
--ignore-no-formats-error \
--flat-playlist \
--trim-filenames 248 \
--print-to-file "%(webpage_url)s#|%(playlist_index)05d|%(title)s|%(channel,uploader,creator)s|%(creators)s|%(channel_url,uploader_url)s|%(release_year,upload_date)s|%(duration>%H:%M:%S)s|%(availability)s|%(description)s|%(tags)s" "%(playlist_title,playlist_id)s.csv" \
--replace-in-metadata title "[\|]+" "-"'

##############
# SHORTCUTS 
# shorter forms of the above commands
# (Uncomment to activate)
##############
#alias yt-dlpgm=yt-dlp-get-metadata
#alias yt-dlpa=yt-dlp-archive
#alias yt-dlpgm=yt-dlp-get-metadata
#alias yt-dlpls=yt-dlp-livestream

##############
# Additional Usage Notes
##############
# You may pass additional arguments when using the Shortcuts or Aliases above.
# Example: You need to use Cookies for a restricted video:
#
# (Alias) + (Additional Arguments) + (Video-URL)
# yt-dlp-archive --cookies-from-browser firefox <URL>

view more: ‹ prev next ›