this post was submitted on 08 Oct 2024
1 points (100.0% liked)

It's A Digital Disease!

23 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
 
The original post: /r/datahoarder by /u/FalsePhilosopher on 2024-10-08 06:33:39.

https://docs.github.com/en/enterprise-cloud@latest/repositories/working-with-files/managing-large-files/about-large-files-on-github#distributing-large-binaries

"We recommend repositories remain small, ideally less than 1 GB, and less than 5 GB is strongly recommended. Smaller repositories are faster to clone and easier to work with and maintain. If your repository excessively impacts our infrastructure, you might receive an email from GitHub Support asking you to take corrective action.

If you need to distribute large files within your repository, you can create releases on GitHub.com. Releases allow you to package software, release notes, and links to binary files, for other people to use. For more information, visit"About releases."

We don't limit the total size of the binary files in the release or the bandwidth used to deliver them. However, each individual file must be smaller than 2 GiB."

So what I just read is make a repo with just a readme file and tar 100GB file zst | split 1996M && gh release yes? So far in the week of hosting close to 10GB in release files I have not gotten an email. So idk use what you will with that info.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here