this post was submitted on 24 Feb 2024
199 points (95.0% liked)
Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ
63502 readers
859 users here now
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules • Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
📜 c/Piracy Wiki (Community Edition):
🏴☠️ Other communities
FUCK ADOBE!
Torrenting/P2P:
- !seedboxes@lemmy.dbzer0.com
- !trackers@lemmy.dbzer0.com
- !qbittorrent@lemmy.dbzer0.com
- !libretorrent@lemmy.dbzer0.com
- !soulseek@lemmy.dbzer0.com
Gaming:
- !steamdeckpirates@lemmy.dbzer0.com
- !newyuzupiracy@lemmy.dbzer0.com
- !switchpirates@lemmy.dbzer0.com
- !3dspiracy@lemmy.dbzer0.com
- !retropirates@lemmy.dbzer0.com
💰 Please help cover server costs.
![]() |
![]() |
---|---|
Ko-fi | Liberapay |
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Why is the GPU reencoding bad for the quality? Any source for this?
Yeah that caught my eye too, seems odd. Most compression/encoding schemes benefit from a large dictionary but I don't think it would be constrained by the sometimes lesser total RAM on a GPU than the main system - in most cases that would make the dictionary larger than the video file. I'm curious.
The way it was explained to me once is that the asic in the gpu makes assumptions that are baked in to the chip. It made sense because they can't reasonably "hardcode" for every possible variation of input the chip will get.
The great thing though is if you're transcoding you can use the gpu to do the decoding part which will work fine and free up more cpu for the encoding half.