An overly compressed 4k stream will look far worse than a good quality 1080p. We keep upping the resolution without getting newer codecs and not adjusting the bitrate.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
This is true. That said, if can't tell the difference between 1080p and 4K from the pixels alone, then either your TV is too small, or you're sitting too far away. In which case there's no point in going with 4K.
At the right seating distance, there is a benefit to be had even by going with an 8K TV. However, very few people sit close enough/have a large enough screen to benefit from going any higher than 4K:
Source: https://www.rtings.com/tv/learn/what-is-the-resolution
I went looking for a quick explainer on this and that side of youtube goes so indepth I am more confused.
I'll add another explanation for bitrate that I find understandable: You can think of resolution as basically the max quality of a display, no matter the bitrate, you can't display more information/pixwls than the screen possess. Bitrate, on the other hand, represents how much information you are receiving from e.g. Netflix. If you didn't use any compression, in HDR each pixel would require 30 bits, or 3.75 bytes of data. A 4k screen has 8 million pixels. An HDR stream running at 60 fps would require about 1.7GB/s of download wihout any compression. Bitrate is basically the measure of that, how much we've managed to compress that data flow. There are many ways you can achieve this compression, and a lot of it relates to how individual codecs work, but put simply, one of the many methods effectively involves grouping pixels into larger blocks (e.g. 32x32 pixels) and saying they all have the same colour. As a result, at low bitrates you'll start to see blocking and other visual artifacts that significantly degrade the viewing experience.
As a side note, one cool thing that codecs do (not sure if literally all of them do it, but I think most by far), is that not each frame is encoded in its entirety. You have, I, P and B frames. I frames (also known as keyframes) are a full frame, they're fully defined and are basically like a picture. P frames don't define every pixel, instead they define the difference between their frame and the previous frame, e.g. that the pixel at x: 210 y: 925 changed from red to orange. B frames do the same, but they use both previous and future frames for reference. That's why you might sometimes notice that in a stream, even when the quality isn't changing, every couple of seconds the picture will become really clear, before gradually degrading in quality, and then suddenly jumping up in quality again.
On codecs and bitrate? It's basically codec = file type (.avi, .mp4) and bitrate is how much data is sent per second for the video. Videos only track what changed between frames, so a video of a still image can be 4k with a really low bitrate, but if things are moving it'll get really blurry with a low bitrate even in 4k.
"File types" like avi, mp4, etc are container formats. Codecs encode video streams that can be held in different container formats. Some container formats can only hold video streams encoded with specific codecs.
ah yeah I figured it wasn't quite right, I just remember seeing the codec on the details and figured it was tied to it, thanks.
For an ELI5 explanation, this is what happens when you lower the bit rate: https://youtu.be/QEzhxP-pdos
No matter the resolution you have of the video, if the amount of information per frame is so low that it has to lump different coloured pixels together, it will look like crap.
The resolution (4k in this case) defines the number of pixels to be shown to the user. The bitrate defines how much data is provided in the file or stream. A codec is the method for converting data to pixels.
Suppose you've recorded something in 1080p (low resolution). You could convert it to 4k, but the codec has to make up the pixels that can't be computed from the data.
In summary, the TV in my living room might be more capable, but my streaming provider probably isn't sending enough data to really use it.
I have 65" 4K TV that runs in tandem with Beelink S12 pro mini-pc. I ran mini in FHD mode to ease up on resources and usually just watch streams/online content on it which is 99% 1080p@60. Unless compression is bad, I don't feel much difference. In fact, my digitalized DVDs look good even in their native resolution.
For me 4K is a nice-to-have but not a necessity when consuming media. 1080p still looks crisp with enough bitrate.
I'd add that maybe this 4K-8K race is sort of like mp3@320kbps vs flac/wav. Both sound good when played on a decent system. But say, flac is nicer on a specific hardware that a typical consumer wouldn't buy. Almost none of us own studio-grade 7.1 sytems at home. JBL speaker is what we have and I doubt flac sounds noticeably better on it against mp3@192kbps.
Interestingly enough, I was casually window browsing TVs and was surprised to find that LG killed off their OLED 8K TVs a couple years ago!
Until/if we get to a point where more people want/can fit 110in+ TVs into their living rooms - 8K will likely remain a niche for the wealthy to show off, more than anything.
Yeah, when I got my most recent GPU, my plan had been to also get a 4k monitor and step up from 1440p to 4k. But when I was sorting through the options to find the few with decent specs all around, I realized that there was nothing about 1440p that left me dissapointed and the 4k monitor I had used at work already indicated that I'd just be zooming the UI anyways.
Plus even with the new GPU, 4k numbers weren't as good as 1440p numbers, and stutters/frame drops are still annoying... So I ended up just getting an ultra-wide 1440p monitor that was much easier to find good specs for and won't bother with 4k for a monitor until maybe one day if it becomes the minimum, kinda like how analog displays have become much less available than digital displays, even if some people still prefer the old ones for some purposes. I won't dig my heels in and refuse to move on to 4k, but I don't see any value added over 1440p. Same goes for 8k TVs.
After years of saying I think a good 1080p TV, playing a good quality media file, looks just as good on any 4k TV I have seen, I now feel justified........and ancient.
Same.
Also, for the zoomers who might not get your reference to the mighty KLF:
I don't like large 4k displays because the resolution is so good it breaks the immersion when you watch a movie. You can see that they are on a set sometimes, or details of clothing in medieval movies that give away they were created with modern sewing equipment.
It's a bit of a stupid reason I guess, but that's why I don't want to go above 1080p for tv's.
I've been looking at screens for 50+ years, and I can confirm, my eyesight is worse now than 50 years ago.
The main advantage in 4K TVs "looking better" are...
-
HDR support. Especially Dolby Vision, gives noticeably better picture in bright scenes.
-
Support for higher framerates. This is only really useful for gaming, at least until they broadcast sports at higher framerates.
-
The higher resolution is mostly wasted on video content where for the most part the low shutter speed blurs any moving detail anyway. For gaming it does look better, even if you have to cheat with upscaling and DLSS.
-
The motion smoothing. This is a controversial one, because it makes movies look like swirly home movies. But the types of videos used in the shop demos (splashing slo-mo paints, slow shots of jungles with lots of leaves, dripping honey, etc) does look nice with the motion interpolation switched on. They certainly don't show clips of the latest blockbuster movies like that, because it will become rapidly apparent just how jarring that looks.
The higher resolution is just one part of it, and it's not the most important one. You could have the other features on a lower resolution screen, but there's no real commercial reason to do that, because large 4K panels are already cheaper than the 1080p ones ever were. The only real reason to go higher than 4K would be for things where the picture wraps around you, and you're only supposed to be looking at a part of it. e.g. 180 degree VR videos and special screens like the Las Vegas Sphere.
I just love how all the articles and everything about this study go "Do you need another TV or monitor?" instead of "here's a chart how to optimize your current setup, make it work without buying shit". 😅
Selling TVs and monitors is an established business with common interest, while optimizing people's setups isn't.
It's a bit like opposite to building a house, a cubic meter or two of cut wood doesn't cost very much, even combined with other necessary materials, but to get usable end result people still hire someone other than workers to do the physical labor parts.
There are those "computer help" people running around helping grannies clean Windows from viruses (I mean those who are not scammers), they probably need to incorporate. Except then such corporate entities will likely be sued without end by companies willing to sell new shit. Balance of power.
I think the real problem is that anything less than 4k looks like shit on a 4k tv
1080p can linearly scale to 4k
i can confirm 4K and up add nothing for me compared to 1080p and even 720p. As long as i can recognize the images, who cares. Higher resolution just means you see more sweat, pimples, and the like.
edit: wait correction. 4K does add something to my viewing experience which is a lot of lagging due to the GPU not being able to keep up.
Quality of the system is such a massive dependency here, I can well believe that someone watching old reruns from a shitty streaming service that is upscaled to 1080p or 4k by their TV they purchased from the supermarket with coupons collected from their breakfast cereal is going to struggle to tell the difference.
Likewise if you fed the TVs with a high end 4k blu ray player and any blu ray considered reference such as Interstellar, you are still going to struggle to tell the difference, even with a more midrange TV unless the TVs get comically large for the viewing distance so that the 1080p screen starts to look pixelated.
I think very few people would expect their old wired apple earphones they got free with their iphone 4 would expect amazing sound from them, yet people seem to be ignoring the same for cheap TVs. I am not advocating for ultra high end audio/videophile nonsense with systems costing 10s of thousands, just that quite large and noticeable gains are available much lower down the scale.
Depending what you watch and how you watch it, good quality HDR for the right content is an absolute home run for difference between standard 1080p and 4k HDR if your TV can do true black. Shit TVs do HDR shitterly, its just not comparable to a decent TV and source. Its like playing high rez loss less audio on those old apple wired earphones vs. playing low bitrate MP3s.
This study was brought to you by every streaming service.
Black and white antennae TV's from the 1950's was clearer than a lot of TV's today, but they weighed 600 kilograms. Nowadays I buy cheap, small TV's and let my brain fill in the empty spaces like it's supposed to. /s