Submitted by SalmonellaTizz t3_ya0sn9 in gadgets
vinraven t1_it8qfow wrote
Can anyone see any 8K vs 4K difference with normal vision?
elkarion t1_it8s8cu wrote
Yes you can. But you will need a very very large screen to be able to tell.
Granum22 t1_it8t5s7 wrote
Or be sitting 3 inches away
kickerua t1_it8uh3d wrote
3 foot, not 3 inch.
For somebody monitor with 50" makes sense, I would buy it as I see it useful.
GibsonMaestro t1_it8znam wrote
A 50" screen doesn't need more than 1080p (and HDR, which only comes with 4k screens).
kickerua t1_it8zsfe wrote
I'm using 32" 4K at the moment, for gaming and productive work it's not a super high density
GibsonMaestro t1_it90o3f wrote
Yeah, but you're what - a foot away? Most people are going to be watching tv or gaming from a couch across the room.
kickerua t1_it91htd wrote
Yes, but I'm want to use it as regular monitor, so it's like 80 cm from me.
I totally agree for plain TV case 8k doesn't make any sense
GibsonMaestro t1_it91mtl wrote
Yes, in your case, 4k and possibly 8k totally makes sense
PeteThePolarBear t1_it9lx7a wrote
I reckon you just need glasses, it's pretty easy to tell the difference of 4k in a lounge room setting. Even with a smallish tv
GibsonMaestro t1_itakchv wrote
To add to this (rather than edit my response) 4k streaming on most services is roughly the same (and often times less than) the bitrate of a 1080p blu-ray (max 28Mbps).
Unless you're getting all your media from Apple TV+, which streams at 20 to 40Mpbs or from 4K discs, your 4K Television is outputting the same quality as a 1080p blu-ray, but with the addition of HDR.
walgman t1_itngjec wrote
Yeah but the modern pirate steamers playing out at max quality are well over 120Mbps at 4k. That’s Plex on Apple TV.
GibsonMaestro t1_itnlivk wrote
Yes, but pirate streamers are easily the minority of (North American/Western European) consumers. Of course, fringe groups can do better. Hell, you can almost consider blu-ray owners a fringe group, at this point.
GibsonMaestro t1_it9vbdv wrote
It's likely the HDR you're noticing, not the pixel density. Are you comparing to other televisions in your house, or upgrades to ones you've gotten rid of? Were the previous tvs low end?
Winjin t1_it95u4d wrote
Yeah it's not about resolution, it's all about DPI now basically. Besides a specific DPI it all becomes completely useless marketing.
Phil152 t1_it8yjed wrote
Definitely, and I'm on the older side (the Pleistocene Age on the Reddit spectrum) with eyesight that is still functional but nowhere near what it used to be.
Which is why we recently upgraded to a large screen 4k OLED. Compared to our old HD flatscreen, it's night and day. The old tv was perfectly functional. It was a decent midlevel tv when we got it. It served us well for many years. The upgrade was a conscious concession to age.
That's 4k. But when we were in the store (the Magnolia section in our local Best Buy, not some exotic high end specialty store), there were a couple of tv's on display that were just jaw-droppingly good in every dimension. The biggest of them had a price tag of $25,000, which of course very, very few of us would even consider buying unless we won the lottery. Naturally, however, both the picture quality and the price tag attracted my attention, and I felt compelled to ask about it. It WAS unmistakably better than anything else on display. It was a very large screen 8k OLED.
Yes, you can tell the difference with normal vision. You can tell the difference with significantly suboptimal vision.
The folks at the store were very quick to emphasize that this tv was on the floor simply to demonstrate the technology and alert people to what they might be considering 10+ years from now. Possibly sooner? Who knows? But it's good to see what's out there over the horizon.
They also emphasized that the pictures we were seeing on the 8k were specially made demo pieces, again simply to demonstrate the technology, that there is basically no 8k content available now, and that no one is anywhere close to streaming it. The good news, however, is that prices are coming down. That particular model will be only $17,000 next year. Not that this would make any difference to us.
How soon will 8k be a player in the Harry Homeowner market? Good question.
BenekCript t1_it96tpw wrote
No one is anywhere close to streaming full fidelity 4k and surround audio. I really hope we ditch discs for owned digital downloads that are equivalent or better…but we’re not there yet as far as I have found.
Phil152 t1_it9emkv wrote
My tech sophistication doesn't go much beyond changing batteries and hitting the power button. So a question:
I know that the streamers' ability to stream 4k is still a bit of a mixed bag. Some older content (especially old tv shows) is still SD, but since I don't watch tv shows, that's not a serious issue for us. A lot is still HD. More and more is 4k, but it's a mix. 4k is becoming the standard, but it will take time.
When we upgraded, we had to upgrade our cable speed and swap to a new 4k ready cable box. (I've thought about switching to fios but that's a separate issue.) My tv tells me the quality of the video I'm getting on any given movie.
My question: can you explain what the gap is between the "4k" listed for a given film and the "full fidelity 4k" to which you refer?
Surround sound is not an issue. We considered that, but we're in an old house and the tv is in a finished basement, but the configuration and wiring issues raised a lot of complications. We settled for a high quality sound bar, which for our room is more than enough.
danielv123 t1_it9sboy wrote
It's about bitrate. All video is compressed. Compression introduces artifacts - you can see this on low res youtube videos for example, rather than seeing large squares with a uniform color you see weird blob like patterns etc, especially in areas with gradients.
The bitrate is how much compressed data is transferred per second. More bitrate means less artifacts, but more expensive for the provider.
Typical 4k blue ray runs at about 100mbit/s. Apples high quality streaming tops out at 40, youtube typically runs about 15 but can reach as much as 40 in some scenes. Netflix doesn't go past 20.
This is not an inherent streaming limitation though, it's just about how much the provider wants to spend. I stream shows from Plex just fine at 120mbps.
Phil152 t1_it9z2n4 wrote
Thanks. There's nothing I can do about what the streaming networks do at the front end. Which for a technophobe like me means it is one more thing I don't have to worry or educate myself about. I suppose the streamers will get nudged along by competitive pressures as 4k TV's become more common. (That assumes the difference is enough for most home viewers to notice or care about.)
Do you know what the current market penetration is for 4k's?
TheThiefMaster t1_itb0i81 wrote
There's some good information on that topic in this YouTube video talking about YouTube experimenting with requiring a subscription for viewing at 4k, by someone that runs a major YouTube channel and small side streaming network (that does support 4k also).
It was about 44% penetration for 4k TVs, but for his content, TVs only made up 11% of viewers in total.
I suspect Netflix has more TV streamers so it would be different for them.
BenekCript t1_it9u2xy wrote
For similar reasons (cost) they also use lower fidelity sound mix. This is most noticeable in the surround channels. If you have at least a 5.1 system, watch a show you own on a streaming platform and then on 4k blu ray. Even those who don’t care about such things will notice the difference. The majority do not though, or would be unwilling to stomach increased cost to get there with streaming today. It also doesn’t help that the general internet infrastructure probably can’t support it en masse.
Phil152 t1_it9zmzf wrote
Thanks. I will try some side by side comparisons between a blu ray and streaming.
killerboy_belgium t1_itnbjly wrote
the big problem is the bitrate tho so even streaming services would serve 4k/8k if the bitrate is shit because its getting so compressed it doesnt really matter
TheThiefMaster t1_itb08zz wrote
Yeah at present a 1080p blu ray is a better picture than a 4k stream, because streams are compressed to hell.
Deadofnight109 t1_it8s8zn wrote
I think the only advantage to having an 8k screen is when the screen size starts getting massive. Like bigger then any home tvs. So it'll be good for say, theaters or big signage but totally useless in home.
13900_lP_wasted t1_ita5s3w wrote
Specially in developing countries where TV isn't even 1080p.. I've been saying this for a while, TV can't compete with streaming services/youtube if it won't even do 1080p, let alone 8K.
BigJigglingMelons t1_itrh4wd wrote
Most media consumption is now on phones or social sites as you listed by consumers
Theaters etc have been going downhill for a while and are the only true settings where these resolutions matters
Outside science and medical applications of course where the higher res the better
Little_Winge t1_italwp0 wrote
I keep seeing the phrase signage, but what does it actually mean?
TheThiefMaster t1_itb0ync wrote
Big advertising billboards. The TV style ones.
Little_Winge t1_itcn9p1 wrote
I am dumb, kinda thought it had to do with signing things like documents...
kickerua t1_it8uo9h wrote
Not bigger, actually in case you'll use regular 55" TV as a monitor with distance of ~80 cm it's already pretty bad with 4K.
And it makes sense in case it's curved
Deadofnight109 t1_it8xypl wrote
You're right, distance is also a factor. Was thinking more along the lines of across the room. Although as someone who uses my 55" 4k TV as a monitor from across the room I couldn't imagine sitting that close to it lol. I think you have to be closer then like 5ft to notice the difference on a 60"tv.
FlanOfAttack t1_it9uqhv wrote
For me it's a desktop productivity thing. I've been using 27" 1440p monitors for about a decade, and I'd like to high-PPI the desktop and gain a little more screen real estate. To properly double my resolution I'd have to go up to 5K, and that would still be a 27" monitor.
With 8K at 40", I could have the pixel density of a MacBook display, but the screen real estate of two full desktop monitors.
StayyFrostyy t1_it8tchi wrote
I was comparing the two at store and honestly i couldnt tell the difference
NelvisAlfredo t1_it8uju2 wrote
Technically for normal TV viewing distance (not computer monitor distance) you can’t even discern the difference between 1080p and 4K even with perfect vision.
Dark_Clark t1_it9gc0p wrote
That’s not even close to being true. I can absolutely tell a difference.
MoltresRising t1_it8vux9 wrote
What? My wife said the she couldn't tell the difference until I showed her the same video in 1080p vs 4K while she sat on our couch. Then she said 'It's like the fish are in our living room!" and now she's on board with 4K.
NelvisAlfredo t1_it8wgos wrote
It is likely other advancements like high dynamic range or local dimming that are making the difference for her. Basically at a pretty standard 10ft TV viewing distance the screen would have to be ungodly huge for her to see the difference in pure resolution.
MoltresRising t1_it8wnqa wrote
Article with a scientific study on this?
auctorel t1_it90uf7 wrote
Table in this article with distance you have to be to be able to tell the difference
https://www.forbes.com/sites/kevinmurnane/2017/11/01/when-a-4k-tv-looks-just-like-a-1080p-tv/
It really is pretty ridiculously close
shifty_coder t1_ita0uoz wrote
I’d go with LTT’s input over some tech journalist’s on Forbes’
QueefBuscemi t1_it90jfb wrote
Someone test this man's wife in laboratory conditions!
GibsonMaestro t1_it90gvh wrote
They are plentiful and easy to find. /uNelvisAlfredo is speaking the truth, and it should be common knowledge by now.
A mid range 1080p plasma will kick the crap out of any non HDR LCD/LED tv in terms of contrast, color accuracy, and refresh rate.
HDR is a game changer, however.
thefinalcutdown t1_it943lp wrote
Very much this. I happen to have both a low-end 4K tv for general viewing and a top-of-the-line (at the time) 1080p plasma for “movie night” viewing. There’s absolutely no comparison. The plasma is miles and miles ahead in every way. Contrast and black levels make way more of a difference than resolution, but that’s harder to market to the general public. I have above average vision (close to 20:10) so I can see a slight difference in the resolutions at 10-12ft viewing distances but that’s minuscule in comparison to the areas in which the plasma excels.
ETA: the low-end 4K is 50” and the plasma is 60”
I-seddit t1_ita4cbk wrote
Bullshit.
mark_99 t1_it8w7r2 wrote
You really can, although only in parts of the image that are high contrast and/or high frequency.
[deleted] t1_it929el wrote
[deleted]
[deleted] t1_itavtmz wrote
[removed]
imakesawdust t1_itaaqsy wrote
IKR? I have a 75" TV but my couch is 11 feet away. I doubt I'd be able to tell the difference between 4K and 8K content. Hell, I'm not convinced I can really tell the difference between 1080p and 4k content at this distance.
pseudobipartisan t1_it9fejt wrote
I can’t tell when it’s a normal screen but I can definitely tell on my oculus.
wolfofremus t1_it9x019 wrote
I kinda wish my 48" OLED is 8K instead of 4K. If you have to read a lot on the screen, a high res monitor will change your life.
WetDehydratedWater t1_itb3j8o wrote
Yes. Easy to see. Particularly if you use an OLED as a large format monitor. I can see 4k pixels on a 65 inch screen easily at regular viewing distances. 5ft back or so. EU rules for tvs are dumb. Eco settings ruin TV display quality. First thing to turn off on any TV.
rocketwidget t1_itq8odm wrote
Technically yes, especially if you have a very large screen or sit very close. But the effect will always be very subtle. It's a diminishing returns problem.
There is also the content problem. 8k content exists, but it's very rare.
(P.S. Just my opinion, but the thing that makes people go "wow" with existing 4k TVs is not mostly the 4k part, it's improvements to things like High Dynamic Range (HDR) and Wide Color Gamut, the quality of which varies significantly by 4k set. I'm personally a bit cynical of 8k because of the price premium).
Knuddelbearli t1_it913au wrote
I don't even see a difference (I see one when I search for difference but not Netflix and chill) from FHD to 4k ...
Winjin t1_it9661u wrote
Yeah not too mention that most people (like 90%?) are probably watching compressed videos that are barely real full HD.
Like I remember thinking whether I need 4K to watch movies... And then I reminded myself that I have dirt cheap wifi and watch everything in compressed 720p.
danielv123 t1_it9r180 wrote
YouTube has done the testing and it turns out most people are fine with 360p somehow.
ThePu55yDestr0yr t1_ita2bkk wrote
Tbf majority demographics of YouTube are like:
-
Half probably bot net inflating viewers, astro turfing or annoying content creators
-
children who don’t care for HD cus they don’t know how to press buttons on their iPhone or TV, or mindlessly auto-playing watching kid shows or some dumb content creators
-
Everyone else, fifth or quarter of “everyone else” are probably vision impaired or old people where hd may not matter.
Also music vid viewers who don’t give a shit about video quality replaying over and over.
HD demographics are gonna be the minority when, need good internet (or patience), and don’t fall into above categories.
Personally I prefer HD vids but no way I can tolerate video buffering and shit ads on trash connection
TheThiefMaster t1_itb14g7 wrote
If you have a phone in portrait mode you probably don't have many more pixels than that for a video.
I've done it a few times scrolling comments while the video played.
TheBigFeIIa t1_it9vf3l wrote
Hardly surprising given decades of fuzzy analog TV
mileswilliams t1_it9qt7y wrote
First thing I ever watched in HD was South Park.
Knuddelbearli t1_itb8i6e wrote
Yes, anime/animation and porn is where I'm most likely to notice a difference on my TV. But still, FHD ( on 42 inches, with 65+ then eventually 4k) is actually enough for me, I prefer better colours / black levels.
Whatifim80lol t1_it8r6cd wrote
Lol no. But that doesn't stop people from feeling like they can. We don't need 144hz monitors either, human flicker-fusion threshold is like 60hz. Anything over 90 makes very little difference for our eyes.
Edit: your eyes aren't seeing those higher refresh rates, they're just seeing a crispy picture and less blur as the frames change. The way frames are drawn by games make this difference, not the monitor itself per se. Gamers always take this news hard for some reason, I guess because of marketing or something? The difference you see isn't what you think it is. Your eyes physically don't work that way.
Basically, if there is a LOT of change between frames in what needs to be drawn from frame to frame (like spinning around 360⁰), the change between those frames appears muddier as they are quickly drawn across the monitor. Higher fps/hz just spread this muddiness across more frames, so each frame looks slightly crisper than it would otherwise.
Your eyes do not see more frames in a second just because you're playing at a high frame rate.
tycoon282 t1_it8rpmr wrote
Nah, high refresh is 👌🏼 once you see 165hz, 60 is 🤢
NerdMouse t1_it8rorl wrote
You say that but doesn't VR say that 90hz is the bare minimum needed to have a smooth experience? Cause I have a VR headset and I can definitely tell when it's just at 90hz
rbnhd_f t1_it8s7zm wrote
Yes, this person is just plain wrong - talking out of their ass. There is diminishing returns the higher you get, but 30fps is absolute trash compared to 120 or 144.
that_other_goat t1_it8urke wrote
If I am remembering correctly it's to avoid the issues with early VR like eye strain, headaches and nausea. I could be misremembering but I'm not sure.
The virtual boy was terrible and there were often vomit buckets beside the hang glider sims at the arcade.
Diggsey t1_it8s6ln wrote
This is not true at all - high refresh rates are very easily detectable, and the human eye doesn't operate at any particular "refresh rate". Humans can detect very tiny difference in reaction times (eg. when you move a mouse and seeing that movement reflected on a screen). A 30hz computer monitor is almost unusable with a mouse. 60hz is fine for normal usage, but for eg. gaming you can get a measurable advantage with 144hz instead and feels smoother. There's probably no reason to ever go above 200hz.
Regarding resolution - yeah 8k is pointless for most usecases, but it really depends on how it's going to be viewed. 8k projected on the side of a building could make sense if you expect people to be looking at it up close. Meanwhile anything above 1080p is completely pointless if it's a watch face...
thecist t1_it8rtj4 wrote
It’s just you who can’t see a difference
elkarion t1_it8spip wrote
The high rephrase is not for our eyes its to more accurately syncing to what's happened. The game say runs at 300 fps. You get 1 frame ever 60say but if you go to 120 time since last frame is reduced so your display will be more up to date.
120 should be standard. Stop with this blurry shit that Hollywood put on us as 24.
I own a 144 hz monitor and notice when it's not smooth and see the dips to sub 60 when game is stressful to the hardware.
TeFD_Difficulthoon t1_it8s500 wrote
AHAHAHAAHHAHAH
nero519 t1_it8tau6 wrote
You either have never used a 144hz monitor or you could have an actual eye problem, is pretty much impossible not to notice the difference.
[deleted] t1_it8u5b6 wrote
[deleted]
Viewing a single comment thread. View all comments