Submitted by Sorin61 t3_11c7r3x in technology
Comments
cesium-sandwich t1_ja2c73b wrote
.and a CPU+GPU that can generate a full frame of AAA graphics in under 1 msec. Good luck w that. Same thing with apples "retina" displays. Yeah they are nice to look at.. but it's REALLY hard to feed a high res image at any decent framerate.
Doubling the frame size means quadrupling the GPU framebuffer size = 4x more horsepower to feed it.
quettil t1_ja2w81y wrote
Is it not possible to do some sort of real-time interpolation between frames?
theinvolvement t1_ja2lmab wrote
What do you think about fitting some logic between the pixels at the cost of pixel density?
I was thinking it could handle some primitive draw operations, like vector graphics and flood fill.
Instead of trying to drive every pixel, you could send tiles of texture with relatively low resolution, and use vector graphics to handle masking of edges.
asdaaaaaaaa t1_ja2pr26 wrote
I'd imagine the more steps in between "generate graphics" and "display" add a considerable amount of latency. From my understanding we're already at the point where having the CPU physically close to related chips (memory's one, IIRC) makes a difference. Could be wrong, but from my understanding the last thing you want to do is throw a bunch of intermediate hardware/steps in the process if you can avoid it.
cesium-sandwich t1_ja2ps0i wrote
There are some economies of scale involved.. especially for high density displays,
The GPU does a lot of the heavy lifting..
But even simple-ish games often take multiple milliseconds of CPU time to simulate One frame, and that doesn't transfer to the CPU, so doubling the framerate means half the physics+gameplay+cpu calculation since you have half as much time to do it.
rumbletummy t1_ja35lae wrote
You mean like CAD?
theinvolvement t1_ja4bihn wrote
I am not sure, what i'm thinking of is a gpu that can output a tiles of image data, and an outline that trims the image to a sharply defined shape.
so the monitor would receive an array of images tiled together, and instructions to trim the edges before displaying on screen.
its kind of a pipe dream i had since hearing about vector graphics video codecs last decade, and microleds a few years ago.
ElasticFluffyMagnet t1_ja2pjo1 wrote
It's mostly because of denuvo. I can manage 100-110 fps on 1440p.. And I'm running it on a 2080 ti (all settings on high)
cyniclawl t1_ja3ymk3 wrote
How? I'm running high on a 3080 and none of my hardware is being taxed at all
ElasticFluffyMagnet t1_ja473ma wrote
Did you mean me or the guy above me..
cyniclawl t1_ja47h2a wrote
I meant you, curious what I'm doing wrong lol. I'm getting 60 in cinematics but as low as 10fps in some parts
ElasticFluffyMagnet t1_ja49s9f wrote
Well, I think it's important to have your driver's up to date and to check if you meet all other requirements . I've seen people with 1660s run it very well on 1080p, and people with 4090s running it insane on 4k with 60+ fps. But then there's people with 1080s, 4070s, and everything in between having problems.
The thing is, I'm running the game and it's using 20gb or ram, not 16, which they say should be enough. Don't know how much influence that has on performance though.
If you have low fps, check links like this, to see if any help. In this case though I believe denuvo really doesn't help either. There's a video of Hogwarts with and without denovo (sort of without) and the difference is quite noticeable.
If none of the things in the earlier videos help, the only thing you can do is contact the makers of the game. Or make a ticket. Seems to me though that it's mostly luck of the draw if your system runs it smooth. If you, of course, meet all requirements. But seeing my RAM usage is way higher I wonder if the other requirements are wrong too..
Edit: added some stuff and fixed typos
458_Wicked_Pyre t1_ja4be38 wrote
Unreal engine, it's a single-core CPU bottleneck.
Exci_ t1_ja2q6g0 wrote
A 4090 at 1000fps is basically free tinnitus.
alice_damespiel t1_ja2hvdv wrote
1k Hz would be beneficial in literally every use case but modern 3d games.
CatalyticDragon t1_ja2m91r wrote
A 7900xtx already pushes 500-600 fps in Valorant (and other similar games). People are getting 300+ on a 3060ti.
One more generation of GPUs and some popular eSports games will hit 1,000 fps.
So it makes sense to work on displays now.
evicous t1_ja3r9ri wrote
I don’t know why you’re getting downvoted. We all joke about 1khz being unacceptably stuttery for esports but… we’re well on our way to doing that on the high end. Frankly given the CPU bottleneck on a 4090/7900XTX at 1080p we might actually already be there with GPUs, or we’re very close.
A kHz refresh 1080p display will be very usable with adaptive refresh already, honestly.
CryptographerOdd299 t1_ja3vq0f wrote
Aren't CRT able of insanely high refresh rates?
deep_anal t1_ja3k85q wrote
Classic r/technology take. Cool new technology being discussed, "yea but what about all this other bullshit that makes this trash and we shouldn't be excited or talk about it in any way shape or form."
ElementNumber6 t1_ja5mljj wrote
This is nothing unique to any particular subreddit. The entire world is salty about the state of GPUs right now, for one reason or another.
sameguyontheweb t1_ja34z4d wrote
*with RTX enabled
rumbletummy t1_ja35dtc wrote
Maybe eyetracking and Foveated rendering could take advantage?
Fickle_Ball_1553 t1_ja3s0t1 wrote
Blame Denuvo.
IHadTacosYesterday t1_ja3upav wrote
In other words... "Calls on NVDA!"
azibuga t1_ja40u79 wrote
Right? Exactly my thoughts when I saw this
king0pa1n t1_ja4eb07 wrote
This would be incredible for old FPS games like Quake
[deleted] t1_ja4mx8q wrote
[deleted]
gliffy t1_ja4x5o7 wrote
Idk I have a 6900xt and getting solid 75fps on ultra
TDYDave2 t1_ja2aflk wrote
Does that mean Hogwarts Legacy is the new Crisis?
tnnrk t1_ja2f08t wrote
It’s just poorly optimized
TDYDave2 t1_ja2f6sq wrote
Then I will be optimistic about it being better optimized in the future.
Certain_Push_2347 t1_ja381rv wrote
Just get it now. There's nothing wrong with it unless your computer doesn't meet minimum requirements. Which are high compared to majority of previous games but it runs well.
chaivpush t1_ja40sma wrote
I have well beyond the min requirements and my frames still tank into the 40s and 50s while moving in Hogwarts and Hogsmeade. Definitely wait until they drop a performance patch.
[deleted] t1_ja56ok8 wrote
[removed]
Certain_Push_2347 t1_ja41ux6 wrote
Are you running on the minimum settings lol. So many people don't even have their PC set up properly and this is why they experience issues in games. Have windows fighting Nvidia software fighting some other 3rd party software vs the game. Have a quad core processor with multi threading disabled. Etc.
chaivpush t1_ja421jr wrote
Trust me when I tell you that I know how to optimize my PC, dipshit.
Certain_Push_2347 t1_ja428ee wrote
Lmao so not minium. Got it. Makes sense.
BobbyBorn2L8 t1_ja5ukc8 wrote
Its well known these days that 'AAA' devs aren't optimising properly for PC, don't defend the practice when people are clearly having issues
Certain_Push_2347 t1_ja6msd0 wrote
Lol that's just not true.
BobbyBorn2L8 t1_ja6z2t1 wrote
How so? There is many games today that should not be peforming as badly as they are on PC, they aren't nearly as good looking to justify it
Name me any decent good looking AAA game I guarnatee they were plagued with poor performance on launch and for many months after. Hogwarts Legacy is the most recent to come to mind
Dead Space Remaster had stuttering and framerate drop issues Elden Ring performed awful on release and still has noticable issues today
Certain_Push_2347 t1_ja7ohno wrote
I've already explained this. It's not like some of us have an exclusive patch that makes the game run at 60fps no problem. It's a properly working PC with the required hardware.
BobbyBorn2L8 t1_ja806iz wrote
Yeah some hardware is not properly optimised for by devs, so while you may have 60fps someone with different hardware to you (not necessarily better or worse) will have a different experience because the devs haven't properly optimised for devices
How do you not get this?
Certain_Push_2347 t1_ja80v1y wrote
I get it. The problem is you understanding.
BobbyBorn2L8 t1_ja8hr3o wrote
hahahahahah brilliant, you have no clue hence your clueless statement
>Just get it now. There's nothing wrong with it unless your computer doesn't meet minimum requirements. Which are high compared to majority of previous games but it runs well.
You clearly have no clue what you are talking about
https://www.pcgamer.com/hogwarts-legacy-february-patch-notes/
peace out
Certain_Push_2347 t1_ja9u0m3 wrote
I'm guessing you didn't read your own article? People who can't run dx12 properly are having shading issues lmao.
BobbyBorn2L8 t1_jad7zi1 wrote
>Since launch, Hogwarts Legacy PC players have reported stuttering and crashing while playing the game, and it seems to be caused by how it loads shaders. Unreal Engine games using DirectX12 have a tendency to chug when shaders load in for the first time, and it doesn't matter how good of a gaming rig you have. Final Fantasy 7 Remake was a particularly egregious example of shader sin, and for some, Hogwarts Legacy is just as bad
What dream are you living in, this is the definition of piss poor optimisation, stop simping for companies they 100% should be criticized for this
Certain_Push_2347 t1_jad9myj wrote
Lmao I don't think you understand how computers work. The game is fine.
BobbyBorn2L8 t1_jada3j1 wrote
Clearly not when there is widespread performance issues causes by shoddy loading of shaders, you don't understand how software works, your own source admits they messed up the loading of shaders
You are truly a delusional fanboy, this is why companies can get away with releasing buggy games nowadays
Certain_Push_2347 t1_jadap3c wrote
I'm not sure what you're even talking about now. I didn't give a source for anything and you're literally repeating what I've said. Perhaps do some reading on computer performance and maybe you'll fix your problem.
BobbyBorn2L8 t1_jadbtty wrote
Sorry I forgot I linked the source, brain is fried and I still understand how computers work better than you do
And I don't have the problem cause I don't own the game, but plenty of people who own the game are getting performance hits where they shouldn't because of poor optimisation the article I provided confirms this but still you sit here and argue that its somehow a consumer problem
The game shouldn't be performing this badly for people
Certain_Push_2347 t1_jae5thp wrote
Lmaooo never even played it but understand it's not hardware issues somehow.
BobbyBorn2L8 t1_jae6zur wrote
Are you dense? You do realise other people's experiences are literally out there right, are they all just lying, performance issues never happen cause of developer/management incompetency/tight deadlines? You've got an article that is literally based off peoples experiences and tells you why the software optimisation is having issues, hell the article is about a fix for performance issues that didn't even fix it
If there wasn't issue why did the developers have to release a fix? Explain that one are the developers lying too?
Certain_Push_2347 t1_jaeuagv wrote
Hopefully you learn something from this. It's okay to not understand but you shouldn't be so aggressive. Makes you look bad. No one judged you before.
atchijov t1_ja2dsqi wrote
10k Hz refresh rate is not the same as 10k fps. Analog movies were shot at 32 fps… and no one complained about “smoothness”.
So anything above 32 is mostly to fool our brain for some “beneficial” purpose.
Ordinary_Fun_8379 t1_ja2gwkn wrote
Movies are shot at 24fps and are noticeably stuttery during action and fast pans. The “feel” of 24fps is so intertwined with what audiences expect a movie to look like that high frame rate films like The Hobbit look wrong to most people.
asdaaaaaaaa t1_ja2pz6k wrote
Agreed, reminds me of the "soap opera effect", where soap operas used to use higher FPS video (I forget the exact amount) which led people to viewing a smoother video experience as "low quality", because that's the type of shows that used to use it. Don't know the technical specifications on it, just that even I had to adjust what I viewed as "high quality" when more studios started doing the same.
PmMeYourBestComment t1_ja2h03a wrote
Ever seen a fast pan on 30fps? You’ll see stuttering. The human eye can easily see difference above 60fps
MadDog00312 t1_ja26c0q wrote
Holy crap, compatible with LCD production, silicon based, and crazy PPI with up to 1000Hz capability was amazing by itself (when it comes to new materials science).
The fact it’s also silicon based, power efficient, and could realistically be 5 years away is so cool!
There’s actually a chance my next tv could have this!
tiktaktok_65 t1_ja2dxdu wrote
what about bandwidth requirements, texture resolutions and the size requirements to power/leverage all that? AAA development times are already between 6-10 years for fundamental new projects that move the technical verge and aren't iterating on established franchises and tech (simply because quality standards are so top of the line) the kind of hyper-realism that is enabled with that display tech if it ever hits will probably take generations to be fully exhausted/leveraged.
PmMeYourBestComment t1_ja2gwsw wrote
You don’t need 1000hz content to leverage 1000hz. It’s much easier on the eyes.
Even if the content is only 60fps, it will be so much smoother.
MadDog00312 t1_ja3wxqv wrote
You are right of course, if they start with everything at once, which they might for the super rich. We will likely see lower resolutions and/or display scaling for a while, as without content and quite frankly processing power to deal with potentially 100 million pixels is going to be huge.
The whole point of the article and my comment was this is not just some research paper. It uses similar tools and industrial processes that already exist.
This is materials science engineering at its finest. They have already proven it works and how to do it. Now they just need money to push it across the finish line.
Professor of materials science and engineer for whatever it’s worth.
MadDog00312 t1_ja5kat8 wrote
I just want bright and accurate color, and pitch blacks. I don’t need some insane resolution gaming monitor, mainly due to GPU prices for true 4K gaming, I want an amazing 100 inch tv that isn’t $60,000. This technology has the potential to make that a reality.
ElementNumber6 t1_ja5n3dx wrote
Micro LED displays have been commercially possible for 5 years, and they scale extremely well, meaning a 100" TV panel could be produced without (compared to other such technologies) much more work than a 1.5" watch display, and they are better in just about every possible way.
And yet, they remain future tech for the most part, likely due to the fact existing technologies still have roadmaps to be milked for many years to come.
MadDog00312 t1_ja5v5c8 wrote
A micro led will likely never be as cheap to manufacture as these will (simply due to complexity) nor will they be as long lasting. It’s impossible due to both physics and thermodynamics.
I will gladly bore you with the actual science if you want (I teach materials science engineering), but I’m not going to type a long long response if you’re not interested 😀.
I’ll add one caveat, because I can’t access the actual research yet (yay peer review!) if the research team did what they claim to have done, this is a multibillion dollar patent in the works!
escobarshideout t1_ja77vvl wrote
My TV does 120Hz but almost nothing supports it apart from certain games with certain expensive hardware.
MadDog00312 t1_ja7ppq7 wrote
Unfortunately that’s not likely to change nearly as quickly as the picture quality. However if the research is legit, you could have a tv with better than OLED picture quality, without the OLED pricing. The 1000 Hz refresh rate is more indicative of how fast the pixels can change, not that at this point there is a need for it.
Allaun t1_ja2qsvk wrote
The use case I could see is Virtual Reality. Screen Door Effect is a difficult thing to overcome.
PM-ME-YOUR-TECH-TIPS t1_ja6sse9 wrote
Screen door effect is already nearly unnoticeable on current headsets.
glacialthinker t1_ja3oq5p wrote
So, relying on a lit backplane, can this particular metasurface fully block the light, or will there be light-bleed or some minimal transmission? This is one of the limitations of LCD (controlling transmission of filtered light) versus various LED options (emitting light).
opaz t1_ja3wkzk wrote
"Affordable" for the manufacturer, I bet they'll decide to keep the higher margin for themselves
TastyTeeth t1_ja3yacs wrote
Never click on articles with "could" in the headline.
Spareo t1_ja4l1jz wrote
I’m still waiting to get my 5k2k 38”
ComfortableSock2044 t1_ja33k31 wrote
Thank god -- more screens!
[deleted] t1_ja3spnk wrote
[removed]
Value-Gamer t1_ja4o09n wrote
There really is no need for this they just need to develop a screen tech that isn’t sample and hold.
mazeking t1_ja5lv15 wrote
And then try this in 8K …. The should stick to improving refresh rate instead of aiming for what would be next 16K or 12K?
Probably need a 10.000 dollar computer to draw very high refersh rate to such a screen.
cute_viruz t1_ja5touo wrote
This will not come out to the public
AustinJG t1_ja66tcz wrote
But will retro games look like shit on this one, too?
I swear I need to get me a little CRT TV for my SNES stuff.
WMHat t1_ja6z4nr wrote
Not sure why when even 120hz refresh rate is enough, but okay.
jeffyoulose t1_ja3w8q8 wrote
All the better to see the nuclear Armageddon that is coming.
Alternative_Log3012 t1_ja2td9a wrote
Losers will say that we only see in 30fps.
quettil t1_ja2wdxe wrote
What will this be used for? TV runs at 24fps, games struggle to get 60 realiably. CS 1.6 maybe.
jesman0 t1_ja3ha7v wrote
Why? Nobody can tell a difference over ~70 fps anyway.
EyeLikeTheStonk t1_ja29tfm wrote
Now all you need is a graphics card that can push 1000 fps in games without costing $10,000.
People are playing Hogwarts Legacy with top of the line graphics cards and registering below 60 fps.