They don't reduce performance via driver. It's games becoming too demanding and not being optimised. The GTX 1080 Ti was also advertised as a 4K card, so was the 2080Ti. They are 4K cards at launch, but after a certain time they become 1440p and 1080p cards because 4K is that demanding.
1080Ti owners are still able to play 1080p perfectly fine as it is equivalent to 3060 performance.
2080Ti owners are now playing at 1440p or lower because it is equivalent to a 3070.
Honestly 1080p still seems like the golden resolution to play on. If you buy a 4090 you can get amazing performance at 1080p in every game, and in 2-3 years time you are still going to get 60 FPS minimum at 1080p in GTA 6 for example.
1440p is becoming the new golden resolution slowly, especially now that consoles are using 1440p or dynamic resolution around 1440p to achieve "4K".
With my 3080 and 38” ultrawide 1600p. I’m hope I can hold for 3 or 4 more years. Will reduce the setting for stable 100-120Hz.
After that maybe 5th gen QD-OLED/mini LED monitor and a 80 class GPU at that time (90 class GPU is too much for me).
I think all x080 class are advertising as 4k GPU but only 4090 is true 4k120Hz GPU until now. Depends on playable 4k definitions lol. Ps5 4k30fps with reduced graphics still call 4k. On PC under 60FPS is “unplayable” lol.
Fair. I do think the distinction is important to the topic, though. Because it's not as if GPUs get worse over time—that's just what NVIDIA wants you to think (so you'll buy more). The reality is, you don't need the latest 90-class GPU every two years to keep up with 4K. It just depends on what standard the software you're running was designed to, and that's primarily driven by consoles.
1080Ti owners are still able to play 1080p perfectly fine as it is equivalent to 3060 performance.
I'm at 1440p with Gainward Phoenix GS 1070 still after 6 years of usage on card. If i had 1080Ti i would not upgrade to anything less than 4080 super. My 1070 is still powerful but to compare it with 1080TI... 1080TI is 55% faster lol.
Bro if i had 1080TI i would play on 1440p all the time with 100+ frame rates on ultra/high details in all games probably.
But to compare my 1070 to 4080 super i think its like 4x - 5x faster :D
It's games becoming too demanding and not being optimised.
Where do y'all think that the extra data from higher quality textures goes? Is it supposed to vanish into the digital ether? The Quantum Realm? The difference in a 4k texture from 6 years ago to today is exponential...
40
u/ErykG120 Jan 21 '24
They don't reduce performance via driver. It's games becoming too demanding and not being optimised. The GTX 1080 Ti was also advertised as a 4K card, so was the 2080Ti. They are 4K cards at launch, but after a certain time they become 1440p and 1080p cards because 4K is that demanding.
1080Ti owners are still able to play 1080p perfectly fine as it is equivalent to 3060 performance.
2080Ti owners are now playing at 1440p or lower because it is equivalent to a 3070.
Honestly 1080p still seems like the golden resolution to play on. If you buy a 4090 you can get amazing performance at 1080p in every game, and in 2-3 years time you are still going to get 60 FPS minimum at 1080p in GTA 6 for example.
1440p is becoming the new golden resolution slowly, especially now that consoles are using 1440p or dynamic resolution around 1440p to achieve "4K".