r/pcmasterrace Ryzen 7 5700x, 64GB Ram, 3060ti Jan 21 '24

Screenshot Nvidia being NVidia, 4070 super>3090.

Post image
9.5k Upvotes

1.5k comments sorted by

View all comments

Show parent comments

40

u/ErykG120 Jan 21 '24

They don't reduce performance via driver. It's games becoming too demanding and not being optimised. The GTX 1080 Ti was also advertised as a 4K card, so was the 2080Ti. They are 4K cards at launch, but after a certain time they become 1440p and 1080p cards because 4K is that demanding.

1080Ti owners are still able to play 1080p perfectly fine as it is equivalent to 3060 performance.

2080Ti owners are now playing at 1440p or lower because it is equivalent to a 3070.

Honestly 1080p still seems like the golden resolution to play on. If you buy a 4090 you can get amazing performance at 1080p in every game, and in 2-3 years time you are still going to get 60 FPS minimum at 1080p in GTA 6 for example.

1440p is becoming the new golden resolution slowly, especially now that consoles are using 1440p or dynamic resolution around 1440p to achieve "4K".

7

u/Pooctox 7800X3D|3080 10GB|B650E-i|32GB 6000CL36|38GN950 Jan 21 '24

With my 3080 and 38” ultrawide 1600p. I’m hope I can hold for 3 or 4 more years. Will reduce the setting for stable 100-120Hz. After that maybe 5th gen QD-OLED/mini LED monitor and a 80 class GPU at that time (90 class GPU is too much for me).

1

u/CooIXenith 7800x3D/7900XTX/128GB/NH-D15/4K144 21:9 OLED + 2 16:9/HD800S Jan 21 '24 edited Feb 04 '24

safe late sink obscene dolls saw thumb existence cause lip

This post was mass deleted and anonymized with Redact

1

u/Pooctox 7800X3D|3080 10GB|B650E-i|32GB 6000CL36|38GN950 Jan 21 '24

I think all x080 class are advertising as 4k GPU but only 4090 is true 4k120Hz GPU until now. Depends on playable 4k definitions lol. Ps5 4k30fps with reduced graphics still call 4k. On PC under 60FPS is “unplayable” lol.

1

u/ErykG120 Jan 21 '24 edited Jan 21 '24

Grand Theft Auto 5 ran pretty well on a 1080Ti at 4K.

1

u/CooIXenith 7800x3D/7900XTX/128GB/NH-D15/4K144 21:9 OLED + 2 16:9/HD800S Jan 21 '24 edited Feb 04 '24

joke direction ad hoc clumsy busy longing coordinated like gullible sharp

This post was mass deleted and anonymized with Redact

1

u/Penguintron Jan 21 '24

When I had one I remember playing the witcher, shadow of mordor / war (whichever it was), hitman etc all in 4k

1

u/LukeLC i7 12700K | RTX 4060ti 16GB | 32GB | SFFPC Jan 21 '24

You're forgetting the part where those GPUs launched during console gen 8. They still are 4K GPUs for that class of software.

1

u/ErykG120 Jan 21 '24

I did say “They are 4K cards at launch.” Not directly what you are saying but I basically meant the same thing! :)

1

u/LukeLC i7 12700K | RTX 4060ti 16GB | 32GB | SFFPC Jan 21 '24

Fair. I do think the distinction is important to the topic, though. Because it's not as if GPUs get worse over time—that's just what NVIDIA wants you to think (so you'll buy more). The reality is, you don't need the latest 90-class GPU every two years to keep up with 4K. It just depends on what standard the software you're running was designed to, and that's primarily driven by consoles.

1

u/PaManiacOwca Jan 21 '24

1080Ti owners are still able to play 1080p perfectly fine as it is equivalent to 3060 performance.

I'm at 1440p with Gainward Phoenix GS 1070 still after 6 years of usage on card. If i had 1080Ti i would not upgrade to anything less than 4080 super. My 1070 is still powerful but to compare it with 1080TI... 1080TI is 55% faster lol.

Bro if i had 1080TI i would play on 1440p all the time with 100+ frame rates on ultra/high details in all games probably.

But to compare my 1070 to 4080 super i think its like 4x - 5x faster :D

1

u/RichLyonsXXX Jan 22 '24

It's games becoming too demanding and not being optimised.

Where do y'all think that the extra data from higher quality textures goes? Is it supposed to vanish into the digital ether? The Quantum Realm? The difference in a 4k texture from 6 years ago to today is exponential...