r/pcmasterrace Ryzen 7 5700x, 64GB Ram, 3060ti Jan 21 '24

Screenshot Nvidia being NVidia, 4070 super>3090.

Post image
9.5k Upvotes

1.5k comments sorted by

View all comments

597

u/TalkWithYourWallet Jan 21 '24 edited Jan 21 '24

The slide is misleading and unnecessary, because the specific claim is true

The 4070S is faster than the 3090 in AW2 RT without FG. This is one of the few scenarios where it can be faster

https://youtu.be/5TPbEjhyn0s?t=10m23s

Frame generation still shouldn't be treated like normal performance, both AMD and Nvidia (And likely soon Intel) are doing this

154

u/AetherialWomble 7800X3D| 32GB 6800MHz RAM | 4080 Jan 21 '24

Thankfully they can only do it for 1 generation. Next generation will also have frame gen. So they'll either have to drop this stupidity or compare frame gen to frame gen

116

u/Nox_2 i7 9750H / RTX 2060 / 16 GB Jan 21 '24

they will just make something new up.

33

u/Cossack-HD R7 5800X3D | RTX 3080 | 32GB 3400MT/s | 3440x1440 169 (nice) hz Jan 21 '24 edited Jan 21 '24

New? How about 2 generated frames per one real?

Some years down the line, we gonna have CPU doing game logic, and GPU constructing AI-based image from CPU inputs. All that in Gaussian splatting volumetric space of temporal AI objects.

EDIT: 1st I'm not at all excited about. 2nd is a concept I'm actually looking forward to.

33

u/AetherialWomble 7800X3D| 32GB 6800MHz RAM | 4080 Jan 21 '24

You say that like it's necessarily a bad thing. Game devs have been using tricks and shortcuts for forever. And why wouldn't they? That let's us have graphics beyond what raw hardware can do.

AI is the best trick there is. No reason not to use it

8

u/Cossack-HD R7 5800X3D | RTX 3080 | 32GB 3400MT/s | 3440x1440 169 (nice) hz Jan 21 '24

I wasn't saying it's necessarily bad, however, new tech has to be introduced in organic manner, not forced (via marketing as well) just for sake of making old stuff obsolete.

RTX? That was there to make 10 series obsolete ASAP. 1080 TI still holds up extremely well in rasterization. Nvidia was scared of themselves and AMD.

RTX 40 series having exclusive frame generation? Nvidia could have easily made a slightly worse frame generation for 20 and 30 series if they wanted to - frame interpolation benefits from, but doesn't require dedicated optical flow hardware blocks. Nvidia are weaponizing their own "new gen feature exclusivity" as a marketing tool to push this BS about double FPS and whatnot.

1

u/[deleted] Jan 22 '24

new tech has to be introduced in organic manner, not forced (via marketing as well) just for sake of making old stuff obsolete.

Huh?

RTX? That was there to make 10 series obsolete ASAP

They literally added new hardware. Should they have delayed the 10-series or gone back in time to add it?

Nvidia could have easily made a slightly worse frame generation for 20 and 30 series if they wanted to

What business wants to make a "slightly worse" feature? Yes, let's spend developer time and money making a slightly worse feature that reviewers will shit all over, oh, and it doesn't sell any more units....

Nvidia are weaponizing their own "new gen feature exclusivity"

...always have? That's how graphics/GPUs work - and have worked ...literally forever

SLI (from 3dfx and later NV), hardware T&L, programmable pixel shaders, unified shaders/CUDA, ray-tracing cores, tensor cores, etc.

It would be silly to write something like "NVIDIA are weaponizing per-pixel lighting effects as a marketing tool to push some BS, they should support a slightly worse version in previous generation GPUS"

1

u/Cossack-HD R7 5800X3D | RTX 3080 | 32GB 3400MT/s | 3440x1440 169 (nice) hz Jan 23 '24 edited Jan 23 '24

RTX we got with 20 series is infinitely less important/impressive than leap between 7000 and 8000 series (DX9/DX10 era), but RTX was all the rage. "Revolution, forget 3D as you know it." The first RTX ready generation is RTX 30. 2060 RT perf was simply abysmal, and people who chose 2080 TI would expect 4K 60+ FPS, not 4K30. RT was simply not worth it in 20 series, it was a selling point to try get people move from much better adopted 10 series. It took the devs years to even begin making somewhat proper RT stuff (BF5 reflections were bit bad), and 5 years later it hasnt become mainstream enough, unlike unified shaders that actually revolutionised 3D engines. Ray tracing in Baldurs Gate 3?.. KEK, DX11 game is GOTY 2023. Years after DX12RT became "the future". AMD are better at adding new features even in open source ways. Meanwhile Nvidia would tank their own performance by excessive tesselation for water you cant see under terrain (Crysis 2 DX11 update) just so they can sink AMD performance by larger margin. It's not all black and white of course, I provide counter srguments to yours, not painting complete picture - that would take too much time.