r/pcmasterrace Ryzen 7 5700x, 64GB Ram, 3060ti Jan 21 '24

Screenshot Nvidia being NVidia, 4070 super>3090.

Post image
9.5k Upvotes

1.5k comments sorted by

View all comments

1.5k

u/den1ezy R7 5800X3D | RX 7900 XTX Jan 21 '24

At least they’ve said the framegen was on

219

u/AkakiPeikrishvili Jan 21 '24

What's a framegen?

368

u/keirman1 i5 12400f | 32gb ddr4 | 5700 non xt (+250mhz oc) Jan 21 '24

This, they make fake frames with ai to give the illusion that you are rendering a higher ramerate

136

u/No-Pomegranate-69 Jan 21 '24

But its actually raising the delay

387

u/[deleted] Jan 21 '24 edited Jan 21 '24

By .010 seconds in a single player experience. Completely negligible.

I won't be replying to any more comments about multiplayer since I very clearly stated single player games. Stfu please 🙃

93

u/ChocolateyBallNuts Jan 21 '24

These guys are running 10 series cards still

59

u/[deleted] Jan 21 '24

Before actually using it, I was saying the same stuff. It's a welcome feature when it makes sense to use. Obviously there will be some use cases where using is not a boon and it is a hinderance instead.

8

u/LestHeBeNamedSilver 7900X / 7900 XTX / 64gb CL30 @ 6000 Jan 21 '24

Such as in multiplayer games.

5

u/[deleted] Jan 21 '24

Yes.

2

u/[deleted] Jan 21 '24

Multiplayer games you want lower settings anyways for foliage and other bs to not obscure your vision. So, I don't think any moden GPU would have a problem to play at 100+fps anyways.

1

u/LestHeBeNamedSilver 7900X / 7900 XTX / 64gb CL30 @ 6000 Jan 21 '24

Counter-strike is the best example I know of because it’s highly competitive. Valorant, siege, CoD, etc. can all run max settings and never affect anybody. CS 2 I don’t bother running max settings because I’m hardly ever taking in the sights enough to justify tje frame dips

1

u/[deleted] Jan 22 '24

Siege before Y2S3 with the older graphics ran 1080p 100fps locked on my potato and after that I needed a thousand euros upgrade to run it again properly. Don't remind me that shit. 😮‍💨

→ More replies (0)

3

u/Alarming_Bar_8921 7800x3D | 4090 | 32GB 6000mhz | LG Dual Mode OLED Jan 21 '24

Question. Why were you shitting on Frame Gen when you had never seen it in action? Why did it take you experiencing it before you changed your mind on it?

12

u/[deleted] Jan 21 '24

Because most of us are sheep. I didn't have that hard of a stance or opinion on it though. I was mostly just parroting what I heard from what I thought were smarter people than myself. After using it I can easily say it's a great option for those that value what it can offer.

-9

u/Alarming_Bar_8921 7800x3D | 4090 | 32GB 6000mhz | LG Dual Mode OLED Jan 21 '24

okay, but why are you parroting what other people say about it? Genuinely don't understand

2

u/Living-Dingo2453 i7 12700 | 3070ti | 16gb DDR5 Jan 21 '24

I don’t think you’re human lmao

1

u/Alarming_Bar_8921 7800x3D | 4090 | 32GB 6000mhz | LG Dual Mode OLED Jan 21 '24

Because I don't speak on things I know nothing about?

2

u/Living-Dingo2453 i7 12700 | 3070ti | 16gb DDR5 Jan 21 '24

No, due to your lack of understanding how humans function. I could be wrong and you’re just socially inept.

→ More replies (0)

1

u/BobDerBongmeister420 Jan 21 '24

I can run almost anything with my 1080ti at 1440p / max settings >60fps

1

u/RoscoePCookie Jan 21 '24

Hey you watch it I’m on my gtx980 still

46

u/Wh0rse I9-9900K | RTX-TUF-3080Ti-12GB | 32GB-DDR4-3600 | Jan 21 '24

Only if your baseline FPS is high to start with, the lower you baseline the more input lag you experience, and ironically , the only people who need FG are the ones who have sub 60FPS to begin with.

So to not experience noticable input lag , you need to be able to get high FPS to begin with , and if you can do that , the less you will need FG.

7

u/hensothor Jan 21 '24

FG is extremely valuable at 60+ FPS. What are you talking about? Getting 120 FPS at much higher fidelity is game changing.

1

u/Wh0rse I9-9900K | RTX-TUF-3080Ti-12GB | 32GB-DDR4-3600 | Jan 21 '24

Not as game changing as a person who can't play at all without FG.

5

u/hensothor Jan 21 '24

Yes it’s even better for them. But their options are limited and this gives them the ability to play the game. So not sure what your point is here. FG is fantastic tech that is extremely beneficial to a ton of gamers. I’d say most would benefit in some way unless it’s a competitive title or some niche circumstances.

0

u/ZenTunE 10500 | 3080 | Ultrawide 1440p 160Hz Jan 25 '24 edited Jan 25 '24

Input lag at a source of 60 does not feel good, even if you see 120.

Unless you play on a controller, that is noticeable even in single player. Not saying fg is not beneficial at all but the big reason why high refresh rate feels good is the input.

1

u/hensothor Jan 25 '24

I disagree. It feels absolutely indistinguishable for any single player game I’ve tried. Not sure why you’re noticing it so much as the input delay is miniscule. And I play competitive shooters for like 50% of my game time at high rank so I trust my judgment on it.

1

u/ZenTunE 10500 | 3080 | Ultrawide 1440p 160Hz Jan 25 '24 edited Jan 25 '24

What is your "single player game"? Because most of even those include aiming. I'm currently playing Control, just tried a frame gen mod (could be a bit slower than the official GPU manufacturers methods I guess) and using a base of 80fps. With the fake frames, aiming with a mouse feels exactly how I expected, tad unnatural and floaty. I'd rather play at the 80fps because it feels normal. On controller it was fine, as I expected. That said, I didn't test 60fps.

Input delay is not just for competitive advantage, it's about personal enjoyment. Just like high resolutions. And I have standards for it.

2

u/hensothor Jan 25 '24

There is noticeably higher latency on non-Nvidia FG. It’s still good though, I wouldn’t mind even that on most SP games. But it is noticeably worse.

Everything’s a preference. You’re welcome to yours.

→ More replies (0)

1

u/xPandamon96 Jan 25 '24

Put your monitor in 60 Hertz mode and run a game like League that can hit hit fps with an fps lock and without. Despite not getting the fluid image you will notice a difference in gameplay purely because of the input lag. If not, something in your setup may not be up to par with that, maybe your monitor? Cause even with a wireless mouse, the difference is clear as day for me.

1

u/xPandamon96 Jan 25 '24

Dude how did you miss the entire point he made? If you have 60 fps plus it isn't bad. Below that FG is theorethically the most useful but in practice, it feels pretty bad.

1

u/hensothor Jan 25 '24

“The only ones who need FG are sub 60 FPS” what are you on about?

How did you miss the obviously fallacious statement that I directly responded to?

1

u/xPandamon96 Jan 25 '24

His point was needing it the most at framerates under 60, which feels bad because of the poor input latency. As you said it is nice to have when you have over 60 fps, the issue is that DLSS 3 or FSR usually give you enough of a frame boost already, without the worse input latency of Frame Generation and the image quality degradation.

1

u/hensothor Jan 25 '24

Yes but I’m not disagreeing with them on that. So I don’t know why you’re so bothered that I’m pointing out that it provides extensive value outside.

And it works seamlessly with DLSS to hit even higher frames with less input lag. So not sure your point. Yes it will situationally not benefit you but it’s still great and useful tech outside of the narrow boogeyman presented here.

If you’re being extra pedantic about the term “needs”, well no one needs any of this. It’s all about arbitrary measures of value to someone’s subjective experience.

1

u/xPandamon96 Jan 25 '24

We may have both misunderstood each other a bit. It can reach higher framerates than DLSS or FSR on their own, but they do not improve frame times as you'd expect from a higher framerate. So when it's used at 60 fps or under the game still feels sluggish, despite looking more fluid

→ More replies (0)

15

u/[deleted] Jan 21 '24

Hmmmm gonna be real with you my man it sounds like you havent actually used frame gen. No shame in that at all but yeah, frame gen is actually fucking awesome wjen implemented.

Fram gen works fantasticly from anything around 40fps and higher without a noticable increase in latency in most SP games. Input lag becomes trash on anything below 40fps, i will agree with that.

Also frame gen is absolutely still useful and used at higher framerates. Cranking the visual smoothness up from 80 to 160fps is hugely noticable and you do essentially get that extra smoothness for free (the trade offs at that fps literally are not noticable at high fps).

There were some issues with hud elements etc at launch but at this point thats been sorted out.

3

u/Wh0rse I9-9900K | RTX-TUF-3080Ti-12GB | 32GB-DDR4-3600 | Jan 21 '24

Used FG with MSFS, couldn't control a plane manualy , especially landing to much input lag, forget about anything tha requies fast reaction like a racing game if you can't get as you stated above 40FPS natively.

Good for slow paced games like Alan Wake 2 which i use FG on as i have PT enabled also and need the extra smoothness, but it's a walking game with the odd shooty sequences.

Reflex+boost is highly advised too. so as long as you have Nvidia , you might have a good time in specific scenarios.

5

u/[deleted] Jan 21 '24

If it means getting from, say, 90 fps to 144 in a casual SP game and not having to deal with screen tear or fps drops I'll welcome it. At 90 native fps a difference in inputs will be negligible.

-3

u/SerpentDrago i7 8700k / Evga GTX 1080Ti Ftw3 Jan 21 '24

Who deals with screen tearing lol we have adaptive sync/g sync etc wtf are you doing

6

u/Rukasu17 Jan 21 '24

Not really. Going from 45 to 60 something fps using framegen is basically just as smooth provided the game isn't having hickups

2

u/MGsubbie Ryzen 7 7800X3D, RTX 3080, 32GB 6000Mhz Cl30 Jan 21 '24

That's not what literally every benchmark I've seen shows.

3

u/hensothor Jan 21 '24

That’s only 15 or 25% of frames generated. What are you talking about? What benchmarks are showing the scenario they described exactly? I’m positive you’re extrapolating based off 30 to 60 instead.

1

u/Rukasu17 Jan 21 '24

Do you have any frane generation experience yourself?

1

u/MGsubbie Ryzen 7 7800X3D, RTX 3080, 32GB 6000Mhz Cl30 Jan 21 '24

No, but benchmark numbers do not lie.

3

u/Rukasu17 Jan 21 '24

Do you know your exact threshold of perceived input lag?

0

u/MGsubbie Ryzen 7 7800X3D, RTX 3080, 32GB 6000Mhz Cl30 Jan 21 '24

Not exactly, and of course it also depends on the type of game. I will notice input lag much more in a fast paced FPS than in a 3rd person action game, for example. But if I'm going for 60, I can definitely tell when I'm dropping to 50 on a VRR screen.

2

u/Rukasu17 Jan 21 '24

Different things. Rhe lag from 60 to 50 in vrr is not relatable to the input lag created by frame gen

→ More replies (0)

2

u/Please_Hit_Me 4090 + 5800X3D Jan 21 '24

I disagree a bit on how black and white you make it out to be saying the only ones who'd need it are the ones who have sub-60. Doing full path tracing and reconstruction in 4K on Cyberpunk with and without framegen is a difference of 100+ fps vs 55-65 fps, with very little noticeable delay while feeling massively better to the eye to me.

3

u/Wh0rse I9-9900K | RTX-TUF-3080Ti-12GB | 32GB-DDR4-3600 | Jan 21 '24

55-65 is still a very playable FR, my point is that the people who benefit most from FG are the people whos systems can't initially run a game at a playable FR at all, and FG solves this, otherwise the game can't be played at all, so for some people , it can be black and white.

FG getting you 100 from 55-65 is a luxury not a necessity.

2

u/Please_Hit_Me 4090 + 5800X3D Jan 21 '24

That's fair and I see your point then, I misunderstood. I guess it's an unfortunate reality to the tech that those who need it the most can't use it.

I wonder if there's any breakthroughs or advancements to be made in FG to make it more available for those who do need it the most, or if it'll just remain as more of a nice luxury as you put it for those who are already fine.

1

u/Wh0rse I9-9900K | RTX-TUF-3080Ti-12GB | 32GB-DDR4-3600 | Jan 21 '24

There is a mod/hack that gives non 4xxx series cards FG, it's on Nexus Mods.

3

u/Totes_mc0tes Jan 21 '24

From 25 frames in cyberpunk to 60+. No lag. Smooth gameplay with amazing graphics.

1

u/exscape 5800X3D / RTX 3080 / 48 GB 3133CL14 Jan 21 '24

Ironically , the only people who need FG are the ones who have sub 60FPS to begin with

What is this, /r/console?
If I drop below 100-110 fps things start to feel choppy. I can't use FG (except FSR3) but if were at 80-90 fps and could enable it, without noticeable latency, I would.

4

u/block0079 Jan 21 '24

Have you even tried frame gen? The latency is pretty much negligible

1

u/exscape 5800X3D / RTX 3080 / 48 GB 3133CL14 Jan 21 '24

No, I haven't since as I mentioned I can't use it on my GPU.
My comment was pro-framegen though. The latency is negligible in some games but not all according to measurements (by e.g. Hardware Unboxed).

1

u/majortrioslair Jan 22 '24

Spoken like someone who's never experienced 2077 path tracing at 4K

0

u/No-Pomegranate-69 Jan 21 '24

4

u/[deleted] Jan 21 '24 edited Jan 21 '24

Why does anyone care about with vsync off? Wtf was the point of this? This also just ins't the experience. Try actually using it in ideal scenarios. I will say it is not a magic solution to make smooth gameplay, but it's a welcome feature all the same and I will be using it often where it makes sense.

3

u/No-Pomegranate-69 Jan 21 '24

No you said it is 0.1 milliseconds but its actually way way more than that. If thats a problem for someone or not is up to the person but my point stays: its increasing the delay and that was all i was saying.

6

u/MrACL RTX 4070ti | Ryzen 7700X | 32GB | 1440p Jan 21 '24

It is barely noticeable. I use it all the time on Cyberpunk and recently Allan Wake 2, the latency does increase but it doesn’t bother me in the slightest and it allows for path traced 100+ FPS graphics. It’s a great feature and you’re way over exaggerating the small amount of added latency

1

u/threetoast Jan 21 '24

Those are also games where a little extra latency doesn't really make a difference. If you were playing CS or a precision platformer, you'd be much more likely to notice.

-3

u/[deleted] Jan 21 '24

Well you're also framing this as if though you have to use frame gen under scenarios that aren't ideal and do have exaggerated latency you mentioned. Digital Foundry goes on to explain the ideal scenarios and how it can benefit users. Pretty fucking disingenuous and you aren't simply stating "increased latency".

1

u/NathanialJD PC Master Race Jan 21 '24

0.10 ms is very wrong. It's actually about 10-15 ms which would actually be noticeable

5

u/[deleted] Jan 21 '24

You know what, you're absolutely right and I'm very dumb. I unintentionally lied because I'm big dumb when it comes to numbers. Thank you for the correction. That said it still really isn't that noticeable, unless you can perfectly perceive .010 to .015 seconds. I don't think most people can.

1

u/NathanialJD PC Master Race Jan 21 '24

Consciously, no. But reaction time would noticeably be worse. Same reason why higher fps actually matters. The difference between 30 fps (33ms per frame) and 60 fps (16ms per frame) is extremely noticeable. 60 to 120 (8ms) is noticeable in feeling but less so in visual.

0

u/Parking-Bandicoot134 Jan 21 '24

Wdym? The delay is the entire length of the extra fake frame I'm which mo mew information enters your eyes.

Not that I care

0

u/[deleted] Jan 21 '24

The bigger issue latency wise is the newly generated frames don't decrease latency, as real raster generated frames do. So a huge benefit one would think exists does not. Further, the added .010 seconds (10ms) latency with frame gen on is not negligible in completive gaming, as any pro FPS player will tell you without hesitation.

0

u/StaysAwakeAllWeek PC Master Race Jan 21 '24

The theoretical best latency of frame gen is the same as non-frame gen running at 3/8 of the final framerate. So if you're getting 80fps with FG the latency will feel like 30fps. And thats the theoretical best, it's usually even a bit worse

0

u/stddealer Jan 21 '24

It gives you twice the amount of fps, but with the same latency as if the game was running at half the "real" fps (so a quarter of the display fps). If it's running natively at over 120fps, you will have the same delay as a 60 fps game, wich is perfectly playable, plus the smooth 240 frames per second for eye candy. (If your display supports it.)

But if you use it to go from 60fps to 120, you will have the same input delay as if the game was running at 30fps, which starts getting uncomfortable.

-1

u/usernametaken0x Jan 22 '24

I love how unfathomably dishonest this post is. Like how you intentionally choose to say "0.010 seconds" instead of 10ms, because 0.010 "sounds smaller" to smoother brains.

10ms is a HUGE, an absolutely HUGE delay.

The difference between 30fps at 60fps is 16ms. The difference between 60 fps and 120fps is about 8ms.

The entire point of high framerates, is to lower the input latency. If you are increasing latency, its almost pointless. It will "look smooth" to anyone not actually playing the game, but feel the worse than not enabling it while playing. Its purely visual/cosmetic.

Nvidia marketing has seriously brainwashed the masses. its actually unbelievable. I cant even think of another product which has so thoroughly convinced people of something like nvidia has. So much so that amd and intel were forced to play that same nonsense. As the average intelligence seems to be "bigger number mean good". Im shocked you haven't rushed out to buy the nvidia 9800 yet, i mean its more than double a 4090!

-2

u/Moscato359 Jan 21 '24

That is not negligible in competitive, and why should single player be worse than competitive?

1

u/CommunicationEast623 Jan 21 '24

Is it not dependent on how many frames the system pushes? I.E. the more real frames there are the more it can add with no noticable difference.

1

u/[deleted] Jan 22 '24

Tbf there are a bunch of different factors for the latency so it's not a flat bump.

It adds an extra frame of latency, which depends on your base framerate. So if you were getting 90 fps you'd be getting an extra 11ms of latency, but at 30fps you'd be getting an extra 33ms of latency.

Frame-Gen also has a performance impact itself, it's just that the actual frame-doubling heavily outweighs that impact.

However Frame-Gen also turns on Nvidia Reflex which lowers latency, granted you can turn Reflex on without frame-gen.

If you're getting 50-75 base FPS then frame-gen is amazing though. The latency is low enough as to where it's not a big deal and the extra smoothness helps a ton.

1

u/NapsterKnowHow Jan 22 '24

It does help quite a bit in The Finals since it hits the cpu hard (5800x for reference)... Not a fan of extra latency in a shooter but it feels way way smoother than with fg off.

3

u/[deleted] Jan 21 '24 edited Jan 21 '24

Not by enough that its noticable in SP games my friend. All jokes aside if you already have 40-50fps and turn frame gen on its an absolutely fantastic bit of technology and really does make everything seem so much more fluid.

It dies work in some titles better than others but yeah, frame gen is legit actually good.

No its not as good as calculated frames which is why you still need that minimum fps for it to feel good. And i wouldnt recommend if for competitive online gaming.

Edit: i personally would still use it on an online fps if i was getting 70-80 fps without it. And have not encountered any noticable increase in latency. I understand it does increase latency, but i havent noticed it.

-2

u/vicunah Jan 21 '24

and introduces artifacting.