Reported by:

Tech Jesus Reveals NVIDIA has peaked the new $2k 5090 preforms 30% better on 4k games and has negligible improvements to 1080p and 1440p games :marseyxd:

https://youtube.com/watch?v=VWSlOC_jiLQ

!g*mers laugh at Nvidia-cels a $2000 GPU still cant play cyberpunk at 4k 60 fps with gay tracing. Have fun paying $2000 for a DLSS software update.

Oh yeah remember this?

https://i.rdrama.net/images/1737649358wTWxZCIMAri7rw.webp

Total fricking lie by 4090 performance they mean 5070 can use 4x frame generation to have an equal framerate, but frame gen also increases the latency to like 60 ms meaning the game will play like utter dog shit despite looking smooth.

https://i.rdrama.net/images/1737649358ME3qnmE93-98Qg.webp

Its kinda funny how the cope went from "well 60 fps doesn't matter" to "well its 400 fps who cares if the game has 60 ms latency"

!fosstards !linuxchads Nvidia continues to suffer the curse of proprietary BS

81
Jump in the discussion.

No email address required.

>nearly a tenth of a second latency

:marseyconfused: Do AAAAA sloppatards really?

Jump in the discussion.

No email address required.

60 ms of latency is awful. Thats ontop of the in game lag too. Having 8ms of latency is considered nearly unplayable in a game like CS2. If ur a total casualtard it doesn't matter I guess the same way 30 fps doesn't matter but it feels like shit to play lol. Like 5 frames of input lag can be unplayable for more skill based genres like shoot em ups or rhythm games or fast paced action games (not souls slop). Its funny to me watching professional tech reviewers like LTT be like "yeah frame gen adds 60 ms of latency and has horrible artifacting but the average consumer won't notice so who cares "

Jump in the discussion.

No email address required.

!oldstrags

Having 8ms of latency is considered nearly unplayable

Come laugh at this

Jump in the discussion.

No email address required.

I 'member playing Counter Strike with 600 ms on dial-up :#boomermonster:

!oldstrags , what's the worst ping you've ever played with?

Jump in the discussion.

No email address required.

I killed Sarth + 3 Drakes in WoW while it was current with 1 full second of lag. I had shitty apartment cable and had to basically be predicting everything 1 second earlier. I'm a bad butt. :marseyskater:

Jump in the discussion.

No email address required.

That's server lag not fps lag

Jump in the discussion.

No email address required.

:marseykneel:

Jump in the discussion.

No email address required.

300ms. Beyond that it's actually unplayable

Jump in the discussion.

No email address required.

60 ms go fricking choke on a peepee you dialup cute twink

Low Ping Bastards for Life

Jump in the discussion.

No email address required.

Cs1.6 was like the only game with Australian servers, everything else I played 300 to us west coast on dialup, then when we got adsl that went down to 230. I used to play tournaments in unreal tournament on 230

Jump in the discussion.

No email address required.

I am so confused, the full latency, from mouse/kb to screen response is usually 40 on CSGO?

Jump in the discussion.

No email address required.

I'm a latent homosexual, just give me 8ms.

Jump in the discussion.

No email address required.

He's right but in a way more derogatory to the average consumer than he intends.

Jump in the discussion.

No email address required.

Thats ontop of the in game lag too

5 frames of input lag can be unplayable for more skill based genres like shoot em ups or rhythm games or fast paced action games

Make you bait less r-slurred in the future please

Jump in the discussion.

No email address required.

I use DLSS all the time and don't notice :marseytransattentionseeker: any input lag. Maybe :marseymight: I'm just r-slurred :marseycrayoneater: idk.

Jump in the discussion.

No email address required.

This os frame gen not dlss. DLSS still requires a decent base resolution and framerate to see good effects and frane gen doesn't so nvidia is pushing it despite it being awful

Jump in the discussion.

No email address required.

:marseysmug2:

Jump in the discussion.

No email address required.

if my ping is above 100ms I get mad, I expect my monitor to be way faster than the connection between my pc and the server

Jump in the discussion.

No email address required.

:marseyindignant: The AI Demon in your computer needs time to think about what you've demanded of it.

Jump in the discussion.

No email address required.

Still keeping my 3090 I guess lol

Jump in the discussion.

No email address required.

A 4090 would prob be worth it if it goes down in price due to the 5090 launch. Unless you are doing dedicated AI models a 5090 is fricking pointless.

Jump in the discussion.

No email address required.

>thinking 4090s will not further appreciate in value

:marseydarkxd:

Jump in the discussion.

No email address required.

RIP I'll prob fully switch to AMD now. I was interested in NVIDIAs reflex technology which can reduce latency to like 2 ms when not using frame gen but man the pricing is insane.

Jump in the discussion.

No email address required.

Just buy a 7900 XTX for less than half the price. I think I paid $900 for the sapphire one

Jump in the discussion.

No email address required.

AMD gpus are a meme, dont bother

Jump in the discussion.

No email address required.

Yeah I snagged one at MSRP near release and it's been a great purchase lol. I did NOT expect for it to hold its value like it has. At this rate I'll get another 2-3 years out of it before I even need to consider upgrading.

Jump in the discussion.

No email address required.

:marseydepressed: I could have done the same but didn't. Feels like I missed out on the second coming of the 1080TI

Jump in the discussion.

No email address required.

I do use the 24GB VRAM for LLMs and other docker nonsense but last I checked there wasn't much performance gains in 1440p between the two.

The 4090 will have to really drop in price.

Jump in the discussion.

No email address required.

In a hypothetical world where you can get both for msrp, it would be silly to pay 1600 for the 4090 rather than 2000 for the 5090.

Jump in the discussion.

No email address required.

1070 sisters we won

Jump in the discussion.

No email address required.

Anyone who actually knows about GPUs already knew this

The 5000 have nearly the exact same specs as the 4000 - be it core count, clocks or RAM

They're betting it all on AI boosting DLSS performance

We're now looking at more than 5 years with no perf improvement from NVDA

Jump in the discussion.

No email address required.

We're now looking at more than 5 years with no perf improvement from NVDA

???? 30 series to 40 series was a massive jump

https://i.rdrama.net/images/1737680253W8NyidxR2x3jnA.webp

Jump in the discussion.

No email address required.

Yeah, now look at the 4060/4070/4080

They improved energy efficiency so they could boost clocks a little, which is nice, but fundamentally the cards are nearly the same. The 5070 has only 250 more cores than the 3070, not even 5%. The 4070 has the same core count as the 3070 too.

Jump in the discussion.

No email address required.

the 4070 perf is same as the 3090 with added power efficiency, thats a pretty good upgrade imo

Jump in the discussion.

No email address required.

Yeah but as I said it's exclusively because of the boosted clocks

Now we're getting nearly no clock or core increase, and no architecture improvement either

Nvidia is doing what Intel did 10 years ago, and it cost them big in the end

Jump in the discussion.

No email address required.

? This was still a jump even if it was pretty linear.

Jump in the discussion.

No email address required.

Stuff that runs flawlessly on existing tech still runs the same when more power is thrown at it :marseygigaretard:

Jump in the discussion.

No email address required.

Man i love the new era of shit optimized AAA games that run like garbage until you turn on magic fake AI frames.

Jump in the discussion.

No email address required.

Which also run worse due to higher latency

Jump in the discussion.

No email address required.

And has stupid artifacting and pop in

Jump in the discussion.

No email address required.

NMS needs this so fricking bad.

Jump in the discussion.

No email address required.

Still gonna replace my 1070 with literally any of the fricking 5000 series anyways or the fricking Intel one at its actual MSRP :marseygiggle:

Jump in the discussion.

No email address required.

don't fall for the [[[scamtel]]]

Jump in the discussion.

No email address required.

I've been on the fence between waiting for a 5000 series or just grabbing the 4070 TI Super before :marseytrump: :#rape: us all with tariffs. :marseyworried:

The 5000's are going to disappear instantly too. :#marseyitneverbegan:

Jump in the discussion.

No email address required.

Get a used 3080 or 6800xt tbh, those cards frick and r-slurs are dropping them for the new thing

Jump in the discussion.

No email address required.

@Ninjjer @lfyca have you seen any reports of defects in Intel's GPU lineup? They seem to actually be competing.

Jump in the discussion.

No email address required.

No, but:

  • I have an 13900k in my server and I'm really fricking mad about the hardware defect.

  • I worked with a bunch of ex-Intel guys and we had a support relationship with one of their product teams. The company is just a dumpster fire right now.

So yeah I'll pass on Intel for my next build. I'm going AMD.

Jump in the discussion.

No email address required.

I had an A750 for about a year, and it wasn't great. Certainly a good value, but it would spaz out any time the video mode changed.

Jump in the discussion.

No email address required.

Buying anything made by Intel is so fricking rslurred. They are the Boeing of the hardware world.

Jump in the discussion.

No email address required.

boeing is 1/2 of the world's airplanes cute twink LOL

Jump in the discussion.

No email address required.

negligible improvements to 1080p and 1440p games

That's on Intel and AMD since it's CPU limited at those resolutions.

Jump in the discussion.

No email address required.

Yeah is OP just r-slurred? Lower resolution is almost always CPU bound, that's why CPU gaming tests use 1080p for benchmarks.

Jump in the discussion.

No email address required.

I'm keeping my GTX1080

Jump in the discussion.

No email address required.

Me, who primarily plays arcade style games: "This 4ms latency is unacceptable!!" :marseyraging:

Jump in the discussion.

No email address required.

@pizzashill is the stock :marseywallst: doing well?

Only thing that matters about nvidia

Jump in the discussion.

No email address required.

it sure is.

Jump in the discussion.

No email address required.

:#carpletsfuckinggo:

Jump in the discussion.

No email address required.

Remember when someone here told me i was catching a faling knife at 100 a share?

Jump in the discussion.

No email address required.

!chuds remember pizza is always right :marseyhesfluffyyouknow:

Jump in the discussion.

No email address required.

pizza will only bother to remember one vague moment in the past to comfort himself.

"someone here told me" lmao

Jump in the discussion.

No email address required.

:#marseycopetalking:

Jump in the discussion.

No email address required.

https://media.tenor.com/kMCF_76cAs4AAAAx/speech-buble.webp

Jump in the discussion.

No email address required.

:#marseyfemboytalking:

Jump in the discussion.

No email address required.

he has 2 shares bro

Jump in the discussion.

No email address required.

Two more than you

:#marseysmughips:

Jump in the discussion.

No email address required.

:marseypajeetitsover:

Jump in the discussion.

No email address required.

still waiting for those positions :#marseywait:

Jump in the discussion.

No email address required.

The amount of FOMO and seethe that non Nvidia stock holders have is reaching critical levels. They're left holding AMD bags in rage as they try to deny that Nvidia's competitors are being left in the dust.

Jump in the discussion.

No email address required.

I am 100% buying a 5090 day one if I can get my hands on one. Skyrim VR with maxed out settings, fancy shaders, and a shitload of mods is going to be so fricking based.

Jump in the discussion.

No email address required.

Is there any efforts to improve the engine like OpenMW did for Morrowind?

Jump in the discussion.

No email address required.

Not exactly, but the script extenders and engine fixes it's quite a bit different than vanilla. A heavily modded Skyrim is like 10 years ahead of any other RPG right now. It does still retain a lot of it's quirks of course.

Jump in the discussion.

No email address required.

I'd assume some mad lads would eventually try it, no?

Jump in the discussion.

No email address required.

Maybe? I'm not sure the engine sucks enough to make it worth it. Also mods are so tightly integrated with the script extenders and stuff it would probably break everything

Jump in the discussion.

No email address required.

I don't know why I had the impression the engine had fundamental flaws, maybe that was me conflating it with Morrowinds.

Jump in the discussion.

No email address required.

I mean it does, but not as bad as morrowind. Installing a bunch of engine fixes and extenders is pretty much de rigueur.

Jump in the discussion.

No email address required.

do you use pre arranged modlists or do it yourself?

Jump in the discussion.

No email address required.

I used to do it from scratch but I was spending all my time modding and not playing. How I just grab a wabbajack and tweak it.

Jump in the discussion.

No email address required.

:#marseyplugged:

Snapshots:

:

Jump in the discussion.

No email address required.

I think it's an acceptable upgrade for what it is.

20%-30% across the board for 20% more power and 25% more price :marseyshrug:

The 30 to 40 series was pretty big already and they didn't bother to change the node from TSMC N4, I assume N2 in 2027 will be another 30 to 40 sized jump.

Jump in the discussion.

No email address required.

Meanwhile AMD's Strix Halo is promising 4070 level performance in an integrated form factor. Nvidia really needs to stop coasting.

Jump in the discussion.

No email address required.

@Salvadore_Ally_Chud was right once again. Nvidia guy has gone from tech guy too marketing guy.

@Salvadore_Ally_Chud will believe the future is here once @Salvadore_Ally_Chud finally see robot assistants walking around helping with grocery shopping.

@Salvadore_Ally_Chud love sucking peepee except for mutilated circumcised peepee.

Jump in the discussion.

No email address required.

People that rabidly pay attention to this bullshit don't actually wind up playing the frickin' games.

Jump in the discussion.

No email address required.



Link copied to clipboard
Action successful!
Error, please refresh the page and try again.