Reported by:

Tech Jesus Reveals NVIDIA has peaked the new $2k 5090 preforms 30% better on 4k games and has negligible improvements to 1080p and 1440p games :marseyxd:

https://youtube.com/watch?v=VWSlOC_jiLQ

!g*mers laugh at Nvidia-cels a $2000 GPU still cant play cyberpunk at 4k 60 fps with gay tracing. Have fun paying $2000 for a DLSS software update.

Oh yeah remember this?

https://i.rdrama.net/images/1737649358wTWxZCIMAri7rw.webp

Total fricking lie by 4090 performance they mean 5070 can use 4x frame generation to have an equal framerate, but frame gen also increases the latency to like 60 ms meaning the game will play like utter dog shit despite looking smooth.

https://i.rdrama.net/images/1737649358ME3qnmE93-98Qg.webp

Its kinda funny how the cope went from "well 60 fps doesn't matter" to "well its 400 fps who cares if the game has 60 ms latency"

!fosstards !linuxchads Nvidia continues to suffer the curse of proprietary BS

81
Jump in the discussion.

No email address required.

Anyone who actually knows about GPUs already knew this

The 5000 have nearly the exact same specs as the 4000 - be it core count, clocks or RAM

They're betting it all on AI boosting DLSS performance

We're now looking at more than 5 years with no perf improvement from NVDA

Jump in the discussion.

No email address required.

We're now looking at more than 5 years with no perf improvement from NVDA

???? 30 series to 40 series was a massive jump

https://i.rdrama.net/images/1737680253W8NyidxR2x3jnA.webp

Jump in the discussion.

No email address required.

Yeah, now look at the 4060/4070/4080

They improved energy efficiency so they could boost clocks a little, which is nice, but fundamentally the cards are nearly the same. The 5070 has only 250 more cores than the 3070, not even 5%. The 4070 has the same core count as the 3070 too.

Jump in the discussion.

No email address required.

the 4070 perf is same as the 3090 with added power efficiency, thats a pretty good upgrade imo

Jump in the discussion.

No email address required.

Yeah but as I said it's exclusively because of the boosted clocks

Now we're getting nearly no clock or core increase, and no architecture improvement either

Nvidia is doing what Intel did 10 years ago, and it cost them big in the end

Jump in the discussion.

No email address required.

? This was still a jump even if it was pretty linear.

Jump in the discussion.

No email address required.



Link copied to clipboard
Action successful!
Error, please refresh the page and try again.