!g*mers laugh at Nvidia-cels a $2000 GPU still cant play cyberpunk at 4k 60 fps with gay tracing. Have fun paying $2000 for a DLSS software update.
Oh yeah remember this?
Total fricking lie by 4090 performance they mean 5070 can use 4x frame generation to have an equal framerate, but frame gen also increases the latency to like 60 ms meaning the game will play like utter dog shit despite looking smooth.
Its kinda funny how the cope went from "well 60 fps doesn't matter" to "well its 400 fps who cares if the game has 60 ms latency"
!fosstards !linuxchads Nvidia continues to suffer the curse of proprietary BS
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
60 ms of latency is awful. Thats ontop of the in game lag too. Having 8ms of latency is considered nearly unplayable in a game like CS2. If ur a total casualtard it doesn't matter I guess the same way 30 fps doesn't matter but it feels like shit to play lol. Like 5 frames of input lag can be unplayable for more skill based genres like shoot em ups or rhythm games or fast paced action games (not souls slop). Its funny to me watching professional tech reviewers like LTT be like "yeah frame gen adds 60 ms of latency and has horrible artifacting but the average consumer won't notice so who cares "
Jump in the discussion.
No email address required.
!oldstrags
Come laugh at this
Jump in the discussion.
No email address required.
I 'member playing Counter Strike with 600 ms on dial-up![:#boomermonster: :#boomermonster:](https://i.rdrama.net/e/boomermonster.webp)
!oldstrags , what's the worst ping you've ever played with?
Jump in the discussion.
No email address required.
I killed Sarth + 3 Drakes in WoW while it was current with 1 full second of lag. I had shitty apartment cable and had to basically be predicting everything 1 second earlier. I'm a bad butt.![:marseyskater: :marseyskater:](https://i.rdrama.net/e/marseyskater.webp)
Jump in the discussion.
No email address required.
That's server lag not fps lag
Jump in the discussion.
No email address required.
More options
Context
Jump in the discussion.
No email address required.
More options
Context
More options
Context
300ms. Beyond that it's actually unplayable
Jump in the discussion.
No email address required.
More options
Context
60 ms go fricking choke on a peepee you dialup cute twink
Low Ping Bastards for Life
Jump in the discussion.
No email address required.
More options
Context
Cs1.6 was like the only game with Australian servers, everything else I played 300 to us west coast on dialup, then when we got adsl that went down to 230. I used to play tournaments in unreal tournament on 230
Jump in the discussion.
No email address required.
More options
Context
More options
Context
I am so confused, the full latency, from mouse/kb to screen response is usually 40 on CSGO?
Jump in the discussion.
No email address required.
More options
Context
I'm a latent homosexual, just give me 8ms.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
He's right but in a way more derogatory to the average consumer than he intends.
Jump in the discussion.
No email address required.
More options
Context
Make you bait less r-slurred in the future please
Jump in the discussion.
No email address required.
More options
Context
I use DLSS all the time and don't notice
any input lag. Maybe
I'm just r-slurred
idk.
Jump in the discussion.
No email address required.
This os frame gen not dlss. DLSS still requires a decent base resolution and framerate to see good effects and frane gen doesn't so nvidia is pushing it despite it being awful
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
if my ping is above 100ms I get mad, I expect my monitor to be way faster than the connection between my pc and the server
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
Still keeping my 3090 I guess lol
Jump in the discussion.
No email address required.
A 4090 would prob be worth it if it goes down in price due to the 5090 launch. Unless you are doing dedicated AI models a 5090 is fricking pointless.
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
RIP I'll prob fully switch to AMD now. I was interested in NVIDIAs reflex technology which can reduce latency to like 2 ms when not using frame gen but man the pricing is insane.
Jump in the discussion.
No email address required.
Just buy a 7900 XTX for less than half the price. I think I paid $900 for the sapphire one
Jump in the discussion.
No email address required.
More options
Context
AMD gpus are a meme, dont bother
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Yeah I snagged one at MSRP near release and it's been a great purchase lol. I did NOT expect for it to hold its value like it has. At this rate I'll get another 2-3 years out of it before I even need to consider upgrading.
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
I do use the 24GB VRAM for LLMs and other docker nonsense but last I checked there wasn't much performance gains in 1440p between the two.
The 4090 will have to really drop in price.
Jump in the discussion.
No email address required.
More options
Context
In a hypothetical world where you can get both for msrp, it would be silly to pay 1600 for the 4090 rather than 2000 for the 5090.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
1070 sisters we won
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Anyone who actually knows about GPUs already knew this
The 5000 have nearly the exact same specs as the 4000 - be it core count, clocks or RAM
They're betting it all on AI boosting DLSS performance
We're now looking at more than 5 years with no perf improvement from NVDA
Jump in the discussion.
No email address required.
???? 30 series to 40 series was a massive jump
Jump in the discussion.
No email address required.
Yeah, now look at the 4060/4070/4080
They improved energy efficiency so they could boost clocks a little, which is nice, but fundamentally the cards are nearly the same. The 5070 has only 250 more cores than the 3070, not even 5%. The 4070 has the same core count as the 3070 too.
Jump in the discussion.
No email address required.
the 4070 perf is same as the 3090 with added power efficiency, thats a pretty good upgrade imo
Jump in the discussion.
No email address required.
Yeah but as I said it's exclusively because of the boosted clocks
Now we're getting nearly no clock or core increase, and no architecture improvement either
Nvidia is doing what Intel did 10 years ago, and it cost them big in the end
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
? This was still a jump even if it was pretty linear.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Stuff that runs flawlessly on existing tech still runs the same when more power is thrown at it![:marseygigaretard: :marseygigaretard:](https://i.rdrama.net/e/marseygigaretard.webp)
Jump in the discussion.
No email address required.
More options
Context
Man i love the new era of shit optimized AAA games that run like garbage until you turn on magic fake AI frames.
Jump in the discussion.
No email address required.
Which also run worse due to higher latency
Jump in the discussion.
No email address required.
More options
Context
And has stupid artifacting and pop in
Jump in the discussion.
No email address required.
More options
Context
NMS needs this so fricking bad.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Still gonna replace my 1070 with literally any of the fricking 5000 series anyways or the fricking Intel one at its actual MSRP![:marseygiggle: :marseygiggle:](https://i.rdrama.net/e/marseygiggle.webp)
Jump in the discussion.
No email address required.
don't fall for the [[[scamtel]]]
Jump in the discussion.
No email address required.
More options
Context
I've been on the fence between waiting for a 5000 series or just grabbing the 4070 TI Super before
us all with tariffs. ![:marseyworried: :marseyworried:](https://i.rdrama.net/e/marseyworried.webp)
The 5000's are going to disappear instantly too.![:#marseyitneverbegan: :#marseyitneverbegan:](https://i.rdrama.net/e/marseyitneverbegan.webp)
Jump in the discussion.
No email address required.
More options
Context
Get a used 3080 or 6800xt tbh, those cards frick and r-slurs are dropping them for the new thing
Jump in the discussion.
No email address required.
More options
Context
Jump in the discussion.
No email address required.
No, but:
I have an 13900k in my server and I'm really fricking mad about the hardware defect.
I worked with a bunch of ex-Intel guys and we had a support relationship with one of their product teams. The company is just a dumpster fire right now.
So yeah I'll pass on Intel for my next build. I'm going AMD.
Jump in the discussion.
No email address required.
More options
Context
I had an A750 for about a year, and it wasn't great. Certainly a good value, but it would spaz out any time the video mode changed.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Buying anything made by Intel is so fricking rslurred. They are the Boeing of the hardware world.
Jump in the discussion.
No email address required.
boeing is 1/2 of the world's airplanes cute twink LOL
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
That's on Intel and AMD since it's CPU limited at those resolutions.
Jump in the discussion.
No email address required.
Yeah is OP just r-slurred? Lower resolution is almost always CPU bound, that's why CPU gaming tests use 1080p for benchmarks.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
I'm keeping my GTX1080
Jump in the discussion.
No email address required.
More options
Context
Me, who primarily plays arcade style games: "This 4ms latency is unacceptable!!"![:marseyraging: :marseyraging:](https://i.rdrama.net/e/marseyraging.webp)
Jump in the discussion.
No email address required.
More options
Context
Only thing that matters about nvidia
Jump in the discussion.
No email address required.
it sure is.
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
Remember when someone here told me i was catching a faling knife at 100 a share?
Jump in the discussion.
No email address required.
!chuds remember pizza is always right![:marseyhesfluffyyouknow: :marseyhesfluffyyouknow:](https://i.rdrama.net/e/marseyhesfluffyyouknow.webp)
Jump in the discussion.
No email address required.
pizza will only bother to remember one vague moment in the past to comfort himself.
"someone here told me" lmao
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
he has 2 shares bro
Jump in the discussion.
No email address required.
Two more than you
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
still waiting for those positions![:#marseywait: :#marseywait:](https://i.rdrama.net/e/marseywait.webp)
Jump in the discussion.
No email address required.
More options
Context
More options
Context
The amount of FOMO and seethe that non Nvidia stock holders have is reaching critical levels. They're left holding AMD bags in rage as they try to deny that Nvidia's competitors are being left in the dust.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
I am 100% buying a 5090 day one if I can get my hands on one. Skyrim VR with maxed out settings, fancy shaders, and a shitload of mods is going to be so fricking based.
Jump in the discussion.
No email address required.
Is there any efforts to improve the engine like OpenMW did for Morrowind?
Jump in the discussion.
No email address required.
Not exactly, but the script extenders and engine fixes it's quite a bit different than vanilla. A heavily modded Skyrim is like 10 years ahead of any other RPG right now. It does still retain a lot of it's quirks of course.
Jump in the discussion.
No email address required.
I'd assume some mad lads would eventually try it, no?
Jump in the discussion.
No email address required.
Maybe? I'm not sure the engine sucks enough to make it worth it. Also mods are so tightly integrated with the script extenders and stuff it would probably break everything
Jump in the discussion.
No email address required.
I don't know why I had the impression the engine had fundamental flaws, maybe that was me conflating it with Morrowinds.
Jump in the discussion.
No email address required.
I mean it does, but not as bad as morrowind. Installing a bunch of engine fixes and extenders is pretty much de rigueur.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
do you use pre arranged modlists or do it yourself?
Jump in the discussion.
No email address required.
I used to do it from scratch but I was spending all my time modding and not playing. How I just grab a wabbajack and tweak it.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
Snapshots:
:ghostarchive.org
archive.org
archive.ph (click to archive)
Jump in the discussion.
No email address required.
More options
Context
I think it's an acceptable upgrade for what it is.
20%-30% across the board for 20% more power and 25% more price![:marseyshrug: :marseyshrug:](https://i.rdrama.net/e/marseyshrug.webp)
The 30 to 40 series was pretty big already and they didn't bother to change the node from TSMC N4, I assume N2 in 2027 will be another 30 to 40 sized jump.
Jump in the discussion.
No email address required.
More options
Context
Meanwhile AMD's Strix Halo is promising 4070 level performance in an integrated form factor. Nvidia really needs to stop coasting.
Jump in the discussion.
No email address required.
More options
Context
@Salvadore_Ally_Chud was right once again. Nvidia guy has gone from tech guy too marketing guy.
@Salvadore_Ally_Chud will believe the future is here once @Salvadore_Ally_Chud finally see robot assistants walking around helping with grocery shopping.
@Salvadore_Ally_Chud love sucking peepee except for mutilated circumcised peepee.
Jump in the discussion.
No email address required.
More options
Context
People that rabidly pay attention to this bullshit don't actually wind up playing the frickin' games.
Jump in the discussion.
No email address required.
More options
Context