https://old.reddit.com/r/hardware/comments/1iv2x5h/i_bought_a_3050_to_pair_with_my_5090_to_uncripple/
TLDR
If you bought a new 50 Series card, old games which have PhysX implementations for cool stuff in game will just grind to a halt and wont work.
25FPS vs 160FPS
66FPS vs 415FPS
Jump in the discussion.
No email address required.
This is absolutely going to happen to older games with raytracing in a decade or so.
Jump in the discussion.
No email address required.
we're not moving to 128bit architecture buddy
Jump in the discussion.
No email address required.
ARM and RISCV are both being considered by hardware manufacturers, wouldn't surprise if Valve's ARM project is used in an actual product in a decade.
Jump in the discussion.
No email address required.
yeah but you can't run x86-based programs on them anyways, so the problems gunna be a whole lot worse than just the gfx lib not supporting it.
if they just released the games open source, i'm sure some nerds would fix it for free. but my god copyrights are a really fricking r-slurred economic conception. we have to wait for some idiot investor camping on the ip to take a stab at trying to make a buck off the old IP, before we even get a half assed attempt at it, instead of just letting nerds do it for free.
!commenters
Jump in the discussion.
No email address required.
This is why I love the FOSS game remakes like OpenMW and Augustus.
Jump in the discussion.
No email address required.
More options
Context
There's definitely R&D going on right now to get ARM to run x86 instructions more efficiently (like Apple did) so I wouldn't rule out a compatibility layer carrying us through.
The real question is if you can ditch x86β¦why not just ditch Windows too? If you don't get bug for bug compatibility anyway why would you put it up with it
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
Huh? Are you saying that non-raytraced rendering is going to lose support or that early implementations of RT are going to age poorly?
Jump in the discussion.
No email address required.
Should've been clearer lol. The former. Nvidia has said they want to remove non raytracing cores in the future. I think if they actually moved ahead with that it would be a shitshow.
Jump in the discussion.
No email address required.
Well my confusion stems from the former sounding ridiculous. I'm by no means a hardware guy but my understanding was that non-raytracing non-tensor cores are very fundamental to rasterized graphics of any kind. Like rendering a desktop environment with only raytracing and tensor cores sounds so stupid.
Jump in the discussion.
No email address required.
Correct, it's absolutely r-slurred. Nvidia had this idea to eventually make all of that CPU rendered because CPUs are strong enough (lol), and leave GPUs to focus on AI and raytracing only. It's delusional, but an Nvidia engineer legitimately suggested it. I thought it'd never happen, but if they removed old PhysX support...
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context