Jump in the discussion.

No email address required.

Feeling super smug now that I went with the Ryzen 7 7800X3D :marseysmug:

Jump in the discussion.

No email address required.

Ryzen gang, Ryzen up!

:#marseypop2:

Jump in the discussion.

No email address required.

AMD + Nvidia combo here :marseythumbsup:

Jump in the discussion.

No email address required.

Novideo

:marseycringe2:

Jump in the discussion.

No email address required.

AMD cards have a fricked price to performance ratio :marseyfluffy: I'm not paying 90% of Nvidia's price to get 70% of the performance. :marseydisagree:

Jump in the discussion.

No email address required.

Neighbor what? prices 7900xt(20GB) $700 vs rtx4080 super(16GB) $1000. Its 70% of the price for 90% of the perf you r-slur

Jump in the discussion.

No email address required.

Wrong. :marseyexcited:

Jump in the discussion.

No email address required.

Cope, AMD's GPUs are absolute dogshit that use up more power while having less performance and functionality. It's why AMD basically gave up on the entirety of the high end for RDNA4 to Nvidia. Only the 7900GRE and 7800XT are worth buying right now for price/performance for AMD, and their iGPUs.

Jump in the discussion.

No email address required.

Jannied for not being gaming related LOL

Jump in the discussion.

No email address required.

Those extremely unimportant gaming components, CPUS

Jump in the discussion.

No email address required.

I was kicking myself for buying a 12th gen last summer and wished I'd sprung for the 13, I feel so fricking lucky rn.

!slots111

Jump in the discussion.

No email address required.

!slots111

Jump in the discussion.

No email address required.

Me asf

Jump in the discussion.

No email address required.

What a massive fricking L for Intel.

The lack of communication throughout this was terrible.

Jump in the discussion.

No email address required.

They managed to make sure it came out on a Friday though

Jump in the discussion.

No email address required.

My team works pretty closely with Intel and we hired a bunch of senior ex-Intel engineers. Seems like kind of a mess to be honest, I'm not really sure what the future holds for them.

Consumer stuff is shifting to ARM, and based on what Apple has done with their silicone, I see zero reason to want to stay on x86. Not sure what will happen with server grade stuff, that's mostly still Intel from what I've seen, but I'm not really a hardware person.

Jump in the discussion.

No email address required.

I think they can coast on x86 for a loooong time. Fact is a lot of economic juggernauts are built off the premise that a line-of-business app from 2007 written in C# that no one has the source to will continue working for decades.

Jump in the discussion.

No email address required.

Yeah maybe that's enough to keep them afloat but unless they find a way to either innovate and stay on top of the CPU market or actually make headway on GPU development, they are in for tough times IMO.

No idea what it would take for them to close the gap with nvidia, but that's probably what I'd be wanting to do if I were them. It's probably more likely that nvidia goes and builds some crazy SOC onto an A100 and cuts them out of the GPU server market though.

Jump in the discussion.

No email address required.

You can run that shit on arm with a thin compatibility layer. faster than it ran in 2007 too, not a high bar. But the slowdown even for current day apps is minimal, that's really not a big issue unless you're gaming or something else that needs 100% of the hardware.

Jump in the discussion.

No email address required.

Nah x86 is fine, there's plenty of reason for Windows and Linux users to stay on x86 right now. Apple's efficiency successes with the M-series is mostly the result of them having a complete top down control over the hardware and software so it's super well-optimized. x86 not being as efficient is mostly due to the separation between MS/Linux and the various chip and hardware manufacturers. The new Ryzen HX AI (ugh I know) and Lunar Lake x86 processors should close a lot of that gap with Apple, especially when it comes to more power demanding tasks.

Jump in the discussion.

No email address required.

All they had to do was offer like 100/200/300 off for owners of i3s/i5s/i7s respectfully

Jump in the discussion.

No email address required.

You think? I mean the CPUs are simply broken, some reports are as high as 25% :marseymindblown:

I think they'll face some serious legal action if they were selling faulty products for two generations.

Jump in the discussion.

No email address required.

I FEEL like the USA is generally forgiving to companies when their customers are happy. I know regulatory wise and he'll even enterprise wise they are likely going to be fricked in the butt BUT as far as the average Joe goes? They know PCs are super sensitive machines and shits gonna happen, but if they're given a heavily subsidized new cpu even if still broken doesn't really matter. Just look at Nintendo and their joycons that are by design broken for almost a decade. Thy did like 4 years of free repairs and then everyone stopped caring

Jump in the discussion.

No email address required.

This is just crazy

I rarely agree with redditors, especially nerd redditors but this actually is. TWO GENERATIONS OF CPUs!! How does this happen, it's insane.

AMD is loving this though. They've made big strides in the past decade and now they're looking quite good compared to intel which I couldn't have predicted even 6 years ago

Jump in the discussion.

No email address required.

After 3.5gb +.5fb gtx970 nothing surprises me

Jump in the discussion.

No email address required.

AMDynasty confirmed

Jump in the discussion.

No email address required.

https://media.tenor.com/s1dR1pGFcwYAAAAx/amd.webp

Jump in the discussion.

No email address required.

Is microcode like code written buy dudes with small peepees or something? Do you have to be a jeet in order to learn it?

Jump in the discussion.

No email address required.

Is microcode like code written buy dudes with small peepees or something?

Sometimes

Do you have to be a jeet in order to learn it?

Yes

Jump in the discussion.

No email address required.

Basically each instruction in a cpu will be translated to a set of gate (function) to apply on registers (data). As these set of gate can have overlap (you should reuse the AND gate set for the NAND and so on), the instructions are different (here the AND and NAND) from the actual cpu input.

One particular use case is the multiplication : You will write down an instruction for a multiplication and your CPU will take X cycle, which depend on what underlying gate he has to call (a 32x32 multiplication directly fit in one register (64 bit on a modern cpu), but a 64x64 will most likely be done in a few cycle to compute the lower and upper bits). :marseyhomofascist:

Jump in the discussion.

No email address required.

Some combination of :marseyautism: and :marseytrain:

Jump in the discussion.

No email address required.

Man I'm glad I built my last PC with an AMD chip but fricking christ I wish they wouldn't put the pins on the chip. Like are they just doing that to be different? I have actually wrecked a CPU in the past by inserting it slightly wrong. Skill issue, I understand, but I don't get why.

Jump in the discussion.

No email address required.

It wasn't an issue for the first 40 years of computing chips, skill issue

Jump in the discussion.

No email address required.

I already admitted it was a skill issue :marseyraging:

Jump in the discussion.

No email address required.

Sorry didn't read all that fr :marseyzoomer:

Jump in the discussion.

No email address required.

Could be better connection, better cooling, patents, less new tooling needed, etc. Also what happens installing an Intel sideways? Dead MoBo?

Anyway, you haven't kept up. AMD does the LGA thing since '22, too.

Jump in the discussion.

No email address required.

That's cool. I think my current gaming PC is from 2020 so you're right, I haven't kept up.

Jump in the discussion.

No email address required.

Can I get a screaming deal on a defective but high end CPU? Would there be any point?

Jump in the discussion.

No email address required.

Yes you could no there wouldn't be it seems. Degradation occurs above 65watts and when you're i7 line is pulling 200 watts you're probably stuck with i5 underclockimg and i3s

Jump in the discussion.

No email address required.

:#marseydarkxd:

Jump in the discussion.

No email address required.

No but the 12900k isn't affected by these issues (apparently) and can be found at Microcenter for about $280 I think. It's a stretch but if these processors see a further discount due to Intel's recent frickery and the release of Arrow Lake I'd jump on it. With an undervolt + good cooling it'd be a pretty sweet setup.

Jump in the discussion.

No email address required.

Jump in the discussion.

No email address required.

Hmm I have a newish gen 13 i7 in one of my servers. I wonder if I can finesse my way into free shit somehow.

Jump in the discussion.

No email address required.

Fake losses of millions of dollars

Jump in the discussion.

No email address required.

Yeah it's on my stable diffusion rig so if my loli cat girl generations get messed up I'm going to be mad.

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.