Unable to load image

I genuinely don't understand scalping

Not the people doing the scalping, that makes perfect sense. You have a high demand good that isn't priced appropriately, so intrepid entrepreneurs with low ping and good webpage-refresh skills step in to fulfill a market need.

No, what I don't understand is why businesses allow this shit to exist in the first place. I get if it's something like a concert. You want to be able to brag that your show is "sold out" even though only half the seats are full. But for electronics (and especially graphics cards), these companies are just leaving money on the table. If these consoomers are willing to pay 3x MSRP to someone they despise just to play their gaymes in slightly higher definition, imagine what they would pay to the actual company making the card?

TL;DR: Nvidia needs to have a dynamic pricing model that changes based on their stock

57
Jump in the discussion.

No email address required.

Reported by:

Nvidia already charges an outrageous amount of money for most of their shit and has a history of nasty business practices, allowing them to change prices depending on stock would just make them artificially lower stocks to sell at a markup every day of the year.

Jump in the discussion.

No email address required.

True, but we already tolerate that for gasoline, and it doesn't look like the inputs for graphics cards will get less volatile in the short to medium-term future.

Jump in the discussion.

No email address required.

Gaming graphics cards are small potatoes in the grand scheme of things. Reddit's favorite company AMD doesn't even prioritize gaming GPU's (for good reason). RDNA2 cards were a paper launch.

Most of AMD's TSMC wafer allocation has gone to their CPU's and semicustom SoC's for Sony and Microsoft.

People will be able to readily get their gaming GPU's in 2 years, and maybe another year for prices to return to pre COVID norms. Nvidia has been able to offer far more GPU's for AIB partners and OEM system integrators' laptops by opting for Samsung 8nm.

Their microarchitectural lead has been far enough ahead that they haven't had to tape out on a bleeding edge node for years. There's a lot of state funded foundry expansion going on (takes years though, and bleeding edge nodes aren't where the vast majority of money is look at Global Foundries for instance) we'll have a huge influx of foundry growth over the next 5-7 years.

It'll be interesting to see if there ends up being too many foundries say in a decade if demand decreases. IBM infamously had to pay Global Foundries to take their derelict fabs.

Jump in the discussion.

No email address required.

Mommy is soooo proud of you, sweaty. Let's put this sperg out up on the fridge with all your other failures.

Jump in the discussion.

No email address required.

Frick you snappy, I'm doing God's work by making g*mers cry.

Jump in the discussion.

No email address required.

If only you wouldnt sound like a fricking nerd whilst doing it

Jump in the discussion.

No email address required.

Ethereum mining is being phased out real soon. I expect a frickton of cheap used mining GPUs in 2022 and new stock to follow.

Jump in the discussion.

No email address required.

People sperged out about the Turing cards on Reddit, but they were likely making less profit versus Pascal cards when you consider the insane die size. The 2080 ti was nearly at the reticle limit for TSMC 12NM.

R&D costs money, GeForce GPU's are subsidized by the Quadro cards, and server grade equipment Nvidia sells. Nvidia seems to have invested heavily in the software stack for the last few gens of product lines, and the dedicated hardware acceleration (tensor cores) further adds to the cost.

When you look at the raw rasterization performance of GeForce vs AMD cards from Maxwell to Turing Nvidia really should have charged more than they did. All of the AMD GCN derived GPU's were amazing for ML workloads though.

Nothing makes me happier than seeing quasi communist Redditors not being able to get their gaming GPU's.

Jump in the discussion.

No email address required.

Yeah GPU R&D costs an obscene amount of money and a company like NVidia that's playing on so many different fields can't afford to stop innovating. Doesn't always pay out though (lol Tegra)

Still they've done their best to shaft customers in the past and they'll do so again at any opportunity, no need to give them the incentive.

Poorcels will always bat for AMD lol even though they haven't made anything interesting in the GPU area since Pitcairn ๐Ÿคฎ

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.