Saturday, 22 February 2025

Seriously WTF nVidia?

 Man, I think I managed to really luck out with the last set of upgrades that I did on my home machine. Back years ago I grabbed a 3070 around launch time before the prices jumped up through the roof and before the whole LHR stuff started showing up. That card has been solid for me for the whole time and I've had no regrets about the purchase. 

Generally I like to sit out for a couple of generations before I upgrade anything. That's gotten stretched out a bit over the last couple of years because the gains generaton to generation just have not been there to justify spending money for a upgrade. On the CPU side I had my i7 860 last until the 7th generation Intel stuff was launched before I upgraded. GPU's I don't tend to upgrade as often as I like to play older games and those generally don't need a higher end card to run. 

However it's been a couple of years and I figured that I should be keeping my eye on what's coming out in the market since my kids are probably going to be looking for upgrades in the next bit and if my old card works for them I may just get a updated piece for my computer. However seeing what's going on with the 50 series cards from Nvidia I'm just shaking my head.

First off the whole stock situation is insane. From what I'm hearing the cards are just unobtainable in retail. Between scalpers and low stock the chances of getting hands on one if you can get one you might want to get some lottery tickets one the way home. 

Second, the price of the damm things. The starting price for a 5080 is 1000$ USD. For a single component that you still need to build out a entire computer around before you can use it for anything. The 40 series cards weren't any better pricing wise so there's really no change there but I find it crazy that I can purchase a entire gaming console and a large TV for the price of a single component for a high end gaming computer. And yes consoles are more limited than a PC, but if all that I'm doing ins playing games a console and TV would get the job done. 

Third a lot of those cards are using that new 12v High Power connector and all the reports of those dammed things melting have me thinking that I would rather have a lower power card that uses the older style connectors - or that doesn't require supplemental power at all. 

And now with the news that it looks like some of the cards randomly seem to be shipping with portions of the GPU die fused off for some reason I have to wonder why the hell nVidia woud ship chips out to board partners that don't have the correct bits all enabled. It's not a uncommon practice for CPU's and GPU's that are having problems to have the problem portions fused off and sold as a lower end part. I'm sure that some of the Intel CPU's that come without a iGPU actually have the silicon on the die and that it just didn't pass testing and was disabled for some reason. But generally you don't have a vendor who cuts out a chunk of the functionality of the device without telling you that they just cut 2-5% of the performance away without telling you. 

I don't have a issue with nVidia, Intel, or AMD selling off silicon that didn't meet the higher end specification as some lower end part, but just tell us that up front so that we know what's going on. 

It's looking like this generation might be a pass on the nVidia side of things. AMD could be a option but there's some stuff that I'm running that definitely runs better on the nVidia hardware though.