What about the 5500x3D. There was chains of that, but it's all gone quiet.
bubblesort33
Y'all can shave your neck beards off.
The allowed limit is 4,800, so the RTX 4090 is about 10% "too powerful."
...
but Nvidia will likely build in some wiggle room — to ensure overclocking as an example doesn't become a problem. Assume a clock speed of 2.7 GHz and we get a maximum number of SMs of 108.
Like what is preventing them from just shipping 4090 cards downclocked to 2.4 Ghz, and then letting China figure out how to flash a 2.7Ghz BIOS onto the cards? I guess just Nvidia just doesn't want to get in trouble with the US government?
The codename of Nvidia's post-Blackwell GPU architecture could be Vera Rubin
So not next gen, but "next-next-gen".
RDNA3 entries to Linux were made 6-8 months before release of the 7900 series. RDNA4 entries were made about a week ago. July would be 7 months from now.
That being said, when RDNA3 entries were made, they also made the 7800xt and 7600xt entries, and those didn't launch until like 12-18 months after the first entry. But I think 12 months at the latest from now makes sense. Another November release date like RDNA2.
I'm sure a huge amount of people are. Had I bought an 8700k instead of an 8600k, I'd still be on 8th gen. That CPU is still in part with the consoles.
You can replace it with a 10 year old CPU from eBay. Just don't, unless it's like $20. We're at DDR5 and you're on DDR3 RAM.
So Smooth Sync is just not working in most games? Like what if you tried it in something totally unexpected like The Witcher 1 or 2? Something old, or something brand new? Is it a white list where they select which games to enable it for? Or a black list where they disable it for certain games exhibiting problems?
RDNA1 and 2 were pretty successful. Vega was very successful in APUs, just didn't scale well for gaming, but was still successful for data center. You can't hit them all, especially when you have a fraction of the budget that your competition has.
Also, he ran graphic divisions, not a Walmart. People don't fail upwards in these industries at these levels. When they fail upwards working in some other industries, they fail to middle management. Somewhere you're not in the spotlight, and out of the public's eye, but don't get to make final decisions. Somewhere to push you out of the way. Leading one of less than a handful graphics divisions in the world is not where you land.
It's about 10% slower than a Ryzen 5600x in games.
"Why? I am still learning, but my observations so far: the 'purpose' of purpose-built silicon is not stable. AI is not as static as some people imagined and trivialize [like] 'it is just a bunch of matrix multiplies'."
But it is stable in a lot of cases, is it not? I mean if you're training a system for autonomous driving, or a training a system for imagine generation, it seems pretty stable. But for gaming it certainly needs flexibility. If we want to add a half a dozen features to games that rely on ML, it seems you need a flexible system.
That does remind me, of how Nvidia abandoned Ampere and Turing when it comes to frame generation because they claim the optical flow hardware is not strong enough. What exactly is "Optical Flow"? Is it a separate type of machine learning hardware? Or is it not related to ML at all?
Processors do wear out over time, but usually not this fast. Might be that undervolt was just barely stable at one point. And maybe even unstable in some conditions you never tested. Now even the tiniest amount of wear, has dropped it below the line.
Could also be RAM being defective. But that's usually from factory, not wear in.