The 4070 is comparatively the best value 40 series card, but 3080 performance for $599 three years later is still pretty lackluster. I think the 7800 XT is a much better buy, albeit I’m not even sure I’d call that card good, and it’s the best one of this generation by a considerable margin.
Tricky-Row-9699
I wouldn’t get a 4080, the card is still so horrendously overpriced that if you’re in the market for one you might as well just get a 4090 and get actual flagship performance. You can even pair it with, like, a Ryzen 5 7600 or i5-13600K, because gaming really still doesn’t scale noticeably above the i5/R5 tier, and at that point it’s only, like, 300 bucks more overall.
Don’t upgrade the CPU, the gaming performance won’t noticeably change and you’ll be spending way more money than you should on something which barely runs even the lightest games these days. Just get a new build.
Man, this is so lame. AMD knows they can afford to put four full-fat Zen 4 cores in an 8300G, they just don’t want to.
Efficiency? Nope, TSMC 3nm is just magic like that. That being said, modern desktop flagships pretty comfortably outpace anything Apple can make, and can be acquired for a fraction of the price.
Way too much CPU, way too little GPU.
Intel GPUs overperform in Geekbench, so I wouldn’t read into this too much. It seems much more likely to me that it’ll pretty much just match 780M performance.
Just get a Thermalright Peerless Assassin, my guy. It’s the same cooling performance for, like, $30.
It maybe isn’t the most correct decision ever to put the 13900K and 14900K below the 12-core boundary when they’re competitive with AMD offerings of the same time period that have 16 big cores - personally, I’d count each E-core as half a core instead of zero cores for the purposes of this chart, because that seems to be about how they perform.
There’s nothing worth upgrading to from a 12700K right now if you’re a gamer, especially if you have a 3080. You’d be spending hundreds of dollars to boost your performance by maybe 5% when you could’ve just overclocked instead.
Intel will be fine. Nanometer for nanometer, their nodes are far denser than TSMC’s nodes (because process node naming is basically just marketing now), and they don’t seem to have any trouble getting new nodes up and running either.
That being said, it does depend on your location a little bit (in particular, if you’re in Europe and dealing with high energy prices, the 4070 might be a better choice for the considerably better efficiency.