Intel
Rules
-
Be civil. Uncivil language, slurs, and insults will result in a ban. If you can't say something respectfully, don't say it at all.
-
No Unoriginal Sources, Referral links or Paywalled Articles.
-
All posts must be related to Intel or Intel products.
-
Give competitors' recommendations only where appropriate. If a user asks for Intel only (i.e. i5-12600k vs i5-13400?) recommendations, do not reply with non-Intel recommendations. Commenting on a build pic saying they should have gone AMD/Nvidia is also inappropriate, don't be rude. Let people enjoy things.
-
CPU Cooling problems: Just like 95C is normal for Ryzen, 100C is normal for Intel CPUs in many workloads. If you're worried about CPU temperatures, please look at reviews for the laptop or CPU cooler you're using.
view the rest of the comments
You would. In some titles. But people underestimate how few cores some games still use, and how well a 4.8ghz 9th gen still holds up. I had an 8600k OCd to 4.8ghz all core and 5ghz single core boost, and when I upgraded to a Ryzen 7700x I really didn't see huge gains in a lot of games. That was at 1080p with a 6600xt. Likely similar to a 3080 at 4k. I wasn't playing many games that utilize many threads. And a 5ghz 8th, 9th, and 10th Gen Intel CPU with no hyper threading Is still equal to a Ryzen 5600 with its SMT disabled.
People often way overestimate how CPU demanding games really are. There are exceptions. Starfield, or that Battlefield game from a few years ago that came out in a really broken state. StarWars survivor that also came out in a broken state. Lots of really bad unoptimized cash grabs.
Some games are broken in other ways, but still well optimized. Cyberpunk when it released played at 60fps on a now 12 year old 2600k. I underclocked my 8600k to 1.8ghz to see what would happen, and it still ran at 40fps.
Ray Tracing it's very CPU heavy, and you'll see large gains there, especially if switching to DDR5.