Short answer, no.
Intel
Rules
-
Be civil. Uncivil language, slurs, and insults will result in a ban. If you can't say something respectfully, don't say it at all.
-
No Unoriginal Sources, Referral links or Paywalled Articles.
-
All posts must be related to Intel or Intel products.
-
Give competitors' recommendations only where appropriate. If a user asks for Intel only (i.e. i5-12600k vs i5-13400?) recommendations, do not reply with non-Intel recommendations. Commenting on a build pic saying they should have gone AMD/Nvidia is also inappropriate, don't be rude. Let people enjoy things.
-
CPU Cooling problems: Just like 95C is normal for Ryzen, 100C is normal for Intel CPUs in many workloads. If you're worried about CPU temperatures, please look at reviews for the laptop or CPU cooler you're using.
If you're fine then you're fine, but that CPU is definitely holding back a 4090 in some scenarios.
Not so much tbh, I just upgraded from 10700k to 14900k + new ddr5 memory.
Cyberpunk benchmark 4k all maxed RT + path tracing On:
10700k + 4090: 71 fps
14900k + 4090: 77 fps
Keeping in mind that the 14900k get crazy hot (95c) compared to 10700k
Could have gone to AM5 7800X3D for less money and more performance with future upgrades but instead went to a hot pretend generation upgrade on a dead end platform. Why do people do this to themselves?
What about the 1% lows? The frame consistency?
It made a big difference for me with a 10700k and 3080 at 4k back then, going to 13th gen
It depends on the games for sure, I saw a big difference.
a 7800x3d would feel like upgrading a ten year old pc.
Bottlenecking a 4090 yikes what a waste
Just because your GPU is hitting 99% utilization does not mean your CPU isnt holding it back.
Upgrading to 13th or 14th gen (or AMD for that matter) will absolutely give you plenty of gains, specially on the 1%lows.
H U G E.