Super cards only for 2024?
Hardware
A place for quality hardware news, reviews, and intelligent discussion.
Been reported for quite some time that the next gen gpus won’t be out until 2025, seems more than likely that’s the year, really puts the group of if I should wait 4 next gen or should I buy in the midst of high end gpus being inflated. Tough spot.
I upgraded to a 4070ti 3 months ago (mainly in anticipation of Phantom Liberty and Alan Wake 2) and seems even a better affair now.
I bought a 4080 at about the same as you bought your card. How I look at it is: I (and you) got a worse deal than if we waited 5 months, but at the same time, we both got the use of a better GPU compared to what we used before, for all that time, and that's worth something.
Me, no regrets, not really.
It's not tough at all: don't buy. Where I am, for the price of a GPU I can get a good TV and a PS5, which is exactly what I've done. I don't need a PC for anything other than gaming and these days, I don't need a PC to game.
Next gen 5090 is ready but delay due to new structures on 3nm TSMC, much of time to re-code and convert. I read some media talking about more and more difficult on 2nm in future. So maybe afraid no enough time for 2025 if engineers head on time over time in extremely positive effects. Maybe the limit of technology has come. Media talks 2nm maybe over 2030.
Next gen 5090 is ready but delay due to new structures on 3nm TSMC
Got a cite for this? I haven't heard someone claim this.
Maybe the limit of technology has come. Media talks 2nm maybe over 2030.
No, I really doubt we are close to the limit. 3d in various forms is the way things are going at that level, and that's only one way to approach the various problems that I'm not educated enough to have a reasonable opinion on. Besides, it seems to me like every time some prognosticator says that we've reached a technological limit when it comes to computing, they've been wrong... when one approach plateaus, we find another that doesn't.
Media talks 2nm maybe over 2030.
I don't think we can make any reasonable predictions about how nodes will progress (or not) that far out.
Yesterday I read it, now I can not find the link, but from posts of experts , they said it's difficult , need more time and TSMC has tools to convert...
Media talks 2nm maybe over 2030.
Uh, Intel's roadmap says they'll have 18A (1.8 nm) in mass production in 2024, and they've said in every subsequent earnings call their nodes are on track for this schedule.
Going to make me a good “Reuben” alright.
I’m sure they’ll be able to process tons of high-level ideas
Pretty unusual to have a 3 year gap
Its every 2 years for consumer GPUs. Always has been.
People are already ripping chips out of 4090s for AI, it'll likely be optimized for that & not playing video games.
Wake me up when there's finally some photonic cores
blackwell are rtx 5000, not rubin. and rtx 5090 is coming late 2024, make yourself a favor and remember what i'm saying.
Isn't Blackwell in January 2025?
The codename of Nvidia's post-Blackwell GPU architecture could be Vera Rubin
So not next gen, but "next-next-gen".
Kinda of amazing how consumer and enterprise chips are starting to rival the performance capability required to stimulant the human brain.
The recently Nvidia H100 is capable of one thousand trillion BFLOAT16 Tensor Core, and two thousand trillion INT8 Tensor Core operations a second. Upcoming generations of consumer and enterprise Nvidia chip over the forthcoming decade should transfer this speed into mainstream market, and further improve it.
The human brain has about sixty to hundred billion neurons, and two hundred to four hundred trillion synapses. Only between 8-16% of the human brain is active at one time, with a firing rate of between 10-40 hz a second. Furthermore, large parts of the human brain are involved in tasks unrelated to higher cognitive processors, for instance the cerebellum contains ~80% of the brain's neurons in the manner of tightly packed granule cells, but has limited functionality when it comes to higher cognitive process.
And I'm already afraid of seeing the absurd prices + abysmal value Nvidia will come up with.
We might all be pleasantly surprised. If NVidia thought the pricing on the existing 4080, 4070 Ti, and 4070 GPUs was fine, they wouldn't bother bringing out a refresh. They do this instead of just lowering prices, because they don't want to create a precedent in buyer's minds that if they just wait on buying existing cards, prices will come down. Yes, I know, it's very transparent to us, but not to most consumers. Plus, a refresh creates some marketing buzz.
Now, how much prices (for certain performance tiers) will go down is the big question. How aggressive does NVidia want to be? We'll see in a month and a half!
2025 is such a big gap, the 4000 series simply wasn't good and the super stop gap coming out likely won't be any better value
The supers are just there to push down the value of existing stock. It's going to be 5% better for the same MSRP currently.
It all depends on how aggressive NVidia chooses to be. It could be quite a bit better price-performance than 5%.. it almost has to be, else what's the point of the refresh in the first place?
This time 1 pixel will be 16 ai pixel generated ones.