this post was submitted on 28 Nov 2023
1 points (100.0% liked)

Hardware

47 readers
1 users here now

A place for quality hardware news, reviews, and intelligent discussion.

founded 1 year ago
MODERATORS
 

I was recently reading Tracy Kidder's excellent book Soul of a New Machine.

The author pointed out what a big deal the transition to 32-Bit computing was.

However, in the last 20 years, I don't really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 11 months ago

It depends on what you meant by 64bit computing, which is not the same as x86 becoming a 64bit architecture.

FWIW, 64 bit computing had been a thing for a very long time in the supercomputer/mainframe space since the 70s. And high end microprocessors had supported 64bit since the early 90s.

So by the time AMD introduced x86_64 there had been about a quarter century of 64bit computing ;-)

It was a big deal for x86 vendors, though. As that is when x86 took over most of the datacenter and workstation markets.