this post was submitted on 28 Nov 2023
1 points (100.0% liked)

Hardware

47 readers
1 users here now

A place for quality hardware news, reviews, and intelligent discussion.

founded 1 year ago
MODERATORS
 

I was recently reading Tracy Kidder's excellent book Soul of a New Machine.

The author pointed out what a big deal the transition to 32-Bit computing was.

However, in the last 20 years, I don't really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 11 months ago (1 children)

Gee really? Here's me thinking 32-bit instruction sets were cosmetic. Thank you for ignoring the part where I said we're still in a transition phase.

Also, with a bit of tinkering, you can run 16-bit applications. It's just recommend to use virtualisation applications because Microsoft doesn't ensure quality updates for 16 bit applications.

[–] [email protected] 1 points 11 months ago (1 children)

Point is that seldom used instructions are microcoded anyway, so they take zero space on the CPU.

[–] [email protected] 1 points 11 months ago

I honestly don't know when you are talking about. CPUs aren't storage devices.