this post was submitted on 28 Nov 2023
1 points (100.0% liked)

Hardware

33 readers
1 users here now

A place for quality hardware news, reviews, and intelligent discussion.

founded 11 months ago
MODERATORS
 

I was recently reading Tracy Kidder's excellent book Soul of a New Machine.

The author pointed out what a big deal the transition to 32-Bit computing was.

However, in the last 20 years, I don't really remember a big fuss being made of most computers going to 64-bit as a de facto standard. Why is this?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 9 months ago

Memory bus width != CPU Bitness

Those two numbers describe totally different things.

For GPUs the bitness number is usually used to describe the width of the memory bus that is how much data can be "transported" concurrently.

For CPUs the bitness describes the size of the data that can be processed at any given time. With AVX, CPUs can handle data vectors that are up to 512-bit long.

GPUs are in fact 64-bit processing units, the largest data type they are designed to handle are 64-bit double precision floating-point numbers.