this post was submitted on 04 Dec 2023
1 points (100.0% liked)

Hardware

48 readers
1 users here now

A place for quality hardware news, reviews, and intelligent discussion.

founded 1 year ago
MODERATORS
 

x86 came out 1978,

21 years after, x64 came out 1999

we are three years overdue for a shift, and I don't mean to arm. Is there just no point to it? 128 bit computing is a thing and has been in the talks since 1976 according to Wikipedia. Why hasn't it been widely adopted by now?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago

The 32 bit limit was a real constraint, 64 bit is not. Also, modern architectures do actually compute 128 bit data in parallel (say 4x32 bit), so it'd just be a matter of representing that data on the screen in a 128 bit way. Any actual need for 128 bit can just be emulated, and it's likely you don't need to process such data at the limit of a 2023 tier processor anyway. In fact if anything for machine learning the direction seems to be going in the other direction, preferring faster hardware at half-precision (https://en.wikipedia.org/wiki/Half-precision_floating-point_format)