this post was submitted on 04 Dec 2023
1 points (100.0% liked)
Hardware
47 readers
1 users here now
A place for quality hardware news, reviews, and intelligent discussion.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'll answer your question with a question: What are doing that requires 128-bit computations?
After that, a follow up question: Is it so important you're willing to cut your effective RAM in half to do it?
Why would it be cutting your effective RAM in half? I know very little about hardware/software architecture and all that.
Imagine we have an 8 bit (1 byte) architecture, so data is stored/processed in 8-bit chunks.
If our RAM holds 256 bits, we can store 32 pieces of data in that RAM (256/8).
If we change to a 16 bit architecture, that same physical RAM now only has the capacity to hold 16 values (256/16). The values can be significantly bigger, but we get less of them.
Bits don't appear out of nothing, they do take physical space, and there is a cost to creating them. We have a tradeoff of the number of values to store vs the size of each value.
For reference, per chunk (or "word") of data:
With 8 bits, we can hold 256 values.
With 64 bits, we can hold 18,446,744,100,000,000,000 values.
With 128 bits, we can hold 3,402,823,670,000,000,000,000,000,000,000,000,000,000 values.
(For X bits, it's 2^X)
Maybe one day we'll get there, but for now, 64 bits seems to be enough for at least consumer-grade computations.
Oh for fuck sake, I replied to a bot.
To the dev that's spamming Lemmy with this garbage: You aren't making Lemmy better. You're actively making it a worse experience.