this post was submitted on 06 Oct 2023
49 points (86.6% liked)
Asklemmy
43750 readers
1189 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Today we have 64-bit computers (e.g. amd64), which descended from 32-bit computers (i386), which descended from 16-bit computers (Intel 8086), which descended from 8-bit computers (Intel 8008). Bit widths in our world naturally follow powers of two.
However, some 1960s computers used word sizes that weren't powers of two. Both IBM and DEC, among others, made 18- and 36-bit systems. Suppose that computing had continued to follow a multiples-of-nine pattern instead of the powers-of-two pattern?
For one thing, hexadecimal is less common. If you're writing 9-, 18-, 36-bit values, you typically write in octal, not hex. (In our world, Unix permissions modes are written in octal; Unix originated on the PDP-7, an 18-bit system.)
IPv4 addresses are 36 bits wide instead of 32, and you write them in octal instead of decimal.
localhost
is700.0.0.1
, and a typical LAN subnet mask is777.777.777.0
.No hexadecimal means no
0xDEADBEEF
or0xCAFEBABE
jokes. However, memory or files that get overwritten with junk are said to be "525'd", because binary101010101...
is octal525252...
.char
would be nine bits wide instead of eight. This affects the development of character sets.In our world, ASCII was originally a 6-bit encoding, expanded to 7-bits to support lowercase. IBM then extended it to 8-bits with code pages for different European languages, creating 8-bit PC extended ASCII. However, no single code page supports all European languages, to say nothing of non-European ones. This led to the invention of multibyte character encodings and ultimately Unicode.
In 9-bit world, multibyte characters are adopted earlier, using the high bit to indicate an extended character. Code pages don't get invented; mojibake never happens.
With 36-bit
time_t
, the Year 2038 problem doesn't happen; thetime_t
's don't wrap around until the year 3058!A 3ยฝ" high-density floppy disk stores one megabyte of 9-bit bytes.
I always wondered why floppies had an oddball storage capacity. TIL