I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.
This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.
Feedback is very much welcome. Thank you.
It is only a mistake from a Human PoV. It is more efficient for the chip since 1000 bytes and 1024 bytes take up the same space. But Humans find anything not base 10 difficult.
It’s not really about the space numbers need inside the computer but about unit prefixes.
Which are the references to drive space. What is your point?