Skip to main content


Big endian naming: open_file
Little endian naming: lefi_enop

If you understand this joke you’re a super nerd.

Jonathan Lamothe reshared this.

in reply to Maddie, Wizard of Installs 🪄

@Maddie, Wizard of Installs 🪄 I remember being rather confused when I first discovered the concept of endianness. I was teaching myself x86 assembly using MS-DOS's debug program and being really confused at how it was pushing 16-bit registers onto the stack.
in reply to Jonathan Lamothe

@me I think that’s probably a pretty universal experience, at least in the subset of people that have written assembly / worked directly with architecture endianness.

It took a lot of 6502 ASM to stop being utterly bewildered every time I looked at machine code or an emulator’s memory monitor.

in reply to Maddie, Wizard of Installs 🪄

@Maddie, Wizard of Installs 🪄 so much confusion stemming from the fact that we adopted a numbering system from a language whose text is meant to be read from right to left.
in reply to Jonathan Lamothe

@me Do you have a source for the idea that big vs little endian has anything to do with the order of digits in Arabic numbers? I think those are two completely independent concepts.
in reply to Jim Rea

@Jim Rea Not exactly, but I'm given to understand that it's the reason you right justify figures to perform a sum for instance.
in reply to Jonathan Lamothe

@me No, you use right justified figures when performing sums so that digits with the same order of magnitude are aligned vertically. Trying to do addition on left justified numbers would be madness! I’m old enough that when I was a kid all sums were done manually, without a calculator. Maybe future generations will forget the reason for this.
in reply to Jonathan Lamothe

@me Some CPU designers felt that low order bits should be stored in lower addresses. On 8 bit machines, bits 0-7 would be stored in byte 0, bits 8-15 in byte 1, etc. This was more convenient for calculations and slightly reduced the number of transistors, but backwards for humans reading numbers in a memory dump. It also meant that the low order bits of 8, 16 and 32 bit values would always be stored in the low byte. Some software would take advantage of this, for example writing an 8 bit value then later reading it as a 16 bit value (of course causing endian problems if the code was ever switched to a big endian processor). Other CPU designers wanted memory dumps in “human” order (big endian) even at the cost of some extra transistors. None of this had anything to do with Arabic numbers or left vs. right writing systems.
in reply to Jim Rea

@Jim Rea So I decided to look more deeply into this. I wanted to learn about how numbers work in Arabic. This is what I've found after a cursory search:

The number 23 is read essentially as three and twenty, which at a glance seems to corroborate the idea of reading the least significant digit first, however this pattern breaks when you get into numbers larger than 100, so... 🤷‍♂️

Source: storylearning.com/learn/arabic…

I found the following excerpt to be interesting. Interpret it as seems appropriate:

One thing to flag here is that even though the Arabic script is written from right to left, it switches to being written from left to right when you’re writing the numbers.

It feels odd to me that they'd "reverse" the order of the digits like that, especially when it's their own counting system.

This is still conjecture on my part though.

in reply to Jonathan Lamothe

@me That’s really interesting, thanks for the link. But I think that makes me convinced even more strongly that big/little endian issues have nothing to do with Arabic, but rather is an entirely new concept from the 1960’s/70’s.
in reply to Maddie, Wizard of Installs 🪄

Amusingly, I was having exactly this debate recently, and ended up having to write a paragraph about how little-endian does *not* mean that the bits within a byte go from most to least while the bytes go from least to most, but it may often feel that way because hex editors show bytes as two big-endian hex digits, creating the illusion of a mixed ordering.

Fun related fact I discovered: Ethernet sends least-significant bit first, so IP over Ethernet is mixed 🤪

This website uses cookies. If you continue browsing this website, you agree to the usage of cookies.