
Hexadecimal: Shortening Long Binaries
Why do color codes (#FFFFFF) mix letters and numbers? The greatest gift Hexadecimal gives developers is 'Readability'.

Why do color codes (#FFFFFF) mix letters and numbers? The greatest gift Hexadecimal gives developers is 'Readability'.
Why does my server crash? OS's desperate struggle to manage limited memory. War against Fragmentation.

Two ways to escape a maze. Spread out wide (BFS) or dig deep (DFS)? Who finds the shortest path?

Fast by name. Partitioning around a Pivot. Why is it the standard library choice despite O(N²) worst case?

Establishing TCP connection is expensive. Reuse it for multiple requests.

I was staring at a crash dump when I first encountered 0xDEADBEEF. My brain froze. Is this a real address or some kind of joke? It wasn't a number, wasn't a word—just this bizarre mix of letters and digits. I spent way too long Googling it before I understood. It was hexadecimal, and someone had deliberately planted it as a marker. DEAD BEEF. A developer's subtle sense of humor.
That moment sparked my curiosity. Why hexadecimal? Why mix numbers and letters? The CSS color codes like #FFFFFF, the memory addresses like 0x7ffee4b3c8d0—they all used hex. I wanted to truly understand this system, not just memorize conversion formulas.
Computers run on electrical signals. On (1) or off (0). So they store everything in binary. Logically perfect. But for human eyes? Absolute torture.
Take the decimal number 255. In binary, it's 11111111. Eight digits. Not too bad. But what about 65535?
1111111111111111
Sixteen digits. To read this, you have to chunk it mentally: "1111, 1111, 1111, 1111... ugh, forget it."
What about a 32-bit memory address?
11010101111010101010101010101010
Thirty-two digits of chaos. This is just encrypted nonsense to anyone trying to read it. Imagine debugging and comparing these addresses. Pure misery.
Computers love binary. But we're humans. We needed a summary format. That's where hexadecimal comes in.
Hexadecimal's core trick is brilliantly simple: compress 4 binary digits into exactly 1 character.
Why 4 bits? Because 2^4 = 16. Hexadecimal represents 0 through 15 in a single digit. Writing 15 in binary takes 1111 (4 bits). It fits perfectly.
Looking at this mapping table, I had my "aha" moment:
| Binary | Decimal | Hexadecimal |
|---|---|---|
| 0000 | 0 | 0 |
| 0001 | 1 | 1 |
| 0010 | 2 | 2 |
| 0011 | 3 | 3 |
| 0100 | 4 | 4 |
| 0101 | 5 | 5 |
| 0110 | 6 | 6 |
| 0111 | 7 | 7 |
| 1000 | 8 | 8 |
| 1001 | 9 | 9 |
| 1010 | 10 | A |
| 1011 | 11 | B |
| 1100 | 12 | C |
| 1101 | 13 | D |
| 1110 | 14 | E |
| 1111 | 15 | F |
Decimal only has 10 digits (0-9), so anything from 10 onward needs two digits. Hexadecimal needs to represent up to 15 in a single character, so it borrows letters A through F. This felt weird at first, but now it's second nature.
Now let's convert that 32-digit binary nightmare into hex:
Binary: 11010101 11101010 10101010 10101010
Hexadecimal: D5 EA AA AA
Result: 0xD5EAAAAA
From 32 digits down to 8. A 4x compression. Suddenly it's readable. Patterns emerge. Notice the repeating AA? You'd never spot that in binary.
Easiest method. Chunk from the right in groups of 4 bits, then convert using the table.
Example: Convert 11111010 to hex.
1111 1010
F A
= 0xFA
If the number of bits isn't divisible by 4, pad with zeros on the left.
Example: Convert 1010111 to hex.
0101 0111
5 7
= 0x57
Reverse process. Each hex digit becomes 4 binary bits.
Example: Convert 0x3C to binary.
3 C
0011 1100
= 00111100
Repeatedly divide by 16 and read the remainders backward.
Example: Convert 255 to hex.
255 ÷ 16 = 15 remainder 15 (F)
15 ÷ 16 = 0 remainder 15 (F)
Read backward: 0xFF
This is tedious by hand. Code does it way faster.
Multiply each digit by the appropriate power of 16 and sum them.
Example: Convert 0x2F to decimal.
2 × 16¹ + F × 16⁰
= 2 × 16 + 15 × 1
= 32 + 15
= 47
Manual calculation is annoying. JavaScript's parseInt() handles it instantly.
I didn't truly get it until I wrote conversion code myself.
// Decimal → Hexadecimal
const decimal = 255;
const hex = decimal.toString(16);
console.log(hex); // "ff"
console.log('0x' + hex.toUpperCase()); // "0xFF"
// Hexadecimal → Decimal
const hexValue = 'FF';
const decimalValue = parseInt(hexValue, 16);
console.log(decimalValue); // 255
// Binary → Hexadecimal (via decimal)
const binary = '11111111';
const decFromBin = parseInt(binary, 2);
const hexFromBin = decFromBin.toString(16);
console.log(hexFromBin); // "ff"
// Dissecting a color code
const color = '#FF5733';
const r = parseInt(color.slice(1, 3), 16); // FF → 255
const g = parseInt(color.slice(3, 5), 16); // 57 → 87
const b = parseInt(color.slice(5, 7), 16); // 33 → 51
console.log(`RGB(${r}, ${g}, ${b})`); // "RGB(255, 87, 51)"
// Using 0x notation directly
const address = 0xDEADBEEF;
console.log(address); // 3735928559 (prints as decimal)
console.log(address.toString(16)); // "deadbeef"
Running this code made it click: "Oh, these are all the same number internally." Hex, binary, decimal—they're just different notations for identical values.
This is what I use most often. CSS color definitions.
.button {
background-color: #FF5733; /* orangish tone */
color: #FFFFFF; /* white */
border: 1px solid #000000; /* black */
}
Breaking down #FF5733:
Strong red with a touch of green creates an orange tone. Six characters in hex. Way shorter than rgb(255, 87, 51).
CSS even supports shorthand:
.box {
background: #FFF; /* same as #FFFFFF */
color: #000; /* same as #000000 */
}
#FFF is shorthand for #FF FF FF. Only works when each channel has identical doubled digits.
Open any debugger and memory addresses are in hex:
0x7ffee4b3c8d0
0x00007fff5fc01000
0x0000000100003f20
Why hex? Memory aligns in bytes, and 1 byte = 8 bits = exactly 2 hex digits. When addresses read 0x1000, 0x1001, 0x1002, you instantly recognize "these are consecutive memory locations." Binary would obscure that pattern completely.
Network card hardware addresses:
00:1A:2B:3C:4D:5E
Six pairs of hex digits separated by colons. Each pair is 1 byte (8 bits). Total 48 bits.
Why hex? Writing it as 00000000:00011010:00101011:... would be insane.
Emoji and special character values are expressed in hex:
// 💀 (skull emoji)
console.log('💀'.codePointAt(0).toString(16)); // "1f480"
// JavaScript Unicode escape
const skull = '\u{1F480}';
console.log(skull); // 💀
The first few bytes of a file identify its type. These "magic numbers" look like this in a hex editor:
PNG: 89 50 4E 47
JPEG: FF D8 FF
GIF: 47 49 46 38
ZIP: 50 4B 03 04
89 50 4E 47 translates to .PNG in ASCII. Programs check these headers to determine file types.
When analyzing binary files, you use hex dumps:
00000000: 7f45 4c46 0201 0100 0000 0000 0000 0000 .ELF............
00000010: 0300 3e00 0100 0000 5010 0000 0000 0000 ..>.....P.......
Left column: address (hex). Middle: data bytes (hex). Right: ASCII interpretation. In binary, this would be completely unreadable.
Octal (base-8) exists too. Uses digits 0-7. Unix file permissions use it:
chmod 755 file.sh
# 7 = 111 (rwx)
# 5 = 101 (r-x)
# 5 = 101 (r-x)
Octal compresses 3 binary bits into 1 digit. But here's the problem: 1 byte is 8 bits, which doesn't divide evenly by 3. For memory addresses or color codes, octal is awkward. Hexadecimal wins decisively: 4 bits = 1 hex digit, 8 bits = 2 hex digits. Perfect alignment.
That's why octal barely gets used anymore. Just Unix permissions remain.
In code, hex numbers get a 0x prefix:
const num = 0xFF; // 255
const addr = 0xDEADBEEF;
Why 0x? It comes from C. Numbers starting with 0 were octal, 0x meant hexadecimal. This convention stuck.
CSS uses # instead:
color: #FF5733;
URL encoding uses %:
https://example.com/search?q=Hello%20World
%20 is the space character's ASCII code (32) written in hex.
Watch out when using hex literals in JavaScript:
const a = 0xFF; // works: 255
const b = 0XFF; // works: 255 (capital X ok)
const c = 0xff; // works: 255 (lowercase f ok)
// But strings are different
parseInt('FF', 16); // 255
parseInt('0xFF', 16); // 255
parseInt('FF', 10); // NaN (can't parse as decimal)
parseInt() needs the second argument to specify the radix. Without it, it assumes decimal:
parseInt('FF'); // NaN
parseInt('10'); // 10 (decimal)
parseInt('10', 16); // 16 (hexadecimal)
Developers use meaningful hex values as debug markers:
0xDEADBEEF: "dead beef" - memory deallocation marker0xCAFEBABE: Java class file magic number0xFEEDFACE: macOS Mach-O file header0xBADC0FFE: "bad coffee" - error code0xDEADC0DE: "dead code" - unused code markerSeeing these makes me smirk. It's a hex privilege—you can't spell words like this in decimal.
Here's a simple hex dump function I wrote:
function hexDump(buffer) {
const bytes = new Uint8Array(buffer);
let output = '';
for (let i = 0; i < bytes.length; i += 16) {
// Address (8-digit hex)
const addr = i.toString(16).padStart(8, '0');
output += addr + ': ';
// Print 16 bytes
for (let j = 0; j < 16; j++) {
if (i + j < bytes.length) {
const byte = bytes[i + j].toString(16).padStart(2, '0');
output += byte + ' ';
} else {
output += ' '; // empty space
}
if (j === 7) output += ' '; // middle separator
}
// ASCII representation
output += ' |';
for (let j = 0; j < 16; j++) {
if (i + j < bytes.length) {
const byte = bytes[i + j];
// Show printable ASCII only
output += (byte >= 32 && byte < 127)
? String.fromCharCode(byte)
: '.';
}
}
output += '|\n';
}
return output;
}
// Usage
const data = new TextEncoder().encode('Hello, Hex World!');
console.log(hexDump(data));
// Output:
// 00000000: 48 65 6c 6c 6f 2c 20 48 65 78 20 57 6f 72 6c 64 |Hello, Hex World|
// 00000010: 21 |!|
Building this made me think, "Oh, so that's why binary editors look this way."
Hexadecimal isn't for computers. It's a translator for humans. Computers still only speak 0s and 1s. But someone built this friendly summary so our brains don't melt staring at endless binary strings.
When I see #FFFFFF, I instantly know "white." When I spot 0xDEADBEEF, I recognize "someone planted a joke here." When I read 0xFF, I think "255, maximum value." All these insights come from hex's gift of readability.
I've grown fond of hexadecimal. It's the tool that compresses ugly binary down to a quarter of its length, reveals patterns, and lets us embed meaning. It feels like learning a new language when you become a developer. With this language, I read memory, mix colors, and analyze file headers—things I couldn't imagine doing without hex.