
Logic Gates: AND, OR, NOT, XOR
How do computers make decisions? Even complex AI is just a combination of these 4 gates.

How do computers make decisions? Even complex AI is just a combination of these 4 gates.
Why does my server crash? OS's desperate struggle to manage limited memory. War against Fragmentation.

Two ways to escape a maze. Spread out wide (BFS) or dig deep (DFS)? Who finds the shortest path?

Fast by name. Partitioning around a Pivot. Why is it the standard library choice despite O(N²) worst case?

Establishing TCP connection is expensive. Reuse it for multiple requests.

When I first learned programming, I'd write code like if (a && b) and wonder, "How does the computer actually understand this?" The AND and OR operators we use so naturally in software actually exist as physical circuits in hardware. That blew my mind.
I just assumed "the CPU magically figures it out," but it turns out the "magic" is actually logic gates built from transistor combinations. Electrical signals (0 or 1) open and close circuits, and that's how "true/false" gets decided.
It took me a while to wrap my head around this. Saying "computers think" is kind of an exaggeration. Really, they're just rapidly flipping electric switches on and off according to rules.
A Logic Gate is a circuit that takes electrical signals (0 or 1) as input and produces output (0 or 1) based on fixed rules. I came to understand them as the physical embodiment of conditional statements in code.
At first I thought, "Isn't this just an electric switch?" But when you combine these gates, you can do addition, subtraction, comparison, storage—literally everything a computer does. That shocked me.
Rule: Output is 1 only if both inputs are 1.
Truth Table:
| A | B | Output |
|---|---|---|
| 0 | 0 | 0 |
| 0 | 1 | 0 |
| 1 | 0 | 0 |
| 1 | 1 | 1 |
Analogy: A vault that requires two keys turned simultaneously. Missing even one key means it won't open.
In Code:
// JavaScript
if (user.isLogin && user.hasTicket) {
console.log("Entry granted");
}
// Direct bitwise operation
let a = 1, b = 1;
console.log(a & b); // 1 (AND operation)
a = 1, b = 0;
console.log(a & b); // 0
How with Transistors?: Connect two transistors in series. Like installing two faucets in a pipe. To get water (1), you must open both faucets.
Rule: Output is 1 if at least one input is 1.
Truth Table:
| A | B | Output |
|---|---|---|
| 0 | 0 | 0 |
| 0 | 1 | 1 |
| 1 | 0 | 1 |
| 1 | 1 | 1 |
Analogy: A door lock. Enter the password OR swipe a keycard—either one opens the door.
In Code:
if (isWeekend || isHoliday) {
console.log("Day off");
}
let a = 1, b = 0;
console.log(a | b); // 1 (OR operation)
a = 0, b = 0;
console.log(a | b); // 0
How with Transistors?: Connect two transistors in parallel. The pipe splits into two paths—opening either faucet lets water through.
Rule: If input is 0, output is 1. If input is 1, output is 0. The contrarian gate.
Truth Table:
| A | Output |
|---|---|
| 0 | 1 |
| 1 | 0 |
Analogy: Emergency stop button. Normally electricity flows (1), but when pressed (1 input), it cuts the flow (0 output).
In Code:
if (!isPaid) {
console.log("Not paid");
}
let a = 1;
console.log(~a); // -2 (bitwise NOT, complement)
console.log(!a); // false (logical NOT)
Rule: Output is 1 only when inputs are different. Same inputs produce 0.
Truth Table:
| A | B | Output |
|---|---|---|
| 0 | 0 | 0 |
| 0 | 1 | 1 |
| 1 | 0 | 1 |
| 1 | 1 | 0 |
Analogy: Two people flip coins. You win only if one shows heads and the other tails. Both heads or both tails means no prize.
In Code:
# Python
a, b = 1, 0
print(a ^ b) # 1 (XOR operation)
a, b = 1, 1
print(a ^ b) # 0
# Commonly used in encryption
message = 0b1010
key = 0b1100
encrypted = message ^ key # 0110
decrypted = encrypted ^ key # 1010 (original restored)
Use Cases: XOR is heavily used in encryption, parity checks, and data integrity verification. XORing with the same key twice restores the original value.
NAND = NOT + AND. Inverts the AND result. NOR = NOT + OR. Inverts the OR result.
NAND Truth Table:
| A | B | Output |
|---|---|---|
| 0 | 0 | 1 |
| 0 | 1 | 1 |
| 1 | 0 | 1 |
| 1 | 1 | 0 |
NOR Truth Table:
| A | B | Output |
|---|---|---|
| 0 | 0 | 1 |
| 0 | 1 | 0 |
| 1 | 0 | 0 |
| 1 | 1 | 0 |
Mind-blowing fact: With just NAND gates, you can build every other gate (AND, OR, NOT, XOR). Same with NOR.
That's why CPU designers often use NAND as the fundamental building block—it's cheap and efficient to manufacture. The NAND Flash in your USB drive or SSD is called "NAND" because it uses this structure as its basic unit.
When I first learned about logic gates, I wondered, "What's the point?" Then I discovered you can combine them to do arithmetic operations. That was a revelation.
A circuit that adds two 1-bit numbers.
Example: 1 + 1 = 10 (binary)
Circuit Design:
// Half adder in code
function halfAdder(a, b) {
const sum = a ^ b; // XOR
const carry = a & b; // AND
return { sum, carry };
}
console.log(halfAdder(1, 1)); // { sum: 0, carry: 1 }
console.log(halfAdder(1, 0)); // { sum: 1, carry: 0 }
A half adder can't handle carry-in from a previous digit. A full adder takes three inputs (A, B, Carry-in) and produces sum and carry-out.
Circuit Design:
function fullAdder(a, b, carryIn) {
const firstSum = a ^ b;
const firstCarry = a & b;
const sum = firstSum ^ carryIn;
const secondCarry = firstSum & carryIn;
const carryOut = firstCarry | secondCarry;
return { sum, carryOut };
}
console.log(fullAdder(1, 1, 1)); // { sum: 1, carryOut: 1 } (1+1+1=11)
Realization: When a CPU adds numbers, it's just chaining these full adders together. A 32-bit addition uses 32 full adders in sequence. That's the heart of the ALU (Arithmetic Logic Unit).
When working with logic gates, you often want to simplify circuits. That's where Boolean algebra comes in.
De Morgan's Laws:
NOT(A AND B) = NOT(A) OR NOT(B)NOT(A OR B) = NOT(A) AND NOT(B)Verification in Code:
let a = true, b = false;
// Law 1
console.log(!(a && b) === (!a || !b)); // true
// Law 2
console.log(!(a || b) === (!a && !b)); // true
Practical Use: Simplifying conditionals during refactoring.
// Original code
if (!(isLoggedIn && hasPermission)) {
console.log("Access denied");
}
// After De Morgan
if (!isLoggedIn || !hasPermission) {
console.log("Access denied");
}
The second version is much easier to read. After learning these laws, I naturally apply them when writing conditionals.
Modern CPUs pack in billions of transistors. For example, the Apple M1 chip has 16 billion transistors.
Why so many?Analogy: Think of each transistor as a Lego brick. A CPU is a massive castle built from billions of bricks. Even building one wall (the ALU) requires millions of bricks.
The code we write must eventually pass through logic gates to execute.
if (x > 5 && y < 10) {
console.log("Condition met");
}
Hardware perspective:
x > 5: The CPU's comparator circuit activates. Internally, it performs subtraction (x - 5) and checks if the result is positive using a combination of logic gates.y < 10: Another comparator circuit activates.&&: The two comparison results feed into an AND gate.So that's what it is: Every IF statement we write ultimately becomes electrical signals passing through AND and OR gates in the CPU. Once I understood this, coding felt more like "I'm controlling hardware switches."
In JavaScript or Python, you can use bitwise operators to directly mimic logic gates.
# Python
a = 0b1100 # 12 (binary)
b = 0b1010 # 10
print(bin(a & b)) # 0b1000 (AND)
print(bin(a | b)) # 0b1110 (OR)
print(bin(a ^ b)) # 0b0110 (XOR)
print(bin(~a)) # -0b1101 (NOT, complement)
print(bin(a << 1)) # 0b11000 (left shift, *2)
print(bin(a >> 1)) # 0b110 (right shift, /2)
Real-world uses:
if (userPermissions & PERMISSION_WRITE)flags |= FLAG_ACTIVEflags &= ~FLAG_ACTIVEif (n & 1) (last bit is 1 if odd)After learning bitwise operators, I realized: "This is hardware-level control."
[Half Adder Circuit]
A ----\
XOR ---- Sum
B ----/
A ----\
AND ---- Carry
B ----/
[Full Adder Circuit]
A -----\
XOR -----\
B -----/ \
XOR ---- Sum
Cin --------------/
(intermediate carry calculation omitted)
|
OR ----- Carry Out
Real circuits are much more complex, but conceptually, they're combinations of XOR, AND, and OR gates like this.
Inside the CPU is a component called the ALU. This is the core unit that handles arithmetic operations (addition, subtraction, multiplication) and logical operations (AND, OR, NOT).
ALU Components:
How one line of code passes through the ALU:
let result = (a + b) & 0xFF;
a + b: ALU's adder circuit activates (full adder chain)& 0xFF: ALU's AND gate circuit activatesAfter understanding this flow, I started thinking about code optimization in terms of "How many cycles will this operation take in the ALU?"
Logic gates are the most fundamental decision-making units in a computer. Every conditional statement, every operation in our code eventually becomes electrical signals passing through these gates.
Combining these gates creates:
So that's what it is: A computer's "thinking" isn't really thinking—it's just electrical signals flowing through logic gates made of transistors. Once I accepted this, programming felt much more concrete and physical to me.