Do you know how data types work in the C programming language?
Today we're going to delve a little deeper into data types in C, and the reasons behind them.
How Does a Computer Actually Store Values?
Have you ever wondered how a computer knows if a value is int, char, or another type?
The short answer is: it doesn't know. Computres only works with bits.
Everything is Binary
Regardless of the type you use in the code — int, char, float — the value is always stored in binary.
For example:
int x = 42;
In memory, what actually exists is something like:
00101010
The computer does not store decimal, does not store hexadecimal, does not store octal, and does not store "letters". It only stores 0 and 1.
Code is for Humans
The code we write is just a human representation. Programming languages were created to make our lives easier, not the machine's.
Then when you write:
int x = 42;
This is still text.
Only later does the compiler convert everything to binary. the process is, conceptually, like this:
42 (decimal) → 00101010 (binary)
This binary value is what goes into memory.
So why do we see "normal" numbers in the terminal?
Because we asked to see them that way. Functions like printf use formatters (%d, %x, %o, %f, etc.) to say:
"Interpret these bits in a specific way."
Example:
printf("%d", x);
Here, %d tells printf:
“Interpret these bits as a decimal number.”
Same value, multiple interpretations
The bits don't change.
What changes is how you ask it to interpret them:
printf("%d", x); // decimal → 42 printf("%x", x); // hexadecimal → 2a printf("%o", x); // octal → 52
Internally, the binary value is exactly the same.
Type is not a base
This is a common mistake:
intis not a numeric base is a typecharis not a numeric base is a type
A type is just a container of bits.
Typically, an int occupies 32 bits (depending on the architecture). The numerical base only appears in two places:
-
In the source code (e.g.,
42,0x2A); -
In the output, when you decide how to display the bits (
printf).
🧠 Final Summary
-
The computer only understands binary;
-
Decimal, hexadecimal, and octal are human conventions;
-
Types (
int,char) only define how the bits will be used; -
printfdecides how to interpret and display these bits; -
The value in memory never changes — only the interpretation.