JavaScript has seven primitive values: undefined
, null
, boolean
, number
, bigint
, string
and symbol
. Knowing how to distinguish between them is critical to understanding equality in JavaScript.
Undefined
There is only one value of this type: undefined
. It is used to represent the concept of an unintentionally missing value.
console.log(typeof undefined); // "undefined"
This type commonly occurs when JavaScript does not know what value to use. For example, when a variable is declared but not assigned a value, it will point to undefined
:
let pet;
console.log(pet); // undefined
pet = "cat";
console.log(pet); // "cat"
However, if you try to read a property from this type, you will get a TypeError
:
let pet = undefined;
console.log(pet.name); // TypeError!
Null
There is only one such value: null
and it is used for intentionally missing values.
It behaves very similarly to undefined
. For example, it throws an error when you try to access its properties:
let pet = null;
console.log(pet.name); // TypeError!
If null
is so similar to undefined
, why have both? In JavaScript, it is common to use null
to represent the intentional absence of a value, which might help distinguish a coding error that could result in undefined
.
While null
is the only value of its type, due to a bug in JavaScript, it pretends to be an object
:
console.log(typeof null); // "object"
Although it looks like an object, it is a primitive value and does not behave like an object
in any way.
Booleans
There are only two boolean values: true
and false
:
console.log(typeof true); // "boolean"
console.log(typeof false); // "boolean"
You can use them to perform logical operations:
console.log(true); // true
console.log(!true); // false (the opposite)
console.log(true || false); // true (at least one is true)
console.log(true && false); // false (both are true)
Numbers
Another JavaScript primitive type is number
:
console.log(typeof 42); // "number"
console.log(typeof 3.14); // "number"
console.log(typeof -42); // "number"
JavaScript numbers, however, don't behave exactly the same way as regular mathematical numbers do:
console.log(0.1 + 0.2 === 0.3); // false
console.log(0.1 + 0.2 === 0.30000000000000004); // true
This behavior is called floating-point math. It is a way of representing numbers in computers. It is not perfect, but it is sufficient for most cases.
While in real math there is an infinite set of numbers, in floating-point math, there are fewer. Therefore, when you using numbers in your code, JavaScript chooses the closest numbers it knows about (just as a scanner does with colors).
In other words, JavaScript uses numbers with limited precision. We can imagine all JavaScript numbers on an axis. The closer we get to 0
, the greater the precision of the numbers and the closer they are to each other.
This is because relatively small numbers occur more often in our programs, and we usually want them to be accurate. But when we write 0.1
or 0.2
, we don't get exactly 0.1
and 0.2
. We get the closest numbers available in JavaScript. They are almost exactly the same, but there may be a small difference. These small differences add up, which is why 0.1 + 0.2
doesn't give us exactly the same number as writing 0.3
.
This is common to several programming languages. In addition, floating-point math includes some special numbers. For instance, when performing operations such as 1 / 0
, and JavaScript needs to represent the result.
Here are some special numbers that might appear in the code:
let scale = 0;
let a = 1 / scale; // Infinity
let b = 0 / scale; // NaN
let c = -a; // -Infinity
let d = 1 / c; // -0
NaN
, which is the result of 0 / 0
and other invalid calculations, stands for "not a number", but it is a number type:
console.log(typeof NaN); // "number"
It is rare to write code using these special numbers. However, they can appear because of a coding error. So it is good to know that they exist.
BigInts
Since regular numbers cannot accurately represent large integers, BigInts were introduced to fill this gap. There is an infinite number of BigInts and they are used to represent integers with arbitrary precision.
console.log(typeof 13927199254740991); // "number"
console.log(typeof 13927199254740991n); // "bigint"
BigInts are great for financial application where precision is important. However, keep in mind that operations with huge numbers require time and resources.
Strings
In JavaScript, text is represented by a string. There are three ways to write strings (single quotes, double quotes, and backticks):
console.log(typeof 'hi'); // "string"
console.log(typeof "hi"); // "string"
console.log(typeof `hi`); // "string"
console.log(typeof ""); // "string"
Strings have built-in methods, but they are not objects:
let pet = "Cat";
console.log(pet.length); // 3
console.log(pet[0]); // "C"
console.log(pet[1]); // "a"
Symbols
Symbols are a new primitive type. They are used to create unique identifiers for objects and control which parts of the code can access it.
let mySecret = Symbol();
console.log(typeof mySecret); // "symbol"