[Super Mario 64 menu theme]
When we write
const x = 0.1
in a JavaScript source file and execute it, JavaScript does not interpret the 0.1
as the real number 0.1, because JavaScript numbers cannot represent every single possible real number, and 0.1 is not one of the real numbers which they can represent. Instead, 0.1
is interpreted as the closest JavaScript number to 0.1.
In decimal, 0.1 is exactly:
0.1000000000000000000... = 0.10
In binary, that is an infinite recurring expansion:
0.00011001100110011001100... = 0.00011
However, all JavaScript numbers are finite binary expansions. So the closest available JavaScript number is:
0.0001100110011001100110011001100110011001100110011001101
Note the absence of a "..." or an overline! This expansion is truncated after 55 bits. In decimal, this number is exactly:
0.1000000000000000055511151231257827021181583404541015625
Note that there's nothing stopping us from writing all of those decimal digits out in our JavaScript source file if we want to:
const x = 0.1000000000000000055511151231257827021181583404541015625
JavaScript will always interpret whatever decimal number we wrote, no matter how (im)precise, as the closest available JavaScript number. Sometimes this reinterpretation will be absolutely precise. But sometimes this reinterpretation will lose some precision.
For the same reason, when we write
const y = 0.2
JavaScript does not interpret this as the real number 0.2 but as the real number
0.200000000000000011102230246251565404236316680908203125
And if we write
const z = 0.3
JavaScript does not interpret this as the real number 0.3 but as the real number
0.299999999999999988897769753748434595763683319091796875
This means that when we write
const sum = 0.1 + 0.2
(or const sum = x + y
), what JavaScript actually computes is the precise sum
0.1000000000000000055511151231257827021181583404541015625
+
0.200000000000000011102230246251565404236316680908203125
=
0.3000000000000000166533453693773481063544750213623046875
JavaScript numbers cannot represent this precise result either, so the value returned is the closest available JavaScript number, which is
0.3000000000000000444089209850062616169452667236328125
Again, we have lost a little precision, although for a different reason. At first, we lost some precision in the interpretation of the source code. Now, we have lost some more precision in the calculation.
Notice that this sum value, which we got by writing 0.1 + 0.2
, is a different JavaScript number from what we got when we simply typed 0.3
.
Now, what happens when we try to log any of these values at the console?
JavaScript does not log every last decimal place of a number. Instead, JavaScript logs out the minimum number of digits necessary to uniquely identify that JavaScript number from the other JavaScript numbers near it.
So, if we try to log the value
0.1000000000000000055511151231257827021181583404541015625
we'll see the much shorter three-character string
> 0.1
at our console, because this is all that is necessary.
Note that yet again, we have lost some precision! That's three times now!
Strictly speaking, the only reason why console.log(0.1)
logs 0.1
is because of two different precision-loss events which cancel one another out. There is no 0.1 in the JavaScript programming language. One would be forgiven for thinking that there is.
Similarly, if we try to log
0.200000000000000011102230246251565404236316680908203125
we'll get
> 0.2
out. And if we try to log
0.299999999999999988897769753748434595763683319091796875
we'll get
> 0.3
out. And finally, if we try to log the result of 0.1 + 0.2
, which we remember is
0.3000000000000000444089209850062616169452667236328125
we'll get [drum roll]...
> 0.30000000000000004
So that's why 0.1 + 0.2
equals 0.30000000000000004
, and does not equal 0.3
. It's because we lost precision in three different places:
This all makes perfect sense now.
Because JavaScript numbers are IEEE 754 double-precision (i.e. 64-bit) floating-point numbers, or "doubles".
A double cannot represent every single possible real number. It can only represent approximately 264 distinct real numbers, all of them integer multiples of powers of 2. This includes, say, 0.125, but not 0.1. Instead we get the approximation behaviour seen above.
This behaviour is not unique to JavaScript. It is seen in every programming language where doubles are available, including C, C++, C#, Erlang, Java, Python and Rust.
To explore further, you may find this snippet of code useful.
import { stringify } from './xact.js' console.log(stringify(0.1)) // '0.1000000000000000055511151231257827021181583404541015625' console.log(stringify(0.2)) // '0.200000000000000011102230246251565404236316680908203125' console.log(stringify(0.3)) // '0.299999999999999988897769753748434595763683319091796875' console.log(stringify(0.1 + 0.2)) // '0.3000000000000000444089209850062616169452667236328125'
Discussion (31)
2018-08-13 01:30:22 by qntm:
2018-08-14 00:28:05 by FeepingCreature:
2018-08-14 17:21:16 by OleenaNatiras:
2018-08-14 19:34:28 by qntm:
2018-08-15 23:48:47 by Sid:
2018-08-16 16:16:15 by Ben:
2018-08-16 17:43:02 by OleenaNatiras:
2018-08-19 13:12:21 by Sid:
2018-08-19 15:11:54 by qntm:
2018-08-19 15:39:40 by Sid:
2018-08-19 16:12:07 by Sid:
2018-08-24 17:27:57 by Phantom Hoover:
2018-08-24 18:16:42 by qntm:
2018-09-02 12:05:00 by aitap:
2018-09-03 14:22:39 by Ingvar:
2018-10-02 08:59:42 by Antistone:
2018-10-10 18:09:18 by donpdonp:
2018-10-11 23:15:37 by Java Dope:
2019-02-20 06:21:36 by qntnn:
2019-02-25 00:37:59 by David S:
2019-04-13 01:42:02 by Lambda Fairy:
2019-06-25 01:15:36 by @rskurat:
2019-06-25 01:24:25 by qntm:
2019-08-27 14:06:56 by George Langham:
2019-09-06 17:45:06 by John:
2020-05-17 18:23:25 by 0.30000000000000004:
2021-03-30 08:43:52 by Cas722ey:
2021-03-30 08:44:22 by Did this work?:
2021-04-12 02:18:59 by me:
2021-04-12 02:20:15 by me (again):
2021-04-12 15:28:06 by another me: