This article says that NASA uses 15 digits after the decimal point, which I’m counting as 16 in total, since that’s how we count significant digits in scientific notation. If you round pi to 3, that’s one significant digit, and if you round it to 1, that’s zero digits.
I know that 22/7 is an extremely good approximation for pi, since it’s written with 3 digits, but is accurate to almost 4 digits. Another good one is √10, which is accurate to a little over 2 digits.
I’ve heard that ‘field engineers’ used to use these approximations to save time when doing math by hand. But what field, exactly? Can anyone give examples of fields that use fewer than 16 digits? In the spirit of something like xkcd: Purity, could you rank different sciences by how many digits of pi they require?
I’m a waitress, and pie is $12.50.
Software Engineering. 16 sigfigs across 64 bits
TIL a 64-bit float is accurate to 16 sigfigs.
Edit: actually, out of curiosity I decided to try and calculate it. I’ve very possibly done the wrong calculation, but what I did was log2(10x)=64, which works out to x≈19. Which isn’t 16, but is very close, and when you consider the way the float actually works it wouldn’t be too surprising that it was lose some information (the sign bit, for example, is immediately completely lost in this context).
A 64 bit IEEE float has 53 significant bits (the “mantissa” or “significand”), and log10(253) is 15.9546.
Isn’t it just 15 significant figures then?
This probably won’t play well with this audience, but I’m a management/strategy consultant. “~5” (technically one decimal place but also rounded to the nearest interval of 5) for any C-level decks ;)
Oof :D