How do people calculate pi to the hundredth+ decimal place?
How do people calculate pi to the hundredth+ decimal place?
So I know that pi is the ratio of a circle's circumference to its diameter (and the ratio of r³x4/3 to the volume of a sphere).
Apparently even the circumference of the universe needs less than 40 decimal places to be more accurate than we would ever need to worry about.
So my question is, how do we determine the decimal points beyond this? If pi is a ratio and even the largest conceived circle only gets you to ±36 places, how do we determine what the subsequent numbers are?
You're right that using geometry and ratios is only good for a few digits of π. Some ancient mathematicians used to draw polygons on the inside and the outside of a circle, and then use the circumferences of those polygons as an upper or lower limit on what π was. Archimedes approximated π as being between 223/71 and 22/7, using 96-sided regular polygons.
The real breakthroughs happened when people realized certain infinite series converge onto π, where you add and/or subtract a series of smaller and smaller terms so that the only digits of the sum that changes with each additional term are already way to the right of the decimal point.
The Leibniz formula, proven to converge to π/4, is 1 - 1/3 + 1/5 - 1/7 + 1/9 - 1/11 . . .
So if you have a pen and paper, you can add and subtract each one in sequence, and eventually they get really small to where you're adding and subtracting numbers so small that it leaves the first few digits untouched. At that point you can be confident that the digits that can't change anymore are the right digits.
Later breakthroughs in new formulas made much faster convergences, so that you didn't have to make as many calculations to get a few digits. And computers make these calculations much, much faster. So today the computer methods generally use the Chudnovsky algorithm that spits out digits of 1/π, which can easily be converted to digits of π itself.