What is ebay expansion?  Computers have given a very long answer

You probably remember from math lessons that you can expand pi as 3.14. However, two decimal places is not a very impressive result when compared to the result achieved by Google computers.

In 2019, Google Cloud put this number at 31.4 trillion. However, the record did not last long, because only two years later, scientists from the University of Applied Sciences in Grisons increased this figure to 62.8 trillion decimal places. This year, however, there was another remarkable achievement.

Read also: Google’s latest text-to-image generator can create realistic images of what you dream of

Google played the main role again, more precisely – computers there. Thanks to their participation, it was possible to determine the value of pi up to 100 trillion decimal places. So Google Cloud is once again involved in breaking records, which is especially impressive when you consider that it has tripled over three years.

The pi expansion is now known as 100 trillion decimal places

This achievement is a testament to how much the Google Cloud infrastructure is accelerating from year to year. The core technology that made this possible is Compute Engine, the secure and configurable computing service from Google Cloud, plus the last few additions and improvements: Compute Engine N2 family of machines, 100Gbps bandwidth, Google Virtual NIC, and sustainable permanent disks. We can read in a statement published by Google

The program that managed to calculate 100 trillion digits of pi is known as R-Cruncher v0.7.8. The algorithm it depends on is known by name Chudnovsky’s algorithm. The computations began on October 14, 2021 and ended on March 21, 2022. More specifically, the computation took 157 days, 23 hours, 31 minutes, and 7.651 seconds.

Read also: They spent four months puzzled over this math question. Chinese genius solved it overnight

As you can imagine, such a large-scale operation requires a lot of computing power and resources. Google Cloud estimated the cache size needed to perform the calculations at 554 TB. It is worth noting that the company has created a cluster consisting of one computing node and 32 storage nodes. In total, this included 64 iSCSI block storage targets.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Magdalena Kawalec-Segond: Giant Bacteria – A Huge Surprise

A bacteria unlike any other has been discovered in the mangrove forests…

Study: Confirmed how COVID-10 vaccine affects fertility

Experts have long argued that COVID-19 vaccines do not affect female and…

Test: Are you a pro at quick addition? Test yourself with our math test!

Add in memory We repeat many times that not all of us…

Pink bridge will remind you of cervical cancer prevention

During the European Week for the Prevention of Cervical Cancer, the ul.…