Home » Google IO 2022: Google’s fourth-generation TPU accelerator builds the largest machine learning hub with 90% carbon-free energy drive and is open to the public through Google Cloud services (177188)

Google IO 2022: Google’s fourth-generation TPU accelerator builds the largest machine learning hub with 90% carbon-free energy drive and is open to the public through Google Cloud services (177188)

by admin
Google IO 2022: Google’s fourth-generation TPU accelerator builds the largest machine learning hub with 90% carbon-free energy drive and is open to the public through Google Cloud services (177188)

Industry teams including Cohere, LG AI Research (LG AI Research), Meta AI, and Salesforce Research have actually experienced Google’s machine learning center, and can set up an interactive development environment through the TPU VM architecture, and can be applied flexibly Machine learning frameworks such as JAX, PyTorch, or TensorFlow also use fast interconnection and optimized software stack features to create more outstanding performance and scalability.

After announcing the launch of the fourth-generation TPU accelerator last year, Google announced at Google I/O 2022 that 8 sets of Cloud TPU v4 Pod computing devices constitute the largest machine learning center, and it emphasizes the use of 90% carbon-free energy to drive.

In the previous description, Google stated that the computing performance of the fourth-generation TPU accelerator has doubled compared to the previous generation. At the same time, the POD computing device composed of 4096 sets of fourth-generation TPU accelerators can also correspond to more than 1 exaFLOPS of computing performance (ie 10 per second18secondary floating-point operations). The machine learning center composed of 8 sets of Cloud TPU v4 Pod computing devices this time can reach up to 9,000,000,000,000,000,000,000,000,000,000,000 teraflops of floating-point operations per second at the peak. It has become the world‘s largest public machine learning center, and will be served through Google Cloud Open to the public.

Industry teams including Cohere, LG AI Research (LG AI Research), Meta AI, and Salesforce Research have actually experienced Google’s machine learning center, and can set up an interactive development environment through the TPU VM architecture, and can be applied flexibly Machine learning frameworks such as JAX, PyTorch, or TensorFlow also use fast interconnection and optimized software stack features to create more outstanding performance and scalability.

See also  Google CEO hints at the 3 key points of the May conference!The first Pixel folding phone is also suspected to appear

The PaLM (Pathways Language Model) language model mentioned by Google this time at Google I/O 2022 is trained through 2 sets of TPU v4 Pods, so as to correspond to faster language translation and understanding.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy