Home » AI chatbots are evolving faster than computer chips

AI chatbots are evolving faster than computer chips

by admin
AI chatbots are evolving faster than computer chips

The technological progress of Large Language Models can be analyzed. A team of researchers at Cornwell University in New York has discovered that large language models are evolving rapidly and can increase their performance even faster than computer chips. However, the amounts of energy required for their operation should not be underestimated.

More than 200 language model evaluations

The researchers found that computing power halves on average every eight months to reach the same benchmark as computer chips. For this purpose, more than 200 language model evaluations were carried out from 2012 to 2023. The finding suggests an increase in efficiency compared to many other areas of data processing and even exceeds Moore’s Law. This means that the number of transistors on a chip regularly doubles. Depending on the source, 12, 18 or 24 months are mentioned as the period. The transistors indicate the computing power of a chip.

Increased performance through better algorithms

The fact that AI models achieve such high performance has primarily to do with the development of the algorithms on which they are based. By scaling up the size of LLMs, performance can be increased even further, the study states. Scaling is cited as the key driver for performance improvements. However, this also requires more computing power, which in turn depends on the availability of the AI ​​chips. These are currently in short supply. Results for the further development of algorithms could not be determined because the code of many LLMs is not publicly accessible. “Overall, our work provides a quantitative assessment of the rapid progress in language modeling,” the researchers write in their study.

See also  Ninja In Pajamas Reveals Its VCT Game Changer Lineup - Valorant - Gamereactor

More efficient AI models use more energy

Dr. Sasha Luccioni, climate and artificial intelligence researcher at Hugging Face, mentions in relation to the study that more efficient AI models can lead to them being used more, reports t3n. Overall, AI can result in higher energy consumption, says Luccioni. In her recent TED talk, she also noted: “The cloud that the AI ​​models live on is actually made of metal and plastic and runs on huge amounts of energy. And every time you query an AI model, it costs the planet.”

Chatbot Arena: Claude-3 outperforms GPT-4 for the first time

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy