Home » What really changes with a personal computer with hardware dedicated to AI?

What really changes with a personal computer with hardware dedicated to AI?

by admin
What really changes with a personal computer with hardware dedicated to AI?

Listen to the audio version of the article

On the so-called Ai computer, we divide ourselves into three categories. Skeptics are convinced that artificial intelligence in computers will go the way of 5G in mobile phones: we will take it for granted without ever being clear what the real advantage is. The enthusiasts, the majority, instead outline new science fiction scenarios of productivity and creativity. The producers, however, have bet everything, some have even already inserted a dedicated button to activate the chatbots. What is certain is that already today, according to Gartner, a fifth of personal computers and smartphones sold on a global scale will be models with integrated (generative) artificial intelligence. In other words, the plane is inclined.

But what is an Ai computer?

This category includes PCs equipped with accelerators or dedicated chips and neural processing units (NPU), accelerated (APU) or tensor processing units (TPU), designed to optimize and accelerate the capabilities of artificial intelligence on the device, improving performance and efficiency in managing the workloads required by the algorithms without having to resort to external servers or cloud services.
To understand what changes, we tried the Medion E15443, the new laptop from the German company which comes with an Intel Core Ultra processor with integrated NPU. This is the new generation of chips from the Santa Clara giant, launched in December, which on paper promise a 25% reduction in energy consumption compared to the same category of processors without NPU.

With this new generation of processors, Intel wanted to reduce the gap with Apple Silicon and counter the arrival of Snapdragon X, improving GPU performance and artificial intelligence. Apple’s MacBooks have had dedicated NPUs built-in since the M1 MacBook Air debuted. As Intel was keen to point out during a presentation, these laptops are not a new product category. Powerful and expensive gaming computers offload most of their AI processing power to the GPUs in their graphics cards. Nvidia Chat With RTX, for example, is a demo app that allows you to customize a GPT LLM (Large Language Model) model connected to your content through RAG (augmented recovery generation) technology, using the GPUs present in its cards locally graphics.

See also  Festival 2024, the first evening on social media: the Sanremo account alone lists the numbers of all 30 singers together

What the Intel Core Ultras do best is manage the tasks assigned to the artificial intelligence locally, and therefore offline, freeing the processor from this workload and thus improving consumption and general performance. This means, in perspective, using artificial intelligence applications whenever possible by exploiting the computer hardware without having to connect to the cloud services of AI providers.

The advantage?

Our data remains inside our computer, thus there is less risk in terms of privacy. In practice, this means using relatively heavy local LLM (large language model) models, such as Meta’s LLaMA, but also other appropriately optimized AI software. And it is precisely on software, Intel points out, that the AI ​​computing game is being played out. The goal, they explained, is to reach at least 100 partners by June next year. Rival Qualcomm also recently launched the AI ​​Hub, a sort of library of 75 pre-optimized AI models for seamless deployment on devices based on Snapdragon and Qualcomm platforms.
.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy