This is, by far, the techno topic of the year. IA this, IA that. Microsoft, Intel, and PC makers have all jumped on the bandwagon. Maybe you’ll want to do the same. But how can artificial intelligence be bought?
The short answer: it takes preparation. We are not buying a PC in 2024 like we did five years ago!
That said, maybe in 2024 you won’t be buying a PC. Already, an abnormally high volume of personal computers and telephones of the year have been sold between 2021 and 2023. Teleworking caused by the pandemic has forced many people to modernize their equipment, including the probably beige computer , perhaps gray, which until then had been accumulating dust in the basement.
Despite this, if you are shopping for a computer device, prepare to be disoriented. It promises to be a fascinating exercise, as Charles Tisseyre would say. So much so that it deserves a report Discoveryof the “immersion in the little-known world of PCs, a unique ecosystem”.
If you’re planning to buy a PC this year, get ready to compare features you didn’t know about before this spring. In fact, it is not out of the question that you might seriously ask a seller something like: “How many TOPS should my PC’s NPU be capable of to handle the tokens of the LLM that helps me illustrate my PowerPoint presentation? »
LLM
LLMs are the expanded language models (Large language models) developed by tech giants like Google and Meta, but also OpenAI, from the big data that they have been collecting left and right for years. These are probabilistic systems trained with incredible precision to write what would be the correct answer to your question, or to guess what is hidden behind the object that you want to erase from your photos.
The best-known LLMs are GPT from OpenAI, LLaMa from Meta and PaLM from Google, but there are several others. Most are found on the Internet, in the form of free software. You can download an LLM to your own computer for free from an online directory such as Hugging Face. You thus obtain your own personal dialog, or your own tool for generating images or videos from a piece of text.
There are also lightweight language models, called SLM. These could one day embody the famous personal AI assistant that tech giants dream of.
TOPS
To work well, both LLMs and SLMs need context. For example, 1500 pages of documents in PDF format. If you ask Copilot to summarize what is said in 1500 pages of text about a stock price, the AI must first of all determine whether what interests you is the variation of a stock listed on the stock exchange or the quality of college training in speculative investment.
Establishing this context requires taking into account thousands, even millions of criteria. For AI, in English, these criteria are called tokens. An AI that can take into account millions of these tokens will produce an answer better suited to the context of your question.
Obviously, the more criteria there are to consider, the more processing power the AI needs. The faster it must respond, the more instantaneous this power must be. Hence the importance, in recent times, of TOPS, i.e. trillions of operations per second (a trillion English is worth one billion French, or a thousand billion). An AI PC can perform dozens of TOPS. To become a PC Copilot +, he must complete at least 40 TOPS.
NPU
The simplest way for PC AI applications to be faster is to work the graphics processor, the GPU, in addition to the CPU (central processor). Hence the enormous and very recent commercial success of Nvidia, which sells its GPUs like hot cakes. GPUs are his specialty.
Except that GPUs are primarily used to supply images to your PC monitor. Manufacturers have therefore derived a new so-called neural processor, called NPU. It is a simplified GPU to consume less energy. Windows sometimes recognizes NPUs as graphics processors without display capability…
These days, the crux of the matter for processor manufacturers, Intel, AMD or ARM, and even Apple, Google and Microsoft (who design their own chips without manufacturing them themselves), is to have the most powerful combination of CPU, GPU and NPU, to perform AI tasks the fastest.
Is this necessary?
So here’s how to shop for your AI: make sure your PC’s NPU and GPU can perform enough TOPS to meet the AI requirements. tokens of your LLM.
At this point, someone at the OQLF who read this text would be in the middle of apoplexy.
Gone, then, are those days when a good PC was the one that simply had the most RAM or the best battery? Maybe not. Most consumers won’t really need an AI PC, which is especially useful if your job is to retouch photos several times a day, edit videos or synthesize tons of technical documents.
If Minesweeper is your favorite app, or the biggest calculation you do on your PC is filing your taxes, AI is of no use to you.
Which is a shame, basically. An AI capable of doing taxes would be quite a revolution…