Moore’s Law is no more, long live Moore’s Law!

Moore’s Law is not one. It is a simple image that has been used for several decades that summarizes the evolution of computing. Newer computers tend to deviate from this law, but that cannot be said to be the end of technological progress. On the contrary.

The observation first made in 1965 by Gordon Moore, co-founder of Intel, took about ten years to become Moore’s law. His prediction focused on the miniaturization of printed circuits. He estimated that the number of transistors that could be crammed onto a computer chip doubled every 18 to 24 months.

Over the years, it has been reinterpreted to refer to the more general progress of computer technology, ever more powerful thanks to ever smaller and more efficient processors.

A breathless law

However, Moore’s law has been called into question for several years. Somewhat disappointed with Intel’s inability to keep up with its own pace, companies like Apple, Google, and Microsoft began producing their own processors in-house. Graphics card maker Nvidia has taken the opportunity to dabble in emerging technology sectors, such as cryptocurrency mining and artificial intelligence.

The result is that not all new computer processors today have twice the transistors of their predecessors. But they bring a higher level of performance, so much so that we arrive these days at a stage where the limits of computing are no longer material. They are human.

Tons of floating points

It’s clear these days, with AI apps like ChatGPT and personal computers like Apple’s new Mac Studio and Mac Pro, that their only limit is the imagination of their users. If the miniaturization of printed circuits has reached a certain limit, the performance gains are not about to stop.

Presented by Apple ten days ago in California, the M2 processors of its new Mac Studio and Mac Pro are proof of this. This processor was released in the summer of 2022 and has 20 billion transistors. The M1 chip, its predecessor, was released 18 months earlier, in the fall of 2020. It had 16 billion transistors.

Under the hood of the Mac Studio, Apple combines several of these chips to create an extremely powerful multi-core processor. In its beefiest configuration, this Mac can perform up to 27.2 trillion (27.2 trillion) floating point operations every second, or 27.2 teraflops.

It is enormous. As an indication, the first supercomputer to cross the teraflops mark was born in 1996. In 2004, the most powerful computer on the planet belonged to the Japanese company NEC and was called Earth Simulator. It reached a power of nearly 36 teraflops. This supercomputer occupied a space of 65 meters by 50 meters.

The Mac Studio is about the size of a dictionary.

strength in numbers

Apple’s processor is powerful enough to support, if not in form, at least in spirit, Moore’s Law. It is also modular. For example, its M2 Ultra version simply combines two M2 Max processors (its “intermediate” version) fused together. It also includes graphics co-processors and a combined bandwidth of 800 gigabytes per second which gives essentially instantaneous access to unified RAM with a capacity of up to 192 gigabytes.

This configuration means that we no longer see the famous little rainbow-colored wheel that regularly appeared on the screen of older Macs when we demanded of their mechanics tasks that exceeded their capacity.

Nvidia has adopted a model not so different from this to impose itself in AI. Nvidia’s chips aren’t as powerful as Intel’s. But they can be paired in very large numbers and perform a host of calculations at the same time, exactly what an AI needs to perform optimally.

People surprised to see Nvidia climb on the stock market to a level that is increasingly approaching that of Apple, and leaving Intel and the others far behind, probably do not grasp the full potential of this technology. The limit, here again, is that of their imagination, rather than that of technology.

Obviously, these processors are not perfect. For example, their energy consumption is enormous. The AI ​​behind ChatGPT consumes as much electricity as a village of around 1500 households. This is not nothing, especially since we will see these AIs multiply rapidly over the next few years.

But many are willing to pay this price to access processing power that even Gordon Moore could not imagine in his day.

To see in video


source site-46