The chronicle of Jean-François Lisée: meanwhile, at Skynet

There are lots of little drones in the packages that Santa Claus is bringing to our toddlers this year. They are getting lighter and lighter. More and more sophisticated. You can equip them with mini cameras. It’s nice. You can also put a small firecracker in it. That the craft can launch at a target or that can explode when the drone itself hits the target. Why not also embed facial recognition software? This way, he will know who your friends are and who your enemies are. Hours of fun to come in playgrounds and parks.

Haven’t found these options in your catalogs this year? It’s because you’re looking in the wrong place. They are in the prospectuses of the arms manufacturers. Mainly Americans, Chinese, Israelis and Turks. Yes, Turkish people. Erdogan’s autocratic regime is also delighted with the effectiveness of these devices, which he uses on its border with Syria. They have been tested in the combat zone. In September, Armenian forces were decimated in Azerbaijan, when these drones recognized the frequency of their radars and automatically destroyed their equipment.

Did you notice the word “automatically” in the last sentence? This means that the drones have acted independently of human will. As was the case in the spring in Libya. There, the drones did not target the equipment, but the soldiers of a rebel militia. They pursued them individually in their retreat. And shot them down. Ditto at the beginning of December in the war that Ethiopia is waging against the rebels in its province of Tigray. There, the drones were supplied by China, the United Arab Emirates and Iran.

Drones and Incas

Welcome to the new era of cyber warfare. The superiority of autonomous drones over all previous technologies is similar to Cortez’s use of guns and cannons against Inca spears and shields. This is why the race to produce and use them is frantic. (For a blood-curdling demo, see here:bit.ly/aidrone.)

The moral argument is obviously that a weapon should never be used without human control. It was brandished by an international coalition that tried, and failed to do so earlier this month at a United Nations meeting in Geneva, to guide their use. The military’s counter-argument is well summed up by this statement from Lieutenant-General John NT Shanahan, director of the Pentagon’s Joint Center for Artificial Intelligence: “We will be really at a disadvantage if we think it will be purely humans versus machines. If they are humans with machines on one side and machines on the other, the time factor will give machines superiority, as decisions [de leur côté] will be taken more quickly. So the battles will probably be algorithms against algorithms. “

Shanahan officially has a budget of US $ 18 billion to develop these explosive flying robots. Similar amounts are invested in China and Russia, to say nothing of North Korea. When a new, superior weapon appears, the fear that your enemies will get it first is an overwhelming motivation. This is true of nations, but also of rival criminal gangs who dispute, for example, territories in Mexico or elsewhere.

It will be, say the worried ones, like those of Amnesty International, the Kalashnikov effect. Small, portable, affordable, we’ll see it everywhere.

The margin of error

The risk of error is major. The best-coded algorithms have massively erred in the recent past. By claiming in the Pittsburgh Hospital Network that asthma reduces the risk of pneumonia, consistently giving negative reviews to women applying for jobs at Amazon, etc. How will the killer drone tell the difference between a soldier and a kid playing cowboy?

the New York Times However, has just questioned the quality of human judgment by revealing that the soldiers who led thousands of American drone attacks made a target identification error in 17% of the incidents, causing a total of 1,300 deaths. civilians in five years. Will smart drones also be pockets? (This is the premise of the excellent film Eagle eye (Evil eye) of 2018. An artificial intelligence is shocked that a human did not follow his recommendation for a drone attack.)

At this point, we are therefore only a degree of separation from the Skynet mentioned in the title. For novices, this is the name of the artificial intelligence system at work in the series Terminator. As soon as it goes online, Skynet becomes aware of its own existence. When humans attempt to unplug it, it retaliates by deciding to eradicate the human race.

Fortunately, with algorithms being almost as fallible as humans, Skynet’s margin of error is large enough to allow resistance to organize itself.

[email protected]; blog: jflisee.org

Watch video


source site-41

Latest