Autonomous driving | Guiding the car just with cameras? Tesla’s controversial choice

(Las Vegas) To guide the future autonomous car, Tesla has a strong opinion on the issue and is now betting everything on cameras, leaving certain specialists in driving assistance systems that also use radar and laser lidar sensors doubtful.



Patrick FALLON with Juliette MICHEL in New York
France Media Agency

At the big CES tech show in Las Vegas, the manufacturer of Lidars Luminar installed a whole system on a parking lot to prove the superiority of its product, making two cars roll side by side at about 50 km / h before tumbling down on the track the silhouette of a child.

The vehicle equipped with its product brakes in time when the other car, a Tesla, pushes the dummy dummy.


PHOTO CHARLES SYKES, ASSOCIATED PRESS

Luminar Object Detection and Collision Avoidance System demonstration at CES 2022 on Wednesday, January 5, 2022, in Las Vegas.

The conditions of the experiment are not validated by an external person. “But we didn’t want to just show a Powerpoint or a nice video,” says Aaron Jefferson, Product Development Manager at Luminar.

“In perfect driving conditions, on a sunny day, the cameras can do a lot,” he said. “The problem is the atypical situations”, he admits: blind spots, fog, a plastic bag, the particular light at sunset, and so on.

Most manufacturers of autonomous driving systems have chosen to combine cameras with radars and / or lidars, instruments which allow distance to be measured respectively via radio waves or by laser.

Tesla made the choice last year to drop radars and rely solely on cameras for its driver assistance system. According to Elon Musk, with technological advancements, an “artificial brain” running on cameras is able to match the capacities of a human brain analyzing its environment with its two eyes.

“It’s a pretty reasonable strategy,” says Kilian Weinberger, a professor at Cornell University who has worked on object detection in autonomous driving systems.

Officially, Tesla offers, for the moment, only driving assistance systems, but hopes to arrive, in fine, with a completely autonomous driving system.

Predicting is complicated

The manufacturer chose, several years ago, to install cameras and radars by default on all its cars and was thus able to recover a significant amount of information on how motorists drive in real conditions.

“Tesla made the bet that by collecting a lot of data, they can train an algorithm as efficient as an algorithm using much more expensive sensors with less data,” Weinberger explains.

The robot taxis of Waymo, the autonomous driving subsidiary of Google, are for example encumbered with sensors but only run under specific conditions.

Autonomous driving systems have four main functions, notes Sam Abuelsamid of Guidehouse Insights: perceive the environment, predict what’s going to happen, plan what the car is going to do, and execute.

“Predicting turned out to be a lot more complicated than engineers thought, especially with pedestrians and cyclists,” he says.

And the progress engineers thought they could make on software that ran solely on cameras through artificial intelligence and machine learning has leveled off.

Regulators’ requirements

The problem is that “Elon Musk dangled his autonomous driving system by ensuring that the equipment already installed on the cars would suffice,” says Mr. Abuelsamid. “Tesla can no longer go back because hundreds of thousands of people have already paid money” to access it.

For the boss of the French equipment manufacturer Valeo, which presents its third generation of lidar at CES, “cameras alone, whatever the amount of data stored, are not enough”.

“Understanding, analyzing what is happening around the car, what we see and what we do not see, day and night, is absolutely key,” says Jacques Aschenbroich. And the environment is dynamic, he adds, referring to the traffic on Place de l’Étoile in Paris.

“Our absolute conviction is that you need Lidars” to achieve more advanced levels of autonomy, concludes Aschenbroich.

“All sensors have their advantages and their drawbacks,” says Marko Bertogna, professor at Italian University Unimore and head of a team running an unmanned vehicle during a self-driving car race in Las Vegas on Friday. .

“In the current state of knowledge”, the cameras alone still make too many errors, he also believes.

For now, “the more systems you have operating in parallel, the more you manage to merge different types of sensors, the more you will probably be among the first to meet the safety requirements that will be demanded by the regulators”, predicts the specialist.


source site-54

Latest