lundi, novembre 25, 2024

« Tesla Self-Driving Tester: The Ultimate Stressful Job Experience » « Testeur de conduite autonome Tesla : l’expérience professionnelle ultime stressante »

Les conducteurs d’essai de Tesla, en particulier ceux de l’équipe Project Rodeo, sont soumis à des scénarios de conduite risqués afin de tester les limites des logiciels de conduite autonome de l’entreprise. Les témoignages révèlent une pression pour créer des situations proches de collisions, incitant les conducteurs à ignorer les lois de la route pour recueillir des données. Bien que certaines entreprises concurrentes affirment adopter des méthodes de test plus sûres, les incidents problématiques persistent dans l’industrie, posant des questions sur la sécurité publique dans ce domaine en pleine évolution.

Would you apply for a position that required you to « remove all sense of legal duty, moral ethics, and humanity »? A recent report suggests that is essentially the reality for Tesla’s test drivers.

The era of self-driving vehicles is on the horizon. Tech companies envision a future where autonomous cars reign supreme. However, many recognize that the current software capabilities are far from achieving that futuristic level of automation. Yet, those with the financial means continue to accelerate this process, often at the expense of public safety, as they beta test their products on the roads we all share.

A report from Business Insider has uncovered information about a specialized group of test drivers at Tesla known as Project Rodeo. This team is tasked with exploring the boundaries of the company’s Full Self-Driving (FSD) and Autopilot systems. What do these boundaries include? Crashing. In fact, the goal seems to be to get as close as possible to a collision, with the more extreme situations being preferred.

One former test driver shared, “You’re pretty much running on adrenaline the entire eight-hour shift. There’s this feeling that you’re on the edge of something going seriously wrong.”

Nine current and former Project Rodeo drivers, along with three Autopilot engineers from California, Florida, and Texas, were interviewed for insights. Most chose to remain anonymous. The situations they described are startling yet not entirely unexpected, as FSD-related crashes are already well-documented, although none of the interviewed individuals had personally experienced one.

Project Rodeo consists of smaller, specialized teams. For instance, the “golden manual” team adheres strictly to driving regulations, avoiding any driver-assistance technologies. Conversely, the “critical intervention” team lets the software manage all driving tasks but steps in only to avert accidents.

One reason drivers delay taking manual control until the last moment is to allow the software to respond, whether correctly or not. The more data gathered, especially in real-world situations, aids engineers in refining and modifying the software.

“We want to analyze the data to understand what led the car to make that decision,” said a former Autopilot engineer. “If you intervene too early, we miss the crucial moment that reveals what happened.”

This experimental approach can lead to cars running red lights, ignoring stop signs, and exceeding speed limits—all on public streets. If drivers feel uneasy during challenging situations, supervisors often assert they took control too prematurely. Consequently, even test drivers not in critical roles may feel compelled to maintain or intentionally create dangerous driving conditions to effectively test the software and secure their jobs.

John Bernal, a former test driver and data analyst, noted that although he was never directly instructed to violate traffic laws for data collection, this was strongly suggested. “My training was to wait until the wheels touched the white line before I could slam on the brakes,” he recounted.

Moreover, certain test drives were intended solely for teaching the software to recognize vulnerable road users, including pedestrians, cyclists, and people in wheelchairs. One former tester recalled a moment when their vehicle approached within three feet of a cyclist before he hit the brakes.

“I clearly remember that guy jumping off his bike,” he said. “He was terrified. The car lunged at him, and all I could do was stomp on the brakes.” Surprisingly, his trainer was pleased with the near miss, stating that his delayed reaction was “perfect” and exactly what they were aiming for. “It felt like the goal was to simulate a hit-or-miss accident and steer clear at the last possible moment.”

Other companies like Cruise and Waymo are also working on self-driving technologies but claim their software testing occurs in controlled settings or assert that their systems differ fundamentally from those at Tesla. Yet, these companies too face issues with their vehicles failing to respond appropriately to surrounding conditions. The consequences of Uber’s now-defunct self-driving unit serve as a lethal reminder of the stakes involved.

“If a parent is holding the bike the entire time, it never gets the chance to learn,” remarked a former Tesla engineer. At the end of the day, data is paramount. For these autonomous technology firms, now beholden to shareholders, it results in a high-stakes environment that the public hasn’t agreed to be part of.

Si la description de poste incluait « doit se débarrasser de tout sens du devoir légal, de l’éthique morale et de l’humanité », postuleriez-vous ? Selon un rapport récent, c’est essentiellement le rôle d’un conducteur de test chez Tesla.

Les véhicules autonomes représentent l’avenir. Si les entreprises technologiques avaient leur mot à dire,

- Advertisement -

Latest