dailyO
Technology

Tesla The Wonder Car kills 2 in China as 'man vs machine' debates rage on

Advertisement
Shaurya Thapa
Shaurya ThapaNov 15, 2022 | 14:52

Tesla The Wonder Car kills 2 in China as 'man vs machine' debates rage on

Tesla's recent malfunctions might make it a major violator of Isaac Asimov's famous laws of robotics (photo-DailyO)

While Elon Musk is busy with his Twitter theatrics, his Texas-based automotive company Tesla is also in the news for all the wrong reasons. Just yesterday, a speeding Tesla car was found guilty of manslaughter!

What: A Tesla Model Y car on autopilot mode seemed to have gone out of control as it ran over a motorcyclist and a high-school girl in China’s Chaozhou City which is located on the northeast of Hong Kong. Three more were injured. Within no time, Tesla began trending on Weibo, the country’s equivalent of Twitter. 

Advertisement

CCTV footage of the “rogue” Tesla has been making the rounds on the internet, leading cynics to question if the world is ready to embrace smart tech like Tesla cars. 

How did the accident happen? To clear the Internet rumours, the car wasn’t driving itself. A 55-year-old man was the owner of the Y model and was in the driver’s seat when the accident happened, claiming that the car’s brakes went unresponsive even though his foot was on the pedal. 

How did Tesla respond to the tragedy? According to a statement Tesla offered to Reuters, "Police are currently seeking a third-party appraisal agency to identify the truth behind this accident and we will actively provide any necessary assistance."

Tesla’s team is looking into the truth behind the accident claiming that the brake lights didn’t seem to be turned on in the CCTV footage, the company believing that the brakes weren’t engaged in the first place. 

Can a Tesla drive itself entirely? Ever since Tesla’s electric cars were equipped with autopilot technology, people naturally grew concerned of its practicality in the real world. At the same time, it must be noted that Tesla has always claimed that active driver supervision is required even when the car is in autopilot. The mode allows the car to steer, brake, and control the speed on its own. 

Advertisement
(meme-DailyO)
(meme-DailyO)

Tesla has been pushing their tech further to make the cars go in “autonomous self-driving” mode but the research is still ongoing. And with incidents like the recent crash in China, doubts are bound to arise around the safety of a car that can drive itself on its own. 

Tesla accidents in the past: Customers have been disgruntled in the way Tesla fails to take much accountability relating to the ignition of their cars and autopilot-led crashes.

  • Back in 2013, a Model S caught fire after hitting metal debris on a highway in Kent, Washington, the company confirming that the fire started in the battery pack after the crash happened.
Model S catches fire (photo-DriveSpark on Twitter)
Model S catches fire (photo-DriveSpark on Twitter)

While the accident was the driver’s fault, Tesla decided to extend its vehicle warranty to cover fire damage. Investigations into battery defects by the US Government’s National Highway Traffic Safety Administration (NHTSA) followed in the subsequent years. 

  • The first case of an autopilot-related Tesla death took place in 2016 when a Model S driver died in a collision with a tractor, his car engaged in autopilot. Maybe humans aren’t ready for such smart tech as some of the Tesla accidents also result from human folly. 
  • For instance, in 2018, a driver of a Model X died with the car in self-driving mode. Allegedly, the late Apple engineer was playing games on his phone before the car crashed into a barrier in the middle of a freeway. 
Advertisement
Remains of the crashed Tesla X from 2018 (photo- Car Scoops)
Remains of the crashed Tesla X from 2018 (photo- Car Scoops)

However, the NHTSA’s investigations also concluded that Tesla malfunctioned due to the car’s system getting confused by an exit on the freeway. So, who is at fault? The human or the machine?

An ongoing trial on Tesla’s autopilot feature: In 2019, Kevin George Aziz Riad exited a freeway in California in his Tesla Model S. As he ran a red light, he collided with a Honda Civic, killing a couple who were just going on their first date.

At the time of the tragedy, the Model S was engaged in autopilot. Yet again, the question of who to put the blame on arises. 

Kevin George Riad's crashed Model S (photo-The Drive)
Kevin George Riad's crashed Model S (photo-The Drive)

“The state will have a hard time proving the guilt of the human driver because some parts of the task are being handled by Tesla.” says Edward Walters, a Georgetown University professor whose speciality lies in the laws around self-driving cars. 

Riad’s trial is ongoing with his lawyers pointing fingers at Tesla’s tech while the prosecutors claim that Riad should be held guilty for speeding and his inability to stop the brakes. 

As Reuters reports, the US Department of Justice has been deciding if Tesla should also be criminally charged for its self-driving claims. Donald Slavik, the lawyer representing the deceased in the Riad trial is asking for Riad to be sentenced while also asking Tesla to hold accountability.

“The Tesla system, autopilot and Tesla spokespeople encourage drivers to be less attentive. Tesla knows people are going to use autopilot and use it in dangerous situations,” he reportedly told Reuters

The trial is yet to come to a conclusion but its media coverage is already changing public perceptions towards Tesla.

Will cars turn against humanity? From Skynet creating terminators to the Matrix’s machines enslaving humanity, the central fear in dystopian science fiction is that one day, Artificial Intelligence will just control humans instead of assisting them like they were supposed to.

Back when the concept of robots began to gain currency, they were always seen as slaves. It’s literally in the name. 

With the word robot usually associated with labour in Slavic languages, Czech playwright Karel Čapek was the first one to coin the term. Since then, Russian sci-fi maestro Isaac Asimov’s fiction and non-fiction have both contributed to furthering the concept of using robots and robotic tech as subservient tools for humankind’s betterment. 

 A self-driving car shown in the 2004 film I Robot, based on Isaac Asimov's writings (GIF-IMDb)
A self-driving car shown in the 2004 film I Robot, based on Isaac Asimov's writings (GIF-IMDb)

For robotic aficionados, Asimov’s three laws of robotics would be basic knowledge. But for the unacquainted, these are his commandments for an ideal robot: 

  • First Law- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • Second Law- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  • Third Law -A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

As of now, Tesla’s tech has already violated the first law of robotics. If the Y-model driver from China is speaking the truth and if he indeed tried to hit the brakes with the car not following his command, then the Tesla car might violate the second law too. Let’s just hope that third law’s violation doesn’t become a reality now!

Last updated: November 15, 2022 | 14:53
IN THIS STORY
    Please log in
    I agree with DailyO's privacy policy