While Elon Musk is busy with his Twitter theatrics, his Texas-based automotive company Tesla is also in the news for all the wrong reasons. Just yesterday, a speeding Tesla car was found guilty of manslaughter!
What: A Tesla Model Y car on autopilot mode seemed to have gone out of control as it ran over a motorcyclist and a high-school girl in China’s Chaozhou City which is located on the northeast of Hong Kong. Three more were injured. Within no time, Tesla began trending on Weibo, the country’s equivalent of Twitter.
CCTV footage of the “rogue” Tesla has been making the rounds on the internet, leading cynics to question if the world is ready to embrace smart tech like Tesla cars.
How did the accident happen? To clear the Internet rumours, the car wasn’t driving itself. A 55-year-old man was the owner of the Y model and was in the driver’s seat when the accident happened, claiming that the car’s brakes went unresponsive even though his foot was on the pedal.
How did Tesla respond to the tragedy? According to a statement Tesla offered to Reuters, "Police are currently seeking a third-party appraisal agency to identify the truth behind this accident and we will actively provide any necessary assistance."
Tesla’s team is looking into the truth behind the accident claiming that the brake lights didn’t seem to be turned on in the CCTV footage, the company believing that the brakes weren’t engaged in the first place.
Can a Tesla drive itself entirely? Ever since Tesla’s electric cars were equipped with autopilot technology, people naturally grew concerned of its practicality in the real world. At the same time, it must be noted that Tesla has always claimed that active driver supervision is required even when the car is in autopilot. The mode allows the car to steer, brake, and control the speed on its own.
Tesla has been pushing their tech further to make the cars go in “autonomous self-driving” mode but the research is still ongoing. And with incidents like the recent crash in China, doubts are bound to arise around the safety of a car that can drive itself on its own.
Tesla accidents in the past: Customers have been disgruntled in the way Tesla fails to take much accountability relating to the ignition of their cars and autopilot-led crashes.
While the accident was the driver’s fault, Tesla decided to extend its vehicle warranty to cover fire damage. Investigations into battery defects by the US Government’s National Highway Traffic Safety Administration (NHTSA) followed in the subsequent years.
However, the NHTSA’s investigations also concluded that Tesla malfunctioned due to the car’s system getting confused by an exit on the freeway. So, who is at fault? The human or the machine?
An ongoing trial on Tesla’s autopilot feature: In 2019, Kevin George Aziz Riad exited a freeway in California in his Tesla Model S. As he ran a red light, he collided with a Honda Civic, killing a couple who were just going on their first date.
At the time of the tragedy, the Model S was engaged in autopilot. Yet again, the question of who to put the blame on arises.
“The state will have a hard time proving the guilt of the human driver because some parts of the task are being handled by Tesla.” says Edward Walters, a Georgetown University professor whose speciality lies in the laws around self-driving cars.
Riad’s trial is ongoing with his lawyers pointing fingers at Tesla’s tech while the prosecutors claim that Riad should be held guilty for speeding and his inability to stop the brakes.
As Reuters reports, the US Department of Justice has been deciding if Tesla should also be criminally charged for its self-driving claims. Donald Slavik, the lawyer representing the deceased in the Riad trial is asking for Riad to be sentenced while also asking Tesla to hold accountability.
“The Tesla system, autopilot and Tesla spokespeople encourage drivers to be less attentive. Tesla knows people are going to use autopilot and use it in dangerous situations,” he reportedly told Reuters.
The trial is yet to come to a conclusion but its media coverage is already changing public perceptions towards Tesla.
Will cars turn against humanity? From Skynet creating terminators to the Matrix’s machines enslaving humanity, the central fear in dystopian science fiction is that one day, Artificial Intelligence will just control humans instead of assisting them like they were supposed to.
Back when the concept of robots began to gain currency, they were always seen as slaves. It’s literally in the name.
With the word robot usually associated with labour in Slavic languages, Czech playwright Karel Čapek was the first one to coin the term. Since then, Russian sci-fi maestro Isaac Asimov’s fiction and non-fiction have both contributed to furthering the concept of using robots and robotic tech as subservient tools for humankind’s betterment.
For robotic aficionados, Asimov’s three laws of robotics would be basic knowledge. But for the unacquainted, these are his commandments for an ideal robot:
As of now, Tesla’s tech has already violated the first law of robotics. If the Y-model driver from China is speaking the truth and if he indeed tried to hit the brakes with the car not following his command, then the Tesla car might violate the second law too. Let’s just hope that third law’s violation doesn’t become a reality now!