dailyO
Technology

Tesla crash brings alive the fear of self-driving cars

Advertisement
Vikram Johri
Vikram JohriJul 11, 2016 | 21:15

Tesla crash brings alive the fear of self-driving cars

With a Tesla Model S on Autopilot crashing into another vehicle in Florida recently, some of the fears around self-driving cars are beginning to be realised.

Autopilot is Tesla’s assisted driving system, the precursor to fully self-driving capabilities that the company, along with other Silicon Valley giants like Google, is developing.

On May 7, as Joshua Brown drove his Model S on a bright morning, a white tractor-trailer turned in front of the car.

Advertisement

According to experts, it was the colour of the trailer against a bright backdrop that may explain why the Model S did not respond to it, that is, slow down or stop.

Brown, meanwhile, was too distracted to push the brakes fast enough. He died in the resultant crash.

Since self-driving cars became the latest obsession of Silicon Valley, countless articles have been written on their potential benefits. That they will be ultimately safer than human-driven vehicles.

That they will free up premium road space. That they will do away with the need to park. That they might even alter the dynamics of the car industry by eliminating the need to purchase cars.

Everyone will be able to zip around in vehicles by tapping into an always-on grid of driverless vehicles.

bd_071116085522.jpg
A still from a YouTube video of Joshua Brown in the driver’s seat of his Tesla Model S. Brown died while the car was in autopilot mode. 

In spite of the Tesla crash, hard data backs the safety claims. The National Highway Traffic Safety Administration, an American body, recorded one fatality for every 100 million miles driven in the US in 2014.

According to Tesla, on the other hand, the accident on May 7 was the first fatality in 130 million miles of driving on Autopilot.

Fully autonomous cars may indeed turn out to be a revolutionary idea in the same league as the steam engine or the internet. Interestingly, their development has been spearheaded not by traditional automakers but technology behemoths.

Advertisement

Ride-hailing apps like Lyft and Uber too have entered the game. All this activity has meant that regulation has lagged the sector, even as technologies like Model S’ Autopilot have hit the market with beta versions.

Not just Tesla, a number of other carmakers like BMW and Volvo now offer vehicles that offer some level of autonomous driving.

Called Level 3 cars in automaker parlance, these cars enable drivers to briefly take their eyes off the road.

The trouble, as with all technology, is in the details, especially the ways humans choose to interact with it.

Studies of self-driving cars by Google and others have shown that drivers display a surprisingly blithe attitude to their safety the moment they are told that the car is even partially autonomous.

In videos recorded by Google and Tesla of volunteers driving autonomous cars, drivers have been known to snack, play cards, or even sleep.

This calls for urgently building Level 4 cars where the driver is not expected to take control under any situation. Those cars will befully autonomous, and destination entry will be the only driver input required.

That said, fully autonomous cars will have their own problems. The technology self-driving cars currently use for detection is called Lidar (light detection and ranging). Since it needs sufficient light to perform, bad weather may hamper the ability of the car to navigate effectively.

Advertisement

Google and others have said they will come up with alternatives to Lidar but at the moment, none are on the horizon.

With greater use of technology will come the increased risk of hacking, a problem that manual cars, for all their issues, have not had to worry about.

Criminal hackers may attack car systems and demand ransom for relinquishing control. And this may happen in nebulous cyber corners where no one need ever present themselves physically. Payment can be made in Bitcoin. 

Finally, using cars is not just about driving from A to B. It involves making judgements every step of the way, and while self-driving cars can learn to handle various situations based on feedback, some conditions may prove too delicate.

Autonomous cars may have to juggle with dilemmas that are essentially ethical in nature.

If an animal suddenly crosses the road, what should the car do? Should it swerve sharply and thus expose those in the car to danger? Or should it let the animal be sacrificed? What if the intruding party were a child?

There are no easy answers to these questions. In fact, there may be no answers to such questions, and it might be best for the person in the thick of the action to respond on an as-is basis.

That, however, is not an option for manufacturers of self-driving cars who will have to program such circumstances into the system.

Self-driving cars, in that respect, extend the remit of technology into a disturbing space where humans handling it are called upon to play God.

Last updated: July 12, 2016 | 17:26
IN THIS STORY
Please log in
I agree with DailyO's privacy policy