Car Crashes That Happen in Autopilot Mode: Who’s To Blame?
Posted Wednesday, September 14, 2016 by Christopher L. Thayer
The first known fatal crash of a Tesla vehicle while under the control of “autopilot” has sparked controversy and federal investigators. 40 year old Joshua Brown, a Tesla enthusiast, was the victim of the autopilot crash when his 2015 Tesla Model S collided with a tractor-trailer while he was using Tesla’s “autopilot” feature. Neither Mr. Brown nor the autopilot apparently applied the brakes. This is the first known fatal crash involving a self-driving car. Tesla has confirmed that the car was in autopilot mode when the crash occurred.
The incident occurred after a tractor-trailer made a left turn in front of Mr. Brown’s Tesla on a Florida highway intersection. According to Tesla, the white side of the truck against the brightly lit sky was the reason why the autopilot failed to detect the other vehicle and did not apply the emergency brake. Mr. Brown was then fatally injured after his car continued driving underneath the tractor-trailer, his roof striking the underside of the trailer, shattering his windshield, and then finally stopping after hitting a utility pole. While we may not know all the facts yet because the story continues to develop, some sources indicate that Mr. Brown was actually watching a movie on a portable device at the time of the crash.
Tesla’s autopilot feature, available when driving at cruising speeds, uses adaptive cruise control combined with a system called “Autosteer” – a form of lane departure assistance. The cruise control system uses radar and forward facing cameras to track the position of cars on the road ahead and adjust the speed of the car as needed. Meanwhile, the Autosteer function uses cameras mounted on the car to track the car’s position in the roadway relative to road markings and other nearby vehicles. Combined, the system can steer the vehicle (even changing lanes), speed up, slow down or even stop – depending on the circumstances. It is not intended, according to Tesla, to act as a “self-driving” car, but merely as a sophisticated driver’s aid system. Autopilot mode, when activated, instructs the driver to keep their hands on the steering wheel at all times – something Mr. Brown apparently failed to do.
Although self-driving vehicles are predicted to be offered to the public sometime in the near future, the current autopilot feature offered in Tesla vehicles is designed mainly to assist drivers in avoiding accidents. The Tesla autopilot function does not make the cars fully autonomous. The technology has been designed to take over steering with an ability to engage active cruise control that adjusts speed without any driver input. The vehicle can also change lanes when the driver puts on a turn signal and most importantly, the car can automatically utilize the emergency brake if it senses an imminent crash. While Tesla representatives attest to its safety, the autopilot is still in the beta-stage and customers are warned that they are responsible for remaining alert and present when using autopilot.
Experts have criticized how aggressively Tesla has been marketing the safety of their autopilot vehicles. Tesla’s co-founder, Elon Musk, has been quoted talking about the autopilot system saying, “we knew we had a system that on balance would save lives.” Tesla explicitly informs drivers of the autopilot vehicles that the autopilot function is still in beta mode. In fact, the vehicles are designed so that the autopilot function is initially turned off by default until activated by a driver. However, these warnings may not be enough for Tesla to escape liability for the May 7 crash.
Many questions arise out of a collision where the “driver” was not a person but the vehicle itself. Can Mr. Brown’s family file suit against Tesla, claiming the autopilot failed and/or was unsafe? Did Tesla’s system “fail” or was it not “reasonably safe” under the unique circumstances of the case? Tesla presumably would defend such a suit by noting that all drivers who activate autopilot are directed to keep their hands on the steering wheel and to pay attention at all times – and that Mr. Brown purportedly disregarded these warnings. What if Mr. Brown’s Tesla had struck another car, causing serious injuries or fatalities in that vehicle? Could those victims make claims against Tesla – or would they be required to sue Mr. Brown – as the actual “driver” of the car who arguably was negligent? These questions and many others remain to be sorted out by the courts or the legislatures around the country as we plunge into the world of semi-automated cars and eventually fully “self-driving” cars.
Mr. Brown’s family has hired a personal injury lawyer to look into the matter further. It has yet to be determined whether a suit will be filed.
Photo credit: benjie castillo, used under the Creative Commons license