For drivers who love the new and the easy, Tesla’s autopilot vehicles sounded great. However, a May crash has raised some serious questions about their autopilot system, the possibility of lawsuits connected to it, and the future of self-driving cars.
Tesla’s autopilot system is made up of:
- A radar system
- A camera
- Ultrasonic sensors affixed to the vehicle
The camera can observe speed limit signs and lane markings and prevent drifting. Sensors with a 16-foot range can alert a driver when other vehicles are too close for comfort. Digital technology controls the vehicle’s steering wheel and brakes.
Appreciative drivers believe the system, which Tesla introduced in October, lets them remove their hands from the wheel and their feet from a car’s pedals. However, Tesla says it expects them to keep their eyes on the roadway to regain vehicle control when necessary.
The potential to overestimate the system’s capabilities is huge. On May 7, Joshua Brown, 40, died after putting his Tesla Model S into autopilot mode in Williston, Florida. The vehicle’s sensors did not pick up a white tractor-trailer crossing the road against a bright sky.
The impact of the Tesla Model S trying to run full speed underneath the trailer peeled off the top of the car. Tesla reported that neither the autopilot system nor Brown distinguished the tractor-trailer from the white sky and applied brakes.
Brown was a Tesla lover who created YouTube videos showing his vehicle operating in autopilot mode. They suggested he felt safe even when he released the steering wheel.
Now here’s the kicker: This wasn’t the only crash.
A second driver operating a Tesla in autopilot mode was involved in a serious accident on July 1. Pennsylvania State Police are investigating the crash, which occurred on the Pennsylvania Turnpike roughly 100 miles from Pittsburgh.
The car overturned and came to rest on its roof. Its driver and its passenger were transported to a hospital and subsequently released. Tesla has stated that it can’t confirm whether the driver or the autopilot was in control of the Model X when it crashed.
Tesla’s autopilot feature is still in a testing phase. In May, the company issued a statement advising drivers to remain alert enough to hit the brakes or seize the car’s steering wheel if in danger.
The company maintains its drivers have logged more than 130 million miles while using autopilot, for 0.78 fatalities per 100 million miles. This contrasts with federal data showing 1.12 deaths for that mileage. However, the two crashes—so far—have cast big doubts on the safety of autopilot and the future of self-driven vehicles.
To Sue or Not to Sue
So what’s the bottom line? Experts estimate that 30,000 annual deaths are linked to highway accidents. Litigation over alleged malfunctions in Tesla’s autopilot feature is probable. Whether plaintiffs will prevail in a product liability case depends on whether the driver was led to believe the autopilot feature had more capabilities than it did and whether the driver received adequate warning about the system’s potential defects.
If the manufacturer can prove a driver ignored safety features in place, that could be a different ballgame. How damages link to fault varies considerably from one state to the next. Tesla issues warnings that its system is actually “traffic-aware cruise control” and says drivers should keep their hands on the wheel all the time. Add to this the tractor trailer-driver’s claim that after the collision, he heard sounds from a Harry Potter movie coming from Brown’s car.