Autopilot feature in cars is misleading–and dangerous

On Behalf of | Jan 23, 2017 | Car Accidents

Automaker Tesla was recently cleared of charges that its Autopilot self-driving system is defective. A car accident that killed a man in May 2016 while he was using Autopilot sparked an investigation into the system.

Tesla’s Autopilot feature is supposed to prevent accidents by warning drivers about other vehicles and obstacles that could be dangerous. Using cameras, radar, lasers and other sensors, programs such as Autopilot are designed to identify objects on the road and determine how they may behave. In the case of the fatal Tesla accident, the system was unable to identify a white semi truck against a light sky. Despite this accident, federal regulators determined that Autopilot does not merit a recall.

The future of driving?

Tesla is not the only automaker using self-driving and auto-assist programs. Manufacturers that have or plan to release such programs include Mercedes-Benz, Ford, General Motors, Honda, Chrysler and Volvo. In addition, Google, Uber and Apple are working on driverless vehicles.

Part of the danger of these systems is driver over-reliance on them. Some drivers think they do not need to pay attention or keep their hands on the wheel when in self-driving or auto-assist mode. Names such as “Autopilot” add to the impression that the car is doing all of the work.

Regardless of the advanced systems you may have in your vehicle, it is critical to keep your eyes and mind on the road and your hands on the steering wheel. Technology ideally should assist drivers, not completely replace them.

FindLaw Network