Live Chat
Call Now: (312) 600-0000
Get a FREE Case Review
Leading Lawyers logo
Super Lawyers logo
American Association for Justice
WILG logo
Illinois Trial Lawyers Association logo
Avvo Rating logo
Workers' Compensation Lawyers Association logo

The Real Problem with Self-Driving Cars

Written by Ankin Law Office

Recent fatal crashes involving self-driving vehicles in California and Arizona have put a spotlight on the safety of autonomous technology and human injuries.

Are Self-Driving Cars Really Self-Driving?

Although autonomous technology allows self-driving cars to operate on their own, a human driver is needed behind the wheel to override self-driving features if something goes wrong. Self-driving cars are not actually self-driving, not yet. Autonomous technology promises many potential benefits for communities such as pedestrian-friendly city environments, increased mobility for elderly and disabled drivers, and fewer traffic accidents. Self-driving cars are already on the road in Arizona, California, Michigan, Nevada, Pennsylvania, Texas, and Washington, but they are currently restricted to specific testing areas and driving conditions. Since accident statistics show that 90 percent of car crashes are caused by human error, the National Safety Council and many states have been strong advocates of self-driving cars over the last decade. However, recent fatalities in Arizona and California involving autonomous vehicles have raised safety concerns and halted testing in several states.

On March 18, a 49-year-old female was killed in Tempe, Arizona by an Uber self-driving SUV. She was struck by the vehicle while crossing the street with her bicycle. Videos of the crash show a human test driver behind the wheel, but cameras show that the driver was looking down and not at the road when the accident occurred. Although there have been several self-driving car accidents recorded, this is the first fatality directly related to autonomous technology. After the accident, Arizona’s Governor ordered Uber to stop its testing of autonomous vehicles on Arizona roads.

On March 23, a California driver was killed when a Tesla self-driving vehicle equipped with Autopilot crashed into a highway barrier. Tesla safety officials said that the driver did not take control of the steering wheel before the crash, even though he received several audible and visual alerts to take over the steering before the car hit the barrier. Tesla blamed the driver’s death on a defective barrier that was not strong enough to handle the impact of the crash.

Who’s Monitoring the Road?

The recent fatal crashes have raised safety concerns about autonomous technology with safety experts. The National Highway Traffic and Safety Administration (NHTSA), has expressed concerns regarding a test driver’s ability to safely monitor self-driving features and take control of the car when something goes wrong. Safety officials fear that the more advanced autonomous technology gets, the more humans will be lulled into a state of false security. In the recent death of the Arizona pedestrian, the test driver behind the wheel failed to override the technology that missed seeing the pedestrian crossing the road.

Safety officials state that this is an urgent reminder that a driver is still a car’s best safety feature, because driving requires full, constant attention to the road and its surroundings. The recent fatalities emphasize that the transition from partial to full automation will be a challenge for auto manufacturers who must ensure driver and pedestrian safety. If safety features fail to engage or operate properly, test drivers must be prepared to quickly disengage autonomous features and take control of the vehicle to prevent auto accidents and injuries.

Currently, autonomous technology uses a variety of features including GPS systems, internal navigational maps, cameras, sensors, and radar to navigate streets and detect other vehicles, pedestrians, and objects in or near the self-driving vehicle. Industry standards for various levels of automation have been established. Automation levels range from 0 where a human driver controls everything to 5 for full automation. However, currently, even the highest level of automation requires a human driver behind the wheel to ensure safety.

ADEPT Driver, a research company that develops crash avoidance training programs, cautions drivers about the limitations and potential dangers of autonomous vehicle technology. They are concerned about drivers becoming complacent while operating vehicles equipped with self-driving features. Researchers at ADEPT Driver are warning safety officials that the recent Arizona and California fatalities emphasize the need for more safety features in self-driving cars. They say that test drivers need advanced skills in crash avoidance and training in visual and cognitive skills that help them quickly scan the road and surroundings and identify safety hazards.

Consumer marketing studies show that many American drivers are uncertain about the safety of self-driving cars and replacing humans behind the wheel. Over 50 percent of drivers say they would not currently purchase a self-driving vehicle until further safety testing is done. Other drivers say they simply enjoy the physical act of driving and don’t want to give that pleasure up to an autonomous vehicle. The recent deaths involving self-driving cars may have a negative impact on consumer safety.