Last week, a self-driving vehicle operated by Uber struck and killed a pedestrian in Arizona. Uber’s car, which was utilizing an autonomous mode, similar to Tesla’s autopilot feature, struck and killed the pedestrian, Elaine Herzberg, around 10 p.m. At the time of the incident, there was a driver at the wheel, but the Uber car did not have any passengers. Even though the car was a self-driving car, a driver was at the wheel to take over in case the car’s autonomous system failed.
The fatal accident occurred as Ms. Herzberg was walking her bicycle across the street. The street she was crossing did not have a crosswalk and the speed limit was 45 miles-per-hour. At the time of impact, Uber’s car was going about 40 miles-per-hour. Initially, the driver of the Uber vehicle, stated that Ms. Herzberg “darted” out in front of the Uber vehicle and as a result, the driver was not able to stop the vehicle in time to avoid hitting her. However, video from the accident, which contains both interior and exterior views, shows that Ms. Herzberg was more than halfway across the road at the time of impact and that driver at the wheel of the Uber was looking down and not looking at the road at the time of the accident.
Following the accident, Arizona suspended Uber’s ability to test autonomous, self-driving vehicles in the state. Arizona’s governor referenced the video of the accident and stated, “I found the video to be disturbing and alarming, and it raises many questions about the ability of Uber to continue testing in Arizona.” However, the suspension of autonomous, self-driving vehicles in Arizona only applies to Uber. Other companies will be allowed to continue testing self-driving vehicles in the state. At the time of the incident, Uber was testing its self-driving cars in Arizona and California. In order for Uber to test its self-driving cars in California, after a period of time, Uber was required to report to the state the results of its tests. However, Arizona had no such requirements for Uber.
Problems in self-driving systems have been well documented. For more information on failures in Tesla’s autopilot system, see our previous blog HERE. Drivers using autopilot systems or self-driving systems can reduce pedestrian injuries and fatalities by staying actively engaged in the act of driving. Even if a car has a self-driving feature, a driver should always stay alert and should actively watch for pedestrians. Additionally, a driver should avoid distractions, such as texting.
Even though Illinois does not yet have a law concerning self-driving cars, Illinois’ Motor Vehicle Code applies to drivers using systems similar to autonomous driving systems, such as autopilot systems. For example, Section 5/11-1002 of the Illinois Motor Vehicle Code outlines the duty of vehicles to yield to pedestrians in crosswalks. 625 ILCS 5/11-1002. Under the Code, drivers of vehicles, including self-driving vehicles and vehicles with autopilot systems, must yield to pedestrians who are within a plainly marked crosswalk between an intersection. A driver who is not paying attention can easily strike a pedestrian who is lawfully crossing in a crosswalk and cause serious and painful injuries to that pedestrian.
As injuries from using cars’ self-driving systems increase, it is clear that the accidents could easily have been prevented if a driver stays engaged in the act of driving and applies the vehicle’s brakes to avoid an accident. If you or a loved one has been struck by a car as a pedestrian and injured, regardless of whether the driver was using an autopilot driving system or self-driving system, contact the car accident lawyers at the law firm of John J. Malm & Associates to learn more about how you may be entitled to receive compensation for your injuries.