Is It Time to “Pump the Brakes” on Self-Driving Cars?
A crash involving a driverless SUV, operated by Uber, struck and killed a woman crossing a street in Tempe, Arizona, on Sunday, March 18. Since the incident, numerous concerns have been voiced over the safety of these vehicles and whether or not they should become a part of our communities.
For little over a year now, Uber has been testing its self-driving cars in areas like Tempe, Pittsburgh, San Francisco, and Toronto. Although a “backup driver” has been present in the vehicle during most tests, the car primarily operates using the Automated Driving System or ADS, which consists of several cameras and sensors to navigate the vehicle. Since this incident, Uber has suspended its tests until the primary cause of the accident is determined. Uber CEO Dara Khosrowshahi expressed his condolences on Twitter stating, “our hearts go out to the victim’s family”. He also ensured that the company would cooperate fully with investigators from Tempe police and the National Transportation Safety Board.
The Purpose Behind Self-Driving Vehicles
Driverless cars are becoming more popular and are likely here to stay. The idea behind them is that they will help improve driver safety. According to the National Highway Traffic Safety Administration (NHTSA), ninety-four percent of crashes are caused by human error. By removing the human from this equation and allowing a computer to control the car’s movements, it could potentially decrease the amount of car accidents and fatalities the US experiences every year.
Uber isn’t alone in the testing phase. Companies like Google-affiliated Waymo and Tesla have their own versions of driverless vehicles that are slowly rolling out in several cities. Although each company has invested hours of research and development in these projects, the issue remains the same. The cars’ computers are using our cities and communities as an educational driving course. As they navigate our roads, they are learning proper maneuvers and driving behaviors.
Our roads are the training ground for these vehicles” said attorney Gene Riddle. “These test cars may not able to react to something they don’t understand yet. It’s like the first day at a new job. These cars are learning as they go. If a pedestrian steps out in in the road, these cars may not know how to properly avoid hitting them until it has learned or been programed to do so.”
Other Autonomous Incidents
In early 2017, Uber experienced their first self-driving accident in Tempe. The Uber car was reportedly knocked on its side after another vehicle hit it. Those involved in the crash did not sustain any injuries.
Other companies have also experienced issues with their self-driving vehicles. Tesla’s Model S, equipped with an “autopilot mode”, has caused two fatalities allegedly related to the feature since its launch.
The Future of Self-Driving Cars
In months to come, we are expected to see more self-driving cars and possibly trucks that travel interstate deployed in the United States. During this time, they will gather data needed to improve their driving abilities. While the vehicles learn the rules of the road, it is just as important for our lawmakers to learn what it will take to make these vehicles safe. Although several states have already begun discussing this issue, very few regulations have been put in place to help protect our fellow drivers and pedestrians.
Ultimately, car manufactures, technology leaders, and our nation’s lawmakers need to agree on reasonable and sensible regulations that put the safety of humans above that of machines. Until we get there, the likelihood of another incident such as this one will remain high.
“This is a problem that isn’t going away anytime soon,” says Gene. “We know it could become more frequent as these cars hit our streets, so we must stay informed on how the government will regulate them so that we can protect our people if they’re injured in a similar accident.”