Getting an extra 30 minutes of sleep while you’re on the road… finishing yesterday’s homework while you’re on your way to your senior year of high school… caring for your baby while you’re breezing through the highway. All of these scenarios seemed too good to be true a few years ago, but now America is on the cusp of the age of autonomous vehicles (AVs). As it is, in 2016, 87.5 percent of people ages 16+ had their driver’s license and spent, on average, a total of 17,600 minutes on the road a year. The idea of a car that could do the tedious and time-consuming duties of driving is a dream that is quickly becoming a reality.
For the disabled, the benefits of autonomous cars are even greater – if nothing else, it allows for increased independence. It means the legally blind, for example, will finally be able to safely operate a car by themselves. A 2012 video of a legally blind man stopping at the Taco Bell drive-thru prompted much positive excitement. The means with which to allow the blind to safely drive is still in actuality in development, but spokesmen do say, “At Waymo, Google’s self-driving car company that was launched nearly a decade ago, officials say visually impaired employees contribute to design and research. While no specific system for blind riders has been completed, the company says it’s developing a mobile app, Braille labels and audio cues.” As Americans gets older, a self-driving car could help those who have a, “loss [of] flexibility, vision and hearing,” and delayed reaction time. Of course, these are some of the same impairments suffered by those with brain injuries.
However, this may seem too good to be true because it is just that. Car fatalities have been on an almost steady decline, from a high of more than 50,000 in the 1970s to the low to medium 30,000s this decade. (“An additional 2.35 million are injured or disabled.”) Though this is still an extremely high number, how will fully autonomous or semi-autonomous cars affect this? Beyond testing, no one knows if or by how much this will decrease with the use of self-driving cars.
Cars don’t have the same “sense” that people do. Only a month ago, on March 20, 2018, in Arizona, a homeless woman became the first pedestrian fatality to be attributed to this new technology. “If there is any real-world scenario where it would be seemingly safe to operate in an automated mode, this should have been it. Something went seriously wrong,” said an urban planning professor after the incident. (The car that caused the fatality was a self-driving Uber. Uber has since suspended it’s self-driving car tests.) In Mountain View, CA, headquarters to self-driving car company Waymo, Walter Huang was killed after the sun glare got into his eyes when his Tesla noted that it needed him to take the wheel, resulting in his vehicle driving straight into a highway median. Two years ago, in Florida, a man was killed when he failed to take the wheel after numerous notifications from a self-driving car. (The National Transportation Safety Board released a report of findings about the incident.)
In a horrifying test, reported by Psychology Today this month, “some recent demonstrations have shown that a few black stickers on a stop sign can fool the algorithm into thinking that the stop sign is a 60 mph sign.” As far as accidents go, in Pittsburgh, PA in late February, a “Woman claim[ed a] self-driving Uber struck her car, left the scene.” Did that driver choose not to stop or did the car leave on its own?
The above are just a few examples of accidents or possible accidents resulting from problems with autonomous cars. (I am not sure how many more examples there are, if any.) Tesla said in 2016, “Autopilot is by far the most advanced such system on the road, but it does not… allow the driver to abdicate responsibility.” Presumably the technology has gotten much safer in the past 2 years because California just legalized testing of fully-autonomous vehicles on public roads. Nationally, H.R. 3388 passed the House unanimously. The bill’s subtext says that its intent is, “to provide for information on highly automated driving systems to be made available to prospective buyers.” Further reading though, one finds that the goal of the bill is, “encouraging the testing and deployment of such vehicles.” (Read also: California proposes new rules for self-driving cars to pick up passengers.)
Self-driving cars have already been tested in multiple states with positive results. For example, in California, the state with the most drivers in America and the state that is testing AVs the most, Waymo just applied to the state to do what the above law indicates: test self-driving cars without a back-up driver on public roads. (Besides California, many other states already have laws or proposed laws on the legality of self-driving cars.) Six months ago, GM announced its plan to start testing its Chevy Bolt EV in Manhattan later in 2018. In Connecticut, Governor Daniel P. Malloy created a pilot program, which will soon launch, to test fully-automated cars. And this month, the Pentagon announced that it intends to become the next big AV developer, as it soon plans to use self-driving vehicles in combat. As Michael Griffin, the undersecretary of defense for research and engineering, states, “52 percent of casualties in combat zones can been attributed to military personnel delivering food, fuel and other logistics.” Removing humans from this equation will save many lives.
Since there has been no final determination of the safety or legality of self-driving cars for the general population or for the disabled, no conclusion can be made on this post. Some car manufacturers are addressing the public’s worries about fully autonomous cars by making them just not really that. For example, one company, Phantom Auto, has developed a remote control car system, in which the car is “driven” remotely by an employee miles away.
But perhaps the worry about autonomous cars is similar to that which arose when America changed from horse-and-buggy to modern cars? The concern and the extreme testing are understandable, but some states realize that the testing must stop at some point. Is that time now? In addition, should we allow those who are currently hindered from driving by their age or disability to get a key?
* Another issue that some have with self-driving cars is that, “AVs will record everything that happens in and around them. When a crime is committed, the police will ask nearby cars if they saw anything.” For car accidents and other such physical and/or vehicular traumas this is a plus. However, while a person or their family may want to know what vehicle caused their child’s car accident, do they want to give the government the ability to know exactly when they left for work, went to Walmart, refilled their gas tank, etc.? Will self-driving cars be the means for social control?