Inquiry-Based Research Essay

 

 

What are Autonomous Vehicles and How Do They Affect Society?

 

Autonomous vehicles. They are thought to be the way of transport in the future. Without any human interaction, they are seamless and efficient. Passengers are able to relax and enjoy the ride without any worries as the car makes it way to its destination. With recent approaches at the goal of fully autonomous vehicles, we’re have way through from fully achieving fully capable self-driving cars. However, as more and more partial automation cars hit the road, people start to wonder about ethics and morality. If a semi self-driving car were to cause an accident, would the driver or the manufacturer be at fault? Or would it be the people who program the functionality of the car? If a self-driving car with no driver seated were to be pulled over, who would the police officer ticket?

 

What Are Autonomous Vehicles?

 

Autonomous vehicles, are vehicles that are capable of driving and handling scenarios on the road without human input. From the National Highway Traffic Safety Administration, the self-driving car can be classified into 5 levels, from driver assistance as level 1 and partial automation as level 2 to full automation as level 5. However, that full automation is far from ready. Currently, one of the most popular automakers approaching the goal of fully autonomous driving is Tesla.

According to Tesla’s website, Tesla’s vehicles are equipped with “Eight surround cameras…Twelve ultrasonic sensors…”(Tesla) and “A forward-facing radar”, hardware which Tesla claims “All new Tesla cars have the hardware needed in the future for full self-driving in almost all circumstances.” With these cameras and sensors working together, they are able to provide data to the car to make adjustments accordingly. Currently, Tesla’s Autopilot capabilities are limited, categorizing it at a level 2 autonomy according to the Society of Automotive Engineers. The system is capable of steering, acceleration, and deceleration, but the driver must be constantly monitoring the environment, intervening when necessary. From Tesla’s website, current features of the Autopilot package includes “Adaptive Cruise Control”, which allows Autopilot to follow another car while maintaining a safe distance, “Autosteer”, which keeps the car centered in its lane and is able to change lanes, “Navigate”, which allows Autopilot to do automatic lane changes, adjust speeds based on lane, and navigate freeway interchanges, and safety features, such as automatically sweering to avoid collisions and automatic emergency braking.

Another company is Waymo, who became a stand-alone subsidiary from Google. Compared to Tesla, their hardware costs more due to their LIDAR(Light Detection and Range) system. From the NOAA’s website, LIDAR “is a remote sensing method that uses light in the form of a pulsed laser to measure ranges (variable distances)… These light pulses… generate precise, three-dimensional information about the shape of the Earth and its surface characteristics”(NOAA, “What is LIDAR.”). What does this mean for cars? Most LIDAR schemes used in cars have “mechanically rotating multiple optical transmitters and receivers on the vehicle’s roof”(Nature Photonics, “LiDAR Drives Forwards”). Compared to Tesla’s use of cameras, radar, and ultrasonic sensors, it places large components on the car, steering away from the traditional sleek look of a car. Another fault of LIDAR is its enormous cost. But what Tesla can’t compete is Waymo’s level of autonomy. Waymo is able to do what Tesla’s vehicles are capable of, and fully self-drive itself under many conditions without human input. Because of their specialized hardware, this allowed Waymo’s self-driving system to be categorized at a level 4 autonomy. Waymo currently does not sell any cars with their self-driving feature and instead places its self-driving system in a taxi service.

 

Safety

 

WIth all this hardware working together in our vehicles, there are a few goals brought upon each company. One such goal is safety. From self-driving systems, you can expect to see not only cars driving by themselves, but cars evading dangerous threats to protect the driver and passengers.

According to Tesla’s website, their 2019 Q3 Tesla Vehicle Safety Report boasted “one accident for every 4.34 million miles driven which drivers had Autopilot engaged”(Tesla), “For those driving without AutoPilot but with our active safety features, we registered one accident for every 2.70 million miles driven”, and “For those driving without AutoPilot and without our active safety features, we registered one accident for every 1.82 million miles driven.” Compared to NHTSA’s most recent report, it shows “in the United States there is an automobile crash every 498,000 miles.”

What this means for the drivers in the United States is that they can expect to get into an accident 5.42 times less likely with Tesla’s active safety features compared to the average driver. However, if you were to use Tesla’s self-driving system, you are 8.71 times less likely to get into an accident compared to the average driver.

Unrelated to autonomous systems, Tesla’s vehicles were “found by the National Highway Traffic Safety Administration to have the lowest and second lowest probability of injury of all cars ever tested.” It is also stated on Tesla’s website that “not only has Model 3 achieved a perfect 5-star safety rating in every category and sub-category, but NHTSA’s test also show it has the lowest probability of injury agency has ever tested.” The Model 3 continues to perfect the European New Car Assessment Programme with a 5-star rating, “earned 5-stars in-evaluate a car’s ability to protect adults, children, vulnerable road users like cyclists and pedestrians, as well as its safety assistance features.”

Tesla’s robust chassis is another key benefit of Tesla’s vehicles. Because of their electric cars, their battery is stored at the bottom of the car. This allows the vehicle to obtain a low center of gravity, allowing the car to be more easily planted to the ground. Besides driving benefits, such as taking a smaller radius turn, it allows the vehicle to be less likely to flip over.

 

Controversy

 

Within the development of fully autonomous vehicles, there are issues regarding its progress. One of the most concerning issues are accidents, leading people to believe if self-driving should exist on public roads. One such incident is involved in Uber.

Uber is a ride-sharing company, offering services to provide ridesharing, food delivery, and more. With self-driving, Uber is able to expand towards their goals. With their self-driving car, it unfortunately killed a pedestrian walking with a bicycle crossing the road during the night. Using LIDAR, it is able to detect objects far away regardless of day or night time. According to the National Transportation Safety Board, Uber’s self-driving system wasn’t able to detect what the obstacle was, classifying it at unknown. It wasn’t until “1.3 seconds before the impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision”(NTSB, “PRELIMINARY REPORT HIGHWAY HWY18MH010”). From the data, it showed that “all aspects of the self-driving system were operating normally at the time of the crash, and that there were no faults or diagnostic messages.”

Another incident is Waymo. From a Washington Post article, it says “A Waymo backup driver seized control of a self-driving minivan in Mountain View, Calif., last month to avoid an encroaching car, then quickly steered to the right and collided with a motorcyclist”(Laris, “A Waymo Safety Driver Collided with a Motorcyclist. The Company Says a Self-Driving Minivan Would Have Done Better.”). This isn’t a problem towards the autonomous system itself however. The driver took control on what felt like an oncoming collision. Waymo apologized, stating “we’re sorry that a member of the community was injured in a collision with one of our cars.”

 

Society

 

With autonomous vehicles becoming a change from the status quo, people have voiced concerns over self-driving and its progress. From a Washington Post article, Karen Brenchley, a “computer scientist with expertise in training artificial intelligence… former product manager, who has worked for Microsoft and Hewlett-Packard”(Siddiqu, “Silicon Valley Pioneered Self-Driving Cars. But Some of Its Tech-Savvy Residents Don’t Want Them Tested in Their Neighborhoods.”) and a “longtime Silicon Valley resident,” fears self-driving cars. Her question is “how engineers could teach the robocars operating on her tree-lined streets to make snap decisions, speed and slow with the flow of traffic and yield to pedestrians”. She said, “The problem isn’t that she doesn’t understand the technology. It’s that she does, and she knows how flawed nascent technology can be.” She believes that progress in self-driving cars shouldn’t involve guinea pigs.

To her surprise and her neighbors, “residents believe in the power of technology to change the world for the better, but they are skeptical of the role it might play in their daily lives.” George Azzari, 39, and a Mountain View resident, claims to see cars “form a trail down a small road near his home at rush hour, clogging up traffic.” Residents from the Mountain View and Palo Alto community expressed concerns at a “Waymo held community meeting on its driverless-car testing.” A local said, “We’re going to storm City Hall if these cars come to Palo Alto.”

Contrary to the perspective of locals, psychologist David Friedman provides his insight on humans and self-driving cars. In this interview, he believes that there are two ways that self-driving cars can progress. He states, “in one future we have a utopia where we have better mobility. We effectively eliminate traffic crashes. We cut costs and have a great convenience feature for consumers, so they can get whatever they want to go whenever they want to go.” The other way he sees it is a “dystopian world… we have self-driving cars that aren’t safe, put people at risk that increase the amount we travel and increase the emissions we create and create a world where we’re all basically locked into our cars instead of spending time with our family and friends.”

Another good point David brings up is the classic trolley car problem. He brings up that, “we have no idea.” He states that “there are no standards or requirements or policies for ethics when it comes to self-driving cars… the agency involved with overseeing auto safety, the National Highway Traffic Safety Administration… eliminated ethics from the guidelines.” This is a good point that David brings up, as self-driving ethics is uncharted territory that society will soon have to answer and explore in depth.

About ethical and logistics of achieving self-driving, authors from a journal entry in Science and Engineering Ethics provides a sketch on how to achieve and plan towards a self-driving future. From their evaluations, “Considerable further attention to both the technical and ethical dimensions of this issue are needed before such a vision could be made policy-ready. On the technical side, social-scientific investigation is needed in order to determine whether our vision can be implemented in a way that avoids the distinctive challenges it faces. On the ethical side, our discussion has set aside several ethically important issues—including safety, sustainability, and privacy—that need to be cogently examined and addressed before any proposal can be considered ethically acceptable overall”(Mladenovic and Mcpherson, “Engineering Social Justice into Traffic Control for Self-Driving Vehicles?”). From their view, they claim that self-driving is still in its infancy and needs to be viewed more cautiously if it wants to succeed in the future.

Mladenovic and Mcpherson bring a good point that is important to the future. They state, “The overall question—how should we engineer social justice into traffic control—is of great importance, and is something that cannot be answered in this paper alone. The intention of this paper is to initiate a much-needed discussion of this question by providing one ethical perspective on technological development in this area.”

 

Conclusion

 

This situation is just the beginning of programing ethics. There isn’t a perfect, clear, and concise solution to this issue. What if the self-driving system took control and still crashed? To guarantee passengers’ safety, the system should step in at every possible chance of danger. But what if it risks other’s safety as well? What is the best solution towards this? In the end, it won’t be until all the obstacles are past when we can all look back on what issues needed more thought and caring, what we could have done better. But that is only until we figure out how we proceed with our current issues with self-driving.

 

Citation

 

  • NOAA. “What Is LIDAR.” National Ocean Service Website, 25 June 2018,

https://oceanservice.noaa.gov/facts/lidar.html.

  • “LiDAR Drives Forwards.” Nature Photonics, vol. 12, no. 8, 27 July 2018, pp. 441–441.,

doi:10.1038/s41566-018-0235-z.

  • Mladenovic, Milos N., and Tristram Mcpherson. “Engineering Social Justice into Traffic

Control for Self-Driving Vehicles?” Science and Engineering Ethics, vol. 22, no. 4, Jan. 2015, pp. 1131–1149., doi:10.1007/s11948-015-9690-9.

  • Laris, Michael. “A Waymo Safety Driver Collided with a Motorcyclist. The Company

Says a Self-Driving Minivan Would Have Done Better.” The Washington Post, WP Company, 6 Nov. 2018, https://www.washingtonpost.com/technology/2018/11/06/waymo-safety-driver-collides-with-motorcyclist-company-says-self-driving-minivan-would-have-done-better/.

  • Griggs, Troy, and Daisuke Wakabayashi. “How a Self-Driving Uber Killed a Pedestrian

in Arizona.” The New York Times, The New York Times, 20 Mar. 2018, https://www.nytimes.com/interactive/2018/03/20/us/self-driving-uber-pedestrian-killed.html.

  • National Transportation Safety Board. “PRELIMINARY REPORT HIGHWAY

HWY18MH010”

  • “Tesla Vehicle Safety Report.” Tesla, 24 Oct. 2019,

https://www.tesla.com/VehicleSafetyReport.

  • Siddiqui, Faiz. “Silicon Valley Pioneered Self-Driving Cars. But Some of Its Tech-Savvy

Residents Don’t Want Them Tested in Their Neighborhoods.” The Washington Post, WP Company, 3 Oct. 2019, https://www.washingtonpost.com/technology/2019/10/03/silicon-valley-pioneered-self-driving-cars-some-its-tech-savvy-residents-dont-want-them-tested-their-neighborhoods/.

  • “ Speaking of Psychology: Self-Driving Cars.” American Psychological Association,

American Psychological Association, https://www.apa.org/research/action/speaking-of-psychology/self-driving-cars.