The potential benefits of autonomous cars include reduced mobility and infrastructure costs, increased safety, increased mobility, increased customer satisfaction and reduced crime. Specifically a significant reduction in traffic collisions; the resulting injuries; and related costs, including less need for insurance. Autonomous cars are predicted to increase traffic flow; provide enhanced mobility for children, the elderly,disabled and the poor; relieve travelers from driving and navigation chores; lower fuel consumption; significantly reduce needs for parking space; reduce crime; and facilitate business models for transportation as a service, especially via the sharing economy. This shows the vast disruptive potential of the emerging technology.
In spite of the various benefits to increased vehicle automation, challenges exist, such as technology challenges, disputes concerning liability, resistance by individuals to forfeit control of their cars, customer concern about the safety of driverless cars, implementation of a legal framework and establishment of government regulations; risk of loss of privacy and security concerns, such as hackers or terrorism; concerns about the resulting loss of driving-related jobs in the road transport industry; and risk of increased suburbanization as travel becomes less costly and time-consuming. Many of these issues are due to the fact that autonomous objects, for the first time, allow computers to roam freely, with many related safety and security concerns.
In 2017, Audi stated that its latest A8 would be autonomous at up to speeds of 60 km/h using its "Audi AI". The driver would not have to do safety checks such as frequently gripping the steering wheel. The Audi A8 was claimed to be the first production car to reach level 3 autonomous driving and Audi would be the first manufacturer to use laser scanners in addition to cameras and ultrasonic sensors for their system.
In November 2017, Waymo announced that it had begun testing driverless cars without a safety driver at the driver position, however; there is still an employee in the car. In February 2018, Waymo announced that its test vehicles had traveled autonomously for over 5 million miles.
Autonomous means self-governance. Many historical projects related to vehicle autonomy have been automated (made to be automatic) due to a heavy reliance on artificial hints in their environment, such as magnetic strips. Autonomous control implies satisfactory performance under significant uncertainties in the environment and the ability to compensate for system failures without external intervention.
One approach is to implement communication networks both in the immediate vicinity (for collision avoidance) and further away (for congestion management). Such outside influences in the decision process reduce an individual vehicle's autonomy, while still not requiring human intervention.
Wood et al. (2012) wrote "This Article generally uses the term 'autonomous,' instead of the term 'automated.' " The term "autonomous" was chosen "because it is the term that is currently in more widespread use (and thus is more familiar to the general public). However, the latter term is arguably more accurate. 'Automated' connotes control or operation by a machine, while 'autonomous' connotes acting alone or independently. Most of the vehicle concepts (that we are currently aware of) have a person in the driver’s seat, utilize a communication connection to the Cloud or other vehicles, and do not independently select either destinations or routes for reaching them. Thus, the term 'automated' would more accurately describe these vehicle concepts". As of 2017, most commercial projects focused on autonomous vehicles that did not communicate with other vehicles or an enveloping management regime.
Tesla Autopilot system is considered an SAE level 2 system.
A classification system based on six different levels (ranging from fully manual to fully automated systems) was published in 2014 by SAE International, an automotive standardization body, as J3016, Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems.This classification system is based on the amount of driver intervention and attentiveness required, rather than the vehicle capabilities, although these are very loosely related. In the United States in 2013, the National Highway Traffic Safety Administration (NHTSA) released a formal classification system,but abandoned this system in favor of the SAE standard in 2016. Also in 2016, SAE updated its classification, called J3016_201609.
In SAE's autonomy level definitions, "driving mode" means "a type of driving scenario with characteristic dynamic driving task requirements (e.g., expressway merging, high speed cruising, low speed traffic jam, closed-campus operations, etc.)"
Level 0: Automated system issues warnings and may momentarily intervene but has no sustained vehicle control.
Level 1 (”hands on”): Driver and automated system shares control over the vehicle. An example would be Adaptive Cruise Control (ACC) where the driver controls steering and the automated system controls speed. Using Parking Assistance, steering is automated while speed is manual. The driver must be ready to retake full control at any time. Lane Keeping Assistance (LKA) Type II is a further example of level 1 self driving.
Level 2 (”hands off”): The automated system takes full control of the vehicle (accelerating, braking, and steering). The driver must monitor the driving and be prepared to immediately intervene at any time if the automated system fails to respond properly. The shorthand ”hands off” is not meant to be taken literally. In fact, contact between hand and wheel is often mandatory during SAE 2 driving, to confirm that the driver is ready to intervene.
Level 3 (”eyes off”): The driver can safely turn their attention away from the driving tasks, e.g. the driver can text or watch a movie. The vehicle will handle situations that call for an immediate response, like emergency braking. The driver must still be prepared to intervene within some limited time, specified by the manufacturer, when called upon by the vehicle to do so. The 2018 Audi A8 Luxury Sedan was the first commercial car to claim to be able to do level 3 self driving. The car has a so-called Traffic Jam Pilot. When activated by the human driver, the car takes full control of all aspects of driving in slow-moving traffic at up to 60 kilometers per hour. The function works only on highways with a physical barrier separating oncoming traffic.
Level 4 (”mind off”): As level 3, but no driver attention is ever required for safety, i.e. the driver may safely go to sleep or leave the driver's seat. Self driving is supported only in limited areas (geofenced) or under special circumstances, like traffic jams. Outside of these areas or circumstances, the vehicle must be able to safely abort the trip, i.e. park the car, if the driver does not retake control.
Level 5 (”steering wheel optional”): No human intervention is required. An example would be a robotic taxi.
In the formal SAE definition below, note in particular what happens in the shift from SAE 2 to SAE 3: the human driver no longer has to monitor the environment. This is the final aspect of the ”dynamic driving task” that is now passed over from the human to the automated system. At SAE 3, the human driver still has the responsibility to intervene when asked to do so by the automated system. At SAE 4 the human driver is relieved of that responsibility and at SAE 5 the automated system will never need to ask for an intervention.
Execution of Steering and Acceleration/Deceleration
Monitoring of Driving Environment
Fallback Performance of Dynamic Driving Task
System Capability (Driving Modes)
Human driver monitors the driving environment
the full-time performance by the human driver of all aspects of the dynamic driving task, even when enhanced by warning or intervention systems
the driving mode-specific execution by a driver assistance system of either steering or acceleration/deceleration using information about the driving environment and with the expectation that the human driver perform all remaining aspects of the dynamic driving task
Human driver and system
Some driving modes
the driving mode-specific execution by one or more driver assistance systems of both steering and acceleration/deceleration using information about the driving environment and with the expectation that the human driver perform all remaining aspects of the dynamic driving task
Some driving modes
Automated driving system monitors the driving environment
the driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task with the expectation that the human driver will respond appropriately to a request to intervene
Some driving modes
the driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task, even if a human driver does not respond appropriately to a request to intervene
Many driving modes
the full-time performance by an automated driving system of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver
Modern self-driving cars generally use Bayesian Simultaneous localization and mapping (SLAM) algorithms, which fuse data from multiple sensors and an off-line map into current location estimates and map updates. SLAM with detection and tracking of other moving objects (DATMO), which also handles obstacles such as cars and pedestrians, is a variant being developed at Google. Simpler systems may use roadside real-time locating system (RTLS) beacon systems to aid localisation. Typical sensors include lidar, stereo vision, GPS and IMU. Visual object recognition uses machine vision including neural networks. Udacity is developing an open-source software stack.
Autonomous cars are being developed with deep learning, or neural networks.Deep neural networks have many computational stages, or levels in which neurons are simulated from the environment that activate the network . The neural network depends on an extensive amount of data extracted from real life driving scenarios. The neural network is activated and “learns” to perform the best course of action.Deep learning has been applied to answer to real life situations, and is used in the programming for autonomous cars. In addition, sensors, such as the LIDAR sensors already used in self-driving cars; cameras to detect the environment, and precise GPS navigation will be used in autonomous cars.
Testing vehicles with varying degrees of autonomy can be done physically, in closed environments, on public roads (where permitted, typically with a license or permit or adhering to a specific set of operating principles) or virtually, i.e. in computer simulations.
When driven on public roads, autonomous vehicles require a person to monitor their proper operation and "take over" when needed.
Apple is currently testing self-driven cars, and increased the number of test vehicles from 3 to 27 in January 2018. This number further increased to 45 in March 2018.
One way to assess the progress of autonomous vehicles is to compute the average number of miles driven between "disengagements", when the autonomous system is turned off, typically by a human driver. In 2017, Waymo reported 63 disengagements over 352,545 miles of testing, or 5596 miles on average, the highest among companies reporting such figures. Waymo also traveled more miles in total than any other. Their 2017 rate of 0.18 disengagements per 1000 miles was an improvement from 0.2 disengagements per 1000 miles in 2016 and 0.8 in 2015. In March, 2017, Uber reported an average of 0.67 miles per disengagement. In the final three months of 2017, Cruise Automation (now owned by GM) averaged 5224 miles per disruption over 62,689 miles.
Several companies are said to be testing autonomous technology in semi trucks. Otto, a self-driving trucking company that was acquired by Uber in August 2016, demonstrated their trucks on the highway before being acquired. In May 2017, San Francisco-based startup Embark announced a partnership with truck manufacturer Peterbilt to test and deploy autonomous technology in Peterbilt's vehicles. Google's Waymo has also said to be testing autonomous technology in trucks, however no timeline has been given for the project.
In Europe, cities in Belgium, France, Italy and the UK are planning to operate transport systems for autonomous cars, and Germany, the Netherlands, and Spain have allowed public testing in traffic. In 2015, the UK launched public trials of the LUTZ Pathfinder autonomous pod in Milton Keynes. Beginning in summer 2015 the French government allowed PSA Peugeot-Citroen to make trials in real conditions in the Paris area. The experiments were planned to be extended to other cities such as Bordeaux and Strasbourg by 2016. The alliance between French companies THALES and Valeo (provider of the first self-parking car system that equips Audi and Mercedes premi) is testing its own system. New Zealand is planning to use autonomous vehicles for public transport in Tauranga and Christchurch.
According to motorist website "TheDrive.com" operated by Time magazine, none of the driving safety experts they were able to contact were able to rank driving under an autopilot system as yet having achieved a greater level of safety than traditional fully hands-on driving.
Autonomous cars could reduce labor costs; relieve travelers from driving and navigation chores, thereby replacing behind-the-wheel commuting hours with more time for leisure or work; and also would lift constraints on occupant ability to drive, distracted and texting while driving, intoxicated, prone to seizures, or otherwise impaired. For the young, the elderly, people with disabilities, and low-income citizens, autonomous cars could provide enhanced mobility. The removal of the steering wheel—along with the remaining driver interface and the requirement for any occupant to assume a forward-facing position—would give the interior of the cabin greater ergonomic flexibility. Large vehicles, such as motorhomes, would attain appreciably enhanced ease of use.
Additional advantages could include higher speed limits; smoother rides; and increased roadway capacity; and minimized traffic congestion, due to decreased need for safety gaps and higher speeds. Currently, maximum controlled-access highway throughput or capacity according to the U.S. Highway Capacity Manual is about 2,200 passenger vehicles per hour per lane, with about 5% of the available road space is taken up by cars. One study estimated that autonomous cars could increase capacity by 273% (~8,200 cars per hour per lane). The study also estimated that with 100% connected vehicles using vehicle-to-vehicle communication, capacity could reach 12,000 passenger vehicles per hour (up 445% from 2,200 pc/h per lane) traveling safely at 120 km/h (75 mph) with a following gap of about 6 m (20 ft) of each other. Currently, at highway speeds drivers keep between 40 to 50 m (130 to 160 ft) away from the car in front. These increases in highway capacity could have a significant impact in traffic congestion, particularly in urban areas, and even effectively end highway congestion in some places. The ability for authorities to manage traffic flow would increase, given the extra data and driving behavior predictability. combined with less need for traffic police and even road signage.
By reducing the (labor and other) cost of mobility as a service, autonomous cars could reduce the number of cars that are individually owned, replaced by taxi/pooling and other car sharing services. This could dramatically reduce the need for parking space, freeing scarce land for other uses. This would also dramatically reduce the size of the automotive production industry, with corresponding environmental and economic effects. Assuming the increased efficiency is not fully offset by increases in demand, more efficient traffic flow could free roadway space for other uses such as better support for pedestrians and cyclists.
The vehicles' increased awareness could aid the police by reporting on illegal passenger behavior, while possibly enabling other crimes, such as deliberately crashing into another vehicle or a pedestrian.
This section is in a list format that may be better presented using prose.You can help by converting this section to prose, if appropriate. Editing help is available.(December 2016)
In spite of the various benefits to increased vehicle automation, some foreseeable challenges persist, such as disputes concerning liability, the time needed to turn the existing stock of vehicles from nonautonomous to autonomous, resistance by individuals to forfeit control of their cars, customer concern about the safety of driverless cars, and the implementation of legal framework and establishment of government regulations for self-driving cars. Other obstacles could be missing driver experience in potentially dangerous situations, ethical problems in situations where an autonomous car's software is forced during an unavoidable crash to choose between multiple harmful courses of action, and possibly insufficient Adaptation to Gestures and non-verbal cues by police and pedestrians.
Possible technological obstacles for autonomous cars are:
Artificial Intelligence still isn't able to function properly in chaotic inner city environments
A car's computer could potentially be compromised, as could a communication system between cars.
Susceptibility of the car's sensing and navigation systems to different types of weather (such as snow) or deliberate interference, including jamming and spoofing.
Avoidance of large animals requires recognition and tracking, and Volvo found that software suited to caribou, deer, and elk was ineffective with kangaroos.
Autonomous cars may require very high-quality specialised maps to operate properly. Where these maps may be out of date, they would need to be able to fall back to reasonable behaviors.
Competition for the radio spectrum desired for the car's communication.
Field programmability for the systems will require careful evaluation of product development and the component supply chain.
Current road infrastructure may need changes for autonomous cars to function optimally.
Discrepancy between people’s beliefs of the necessary government intervention may cause a delay in accepting autonomous cars on the road. Whether the public desires no change in existing laws, federal regulation, or another solution; the framework of regulation will likely result in differences of opinion.
A direct impact of widespread adoption of autonomous vehicles is the loss of driving-related jobs in the road transport industry. There could be resistance from professional drivers and unions who are threatened by job losses. In addition, there could be job losses in public transit services and crash repair shops. The automobile insurance industry might suffer as the technology makes certain aspects of these occupations obsolete. A frequently cited paper by Michael Osborne and Carl Benedikt Frey found that autonomous cars would make many jobs redundant.
Privacy could be an issue when having the vehicle's location and position integrated into an interface in which other people have access to. In addition, there is the risk of automotive hackingthrough the sharing of information through V2V (Vehicle to Vehicle) and V2I (Vehicle to Infrastructure) protocols. There is also the risk of terrorist attacks. Self-driving cars could potentially be loaded with explosives and used as bombs.
The lack of stressful driving, more productive time during the trip, and the potential savings in travel time and cost could become an incentive to live far away from cities, where land is cheaper, and work in the city's core, thus increasing travel distances and inducing more urban sprawl, more fuel consumption and an increase in the carbon footprint of urban travel. There is also the risk that traffic congestion might increase, rather than decrease. Appropriate public policies and regulations, such as zoning, pricing, and urban design are required to avoid the negative impacts of increased suburbanization and longer distance travel.
Some believe that once automation in vehicles reaches higher levels and becomes reliable, drivers will pay less attention to the road. Research shows that drivers in autonomous cars react later when they have to intervene in a critical situation, compared to if they were driving manually. Depending on the capabilities of autonomous vehicles and the frequency with which human intervention is needed, this may still increase safety as compared to all-human driving.[weasel words]
Ethical and moral reasoning come into consideration when programming the software that decides what action the car takes in an unavoidable crash; whether the autonomous car will crash into a bus, potentially killing people inside; or swerve elsewhere, potentially killing its own passengers or nearby pedestrians. A question that comes into play that programmers find difficult to answer is "what decision should the car make that causes the ‘smallest’ damage when it comes to people's lives?" The ethics of autonomous vehicles is still in the process of being solved and could possibly lead to controversy. However, human drivers are known to make biased ethical decisions when driving (often unconsciously, such as avoiding harm to themselves). In many cases, human thought and reaction time are too slow to detect an upcoming crash, think through the ethical implications of the available options, and take an action to implement an ethical choice.
A Mercedes-Benz S 4504MATICCoupe. The forward-facing Distronic sensors are usually placed behind the Mercedes-Benz logo and front grille.
In 2005, Mercedes refined the system (from this point called "Distronic Plus") with the Mercedes-Benz S-Class (W221) being the first car to receive the upgraded Distronic Plus system. Distronic Plus could now completely halt the car if necessary on E-Class and most Mercedes sedans. In an episode of Top Gear, Jeremy Clarkson demonstrated the effectiveness of the cruise control system in the S-class by coming to a complete halt from motorway speeds to a round-about and getting out, without touching the pedals.
By 2017, Mercedes has vastly expanded its autonomous driving features on production cars: In addition to the standard Distronic Plus features such as an active brake assist, Mercedes now includes a steering pilot, a parking pilot, a cross-traffic assist system, night-vision cameras with automated danger warnings and braking assist (in case animals or pedestrians are on the road for example), and various other autonomous-driving features. In 2016, Mercedes also introduced its Active Brake Assist 4, which was the first emergency braking assistant with pedestrian recognition on the market.
Due to Mercedes' history of gradually implementing advancements of their autonomous driving features that have been extensively tested, not many crashes that have been caused by it are known. One of the known crashes dates back to 2005, when German news magazine "Stern" was testing Mercedes' old Distronic system. During the test, the system did not always manage to brake in time. Ulrich Mellinghoff, then Head of Safety, NVH, and Testing at the Mercedes-Benz Technology Centre, stated that some of the tests failed due to the vehicle being tested in a metallic hall, which caused problems with the system's radar. Later iterations of the Distronic system have an upgraded radar and numerous other sensors, which are not susceptible to a metallic environment anymore. In 2008, Mercedes conducted a study comparing the crash rates of their vehicles equipped with Distronic Plus and the vehicles without it, and concluded that those equipped with Distronic Plus have an around 20% lower crash rate. In 2013, German Formula One driver Michael Schumacher was invited by Mercedes to try to crash a Mercedes C-Class vehicle, which was equipped with all safety features that Mercedes offered for its production vehicles at the time, which included the Active Blind Spot Assist, Active Lane Keeping Assist, Brake Assist Plus, Collision Prevention Assist, Distronic Plus with Steering Assist, Pre-Safe Brake, and Stop&Go Pilot. Due to the safety features, Schumacher was unable to crash the vehicle in realistic scenarios.
In mid‑October 2015 Tesla Motors rolled out version 7 of their software in the U.S. that included Tesla Autopilot capability. On 9 January 2016, Tesla rolled out version 7.1 as an over-the-airupdate, adding a new "summon" feature that allows cars to self-park at parking locations without the driver in the car. Tesla's autonomous driving features can be classified as somewhere between level 2 and level 3 under the U.S. Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) five levels of vehicle automation. At this level the car can act autonomously but requires the full attention of the driver, who must be prepared to take control at a moment's notice. Autopilot should be used only on limited-access highways, and sometimes it will fail to detect lane markings and disengage itself. In urban driving the system will not read traffic signals or obey stop signs. The system also does not detect pedestrians or cyclists.
On 20 January, 2016 the first known fatal crash of a Tesla with Autopilot occurred in China's Hubei province. According to China's 163.com news channel, this marked "China's first accidental death due to Tesla's automatic driving (system)." Initially, Tesla pointed out that the vehicle was so badly damaged from the impact that their recorder was not able to conclusively prove that the car had been on Autopilot at the time, however 163.com pointed out that other factors, such as the car's absolute failure to take any evasive actions prior to the high speed crash, and the driver's otherwise good driving record, seemed to indicate a strong likelihood that the car was on Autopilot at the time. A similar fatal crash occurred four months later in Florida. In 2018, in a subsequent civil suit between the father of the driver killed and Tesla, Tesla did not deny that the car had been on Autopilot at the time of the accident, and sent evidence to the victim's father documenting that fact.
The second known fatal accident involving a vehicle being driven by itself took place in Williston, Florida on 7 May 2016 while a Tesla Model Selectric carwas engaged in Autopilot mode. The occupant was killed in a crash with an 18-wheel tractor-trailer. On 28 June 2016 the National Highway Traffic Safety Administration (NHTSA) opened a formal investigation into the accident working with the Florida Highway Patrol. According to the NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway, and the car failed to apply the brakes. The car continued to travel after passing under the truck’s trailer. The NHTSA's preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involved a population of an estimated 25,000 Model S cars. On 8 July 2016, the NHTSA requested Tesla Motors provide the agency detailed information about the design, operation and testing of its Autopilot technology. The agency also requested details of all design changes and updates to Autopilot since its introduction, and Tesla's planned updates schedule for the next four months.
According to Tesla, "neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied." The car attempted to drive full speed under the trailer, "with the bottom of the trailer impacting the windshield of the Model S." Tesla also claimed that this was Tesla’s first known autopilot death in over 130 million miles (208 million km) driven by its customers with Autopilot engaged, however by this statement, Tesla was apparently refusing to acknowledge claims that the January 2016 fatality in Hubei China had also been the result of an autopilot system error. According to Tesla there is a fatality every 94 million miles (150 million km) among all type of vehicles in the U.S. However, this number also includes fatalities of the crashes, for instance, of motorcycle drivers with pedestrians.
In July 2016 the U.S. National Transportation Safety Board (NTSB) opened a formal investigation into the fatal accident while the Autopilot was engaged. The NTSB is an investigative body that has the power to make only policy recommendations. An agency spokesman said "It's worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible.". In January 2017, the NTSB released the report that concluded Tesla was not at fault; the investigation revealed that for Tesla cars, the crash rate dropped by 40 percent after Autopilot was installed.
According to Tesla, starting 19 October 2016, all Tesla cars are built with hardware to allow full self-driving capability at the highest safety level (SAE Level 5). The hardware includes eight surround cameras and twelve ultrasonic sensors, in addition to the forward-facing radar with enhanced processing capabilities. The system will operate in "shadow mode" (processing without taking action) and send data back to Tesla to improve its abilities until the software is ready for deployment via over-the-air upgrades. After the required testing, Tesla hopes to enable full self-driving by the end of 2019 under certain conditions.
On 15 March 2018, a Tesla Model S that was parked near a primary school in Wormerveer, Netherlands, set itself in motion following a technical failure. It drove towards a group of people before hitting a bicycle, a scooter and another car, finally coming to a standstill against a fence. One person was taken to hospital for a checkup. In an update, police mentioned a handling error, not a technical failure, and arrested the 70-year-old driver. According to Tesla, the car was not equipped with Autopilot.
In August 2012, Google announced that their vehicles had completed over 300,000 autonomous-driving miles (500,000 km) accident-free, typically involving about a dozen cars on the road at any given time, and that they were starting to test with single drivers instead of in pairs. In late-May 2014, Google revealed a new prototype that had no steering wheel, gas pedal, or brake pedal, and was fully autonomous. As of March 2016, Alphabet had test-driven their fleet in autonomous mode a total of 1,500,000 mi (2,400,000 km). In December 2016, Google Corporation announced that its technology would be spun-off to a new subsidiary called Waymo.
Based on Google's accident reports, their test cars have been involved in 14 collisions, of which other drivers were at fault 13 times, although in 2016 the car's software caused a crash.
In June 2015, Brin confirmed that 12 vehicles had suffered collisions as of that date. Eight involved rear-end collisions at a stop sign or traffic light, two in which the vehicle was side-swiped by another driver, one in which another driver rolled through a stop sign, and one where a Google employee was controlling the car manually. In July 2015, three Google employees suffered minor injuries when their vehicle was rear-ended by a car whose driver failed to brake at a traffic light. This was the first time that a collision resulted in injuries. On 14 February 2016 a Waymo vehicle attempted to avoid sandbags blocking its path. During the maneuver it struck a bus. Google stated, "In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision." Google characterized the crash as a misunderstanding and a learning experience. No injuries were reported in the crash.
In March 2017, an Uber test vehicle was involved in a crash in Tempe, Arizona when another car failed to yield, flipping the Uber vehicle. There were no injuries in the accident.
By December 22, 2017, Uber had completed 2,000,000 miles in autonomous mode.
On March 18, 2018, Elaine Herzberg became the first pedestrian to be killed by a self-driving car in the United States after being hit by an Uber vehicle, also in Tempe. Prior to this fatality, there have been two other known self-driving car related fatalities, both with cars by Tesla Motors, making this the third known self-driving car related fatality. This also marked the first time an individual outside of an auto-piloted car is known to have been killed by such a car. The first death of an essentially uninvolved third party is likely to raise new questions and concerns about the safety of autonomous cars in general. Herzberg was crossing outside of a crosswalk, approximately 400 feet from an intersection.  The cause of the accident is not clear. Some experts say a human driver could have avoided the fatal crash.  Arizona Governor Doug Ducey later suspended the company's ability to test and operate its autonomous cars on public roadways citing an "unquestionable failure" of the expectation that Uber make public safety its top priority. Uber has pulled out of all self-driving-car testing in California as a result of the accident.
On 9 November 2017 an autonomous self-driving bus with passengers was involved in a crash with a truck. The truck was found to be at fault of the crash, reversing into the stationary autonomous bus. The autonomous bus did not take evasive actions or apply defensive driving such as flash headlights, sound the horn, or as one passenger commented "The shuttle didn’t have the ability to move back. The shuttle just stayed still."
According to a Wonkblog reporter, if fully autonomous cars become commercially available, they have the potential to be a disruptive innovation with major implications for society. The likelihood of widespread adoption is still unclear, but if they are used on a wide scale, policy makers face a number of unresolved questions about their effects.
One fundamental question is about their effect on travel behavior. Some people believe that they will increase car ownership and car use because it will become easier to use them and they will ultimately be more useful. This may in turn encourage urban sprawl and ultimately total private vehicle use. Others argue that it will be easier to share cars and that this will thus discourage outright ownership and decrease total usage, and make cars more efficient forms of transportation in relation to the present situation.
Policy-makers will have to take a new look at how infrastructure is to be built and how money will be allotted to build for autonomous vehicles. The need for traffic signals could potentially be reduced with the adoption of smart highways. Due to smart highways and with the assistance of smart technological advances implemented by policy change, the dependence on oil importsmay be reduced because of less time being spent on the road by individual cars which could have an effect on policy regarding energy. On the other hand, autonomous vehicles could increase the overall number of cars on the road which could lead to a greater dependence on oil imports if smart systems are not enough to curtail the impact of more vehicles. However, due to the uncertainty of the future of autonomous vehicles, policy makers may want to plan effectively by implementing infrastructure improvements that can be beneficial to both human drivers and autonomous vehicles. Caution needs to be taken in acknowledgment to public transportation and that the use may be greatly reduced if autonomous vehicles are catered to through policy reform of infrastructure with this resulting in job loss and increased unemployment.
Other disruptive effects will come from the use of autonomous vehicles to carry goods. Self-driving vans have the potential to make home deliveries significantly cheaper, transforming retail commerce and possibly making hypermarkets and supermarkets redundant. As of right now the U.S. Government defines automation into six levels, starting at level zero which means the human driver does everything and ending with level five, the automated system performs all the driving tasks. Also under the current law, manufacturers bear all the responsibility to self-certify vehicles for use on public roads. This means that currently as long as the vehicle is compliant within the regulatory framework, there are no specific federal legal barriers to a highly automated vehicle being offered for sale. Iyad Rahwan, an associate professor in the MIT Media lab said, "Most people want to live in a world where cars will minimize casualties, but everyone wants their own car to protect them at all costs." Furthermore, industry standards and best practice are still needed in systems before they can be considered reasonably safe under real-world conditions.
The 1968 Vienna Convention on Road Traffic, subscribed to by over 70 countries worldwide, establishes principles to govern traffic laws. One of the fundamental prinicples of the Convention has been the concept that a driver is always fully in control and responsible for the behavior of a vehicle in traffic. The progress of technology that assists and takes over the functions of the driver is undermining this principle, implying that much of the groundwork must be rewritten.
U.S. states that allow testing of driverless cars on public roads as of 1 February 2018.
In the United States, a non-signatory country to the Vienna Convention, state vehicle codes generally do not envisage — but do not necessarily prohibit — highly automated vehicles. To clarify the legal status of and otherwise regulate such vehicles, several states have enacted or are considering specific laws. In 2016, 7 states (Nevada, California, Florida, Michigan, Hawaii, Washington, and Tennessee), along with the District of Columbia, have enacted laws for autonomous vehicles. Incidents such as the first fatal accident by Tesla's Autopilot system have led to discussion about revising laws and standards for autonomous cars.
In September 2016, the US National Economic Council and Department of Transportation released federal standards that describe how automated vehicles should react if their technology fails, how to protect passenger privacy, and how riders should be protected in the event of an accident. The new federal guidelines are meant to avoid a patchwork of state laws, while avoiding being so overbearing as to stifle innovation.
In June 2011, the Nevada Legislature passed a law to authorize the use of autonomous cars. Nevada thus became the first jurisdiction in the world where autonomous vehicles might be legally operated on public roads. According to the law, the Nevada Department of Motor Vehicles (NDMV) is responsible for setting safety and performance standards and the agency is responsible for designating areas where autonomous cars may be tested.This legislation was supported by Google in an effort to legally conduct further testing of its Google driverless car. The Nevada law defines an autonomous vehicle to be "a motor vehicle that uses artificial intelligence, sensors and global positioning system coordinates to drive itself without the active intervention of a human operator." The law also acknowledges that the operator will not need to pay attention while the car is operating itself. Google had further lobbied for an exemption from a ban on distracted driving to permit occupants to send text messages while sitting behind the wheel, but this did not become law. Furthermore, Nevada's regulations require a person behind the wheel and one in the passenger’s seat during tests.
A Toyota Prius modified by Google to operate as a driverless car.
In 2014 the Government of France announced that testing of autonomous cars on public roads would be allowed in 2015. 2000 km of road would be opened through the national territory, especially in Bordeaux, in Isère, Île-de-France and Strasbourg. At the 2015 ITS World Congress, a conference dedicated to intelligent transport systems, the very first demonstration of autonomous vehicles on open road in France was carried out in Bordeaux in early October 2015.
In 2015, a preemptive lawsuit against various automobile companies such as GM, Ford, and Toyota accused them of "Hawking vehicles that are vulnerable to hackers who could hypothetically wrest control of essential functions such as brakes and steering."
In spring of 2015, the Federal Department of Environment, Transport, Energy and Communications in Switzerland (UVEK) allowed Swisscom to test a driverless Volkswagen Passat on the streets of Zurich.
On 19 February 2016, Assembly Bill No. 2866 was introduced in California that would allow completely autonomous vehicles to operate on the road, including those without a driver, steering wheel, accelerator pedal, or brake pedal. The Bill states the Department of Motor Vehicles would need to comply with these regulations by 1 July 2018 for these rules to take effect. This bill has yet to pass the house of origin.
In 2016, the Singapore Land Transit Authority in partnership with UK automotive supplier Delphi Automotive Plc will launch preparations for a test run of a fleet of automated taxis for an on-demand autonomous cab service to take effect in 2017.
In September 2016, the U.S. Department of Transportation released its Federal Automated Vehicles Policy, and California published discussions on the subject in October 2016.
In December 2016, the California Department of Motor Vehicles ordered Uber to remove its self-driving vehicles from the road in response to two red-light violations. Uber immediately blamed the violations on "human-error", and has suspended the drivers.
As of April 2017 it is possible to conduct public road tests for development vehicles in Hungary, furthermore the construction of a closed test track, the Zala Zone test track, suitable for testing highly automated functions is also under way near the city of Zalaegerszeg.
Individual vehicles may benefit from information obtained from other vehicles in the vicinity, especially information relating to traffic congestion and safety hazards. Vehicular communication systems use vehicles and roadside units as the communicating nodes in a peer-to-peer network, providing each other with information. As a cooperative approach, vehicular communication systems can allow all cooperating vehicles to be more effective. According to a 2010 study by the National Highway Traffic Safety Administration, vehicular communication systems could help avoid up to 79 percent of all traffic accidents.
In 2012, computer scientists at the University of Texas in Austin began developing smart intersections designed for autonomous cars. The intersections will have no traffic lights and no stop signs, instead using computer programs that will communicate directly with each car on the road.
Among connected cars, an unconnected one is the weakest link and will be increasingly banned from busy high-speed roads, predicted a Helsinki think tank in January 2016.
In a 2011 online survey of 2,006 US and UK consumers by Accenture, 49% said they would be comfortable using a "driverless car".
A 2012 survey of 17,400 vehicle owners by J.D. Power and Associates found 37% initially said they would be interested in purchasing a fully autonomous car. However, that figure dropped to 20% if told the technology would cost $3,000 more.
In a 2012 survey of about 1,000 German drivers by automotive researcher Puls, 22% of the respondents had a positive attitude towards these cars, 10% were undecided, 44% were skeptical and 24% were hostile.
A 2013 survey of 1,500 consumers across 10 countries by Cisco Systems found 57% "stated they would be likely to ride in a car controlled entirely by technology that does not require a human driver", with Brazil, India and China the most willing to trust autonomous technology.
In a 2014 US telephone survey by Insurance.com, over three-quarters of licensed drivers said they would at least consider buying a self-driving car, rising to 86% if car insurance were cheaper. 31.7% said they would not continue to drive once an autonomous car was available instead.
In a February 2015 survey of top auto journalists, 46% predict that either Tesla or Daimler will be the first to the market with a fully autonomous vehicle, while (at 38%) Daimler is predicted to be the most functional, safe, and in-demand autonomous vehicle.
In 2015 a questionnaire survey by Delft University of Technology explored the opinion of 5,000 people from 109 countries on automated driving. Results showed that respondents, on average, found manual driving the most enjoyable mode of driving. 22% of the respondents did not want to spend any money for a fully automated driving system. Respondents were found to be most concerned about software hacking/misuse, and were also concerned about legal issues and safety. Finally, respondents from more developed countries (in terms of lower accident statistics, higher education, and higher income) were less comfortable with their vehicle transmitting data. The survey also gave results on potential consumer opinion on interest of purchasing an automated car, stating that 37% of surveyed current owners were either "definitely" or "probably" interested in purchasing an automated car.
In 2016, a survey in Germany examined the opinion of 1,603 people, who were representative in terms of age, gender, and education for the German population, towards partially, highly, and fully automated cars. Results showed that men and women differ in their willingness to use them. Men felt less anxiety and more joy towards automated cars, whereas women showed the exact opposite. The gender difference towards anxiety was especially pronounced between young men and women but decreased with participants’ age.
In 2016, a PwC survey, in the United States, showing the opinion of 1,584 people, highlights that "66 percent of respondents said they think autonomous cars are probably smarter than the average human driver". People are still worried about safety and mostly the fact of having the car hacked. Nevertheless, only 13% of the interviewees see no advantages in this new kind of cars.
A Pew Research Center survey of 4,135 U.S. adults conducted May 1-15, 2017 finds that many Americans anticipate significant impacts from various automation technologies in the course of their lifetimes—from the widespread adoption of autonomous vehicles to the replacement of entire job categories with robot workers.
With the emergence of autonomous automobiles there are various ethical issues arising. While morally, the introduction of autonomous vehicles to the mass market seems inevitable due to a reduction of crashes by up to 90% and their accessibility to disabled, elderly, and young passengers, there still remain some ethical issues that have not yet been fully solved. Those include, but are not limited to: the moral, financial, and criminal responsibility for crashes, the decisions a car is to make right before a (fatal) crash, privacy issues, and potential job loss.
There are different opinions on who should be held liable in case of a crash, in particular with people being hurt. Many experts see the car manufacturers themselves responsible for those crashes that occur due to a technical malfunction or misconstruction. Besides the fact that the car manufacturer would be the source of the problem in a situation where a car crashes due to a technical issue, there is another important reason why car manufacturers could be held responsible: it would encourage them to innovate and heavily invest into fixing those issues, not only due to protection of the brand image, but also due to financial and criminal consequences. However, there are also voices that argue those using or owning the vehicle should be held responsible since they know the risks involved in using such a vehicle. Experts suggest introducing a tax or insurances that would protect owners and users of autonomous vehicles of claims made by victims of an accident. Other possible parties that can be held responsible in case of a technical failure include software engineers that programmed the code for the autonomous operation of the vehicles, and suppliers of components of the AV.
Taking aside the question of legal liability and moral responsibility, the question arises how autonomous vehicles should be programmed to behave in an emergency situation where either passengers or other traffic participants are endangered. A very visual example of the moral dilemma that a software engineer or car manufacturer might face in programming the operating software is described in an ethical thought experiment, the trolley problem: a conductor of a trolley has the choice of staying on the planned track and running over 5 people, or turn the trolley onto a track where it would kill only one person, assuming there is no traffic on it. There are two main considerations that need to be addressed. First, what moral basis would be used by an autonomous vehicle to make decisions? Second, how could those be translated into software code? Researchers have suggested, in particular, two ethical theories to be applicable to the behavior of autonomous vehicles in cases of emergency: deontology and utilitarianism. Asimov’s three laws of robotics are a typical example of deontological ethics. The theory suggests that an autonomous car needs to follow strict written-out rules that it needs to follow in any situation. Utilitarianism suggests the idea that any decision must be made based on the goal to maximize utility. This needs a definition of utility which could be maximizing the number of people surviving in a crash. Critics suggest that autonomous vehicles should adapt a mix of multiple theories to be able to respond morally right in the instance of a crash.
Privacy-related issues arise mainly from the interconnectivity of autonomous cars, making it just another mobile device that can gather any information about an individual. This information gathering ranges from tracking of the routes taken, voice recording, video recording, preferences in media that is consumed in the car, behavioral patterns, to many more streams of information.
The implementation of autonomous vehicles to the mass market might cost up to 5 million jobs in the US alone, making up almost 3% of the workforce. Those jobs include drivers of taxis, buses, vans, trucks, and e-hailing vehicles. Many industries, such as the auto insurance industry are indirectly affected. This industry alone generates an annual revenue of about $220 billions, supporting 277,000 jobs. To put this into perspective – this is about the number of mechanical engineering jobs. The potential loss of a majority of those jobs due to an estimated decline of accidents by up to 90% will have a tremendous impact on those individuals involved. Both India and China have placed bans on automated cars with the former citing protection of jobs.
The film Demolition Man (1993), starring Sylvester Stallone and set in 2032, features vehicles that can be self-driven or commanded to "Auto Mode" where a voice-controlled computer operates the vehicle.
The film Minority Report (2002), set in Washington, D.C. in 2054, features an extended chase sequence involving autonomous cars. The vehicle of protagonist John Anderton is transporting him when its systems are overridden by police in an attempt to bring him into custody.
The film Eagle Eye ( 2008 ) Shia LaBeouf and Michelle Monaghan are driven around in a Porsche Cayenne that is controlled by ARIIA ( a giant supercomputer ).
The film I, Robot (2004), set in Chicago in 2035, features autonomous vehicles driving on highways, allowing the car to travel safer at higher speeds than if manually controlled. The option to manually operate the vehicles is available.
Blade Runner 2049 (2017) opens with LAPDReplicant cop K waking up in his 3-wheeled autonomous flying vehicle (featuring a separable surveillance roof drone) on approach to a protein farm in northern California.
Intelligent or self-driving cars are a common theme in science fiction literature. Examples include:
In Isaac Asimov's science-fiction short story, "Sally" (first published May–June 1953), autonomous cars have "positronic brains" and communicate via honking horns and slamming doors, and save their human caretaker.
In Robert A Heinlein's novel, The Number of the Beast (1980), Zeb Carter's driving and flying car "Gay Deceiver" is at first semi-autonomous and later, after modifications by Zeb's wife Deety, becomes sentient and capable of fully autonomous operation.
"Driven", series 4 episode 11 of the 2003 TV series NCIS features a robotic vehicle named "Otto," part of a high-level project of the Department of Defense, which causes the death of a Navy Lieutenant, and then later almost kills Abby.
The TV series "Viper" features a silver/grey armored assault vehicle, called The Defender, which masquerades as a flame-red 1992 Dodge Viper RT/10 and later as a 1998 cobalt blue Dodge Viper GTS. The vehicle's sophisticated computer systems allow it to be controlled via remote on some occasions.
Jump up^Gehrig, Stefan K.; Stein, Fridtjof J. (1999). Dead reckoning and cartography using stereo vision for an autonomous car. IEEE/RSJ International Conference on Intelligent Robots and Systems. 3. Kyongju. pp. 1507–1512. doi:10.1109/IROS.1999.811692. ISBN0-7803-5184-3.
Jump up^Natasha Merat and A. Hamish Jamson. "HOW DO DRIVERS BEHAVE IN A HIGHLY AUTOMATED CAR? " Institute for Transport Studies University of Leeds. Quote: "Drivers’ response to all critical events was found to be much later in the automated driving condition, compared to manual driving."
Jump up^"A Tragic Loss" (Press release). Tesla Motors. 2016-06-30. Retrieved 2016-07-01. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.
Jump up^O'Toole, Randal, Policy Implications of Autonomous Vehicles (September 18, 2014). Cato Institute Policy Analysis No. 758. Available at SSRN: https://ssrn.com/abstract=2549392
Jump up^Pinto, Cyrus (2012). "How autonomous vehicle policy in California and Nevada addresses technological and non-technological liabilities". Intersect: The Stanford Journal of Science, Technology and Society. 5.
^ Jump up to:abKyriakidis, M.; Happee, R.; De Winter, J. C. F. (2015). "Public opinion on automated driving: Results of an international questionnaire among 5,000 respondents". Transportation Research Part F: Traffic Psychology and Behaviour. 32: 127–140. doi:10.1016/j.trf.2015.04.014.
Jump up^Hohenberger, C.; Spörrle, M.; Welpe, I. M. (2016). "How and why do men and women differ in their willingness to use automated cars? The influence of emotions across different age groups". Transportation Research Part A: Policy and Practice. 94: 374–385. doi:10.1016/j.tra.2016.09.022.
Jump up^Fagnant, D. J.; Kockelman, K. (2015). "Preparing a nation for autonomous vehicles: Opportunities, barriers, and policy recommendations". Transportation Research Part A: Policy and Practice. 77: 167–181. doi:10.1016/j.tra.2015.04.003.