begin quote from:
Fatal Tesla crash exposes lack of regulation over autopilot technology
Los Angeles Times | - |
The
fatal crash of a Tesla electric car using an autopilot feature still in
beta testing -- and never reviewed by regulators -- highlighted what
some say is a gaping pothole on the road to self-driving vehicles: the
lack of federal rules.
Fatal Tesla crash exposes lack of regulation over autopilot technology
Jim Puzzanghera
The fatal crash of a Tesla
electric car using an autopilot feature still in beta testing -- and
never reviewed by regulators -- highlighted what some say is a gaping
pothole on the road to self-driving vehicles: the lack of federal rules.
Automakers do not need to get the technology approved by the National Highway Traffic Safety Administration before rolling it out to the public. They just have to attest that their vehicles meet federal safety standards -- yet there still are no such standards for autonomous driving features.
That enables carmakers to, at their own discretion, roll out
features when they deem them to be ready to hit the road. In a lengthy
statement Thursday, Tesla acknowledged that its autopilot mode was “new
technology and still in a public beta phase,” and said that “as more
real-world miles accumulate … the probability of injury will keep
decreasing.”
But critics, lawmakers and safety advocates say carmakers should not be using their customers as guinea pigs. They’re questioning whether the companies are moving too fast and say regulators should put the brakes on a nascent technology that might have been rolled out too hastily.
“The Tesla vehicles with autopilots are vehicles waiting for a crash to happen -- and it did in Florida,” said Clarence Ditlow, executive director of the Center for Auto Safety. He is calling for the company to issue a recall and disable the autopilot function until NHTSA issues safety guidelines.
“If you don’t have a radar system on a car that can tell you there’s a truck ahead of you, there’s a problem,” he said.
On the afternoon of May 7, in clear, dry conditions, a Tesla Model S driven by 40-year-old Joshua Brown slammed into a tractor trailer attempting to turn in front of it in Williston, Fla.
The accident sheared the roof off the car, which skidded under the truck and off the road, plowing through two wire fences before crashing into a utility pole, the accident report said. Brown, a former Navy Seal from Canton, Ohio, was pronounced dead at the scene.
Tesla and other automakers say that ultimately, self-driving cars and autonomous features will make driving safer and that eventually cars will be able to drive themselves better than humans can.
As it is now, the cars are able to manage normal driving conditions well, but have trouble in inclement weather or in unfamiliar driving situations, such as when bicyclists zoom by. In the case of the Tesla fatality, the company blamed the white side of the tractor trailer against a brightly lighted sky, which it said confused the car.
NHTSA is investigating the accident, thought to be the auto industry’s first fatality involving an autonomous driving feature.
"This tragic incident raises some serious concern about whether current autonomous vehicle technology is ready for use on public roads,” said Sen. Bill Nelson (D-Fla.), the top Democrat on the committee that oversees the agency. “NHTSA needs to take a hard look at this incident and determine whether the autopilot feature in this car is actually safe."
Transportation Secretary Anthony Foxx, who oversees NHTSA, is expected to issue guidelines for autonomous vehicles this month. He and NHTSA chief Mark Rosekind have touted the potential for the technology to reduce traffic fatalities.
Those guidelines were expected to focus on fully self-driving cars, which are different from the Tesla Model S that was involved in the fatal crash; that car was simply equipped with an autopilot mode. But the Florida accident could lead transportation officials to add guidelines for autonomous features such as autopilot, said Karl Brauer, senior analyst at Kelley Blue Book.
“The manufacturer is kind of able to roll it out with whatever limitations and guidance options they want,” Brauer said. “We might see a shift to having more aggressive guidelines for those driver-assistance functions as well.”
Tesla’s autopilot feature -- which must be turned on by the driver -- uses cameras, radar and sensors to steer the vehicle, change lanes and adjust speed based on traffic, the company said.
On Friday, the Palo Alto company said autopilot hardware is included in all its vehicles and that it was “continually advancing the software through robust analysis of hundreds of millions of miles of driving data.”
“We do not and will not release any features or capabilities that have not been robustly validated in-house,” a spokesperson said.
In owners manuals, corporate filings and other documents, Tesla makes clear that it believes drivers are responsible for maintaining control of their vehicles when using autopilot features.
In the owners manual for the Tesla Model X, for instance, the company notes that it is the driver’s responsibility “to stay alert, drive safely and be in control of the vehicle at all times.” It goes on to say drivers should “be prepared to take corrective action” and that “failure to do so can result in serious injury or death.”
Owners manuals also say drivers should always keep hold on the steering wheel, even when using autopilot. The question, though, is whether those warnings will provide enough protection to shield Tesla from liability when autopilot causes or fails to avoid collisions.
Greg Keating, a law professor at USC, said they might not.
If a Tesla owner sued, he said, the owner might argue that he or she had a reasonable expectation that the autopilot system would live up to its name, regardless of the warnings in the manual.
Product liability claims could be costly for Tesla, which notes in Securities and Exchange Commission filings that the company -- not an insurer -- would be on the hook to pay any such claims.
“Any product liability claims will have to be paid from company funds, not by insurance,” the company notes in the “risk factors” section of its latest annual report.
In the absence of federal guidelines, several states have developed their own, leading to an inconsistent set of rules.
Since 2014, California has required
manufacturers testing autonomous vehicles to submit detailed
information, including the number of self-driving cars the companies are
testing and the drivers who are testing them, Department of Motor
Vehicles spokeswoman Jessica Gonzalez said.Automakers do not need to get the technology approved by the National Highway Traffic Safety Administration before rolling it out to the public. They just have to attest that their vehicles meet federal safety standards -- yet there still are no such standards for autonomous driving features.
ADVERTISING
But critics, lawmakers and safety advocates say carmakers should not be using their customers as guinea pigs. They’re questioning whether the companies are moving too fast and say regulators should put the brakes on a nascent technology that might have been rolled out too hastily.
“The Tesla vehicles with autopilots are vehicles waiting for a crash to happen -- and it did in Florida,” said Clarence Ditlow, executive director of the Center for Auto Safety. He is calling for the company to issue a recall and disable the autopilot function until NHTSA issues safety guidelines.
“If you don’t have a radar system on a car that can tell you there’s a truck ahead of you, there’s a problem,” he said.
On the afternoon of May 7, in clear, dry conditions, a Tesla Model S driven by 40-year-old Joshua Brown slammed into a tractor trailer attempting to turn in front of it in Williston, Fla.
The accident sheared the roof off the car, which skidded under the truck and off the road, plowing through two wire fences before crashing into a utility pole, the accident report said. Brown, a former Navy Seal from Canton, Ohio, was pronounced dead at the scene.
Tesla and other automakers say that ultimately, self-driving cars and autonomous features will make driving safer and that eventually cars will be able to drive themselves better than humans can.
As it is now, the cars are able to manage normal driving conditions well, but have trouble in inclement weather or in unfamiliar driving situations, such as when bicyclists zoom by. In the case of the Tesla fatality, the company blamed the white side of the tractor trailer against a brightly lighted sky, which it said confused the car.
NHTSA is investigating the accident, thought to be the auto industry’s first fatality involving an autonomous driving feature.
"This tragic incident raises some serious concern about whether current autonomous vehicle technology is ready for use on public roads,” said Sen. Bill Nelson (D-Fla.), the top Democrat on the committee that oversees the agency. “NHTSA needs to take a hard look at this incident and determine whether the autopilot feature in this car is actually safe."
Transportation Secretary Anthony Foxx, who oversees NHTSA, is expected to issue guidelines for autonomous vehicles this month. He and NHTSA chief Mark Rosekind have touted the potential for the technology to reduce traffic fatalities.
Those guidelines were expected to focus on fully self-driving cars, which are different from the Tesla Model S that was involved in the fatal crash; that car was simply equipped with an autopilot mode. But the Florida accident could lead transportation officials to add guidelines for autonomous features such as autopilot, said Karl Brauer, senior analyst at Kelley Blue Book.
“The manufacturer is kind of able to roll it out with whatever limitations and guidance options they want,” Brauer said. “We might see a shift to having more aggressive guidelines for those driver-assistance functions as well.”
Tesla’s autopilot feature -- which must be turned on by the driver -- uses cameras, radar and sensors to steer the vehicle, change lanes and adjust speed based on traffic, the company said.
On Friday, the Palo Alto company said autopilot hardware is included in all its vehicles and that it was “continually advancing the software through robust analysis of hundreds of millions of miles of driving data.”
“We do not and will not release any features or capabilities that have not been robustly validated in-house,” a spokesperson said.
In owners manuals, corporate filings and other documents, Tesla makes clear that it believes drivers are responsible for maintaining control of their vehicles when using autopilot features.
In the owners manual for the Tesla Model X, for instance, the company notes that it is the driver’s responsibility “to stay alert, drive safely and be in control of the vehicle at all times.” It goes on to say drivers should “be prepared to take corrective action” and that “failure to do so can result in serious injury or death.”
Owners manuals also say drivers should always keep hold on the steering wheel, even when using autopilot. The question, though, is whether those warnings will provide enough protection to shield Tesla from liability when autopilot causes or fails to avoid collisions.
Greg Keating, a law professor at USC, said they might not.
If a Tesla owner sued, he said, the owner might argue that he or she had a reasonable expectation that the autopilot system would live up to its name, regardless of the warnings in the manual.
Product liability claims could be costly for Tesla, which notes in Securities and Exchange Commission filings that the company -- not an insurer -- would be on the hook to pay any such claims.
“Any product liability claims will have to be paid from company funds, not by insurance,” the company notes in the “risk factors” section of its latest annual report.
In the absence of federal guidelines, several states have developed their own, leading to an inconsistent set of rules.
So far, 111 autonomous vehicles from 14 manufacturers have been approved for testing on California roads.
Under California regulations, Tesla’s autopilot feature is not considered autonomous technology because it does not have the ability to drive without the “active physical control or monitoring” by a human operator, according to the DMV.
Brown, who called his car Tessy, had posted YouTube videos praising the autopilot function. Tesla Chief Executive Elon Musk touted one video on Twitter that was posted a month before the accident. In several of the videos, Brown’s hands were not on the wheel when the car was in motion.
“I have always been impressed with the car, but I had not tested the car’s side collision avoidance,” Brown wrote in comments posted April 5 with the video, which showed the technology saving him from being sideswiped by a truck. “I am VERY impressed. Excellent job Elon!”
The technology has been used in 130 million miles of driving without a fatality, Tesla said.
The driver of the tractor-trailer in the accident, Frank Baressi of Palm Harbor, Fla., told the Associated Press that Brown was “playing ‘Harry Potter’ on the TV screen” in the car when the crash took place. He acknowledged he couldn’t see the movie playing and that he only heard the audio.
Baressi did not respond to a voicemail requesting comment.
Sgt. Kim Montes, a spokeswoman for the Florida Highway Patrol, told The Times that a portable DVD player was found in the Tesla but said she did not know whether it had been in use.
“At the time of the impact, we don’t know what the status of that DVD player was. Investigators are looking into it,” she said. “We may never know.”
In a February conference call, executives touted the autopilot feature, saying early Tesla customers whose leases were about to expire were interested in leasing new models equipped with autopilot.
“Autopilot is certainly one of the core stories of what’s going on here at Tesla,” said Jonathan McNeil, the automaker’s president of global sales. “It’s really exciting.”
jim.puzzanghera@latimes.com
Times staff writers James Rufus Koren, Samantha Masunaga and Tracey Lien contributed to this report.
ALSO
Traffic deaths surged in 2015 as driving hit new record
U.S. auto sales expected to hit record in first 6 months
Don't drive these Hondas and Acuras unless dangerous air bags are fixed, government warns
end quote.
I have mentioned before that I was driving south on Interstate 80 across the Bay Bridge from Berkeley to San Francisco. I looked to my right and saw a Tesla in an obvious Automatic pilot configuration. As I looked down to my right (I'm in a 4 wheel drive large Tundra) I could definitely see the driver (a male) did not have his hands on the wheel. I realized to myself that I didn't want to be driving next to a computer driving that car and so I pulled ahead knowing the driver of the Tesla with a scared look on his face from letting go of the steering wheel completely was having some sort of out of body experience while driving.
Out of body experiences are best left to when you are sitting down or lying on a bed where you can survive them, not while your Beta App (that isn't ever meant to be "left alone to drive") isn't fully functional because it is in a BETA or testing phase only.
2nd note: Another thought I had was that the Beta automatic pilot on a Tesla tells you to "Hold Onto the wheel" while the computer is driving it. That just doesn't make any sense to me because "Wouldn't that interfere with the wheel turning?" or is the technology separate from the wheel but defers to you when you start moving the wheel?
So, the whole technology is confusing and reminds me of the Asiana pilot training another pilot where they had too many Automatic pilot functions on to safely land the plane in San Francisco a few years ago now and so the plane's tail was sheared off by the fence because of this and a couple of people died, one of them a stewardess who was in the tail section.
Whenever, whether flying or driving that you try to do too many things too many ways (at the same time), some people are liable to die eventually. To me, this is only logical. Simple is better always.
However, programmers and designers and engineers often are not thinking of the practical survival applications of these complications. All these devices are great but when you combine too many variables into any situation problems, deaths, near deaths or maimings or just accidents with fender benders are going to result eventually.
IF something like this happens on a computer either the computer freezes or shuts down or whatever. When something like this happens in a vehicle, (plane or car or truck) people get maimed or die! So, the need for accuracy is hundreds or thousands of times more necessary in a moving vehicle than just in a computer.
And to me
No comments:
Post a Comment