Tesla Autopilot Most Often Used For Highway-Speed Driving, MIT Researchers Say
By Marco della Cava, USA TODAY
May 31, 2018
SAN FRANCISCO — Tesla's innovative and controversial Autopilot software — which powers the partially self-driving features of its electric cars — is most often used for highway driving, according to the initial findings of an MIT study using volunteer owners.
The research, shared at a conference in Cambridge, Mass. Wednesday, came a day after the latest crash of a Tesla using Autopilot, and as two consumer groups renewed criticism of the software's name and marketing, which they say dangerously misleads drivers.
Launched in 2015, the software Tesla CEO Elon Musk once said could be "safer than humans" is receiving more scrutiny as the number of Teslas on the road increase and other automakers unveil their own, partially autonomous vehicles.
The Massachusetts Institute of Technology, in on-going study of 34 Tesla owners who volunteered for the project, found Autopilot was used during 36% of the miles driven by the 22-car test group (some cars are owned by couples).
"That it's actually being used quite often by most drivers jibes with what Musk has said," Bryan Reimer, research scientist in the MIT AgeLab and associate director of MIT's New England University Transportation Center, told USA TODAY.
Drivers are most likely to use it for highway-speed driving, with the next biggest cluster between 25 and 45 miles per hour, Reimer said in an address Wednesday to the New England Motor Press Association's annual tech conference.
Tesla owners seem to come in a few flavors. Some adore their cutting-edge tech sedans but never engage Autopilot, but most are so enamored of the semi-autonomous features that they use them regularly to take some of the monotony out of driving. (In more extreme cases, they film themselves while the car drives itself.)
Reimer says that interviews with participants in other tesearch test groups show a "glaring gap" in the understanding of automation and safety technology. He says that demands an increase in driver education on the part of stakeholders such as automakers, dealers and perhaps even licensing authorities.
That confusion may well have played a role in some of the recent crashes involving Teslas on Autopilot.
The latest one happened Tuesday in Laguna Beach, Calif., where a Model S hit and totaled a parked, but empty, police car.
In an earlier Utah crash, the driver — who sustained only a broken ankle after hitting a stopped fire truck at 60 mph — was looking at her phone and disengaged from driving for 80 seconds before impact. In March, the driver of a Model X died after his Autopilot-enabled car steered into a highway divider in Mountain View, Calif.
In each case, Tesla has responded by reminding consumers that the system is not meant to turn the vehicle into a self-driving car and that it requires constant driver oversight.
But Musk's enthusiastic forecast for the capabilities of Autopilot, as well as its name, often override those admonitions, say two consumer groups.
Consumer Watchdog and the Center for Auto Safety held a press conference in Los Angeles Wednesday to urge state and federal regulators to push Tesla to rename Autopilot and possibly require that it be tested further.
"People relying on Tesla (Autopilot) are getting killed, and that's what we're trying to stop," said John Simpson, privacy and technology project director at Consumer Watchdog, which has been dogging Tesla for years.
Simpson said the two groups are urging the California Department of Motor Vehicles officials to investigate how the electric automaker's claims about the technology match up to Autopilot's reality.
Jason Levine, executive director of the Center for Auto Safety, said that the consumer groups also are asking the Federal Trade Commission to look into what they call "dangerously misleading and deceptive marketing practices" associated with Autopilot.
"Tesla has captured the imagination of the buying public with car pitched directly to consumers by a celebrity CEO," Levine said. "But the software and hardware (of Autopilot, which uses radar and cameras to scan the road ahead) needs to be improved, or the name has to be changed. These recent deaths should give politicians pause."
One of the trickiest aspects of semi-self-driving cars involves the moment when a situation requires handing control back to the driver, also known as the "handoff."
Automakers from Tesla to Nissan to Cadillac have used different kinds of feedback to force the driver to re-engage with the vehicle once this driver-assist tech is deployed.
In the MIT study, of the nearly 20,000 studied Autopilot "disengagements" — when control was handed back to drivers — a scant 0.5% were initiated by the car, with humans taking over for reasons ranging from planned maneuvers to complex road scenarios.
That result suggests this particular test group does not seem to be abusing the system, which would trigger various warnings to take back control of the vehicle.
The general public has been fascinated when it comes to the coming age of self-driving cars, a future heralded by some of technology's biggest names. But this effort took a hit in March when an Uber autonomous car killed a pedestrian in Arizona and led to the company pulling out of the statewide testing.
Just how much confusion exists is hammered home by a finding from an MIT AgeLab survey question that asked respondents: "To your awareness, are self-driving vehicles available for purchase today?"
Nearly 23% said yes. But there are no self-driving vehicles are available for purchase today; only cars with semi-autonomous features such as Autopilot.
"As I mentioned," says Reimer, "there's a lot of confusion out there."
Follow USA TODAY tech writer Marco della Cava on Twitter.