Just days after it became available, Tesla's news Smart Summon feature has come under fire from owners and professional testers alike.
The feature is supposed to offer a way for Tesla owners to have their car navigate itself out of a space, across the parking lot, and drive right to where they are standing — provided the route doesn't involve any public roads.
- Tesla Autopilot feature 'breaks laws and could pose safety risk'
- Tesla Model 3 Performance review: Worth the hype?
Smart Summon is part of a package Tesla calls Full Self-Driving, or FSD, which currently costs $6,000 and claims to one day make the car fully autonomous. Tesla says "automatic driving on city streets" is due via a software update before the end of 2019.
However, right after Smart Summon became available via the Version 10 software update earlier in October, owners began posting videos to social media demonstrating the system's shortcomings. Videos, some of which are embedded below, showed how the cars were sometimes unable to see approaching traffic, and how they could not always tell the difference between grass and asphalt, then driving across the former to get to their owner.
Now, Consumer Reports _ which has called out Tesla's autonomous driving technology in the past — has had its say. Pulling no punches, the publication described the Smart Summon system of its own Tesla Model 3 as acting like a drunk or distracted driver.
Jeff Plungis, the site's lead automotive tester, said: "The vehicle drove in the middle of the traffic lane, not on the side closer to the parked cars, as a human driver would. It would wander left and right as it drove - erratically, like a drunken or distracted driver. It another instance, the Model 3 drove itself the wrong way on a one-way lane. The tester had to run out to the car to move it to allow traffic to begin flowing again."
However, the report also accepted that "under the right circumstances," the Smart Summon system would "slowly and successfully make its way to the person summoning it with a smartphone - and in those circumstances the car was indeed controlling itself, steering, braking, and making decisions about its route."
The report said the system "was glitchy and at times worked intermittently, without a lot of obvious benefit for consumers."
Jake Fisher, senior director of auto testing for Consumer Reports, said: "What consumers are really getting is the chance to participate in a kind of science experiment. This is a work in progress."
During one test, the Model 3 is said to have become confused about where it was, mistakenly identifying the parking lot as a public road, which caused Smart Summon to stop working and forced the driver to go and move the car manually. Smart Summon is stopped (and the car immediately halted) when the owner removes their finger from a button on the Tesla smartphone app.
These findings fall into line with numerous videos posted to Twitter and YouTube in recent days, showing Tesla cars struggling to navigate themselves around parking lots.
Tesla says Smart Summon is currently in the beta stage of development, is not a finished product, and should therefore be treated with care. Owners using Smart Summon remain ultimately responsible for the vehicle at all times, and are urged to only use the system in a safe environment.
The US National Highway Traffic Safety Administration (NHTSA) says it is aware of reports of Smart Summon working incorrectly, and has ongoing contact with Tesla through which it plans to gather more information on the system and its failings.
"Tesla once again is promising 'full self-driving' but delivering far less, and now we're seeing collisions," said Ethan Douglas, a senior policy analyst at Consumer Reports in Washington DC. "Tesla should stop beta-testing its cars on the general public by pushing out experimental features before they're ready."