Tesla prides itself on cutting-edge innovations in car manufacturing, particularly its famous Autopilot system.
However, recent developments have indicated that this technology may not be as safe as previously thought.
The story has gained traction because, according to Harris County Constable Mark Herman, no one was driving the car at the time of the crash.
While there are supposedly safeguards in place to ensure that someone is in the driver’s seat and using the steering wheel, Herman said he believes the car wasn’t being driven by a human when the crash occurred.
In response to this tragedy, Consumer Reports set out to test how easy it is to get a Tesla to drive with no one in the driver’s seat.
Its findings were alarming.
“Consumer Reports engineers easily tricked our Tesla Model Y this week so that it could drive on Autopilot, the automaker’s driver assistance feature, without anyone in the driver’s seat — a scenario that would present extreme danger if it were repeated on public roads,” the report said.
Jake Fisher, CR’s senior director of auto testing, performed the test. He first engaged the Autopilot feature with the car in motion on a closed track and then set the speed dial to zero, which caused the car to stop.
Do you think Tesla’s Autopilot feature is too dangerous to be used?
95% (35 Votes)
5% (2 Votes)
Next, he reproduced the weight of a driver’s hand by setting a weighted chain on the left side of the steering wheel. After that, he moved into the passenger seat without unbuckling the driver’s seat belt or opening the doors of the car.
Finally, he used the steering wheel dial to make the car accelerate again. It stopped only when he returned the dial to zero.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Fisher said.
“It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
CR noted that vehicles with similar capabilities produced by other companies have much better safety measures in place. General Motors’ Super Cruise system uses a camera to make sure the driver is in the seat and focused on the road.
Jalopnik argued that it is not just the occupants of the self-driving car who are put at risk by Tesla’s lax safety measures, but also other drivers on the road.
“You might argue that anyone intentionally using Autopilot while not in the driver’s seat deserves whatever comes next, but the people that this driverless Tesla could crash into don’t,” the tech outlet said.
“After all, Autopilot is imperfect. Further, the point of pretty much all car safety tech is to save people from themselves.”
Any way you slice it, this is not a good look for Tesla. CEO Elon Musk attempted to clear the company’s name in a tweet after the crash in Texas.
“Data logs recovered so far show Autopilot was not enabled & this car did not purchase [full self-driving],” he said. “Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.”
Your research as a private individual is better than professionals @WSJ!
Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.
Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.
— Elon Musk (@elonmusk) April 19, 2021
It was not immediately clear whether or not Autopilot is to blame for the fatal accident. Either way, CR’s research exposes a serious flaw in Tesla’s safety features that should be addressed immediately.
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.