One of the major problems with Tesla’s implementation of their semi-automated, Level 2 driver-assist system, confusingly called Autopilot or, in other variants, Full Self-Driving, is that it has inadequate systems in place to confirm there’s a human at the wheel, ready to take over in an instant, and, like all Level 2 systems, has no safe failover behavior to get the car out of the way if no driver is detected or something else goes wrong. While I’ve had many Tesla-lovers scream at me about how these things just aren’t problems, the existence of morons like this guy arrested for misusing his Tesla’s Autopilot are proof it is.
And this dipshit, oh boy, is an especially egregious example of the problem, as the not-driver, a 25 year-old noted Instagram tumor named Param Sharma, claims to have driven in a Tesla with no one in the driver’s seat for a total of 40,000 miles.
Now, just to clarify, I’m not saying the fault here all lies with Tesla, not by a long shot—Teslas do provide a bunch of warning messages and it’s very clearly stated that, no, dummy, you should not ignore what’s going on and not be ready to take over, because the system is not designed to work that way.
Sharma is fucking idiot, and chose deliberately to do these things, putting people all around him on the roads in danger, and, potentially less tragically, himself.
Sharma was arrested and cited for disobeying an officer, and after his arrest, in an interview with the Associated Press, Sharma revealed that his conception of what his Tesla is capable of is very different than the company’s official line:
“It was actually designed to be ridden in the back seat…I feel safer in the back seat than I do in the driver’s seat, and I feel safer with my car on Autopilot, I trust my car Autopilot more than I trust everyone on the road.”
He also told AP that he felt Elon Musk wants him to do dumb shit like this, an idea that comes up in some of his cringy Instagram videos, like here where in the comments he mentions that Elon Musk is “blue collaring” for him, one of his many embarrassing digs at people who actually work, as opposed to the “gold collar” he imagines himself to wear.
I’m not certain I’ve ever encountered anyone less likable. Is this a joke? A parody account, or does this person actually suck this dramatically?
In fact, right after he was arrested, he posted this video on Instagram of him doing the same miserably stupid thing:
The video also stays very much on-brand for Sharma, who reminds us both that he’s so very rich and, simultaneously, reminds us that money is in no way an indicator of a human’s actual worth, which in this case appears to be quite minimal.
He seems to be enjoying the attention his dumb Tesla stunt got him:
So, yes, Sharma is a whole lot of garbage crammed into a wet, dripping Hermès bag, and while that is a problem, Tesla’s poor driver monitoring systems are a problem, too.
Defeating the steering wheel torque sensor is pretty trivial; Consumer Reports did it with some string and a couple rolls of tape. That, and connecting the seat belt on an empty seat are all it takes to get the Tesla driving in semi-automated mode without a person in the driver’s seat, as Sharma claims to have done for tens of thousands of miles.
And while, sure, for most highway driving the system works reasonably well, it’s by no means infallible; there’s many examples of poor decision making by the system, confusion about billboards, and other issues. Even fatal crashes. Even if it was a great driver but still requires a person to take over at any moment, that’s a huge problem: would you hire a narcoleptic chauffeur who was almost always a safe driver, but could potentially fall asleep at any moment?
I’m guessing you wouldn’t.
Again, it doesn’t matter how well it can do, if a driving-assist system requires a person to be alert and ready to take over, then the design of the system needs to be as resistant to idiotic abuse as possible. Because idiots will always put too much faith in a system, and then we get shit like this.
Currently, the best a Tesla can do if it doesn’t detect a driver at the wheel even after warning chimes is to slow to a stop and turn on its emergency flashers, which, if this were to happen on a highway, could be disastrous—nobody should be just stopping in their lane on a highway. If steps are taken to fool the system, as in the case of Sharma or any of these other morons, then there’s not even this minimal failsafe.
It’s clear that among at least some subset of Tesla owners, there’s both a severe misunderstanding of the capabilities of Autopilot and a willingness to blithely misuse the system, which at worst leads to crashes and at best leads to more difficulties in the development and adoption of future automated driving systems.
It would be one thing if it just had the potential to harm the person inside the car, like not wearing a seat belt; in Sharma’s case, one could just listen to this craptacular track he recorded and think, well, if he wants to do dangerous shit, okay:
Unfortunately, much like this unlistenable song, what he’s doing has the potential to harm other people as well, people who didn’t sign up to be part of one dork’s insecure tribute to colossal shittiness.
Tesla needs to accept that there are jackasses who will misuse their systems, and they have a responsibility to do more to try and keep it from happening.
The smartest path would be to push for safe failover systems, like what you’d have in basic Level 3; if not, then better driver monitoring is needed.
Even if that driver is a dipshit.