This Article

It’s time for the government to come down on Tesla and “Full Self Driving” before more people die

It’s time for some government intervention.

Before I get accused of a being a bleeding heart liberal who wants government control of everything, I think there’s a time and place for government, but it also doesn’t need to be all up in my business all the time.

One of the places where I think regulation plays an important part, and that’s in public safety. In automobiles, that falls to the National Highway Traffic Safety Administration (NHTSA). They make the rules for cars that can be sold in the U.S. They crash test the cars. They are the ones to ensure vehicles sold are safe.

That doesn’t mean automakers don’t want to make safe cars, but there needs to be a safety net in place to ensure that they do.

With Tesla, they’ve been doing a woefully inadaquete job. When a Tesla is in a crash, it’s actually a safe vehicle. But the driver’s assist technology that’s supposed to prevent crashes in the first place seems to also be causing them.

Tesla’s Autopilot software, and the horribly named “Full Self Driving” add-on are advanced driver assistance features. They aren’t self-driving, and they aren’t designed for the driver to not pay attention and be able to take over at any time.

You wouldn’t know that second part though based on the name. Most people assuming Autopilot means the computer does everything. It’s just like a plane’s Autopilot.

Except a plane’s Autopilot doesn’t do everything. If you ask a commoner, they’ll say “planes fly themselves,” but a pilot won’t. Elon Musk seems to think people understand that difference.

When asked by Automotive News if people know that it’s just an assistance feature in wake of people climbing into the back seat of cars and sleeping, he said, “No, they obviously know.”

If they do obviously know, they don’t care. That’s the problem. There might be warning boxes and indicators and what not to remind drivers that it’s an assist feature and it’s in “beta,” but out in the real world people are bypassing those warnings.

They can do that because there’s no active driver monitoring. There’s no camera to look at the driver to make sure they’re paying attention. So drivers just add some weight to the steering wheel, keep the seat belt buckled, and climb into the passenger seat to take a snooze.

“Full Self Driving” is even worse, because while it might be a more comprehensive system, it still requires just as much driver intervention and vigilance. It’s also horribly unpredictable.

If any automaker other than Tesla put this software into a car and sold it to the general public (“FSB” is in a limited “beta” but has rolled out to more people outside the company), they’d be sued out of business. But not Tesla. Those who stan Tesla put up with it because Elon Musk is “changing the world.” If you question the cultlike logic, you’re accused of being jealous of his success.

But this isn’t about Elon Musk. Elon is on brand here pushing the limits of what he and his company can get away with in regards to government oversight. He’s laughed off the Securities and Exchange Commission (SEC). He’s even faced legal issues in regards to calling people pedophiles on Twitter. He just doesn’t care.

But crashes involving Tesla driver assist software keep happening. Just this past week there was this event with a commercial truck, and this event where someone hit the back of a Michigan State Trooper. The last one the driver had a suspended license, so they must’ve though that having a Tesla driving itself doesn’t mean they need a license.

Driver training in the United States is terrible compared to the rest of the world. People aren’t using driver’s assist functions they way they were intended to because they either don’t care or don’t understand how they work. With names like Autopilot and “Full Self Driving,” it’s easy to see how people could get that confused. Heck, there are still automotive journalists today calling these things self-driving cars, when no publicly-available self-driving car is on sale.

Tesla is beta testing with peoples’ lives. I didn’t agree to the software license agreement for either of those technologies. I didn’t agree to put my life at risk because some idiot doesn’t know see the danger in climbing into the fucking back seat of a car driving down the highway.

The government needs to step in. Autopilot and “Full Self Driving” are horrible names for technology and should be changed. Tesla vehicles should pay attention to weight in the driver’s seat and disable driver’s assist function if no weight is present, along with the seat belt buckled and hand on the wheel. The more advanced features should be hobbled until Tesla has an active driver monitoring system installed to ensure the driver is actually paying attention.

Finally, any software tagged as “beta” needs to be limited to a certain area with people who have had advanced driver training.

People are dying because of Tesla’s driver assistance software. Countless crashes are occurring. The only reason why there still hasn’t been oversight is because the “right person” hasn’t died yet. How about instead of waiting for that moment, we do something now?

Written by Chad Kirchner
Follow Author
JOIN THE EV PULSE NEWSLETTER
Receive weekly updates on each of our electrifying articles.