Connect with us

Technology

Teslas are now smart enough to avoid McAfee’s self-driving car hack

Published

on

A little electrical tape has caused quite the stir.

This week, McAfee security researchers released 18 months worth of research that demonstrates the ease with which a “smart” autonomous vehicle can be tricked into misreading and accelerating past speed limits. The finding that some strategically placed black tape on a speed limit sign could trip up a smart car equipped with Mobileye cameras (used for advanced driving systems) to go 85 mph instead of the 35 mph limit certainly seemed alarming. 

But there were some major caveats to the research. Mainly, that for self-driving vehicles (i.e., cars reliant on computer control for driving versus a hybrid system like Tesla’s that relies on humans and software for piloting) this weakness discovered in older Teslas isn’t an actual issue. 

You can check out McAfee’s successful hack, in which the car’s cruise control zooms past 35 mph, in the video above. 

But before freaking out about all the ways self-driving and automated vehicles are doomed, first consider that the McAfee Advanced Threat Research team tested this model hack on two 2016 Teslas; newer Tesla models have since stopped using Mobileye cameras in favor of the company’s own proprietary cameras. 

Also, the version of the Mobileye camera used in those models has been updated and that version is no longer susceptible to the hack.

As Steve Povolny, head of McAfee Advanced Threat Research, explained in a phone call, most advanced driver systems with collision avoidance or adaptive cruise control don’t solely rely on camera sensors. But regardless, his team wanted to highlight machine learning vulnerabilities that the industry needs to constantly monitor and improve. “We’re here to show these weaknesses exist,” he said. 

For the researchers, this eye-opening hack highlights how automotive part makers for autonomous and automated vehicles could pre-empt dangerous scenarios. “We’re starting to change the topic of conversation,” Povolny said. “We didn’t get a chance to do that with browsers and operating systems.”

Looking at a self-driving car company like Google spin-off Waymo, you see that its autonomous vehicles are representative of most self-driving systems. These vehicles rely primarily on mapping data that its cars have manually collected. Those maps mark everything from stop sign placement to curb height. In this scenario with the electric tape, Waymo has a safeguard built into its sensor and computer system: Even if a sign appears to have a faster speed limit than what’s in the database or map, the car will never go faster than what’s been programmed.

Waymo’s vehicles don’t necessarily ignore all the visual info it reads from its cameras as it drives around, but it also doesn’t immediately acquiesce to a posted sign. If a Waymo vehicle comes upon a construction zone, it’s trained to respond appropriately to the situation, like slowing down to posted speeds. Waymo’s machine learning process allows its robotaxis to read road signs, text on emergency vehicles, and other signage on cars and trucks, like “Oversized”or “Student driver,” as the company explained in a recent blog post.

So, you can try to troll a Waymo or other self-driving cars with some black tape, but don’t expect them to budge past the legal limit.

Continue Reading
Advertisement Find your dream job

Trending