News

Tesla autopilot feature hacked to risk oncoming traffic

Tesla’s High-End Vehicle’s Lane Recognition System not Free from Technical Glitches- Keen Labs Claims in New Research.

Cybersecurity firm Keen Labs published a research paper [PDF] on Saturday in which it described the three hacks that the company detected that can be used to manipulate Tesla Model S. The first two hacks were directed towards the Autopilot lane recognition system while the third hack was targeted towards the vehicle’s windshield’s automated rain detection system.

See: Bug bounty: Hack Tesla Model 3 to win your own Model 3

They first tried to change the physical road surface by confusing the system using blurred patches. The researches placed the blurred patches on the left-lane. But, they found that this method wasn’t as easy to pull off in real life because Tesla’s computer could identify the flaw. Thus, they weren’t able to disable the lane recognition feature in a moving Tesla car because the company has already added abnormal lanes in the Autopilot function’s training miles set to allow it to get a good sense of lane direction.

Tesla autopilot feature hacked to risk oncoming traffic

However, Keen Labs researchers were keen on compromising the Autopilot lane recognition system. They tried another hack by making the system believe that there was a traffic lane when actually there wasn’t any. Therefore, they tried another strategy by painting three small sized squares in the traffic lane.

Tesla autopilot feature hacked to risk oncoming traffic

The purpose was to imitate merge striping and let the car’s Autopilot system veer into the left lane traffic through being fed misleading traffic direction related information. This was, according to Keen, was a much more dangerous tactic than allowing the system to believe that there is no lane.

“This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”

Researchers also played with the windshields, which in Tesla’s vehicles use moisture sensors for identifying rain through numerous Autopilot cameras as well as AI and the wipers start working automatically. There are 120-degree Fisheye cameras that can easily capture clear images of the windshield and send them to the neural network after preprocessing them.

The neural network, researchers explained, then releases a float value, that can be between 0 and 1, to identify the probability of moisture on the windshield. Researchers mimicked the raindrops by exposing the cameras of the vehicle to such an image that would trigger the required reaction from the neural network.

Tesla’s spokesperson states that the identified issues don’t actually pose a real threat since no drivers so far have faced any such issues or have complained about it. The company’s official statement read:

“In this demonstration, the researchers adjusted the physical environment (e.g. placing tape on the road or altering lane lines) around the vehicle to make the car behave differently when Autopilot is in use. This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”

Furthermore, Tesla’s spokesperson says that the findings of Keen Labs didn’t qualify for the bug bounty program of Tesla but the company was indeed grateful to the researchers for identifying potential flaws.

“We know it took an extraordinary amount of time, effort, and skill, and we look forward to reviewing future reports from this group,” Tesla’s spokesperson acknowledged the efforts of Keen. However, the company maintains that such an attack won’t meet success in the real world and hence, cannot be termed as a security concern.

See: Researchers demonstrate how to unlock Tesla wireless key fobs in 2 seconds

Regarding the windshield hack, Tesla claims that the wipers’ auto mode works in BETA and there is already a warning in the Owner’s Manual so, in case of technical glitches, drivers can always switch to manual mode setting.

Keen Labs released a video as well to demonstrate all the three hacks that its researchers identified.

Did you enjoy reading this article? Like our page on Facebook and follow us on Twitter.

You Might Also Like