Tagged: tesla

Investigating Problems with the Tesla HomeLink RF Signal with a HackRF and GNU Radio

Tesla vehicles have a feature where they can copy and mimic a garage door remote via a built in transmitter on the car itself. This frees you from having to carry around a garage door key fob, and you can simply open your garage door by pressing a button on the car's LCD screen.

However, some people have reportedly been having a little trouble with this feature as in some cases the garage door would begin opening, and then suddenly stop opening as if the keyfob button had been pressed twice.

Over on YouTube CWNE88 decided to investigate this problem using his HackRF and GNU Radio. From a simple waterfall he was able to determine that the Tesla actually transmits the mimic'd garage door signal for a full two seconds.

As a keypress from the original keyfob would typically result in a much shorter transmission, CWNE88 believes that the long two second transmission could in some cases be seen as two transmissions by the garage door, resulting in an open, and then close command being detected. 

Tesla HomeLink RF Signal

Running a Tesla Model 3 on Autopilot off the Road with GPS Spoofing

Regulus is a company that deals with sensor security issues. In one of their latest experiments they've performed GPS spoofing with several SDRs to show how easy it is to divert a Tesla Model 3 driving on autopilot away from it's intended path. Autopilot is Tesla's semi-autonomous driving feature, which allows the car to decide it's own turns and lane changes using information from the car's cameras, Google Maps and it's Global Navigation Satellite System (GNSS) sensors. Previously drivers had to confirm upcoming lane changes manually, but a recent update allows this confirmation to be waived.

The Regulus researchers noted that the Tesla is highly dependent on GNSS reliability, and thus were able to use an SDR to spoof GNSS signals causing the Model 3 to perform dangerous maneuvers like "extreme deceleration and acceleration, rapid lane changing suggestions, unnecessary signaling, multiple attempts to exit the highway at incorrect locations and extreme driving instability". Regarding exiting at the wrong location they write:

Although the car was a few miles away from the planned exit when the spoofing attack began, the car reacted as if the exit was just 500 feet away— slowing down from 60 MPH to 24 KPH, activating the right turn signal, and making a right turn off the main road into the emergency pit stop. During the sudden turn the driver was with his hands on his lap since he was not prepared for this turn to happen so fast and by the time he grabbed the wheel and regained manual control, it was too late to attempt to maneuver back to the highway safely.

In addition, they also tested spoofing on a Model S and found there to be a link between the car's navigation system and the automatically adjustable air suspension system. It appears that the Tesla adjusts it's suspension depending on the type of road it's on which is recorded in it's map database.

In their work they used a ADALM PLUTO SDR ($150) for their jamming tests, and a bladeRF SDR ($400) for their spoofing tests. Their photos also show a HackRF.

Regulus are also advertising that they are hosting a Webinar on July 11, 2019 at 09:00PM Jerusalen time. During the webinar they plan to talk about their Tesla 3 spoofing work and release previously unseen footage.

GPS/GNSS spoofing is not a new technique. In the past we've posted several times about it, including stories about using GPS spoofing to cheat at Pokémon Go, misdirect drivers using Google Maps for navigation, and even a story about how the Russian government uses GPS spoofing extensively.

Some SDR tools used to spoof the Tesla Model 3.
Some SDR tools used to spoof the Tesla Model 3.

Stealing a Tesla Model S in Seconds by Cloning its Wireless Keyfob

Recently wired.com ran a story that explains how research hackers from KU Leuven university in Belgium have been able to clone a Tesla car key fob within seconds. With the cloned keyfob they are then able to open the Tesla's door, start the motors and drive away. The researchers believe this attack could also work on cars sold by McLaren and Karma, as well as Triumph motorcycles.

Like most automotive keyless entry systems, Tesla Model S key fobs send an encrypted code, based on a secret cryptographic key, to a car's radios to trigger it to unlock and disable its immobilizer, allowing the car's engine to start. After nine months of on-and-off reverse engineering work, the KU Leuven team discovered in the summer of 2017 that the Tesla Model S keyless entry system, built by a manufacturer called Pektron, used only a weak 40-bit cipher to encrypt those key fob codes.

The researchers found that once they gained two codes from any given key fob, they could simply try every possible cryptographic key until they found the one that unlocked the car. They then computed all the possible keys for any combination of code pairs to create a massive, 6-terabyte table of pre-computed keys. With that table and those two codes, the hackers say they can look up the correct cryptographic key to spoof any key fob in just 1.6 seconds.

The attack hardware consists of a Yardstick One dongle, a Proxmark RFID/NFC radio, and a Raspberry Pi connected to the 6TB hard drive containing the database of pre-computed keys. All together the cost of such a system is under $600.

The actual attack works by first bringing the RFID antenna and radio near the car and recording vehicles identifier code which is periodically transmitted by the car. Then the antenna is brought near to the owners keyfob and impersonates the car using the identifier code. This tricks the keyfob into sending out encrypted response codes which are then decrypted by the 6TB lookup table on the hard drive. The Yardstick One is then used to transmit the final unlock code at 433.92 MHz.

Tesla have since responded by noting that cars sold after June 2018 have improved encryption and aren't vulnerable to this attack, and that owners of cars manufactured earlier are able to enable an option that requires a PIN code to be entered. Owners could also take extra precautions such as using an RFID blocking pouch. Tesla vehicles also have built in GPS tracking which may deter thieves.

The video below shows the attack in action, and a short overview paper by the researchers can be found here.

COSIC researchers hack Tesla Model S key fob