Moving to a digital key is not inherently a bad idea, as long as a new system is as good, or better, than the old solution. Perhaps the new digital key specification has too many unknowns though, says Craig Smith, transportation security research director, Rapid7.
This summer, the Car Connectivity Consortium (CCC) announced a new specification that’s being developed to allow car owners to turn their smartphones into digital keys. According to reports, the new system will be available to all CCC member companies, which include BMW, Honda and Toyota. Work is already underway to roll out the capability, which could include functions such as unlocking and locking, starting engines, and even sharing access to cars.
The car key fobs that are in use now have been plagued by problems, a lot of which revolve around using proprietary cryptography and the hope that bad guys won’t understand the technology used to make the keys work. Will the use of a smartphone to perform vehicle key operations just bring its own set of security issues and vulnerabilities?
Lack of standards means security gaps
Ultimately, the backend is set up to fail as there are no standards for any of the technology pillars that the digital key system is based on. Also, the use of proprietary algorithms and secure chips to hide the data from the public eye will simply mean that the flaws in current key fobs will follow drivers into the digital space.
One key area of concern around the use of secure chips is that the practice of placing private security data in a ‘protected area’. Protected areas in the digital arena are as strong as protected areas in real life: given enough time, somebody can break in.
Moving proprietary code onto chips also has its disadvantages. The old adage of ‘to protect software, move it into hardware’ doesn’t apply: rather than adding protections, it just makes the code harder to get at than if it were on a key fob. Attackers may need to use a scanning electron microscope (SEM) to study the chip, but this is no longer cost-prohibitive.
Hackers can rent the equipment from a university if they want it, and it’s not particularly expensive. They can also use open source tools like ChipWhisperer to perform complex attacks at low cost.
Security by obscurity
A more pressing issue, however, is that keys inside hardware are difficult to update. If a key is extracted by a hacker, or leaked by a disgruntled or absent-minded employee, it’s difficult to recover the data.
Once a key is exposed, there’s no longer any security in place – and the vehicle manufacturer is left with no recourse other than a recall if they want to fix the problem.
Yes, ‘secret algorithms’ will provide protection against this for a while, but for how long? Anything can be cracked with enough effort, time and money – and once it is, and the details have been published, you can’t put the genie back in the bottle.
A vehicle spends around 11.5 years on the road, supported by the manufacturer. That means car makers are responsible for the security of their keys for more than a decade. That’s a very long time to support a system, and a manufacturer cannot do this by relying on ‘security by obscurity’. They must assume the bad guy will eventually see their code, and the system will no longer be secure.
Open standards on the chip
What would work, however, is the introduction of an open standard and methodology that can be implemented in a chip, and openly reviewed and updated. Rather than trying to build another mystery black box to protect vehicles, this will enable data to be shared widely, so that exploits can be understood and quickly patched out of existence.
A good example of this is when Atmel was the first company to release an open immobiliser standard, which fell under academic scrutiny by Stefan Tillich and Marcin Wojcik from the University of Bristol. This approach allowed for a better final product, and it ought to be encouraged throughout the industry.
The digital key system specification is still in its early stages – and manufacturers are absolutely not required to implement proprietary algorithms or rely on secure chips to keep their secrets. That is up to them to decide. This is, after all, only the first version of the specification, and in future iterations we’ll hopefully see less of an emphasis on ‘security through obscurity’.
The author of this blog is Craig Smith, transportation security research director, Rapid7