What are the top automotive cybersecurity vulnerabilities?

What are the top automotive cybersecurity vulnerabilities?

on Oct 13, 15 • by Jeff Hildreth • with No Comments

And how can nearly 90 percent of all vulnerabilities be traced back to just the top ten. If you missed our recent webinar you will want to read this...

Home » Software Security » What are the top automotive cybersecurity vulnerabilities?

In our webinar, “The top ten automotive cybersecurity vulnerabilities of 2015,” we took a deep dive into the most common vulnerabilities to see how they can be discovered in real software, what the problems might look like in code, and what simple changes can be made to coding style or processes in order to avoid them. Missed the webinar?

Watch it now.

We had so much great content that we actually ran out of time. In this blog post I’ll recap what we didn’t get to so we don’t leave any attendees wondering what was coming next. And in the weeks to come I’ll do a deeper dive into each vulnerability so people that didn’t attend the webinar can still protect themselves from these common vulnerabilities.

What we couldn’t get to during the webinar

There are four best practices to ensure you’re protected from common vulnerabilities: clean design, methodical process, careful analysis, and good tools.

Clean design: Make a design that cleanly separates processes with different security needs, and that has a tightly controlled interface between components. Don’t provide more access to system components than is strictly necessary. And don’t reinvent the security wheel—use tried and true technologies to authenticate, encrypt, and secure your product. If you’re at all in doubt about the security of your design, don’t be afraid to ask for assistance from a security professional.

Methodical process: Process only works when you’re following it. Don’t let process take all the joy out of programming, but let it help create a safety net for you and the software you produce. Knowing that your process can catch silly mistakes doesn’t make you careless—it makes you confident.

Careful analysis: Make sure that you think carefully about every place that a hacker can approach the system. Don’t be trapped by conventional thinking about the existing interfaces you’re designing—if you didn’t have the source and had to reverse engineer your way into your own box, what attack vectors would you attempt? Carefully examine those and make sure that they’re as resilient as possible, and where your program is consuming data, don’t take anything for granted.

Good tools: Tools are there to help you—many of the errors found by static code analysis will translate directly into closing vulnerabilities. Using tools to clean up sloppy coding practices will result in tighter, and more secure code. And many tools have special configurations designed to look for specific security problems.

Tools only work if you use them – make sure they’re part of your processes. You need to have tools that you can rely on during all stages of your development lifecycle.

Build: Integrate your tools as part of your build process from the very beginning so you can take advantage of consistent checks in your incremental daily builds, and that the insertion of tools into a mature build process is not quite such a daunting task. Don’t wait until near the end of the project when high inertia and bug introduction risk will lead you to opt out of many little fixes that the tools may indicate.

Validation: Validation depends on input from not only your code, but code that you rely on. Find tools that can help you flow integrity checks throughout the entire integration process. And don’t stop running tools just because you’ve hit production. Rely on them to double-check bug fixes or feature additions after your golden build to make sure that any new code added adheres to the same high quality bar you set initially.

Don’t ignore black boxes, they’re part of the system design too. Things like font engines, web browsers, PDF viewers, speech rec engines, graphics toolkits, and Adobe Flash and AIR all need to be considered when you’re developing. Just because that code doesn’t come from you doesn’t mean that it can‘t be hacked—look for ways to validate any inputs those modules receive or to isolate those components as much as possible from the rest of the system, so that any inadvertent security breach cannot propagate far.

Bugs can be found in very stable code, so plan on over-the-air updates to address deployed systems. Paraphrasing a famous security quote: a software product is never 100 percent secure from security risk, unless it’s disconnected from every possible input and encased in a block of concrete, and even then it’s doubtful. It is a maximum of the security world that security updates are a necessary evil because software is never perfect. Cars (or any embedded system for that matter) are no different—without having a way to issue security patches, you have already lost the game. Build your software processes assuming that production does not stop software development, and that the system will need to have the technology and infrastructure in place to receive updates.

Finally, we were asked a great question that we didn’t have time to answer. The question was:

Would you advocate encrypted logfiles?

The answer, given by my co-presenter Andy Gryc:

Ideally, you would have log files encrypted (with industry standard encryption, not just XOR/shift) and located on a RAM disk or other temporary storage so they aren’t persistent. Decryption can take place with developer tools that have the appropriate keys pre-baked in or configured through a developer/target key pairing process. Any measure like this makes it significantly more difficult to derive any value from inadvertently leaked information.

Of course, encrypted logs are not always convenient for debugging purposes, especially for board bring-up, time-sensitive logs or long-duration problems. Another way to do this is to encrypt the filesystem where any configuration or logging data is stored. Anything you can do that introduces barriers between your development team and a hacker raises the barrier to entry. Make careful choices about where you can live with encrypted logs and where you absolutely can’t, and minimize any information left out in the clear.

You may be surprised to know that awareness of these top ten issues can actually help with nearly 90 percent of all vulnerabilities in embedded software. So congratulations on taking the first big step. Close that other 10 percent by relying on the right tools. Also consider assessing where you stand right now.

Code confidence

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Scroll to top