The development of embedded software for automotive applications has seen dramatic changes in recent years. In recent times, the connected car is falling under the spotlight with its vulnerability to attack from malicious hackers. The automotive industry’s response has been to propose a consolidated cybersecurity framework, with a multi-faceted approach to the defence of the connected vehicle. In this article, LDRA focuses on one piece of that complex jigsaw puzzle, reflecting on the fact that if left unchecked, an increasing volume of application software will inevitably lead to proportionate increase in the number of system vulnerabilities associated with software defects and design flaws.
Any system providing an interface to the outside world has the potential to contain security vulnerabilities. In particular, any accessibility via the internet requires a strategy to deal not only with a few malicious specialists, but with a whole world of hackers. A cursory search of historic news articles will throw up dozens of examples of why that is also a safety threat. When security researcher Barnaby Jack used a modified antenna and software in 2011 to wirelessly attack and take control of Medtronic’s implantable insulin pumps, he demonstrated how such a pump could be commanded to release a fatal dose of insulin . A Polish teenager adapted a TV remote control in 2008 to hack into the Lodz tram system and use it as a giant train set . Miller and Valasek’s work, “Remote Exploitation of an Unaltered Passenger Vehicle”  is a very famous example of how similar threats are very real in the automotive world.
In response to such risks and to lay down foundations for a safe and secure ecosystem, SAE (Society of Automotive Engineers) have introduced a set of guidelines in their SAE J3061 publication, “Cybersecurity Guidebook for Cyber-Physical Vehicle Systems”, where a system is defined as “a collection of hardware and software to perform a function or functions in a vehicle”.
SAE J3061 provides that “a cybersecurity process framework and guidance to help organisations identify and assess cybersecurity threats and design cybersecurity into cyber-physical vehicle systems throughout the entire development lifecycle process”. This cybersecurity framework can be tailored to suit vehicle and organisational development processes throughout the product lifecycle from concept, through development, production, utilisation (operation) and support (service), and on to retirement (decommissioning). (1) illustrates the relationships between the security concepts, stakeholders and actors.
SAE J3061 is complementary to the long adopted ISO 26262 standard, which promotes the functional safety of on-board electrical and electronic systems. ISO 26262 is itself an industry-focused adaptation of the more generic IEC 61508 standard.
Initially, a system level vulnerability analysis is completed to identify potential threats, and cybersecurity requirements are derived from it, (2). These requirements are associated with software and/ or hardware sub-systems such that they can later be verified in each of those sub-systems, and subsequently at an integrated system level.
Software cyber security requirements are referenced throughout the development and implementation of software units & modules. Software vulnerability analysis is generally focused on software architecture and source code review. During source code review, the code implementation needs to be checked for vulnerabilities such as those identified in the CWE (Common Weakness Enumeration) list, the CVE (Common Vulnerabilities and Exposures) list, and elsewhere.
Manual code review would be challenging, labour-intensive and error-prone, and so the use of a qualified tool is advisable. Where vulnerabilities are highlighted, such a tool can also guide the user on its replacement with a compliant and secure coding construct.
Software vulnerability analysis generates a list of possible vulnerabilities, security flaws, and their trigger conditions. That list includes any software defects that could potentially expose the system or application to attack and it equips the development team with the information required to devise a mitigation plan.
The detail of software vulnerability analysis differs between programming languages. The C programming language is designed to be an efficient high-level language with a small footprint, and the later C++ language introduced additional object oriented features. They share vulnerabilities such as the ability to write or copy beyond the boundaries of an array, an inability to catch/ detect integer overflows and truncations, and the potential to call functions with the wrong number of arguments. Neither language is type safe, leading to the possibility of undesirable behaviour and security vulnerabilities, when typing inconsistencies are introduced.
JAVA shares few of these language specific vulnerabilities with C and C++, and like them it is susceptible to design and implementation level flaws.
CODING STANDARDS & SOFTWARE WEAKNESS LISTS
The SEI CERT C, C++, and Oracle Java secure coding standards are designed to eliminate insecure coding practices and undefined behaviours that can lead to exploitable vulnerabilities and unreliable applications. Compliance to SEI CERT C, C++, or Java coding standards is a likely basic mitigation strategy specified during vulnerability analysis, as a part of the cybersecurity lifecycle.
Similarly, MISRA C and MISRA C++ each defines a subset of their respective languages in which the opportunity to make mistakes is either reduced or removed. They are designed to be used in the development of any safety- and/ or security-critical embedded application. Qualified static analysis tools (TCL1 or TCL2 as per ISO 26262) are designed to automate source code compliance verification against such standards.
In contrast to the CERT and MISRA software coding standards, CWE provides a detailed list of a known weakness in software, whether a result of language misuse or through poor application implementation. CERT guidelines do contain cross-references to CWE entries, where their use can mitigate against such an identified weakness. To illustrate how coding standards contribute to system security, it is useful to consider how they might have helped prevent a well-known hack. The “Blaster Worm” (also known as Lovsan, Lovesan or MSBlast) was a computer worm that spread on computers running the operating systems Windows NT 4.0, Windows 2000, Windows XP, and Windows Server 2003. It was first noticed on August 11, 2003 and the number of infections peaked two days later.
The worm exploited a buffer overflow weakness in the DCOM RPC service, allowing remote attackers to execute arbitrary code via a malformed message. This weakness is manifested in the copying of the server name upon an activation request on port 135. The buffer allocated for that server name is of the maximum size allowed for computer name, but the potential for a buffer overflow condition was not checked or mitigated.
The successful exploitation of this vulnerability allows an attacker to run arbitrary code with local system privileges on a compromised system. This allowed the Blaster Worm to spread through the internet, affecting millions of computers. The commercial losses due to this infection were reportedly around $ 500 mn (3).
Internal code construct weaknesses of the kind exploited by the Blaster Worm can be identified during the development phase by a static analysis tool (capable of enforcing CERT C/C++, MISRA standard and CWE detection), prompting it to be replaced with fail-safe code.
For example, parsing the DCOM RPC code exploited by Blaster Worm through LDRA’s TBvision highlights a number of weaknesses as referenced by CERT C++and CWE, (4) & (5).
The automotive industry’s response to the challenges posed by the connected vehicle has been to propose a consolidated cybersecurity framework, with a multi-faceted approach to the defence of the connected vehicle. The resulting SAE J3061 document is pivotal to ensuring the security and safety of these vehicles and the people affected by them.
This security framework and its associated recommendations should be observed from the very beginning of the software development lifecycle. With regards to the development of application code, the use of a qualified static analysis tool will help developers and integrators alike to identify defects, flaws and vulnerabilities ahead of time and hence to produce a fail-safe and secure system implementation.
The ever-evolving techniques of hackers guarantee that the best practices of today will need modification or supplementation tomorrow. If they are to provide safe and secure products for the future, vehicle manufacturers, OEMs and suppliers alike therefore need to closely observe the responding evolution of security standards and guidelines, and to build a skills base capable of implementing their recommendations.
 MEDCITY News, “Hacker shows off vulnerabilities of wireless insulin pumps”, March 2012, http://medcitynews.com/2012/03/hacker-shows-off-vulnerabilities-of-wireless-insulin-pumps/
 The Telegraph, Schoolboy hacks into city’s tram system, January 2008, http://www.telegraph.co.uk/news/worldnews/1575293/Schoolboy-hacks-into-citys-tram-system.html
 Remote Exploitation of an Unaltered Passenger Vehicle, Dr. Charlie Miller & Chris Valasek, August 2015, http://illmatics.com/Remote%20Car%20Hacking.pdf
SAE J3061: Cybersecurity Guidebook for Cyber-Physical Vehicle Systems
:: Section 8.1: Applying a Cybersecurity Process Separately with Integrated Communication Points to a Safety Process
:: Section 8.6: Product Development at the Software Level
:: Section 8.6.4: Software Vulnerability Analysis
:: Section 8.6.5: Software Unit Design and Implementation
:: Section 8.6.6: Software Implementation Code Reviews
:: Section Appendix E: Provides references to some available vulnerability databases and vulnerability classification schemes.
:: Section 126.96.36.199: Product Development: Software Level (Figure 10 1 Software Vulnerability Analysis)
ISO 15288 Systems and software engineering – systems life cycle processes
ISO 26262 Road vehicles – functional safety
IEC 61508 Functional safety of electrical/ electronic/ programmable electronic safety-related systems
Secure Coding in C and C++ Robert C. Seacord. Section 1.2 Security Concepts (Figure 10 2 Security Concepts and Interaction between various stakeholders/actors)
PRIYASLOKA ARYA is Senior Technical Manager at LDRA Certification Services in Bengaluru (India)
SHRIKANT SATYANARAYAN is Technical Manager at LDRA Technology Pvt Ltd in Bengaluru (India)
DEEPU CHANDRAN is Senior Technical Consultant at LDRA Technology Pvt Ltd in Bengaluru (India)