• Adam Cohen

DevOps and the Fate of Secure Software Development

Reconciling Technology Development,

Security and the Lawyer’s Role (originally published in the Cybersecurity Law Report)

No matter how much new law is written on the topic of cybersecurity or data privacy,

technology development refuses to slow down. The runaway freight train of information

technology smashes through the legal checkpoints we set up as if they were wisps of smoke. In the race to market, software products are routinely released to consumers with “bugs” –

including security vulnerabilities. By contrast, the legislative process is not accelerating at all.

Inevitably, the distance between technology and law is elongating, with serious implications for risk management. Notwithstanding the cornucopia of associated risk, potential personal liability for corporate of􀃬cers and directors, lawyers are rarely invited to discussions about software testing.

Legal counsel might prefer to avoid getting involved in this con􀃭ict, but professional

responsibility calls for identifying emerging risks that could crystallize into future legal disputes. In the era of “software-de􀃬ned-everything,” these risks increasingly involve software development. Legal claims arising from software defects can take almost any form, ranging from“garden variety” contract and tort claims to consumer fraud and unfair or deceptive trade practices to regulatory enforcement, and even criminal prosecution. In-house and outside counsel must anticipate and prepare for this litigation blitzkrieg.

Business lawyers, therefore, need some general understanding of software development security and how approaches to testing software are evolving. They cannot allow the “Sec” in DevSecOps to be a “black box” that falls outside the dominion of legal scrutiny and compliance.

The Significance of Software Development Security

A global cybersecurity “arms race” spurs allocation of vast resources to development of a broad range of technology, offensive and defensive. A major domain of focus for both sides involves identification of weaknesses, a/k/a vulnerabilities in software, a/k/a applications (the latter may be defined more narrowly, as a type of software built to fulfill a specific purpose for an end- user; in this sense, “end-user” is distinguished from its literal meaning to carve out software developers from the typical user) and exploiting (hacking) or remediating (fixing) them.

The scope of this phenomenon is witnessed by, inter alia, two observable features of the

cybersecurity landscape: with effectively unlimited resources maintain bug bounty programs, where rewards are offered for discovery and reporting of zero day (previously unknown) vulnerabilities; and

2.a vast gap in employment demand and supply in cybersecurity, despite the increasing numbers of information technology professionals devoting their careers to looking for undisclosed software vulnerabilities as a means of reliable income, either through bug bounty programs in one form or another, public or private, as part of a defensive cybersecurity effort, or through exploiting newly discovered vulnerabilities illegally, e.g., to gain further access to a target system connected to the enterprise with the program.

Moreover, a cursory review of public data breaches shows that exploiting flawed and unremediated programming is frequently essential to the attack methodology. For example, in the Equifax breach, attackers reportedly exploited an unpatched Apache Struts vulnerability to access a crucial server (compounded by, among other things, the failure of a scanning tool to detect indicators of compromise).

Even where social engineering in the form of phishing begins a cyber attack, malware unleashed when a hapless user/phish clicks on an email attachment or link often includes capabilities to exploit software vulnerabilities, in aid of achieving its ultimate aim.

DevOps Phenomenon Creates Reduced Visibility

DevOps is a mash-up of enterprise processes where software development, deployment to

production and operational use are seamlessly integrated. This abstraction is often compared to other, “agile,” software engineering concepts. Agile software development is in some ways an evolutionary predecessor to DevOps; the former is a software development model that broke the paradigm of traditional models (the latter has been depicted as structured, linear or hierarchical), with the promise of unleashing unprecedented creativity, speed and, of course, pro􀃬t. “Continuous Integration” and “Continuous Development (or Deployment)” (collectively “CI/CD”) makes “Agility” a 24/7 operation that extends beyond development and into production (operation or commercial availability).

DevOps is largely engendered by the major public cloud service providers (tech titans like

Amazon, Microsoft and Google), which provide developers a Pandora’s box of in􀃬nitely and

automatically scalable environments, platforms and tools at pricing that is elastic with use and requires virtually no upfront investment. Moreover, most organizations simply have chosen to rely on the development practices of their vendors and third-party service providers much of the time, as evidenced by recent studies showing the overwhelming adoption of software-as-aservice by business enterprises, such as the 2018 IDG Cloud Computing Study.

This environment requires a fundamental change in approach to risk management, including by cybersecurity and legal professionals, because visibility, long considered an essential foundation for security, is radically reduced. Factors reducing visibility include not only the use of third party infrastructure, platforms and applications, but also the speed and automation of the development process. There is simply not enough time to perform traditional security testing and monitoring if we are to continue accelerating the acceleration of development. This raises the question of what can be done about software security.

Traditional Secure Software Development

DevSecOps is the answer to addressing security risk in the current fast-paced development landscape. In order to comprehend the benefits of DevSecOps, however, it is important to

understanding the background on traditional secure software development.

Old-Fashioned Lifecycle of Traditional Secure Software Development

A review of classical secure software development lifecycle (SSDLC) conveys a clear sense of how painfully meticulous and manual (i.e., old-fashioned) the traditional process looks when compared with DevOps, where the brake lines have been cut. Traditional SSDLC identifies threats and risks within the framework of well-established modelslike STRIDE (spoofing identity, tampering with data, repudiation, information disclosure, denial of service and elevation of privilege) and DREAD (damage potential, reproducibility, exploitability, affected users and discoverability), before doing any testing.

Microsoft’s tome on secure development emphasizes that security cannot be “tested into” a product: “[t]esting is time consuming, laborious and expensive and therefore its ability to identify vulnerabilities is limited,” as stated in Writing Secure Code, by Howard and Lipner. Given that the later in the development process a problem is discovered the (much) more expensive it is to fix, the treatise (“required reading at Microsoft,” according to Bill Gates) recommends involving testers early in design and threat-modeling, and reviewing specifications for security issues.

In the traditional process, security testing plans are based on the threat model. A testing plan might include, inter alia, these steps:

  1. decompose application into fundamental components;

  2. identify component interfaces;

  3. rank interfaces by potential vulnerability risk;

  4. ascertain data structures used by each interface;

  5. and find security problems by injecting mutated data.

Each threat from the threat model should be accounted for in the test plan along with the expected result for each test. A testing plan should also consider complexities such as the impact of a particular component of the application on others and backup plans for development and further testing where indicated. STRIDE and other models help organize the variety of attacks used to test applications. The categories represented by the acronym are abstract or conceptual; the details (the “how”) of carrying them out will be highly specific and tailored to technical context.

For example, the attacks/tests used for “spoofing identify” or “S” in STRIDE, might include attempts to do the following:

  • force application to bypass authentication;

  • force authentication protocol to use less secure version; view user credentials in transit or at rest;

  • replay security tokens like cookies to circumvent authentication; and use brute force credentials.

Bear in mind that even subtle changes in error messages can help the attacker gain useful

information about the target.

Traditional Approach Still a Legal Barometer

Techniques of testing software for security have been explored for decades and a wealth of literature has emerged reflecting lessons learned. Laws regulating software development security were written with this history in mind. Moreover, the traditional approach is most likely to be used as a barometer in evaluating compliance with laws regulating cybersecurity or business practices more broadly, many of which apply a standard of “reasonableness” or a similarly nebulous and moving target. When an adverse party, regulator, judge or other legal actor ( jury, arbitrator, etc.) is assessing responsibility for a vulnerability that allows a massive data privacy breach, the developer will be hard-pressed to explain the failure to use any of the classic SSDLC techniques. Opposing or investigating lawyers will find a test that was not performed, that arguably could have led to discovery of the exploited vulnerability, and then cross-examine the developer about why it wasn’t performed. Citing delays testing might impose on the road to profit may not be the most helpful response.

DevSecOps Keeps the Development Train Moving

In contrast to the laborious gait of traditional software testing, in DevSecOps, automated security tools integrated with the development environment promise to provide visibility into code as it moves through the “pipeline” towards release. There is no stopping the development train on its one-way journey to bring applications to customers or make them operational, even for testing. Instead, testing will have to be done while the train with no brakes is plunging downhill at a steep angle.

Traditionally, a key part of software development security was carefully planned security testing in a carefully segregated test environment, as distinct from “live” testing through the use of released software in a production environment. But the world is moving too fast for such old- fashioned ways. This is the age of DevOps, where there is no gap between development and operations.

We might be misunderstood to imply such a gap by inserting “Sec” for DevSecOps, but the reality is that DevOps has swallowed security whole. It’s not that we don’t do security testing anymore, but we don’t wait for security testing. Actually, we don’t even slow down. Instead, we use technology that automates testing and integrates it with the integrated development and operations that are DevOps.

Satisfying Buyers’ Needs Creates Conflict

Summarizing the insatiable demand for compressing the SDLC timeline, Gartner, in its 2019 Magic Quadrant for Application Security Testing, states: “Buyers expect offerings to fit earlier in the development process with testing often driven by developers rather than security specialists, and tightly integrated as part of the build and release process. As a result, this market evaluation focuses more heavily on the buyer’s needs when it comes to supporting rapid and accurate testing that is capable of being integrated in an increasingly automated fashion throughout the software development life cycle (SDLC).”

Classically trained cybersecurity professionals will note that this DevOps approach defies a basic principle of cybersecurity and, in particular, software security testing, i.e., segregation of duties – which mandates that development is conducted by different personnel than those who do the security testing. Lawyers should readily appreciate the “conflict of interest.”

Three Categories of Application Security Testing

Gartner divides the application security testing (AST) solutions market into three categories – static application security testing (SAST), dynamic application security testing (DAST) and integrated application security testing (IAST), which could be delivered as tool, subscription service or combination of the two. Gartner defines these styles of AST as follows:

SAST ...analyzes an application’s source, bytecode or binary code for security vulnerabilities, typically at the programming and/or testing software life cycle (SLC) phases.

DAST ...analyzes applications in their dynamic, running state during testing or operational phases...simulates attacks against an application (typically web-enabled applications and services), analyzes the applications reactions and, thus, determines whether it is vulnerable.

IAST ...combines elements of SAST and DAST simultaneously...typically implemented as an agent within the test runtime environment...that observes operation or attacks and identifies vulnerabilities.

Gartner urges that RASP and SCA capabilities are fundamental/must-haves for secure development and favors testing solutions that include these capabilities. RASP stands for runtime application self protection (i.e., a technology for allowing applications to protect themselves from vulnerability exploitation at runtime). Software composition analysis or SCA is used to identify open-source and third-party components in use in an application and their known security vulnerabilities.

Accordingly, software development security, like everything else these days in IT, is “solved” by automated technology, provided, or even operated, by third-party service providers. It is too early to tell whether or to what degree the new DevSecOps solutions will be effective, especially in comparison with the classical SSDLC.

Unfortunately, we may never 􀃬nd out given how rapidly IT changes – the relentless cyber-arms race makes the lifespan of any security technology or technique, offensive or defensive,

uncertain in duration but certainly brief.

The Lawyer’s Role in Technology Development

There is no avoiding the fact that providing effective legal counsel increasingly requires

understanding a broadening scope of topics related to information technology. This is pretty

easy to understand when you consider the trajectory of technology in every aspect of our lives And further re􀃭ection inevitably leads to the realization that software applications are at the

core of modern enterprise IT (i.e., “software-de􀃬ned everything”).

Accordingly, software or application development security has been considered one of the

primary domains of cybersecurity since early in the field’s development. Through this status in the classic cybersecurity literature, software development security found its way into the law. Sometimes the recognition is direct and explicit in the form of express rules requiring development security, and other times it is obscured by the clouds of workhorse “reasonableness” standards for cybersecurity, applied in laws written by lawmakers oblivious to the finer points of software development security.

Avoid the Black Box: Understand Technologies and Solutions

Lawyers cannot allow the Sec” in DevSecOps to be a “black box” that falls outside the dominion of legal scrutiny and compliance. Given that any legal process questioning a client’s software development security is likely to involve evaluating it against traditional, established SSDLC standards different from DevSecOps (the latter arguably too new to even have established standards), lawyers need to get an understanding of the technologies their DevSecOps clients are using so that they can explain how these “solutions” manage secure development risk at least as well as the “old” testing practices.

Ability to Explain and Defend the Process

A business enterprises’ legal counsel should be able to describe the client’s approach to software development security in a way that supports the defensibility of the approach under applicable legal standards. To the extent the client is relying on a technology solution, questions to answer include: What is it? How does it work? What security tests does it ostensibly perform? What verification/validation of its effectiveness is available? What specific risk assessment supports its use?

If a vulnerability is exploited and becomes the subject of a legal process or inquiry, questions will be asked about software security testing. For attorneys, mere understanding of the testing process is not enough – they must be able to explain it to a non-expert finder-of-fact, which involves other skills (possibly more challenging). Many business enterprises analyzing costs and assessing risks rely on in-house legal counsel to understand the company’s testing process, although they rely on outside litigation counsel when disputes become more formalized.

The resulting game of telephone, especially where the message is technical in nature as it here, can introduce additional risk or challenges. For the lawyers with clients practicing DevSecOps, there is a lot of explaining to do – and no long history of well-established best practices compiled in voluminous treatises. Potential harm from deploying bad codeis no longer limited to loss of consumer or business partner goodwill or internal operational problems. Instead, software vulnerabilities that could or should have been identified by testing, and eliminated before deployment, play feature roles in litigation or regulatory enforcement with the potential to destroy business enterprise clients.

Managing Third-Party Risk

Usually, deploying one of the latest and greatest AST solutions involves bringing in yet another third party, introducing additional risk management complications (it is axiomatic in cybersecurity that complexity is antithetical to security). Regardless of what representations and agreements are in the contract with the service provider to protect the client, legal counsel may need to remind the client that its legal obligations to others are unaffected. In other words, the client must be made aware that using a third-party product or service to facilitate legal compliance does not transfer the risk of non-compliance to the third party. Ultimately, the choices the client makes in addressing software development security should be traceable to roots in a risk assessment process (as opposed to, e.g., lower price, higher pro􀃬t, etc.)

Appreciation of a Holistic Vulnerability Management Program

As key enterprise risk management professionals, lawyers should appreciate the broader topic of vulnerability management and the potential bene􀃬ts of a programmatic approach, i.e., establishing a holistic vulnerability management program. An increasing number of organizations have allocated resources to an alternative, supplement or complement to testing, in the form of the “bug bounty” program. This generally involves offering monetary rewards for the identi􀃬cation of previously unknown vulnerabilities in the offeror’s software.

The details of implementing a bug bounty program and related legal issues are beyond the scope of this article, along with other aspects of enterprise vulnerability management broadly. However, appropriate context, for understanding the role of testing and the kinds of challenges. DevOps presents for security and legal compliance, requires awareness of such alternative candidates for security budget resources. Moreover, while of the relevance of secure development varies across industries, organizational structures, jurisdictions and other contextual variables, an ever-increasing number of companies that would not be considered members of the “technology industry” are heavily, if not existentially, dependent on, or at least engaged in, the “digital transformation” of their businesses – which typically involves some form of software development as a cornerstone.

Companies whose primary business is software, whether licensing, selling or providing related services, have a different risk calculus when it comes to software security (as it tends to dwarf all other risks). Moreover, the typical bug bounty program involves software that is already in production, or use. While it’s nice to 􀃬nd 􀃭aws in software at this stage and important to issue patches for those 􀃭aws that are identi􀃬ed, it really isn’t the same thing as 􀃬nding them before deploying the 􀃭awed application into production. The fact that a software bug is identi􀃬ed through a bug bounty program and then patched doesn’t mean that the bug hasn’t been previously exploited by threat actors.