Image may be NSFW.
Clik here to view.Dr. Gary McGraw is an authority on software security who has written many security books. This book, Software Security: Building Security In, is the third in a series.
While Exploiting Software: How to Break Code focuses on the black hat side of security, and Building Secure Software: How to Avoid Security Problems the Right Way focuses on the white hat side, this book brings the two perspectives together (see book cover on right).
Chapter 1, Defining a Discipline, explains the security problem that we have with software. It also introduces the three pillars of software security:
- Applied Risk Management
- Software Security Touchpoints
- Knowledge
Chapter 2, A Risk Management Framework, explains that security is all about risks and mitigating them. McGraw argues the need to embed this in an overall risk management framework to systematically identify, rank, track, understand, and mitigate security risks over time. The chapter also explains that security risks must always be placed into the larger business context. The chapter ends with a worked out example.
Chapter 3, Introduction to Software Security Touchpoints, starts the second part of the book, about the touchpoints. McGraw uses this term to denote software security best practices. The chapter presents a high level overview of the touchpoints and explains how both black and white hats must be involved to build secure software.
Chapter 4, Code Review with a Tool, introduces static code analysis with a specialized tool, like Fortify. A history of such tools is given, followed by a bit more detail about Fortify.
Image may be NSFW.
Clik here to view.Chapter 5, Architectural Risk Analysis, shifts the focus from implementation bugs to design flaws, which account for about half of all security problems. Since design flaws can’t be found by a static code analysis tool, we need to perform a risk analysis based on architecture and design documents. McGraw argues that MicroSoft mistakenly calls this threat modeling, but he seems to have lost that battle.
Chapter 6, Software Penetration Testing, explains that functional testing focuses on what software is supposed to do, and how that is not enough to guarantee security. We also need to focus on what should not happen. McGraw argues that this “negative” testing should be informed by the architectural risk analysis (threat modeling) to be effective. The results of penetration testing should be fed back to the developers, so they can learn from their mistakes.
Chapter 7, Risk-Based Security Testing, explains that while black box penetration testing is helpful, we also need white box testing. Again, this testing should be driven by the architectural risk analysis (threat modeling). McGraw also scorns eXtreme Programming (XP). Personally, I feel that this is based on some misunderstandings about XP.
Chapter 8, Abuse Cases, explains that requirements should not only specify what should happen, in use cases, but what should not, in abuse cases. Abuse cases look at the software from the point of view of an attacker. Unfortunately, this chapter is a bit thin on how to go about writing them.
Chapter 9, Software Security Meets Security Operations, explains that developers and operations people should work closely together to improve security. We of course already knew this from the DevOps movement, but security adds some specific focal points. Some people have recently started talking about DevOpsSec. This chapter is a bit more modest, though, and talks mainly about how operations people can contribute to which parts of software development.
Image may be NSFW.
Clik here to view.Chapter, 10, An Enterprise Software Security Program, starts the final part of the book. It explains how to introduce security into the software development lifecycle to create a Security Development Lifecycle (SDL).
Chapter 11, Knowledge for Software Security, explains the need for catalogs of security knowledge. Existing collections of security information, like CVE and CERT, focus on vulnerabilities and exploits, but we also need collections for principles, guidelines, rules, attack patterns, and historical risks.
Chapter 12, A Taxonomy of Coding Errors, introduces the 7 pernicious kingdoms of coding errors that can lead to vulnerabilities:
- Input Validations and Representation
- API Abuse
- Security Features
- Time and State
- Error Handling
- Code Quality
- Encapsulation
It also mentions the kingdom Environment, but doesn’t give it number 8, since it focuses on things outside the code. For each kingdom, the chapter lists a collection of so-called phyla, with more narrowly defined scope. For instance, the kingdom Time and State contains the phylum File Access Race Condition. This chapter concludes with a comparison with other collections of coding errors, like the 19 Deadly Sins of Software Security and the OWASP Top Ten.
Chapter 13, Annotated Bibliography and References, provides a list of must-read security books and other recommended reading.
The book ends with four appendices: Fortify Source Code Analysis Suite, Tutorial, ITS4 Rules, An Exercise in Risk Analysis: Smurfware, and Glossary. There is a trial version of Fortify included with the book.
Image may be NSFW.
Clik here to view.
All in all, this book provides a very good overview for software developers who want to learn about security. It doesn’t go into enough detail in some cases for you to be able to apply the described practices, but it does teach enough to know what to learn more about and it does tie everything together very nicely.
Filed under: Application Security, Books, Blogs, etc. Tagged: 19 deadly sins, 7 pernicious kingdoms, abuse case, best practice, black hat, CERT, code review, CVE, design flaw, DevOps, DevOpsSec, exploit, OWASP top 10, penetration testing, risk management, Security Development Lifecycle, security touchpoint, static code analysis, threat modeling, vulnerability, white hat Image may be NSFW.
Clik here to view.
Clik here to view.
Clik here to view.
Clik here to view.
Clik here to view.
Clik here to view.
Clik here to view.
Clik here to view.
