Sei sulla pagina 1di 25

Internet Security 2 (IntSec2)

6 Software Security Building Security In"


Prof. Dr. Andreas Steffen
Institute for Internet Technologies and Applications (ITA)

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 1

6 Software Security NIST statistics on vulnerabilities The trinity of trouble (connectivity, extensibility, complexity) Bugs, flaws, defects Software artifacts The three pillars of software security (risk management, best practices, knowledge) Pillar I: Applied risk management Pillar II: Software security best practices Barry Boehms cost of change law Mi Microsofts t t ft trustworthy computing initiative th ti i iti ti Pillar II: Software Security Knowledge Code review tools Code rule sets Architectural risk analysis

Literature

Gary McGraw, Software Security, Building Security In, 408 pages, 2006, Addison-Wesley, ISBN 0-321-35670-5

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 2

NIST Statistics on Vulnerabilities with High Severity


3000

2500

other errors input validation errors (buffer overflows)

2000

1500

1000

500

0 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996

Source: http://nvd.nist.gov/statistics.cfm

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 3

High Severity Input Validation Errors (Buffer Overflows and Boundary Condition Errors): Year 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 # of Vulns 2275 1498 634 412 569 437 206 146 68 74 24 % of Total 34% 31% 27% 32% 29% 26% 20% 16% 28% 29% 32% Number of Vulnerabilities of High Severity: Year 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 # of Vulns 2758 1946 955 632 935 777 436 317 129 145 46 % of Total 42% 40% 40% 49% 48% 46% 43% 35% 52% 58% 61% Number of Vulnerabilities of Medium Severity: Year 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 # of Vulns 1247 496 203 159 183 194 120 254 29 35 12 % of Total 19% 10% 9% 12% 9% 12% 12% 28% 12% 14% 16% Number of Vulnerabilities of Low Severity: Year 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 # of Vulns 2595 2449 1214 498 845 701 462 347 88 72 17 % of Total 39% 50% 51% 39% 43% 42% 45% 38% 36% 29% 23%

The Trinity of Trouble

Connectivity

The growing connectivity of computers through the Internet has

increased both the number of attack vectors and the ease with which an attack can be made. Due to SOA, legacy applications that were never intended to be internetworked are now published as services.

Extensibility

The plug-in architecture of Web browsers allows easy installation of p g y

viewer extensions for new document types. Operating systems support extensibility through dynamically loadable device drivers and modules. Applications support extensibility through scripting, controls, components, and applets thanks to Java and the .NET framework.

Complexity

Unrestrained growth in the size and complexity of modern software

systems. I practice th d f t rate t d t go up as th square of t In ti the defect t tends to the f code size.
Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 4

Windows Complexity
50 45 40 35 30 25 20 15 10 5 0 Win 3.1 1990 NT 3.5 1994 Win 95 1996 NT 4.0 1998 Win 98 1999 Win 2K 2000 XP 2001 Vista 2006

Million Source Lines of Code (SLOC)

Source: Gary McGraw, Software Security

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 5

Bugs + Flaws = Defects

Security Bug

A security bug is an implementation-level vulnerability that can be


easily discovered and remedied using modern code review tools. Examples: buffer overflows, race conditions, unsafe system calls.

Security Flaw

A security flaw is a design-level vulnerability that cannot be detected

by automated tools [yet] but usually requires a manual risk analysis risk-analysis of the software architecture done by experts. Examples: method overriding, error handling, type safety confusion.

Security Defect

Both bugs and flaws are defects which may lie dormant in software

for years only to surface in fielded systems with major consequences. In practice software security problems are divided 50/50 between bugs and flaws.

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 6

Bug or Flaw? Find the Defects!


1 read(fd, userEntry, sizeof(userEntry)); 2 comparison = memcmp(userEntry, correctPasswd, strlen(userEntry)); 3 if (comparison != 0) 4 return BAD_PASSWORD;

Bug in line 1 Bug in line 2

Return value from read() ignored. Always a bad sign but not
directly resulting in an attack attack.

The function strlen() relies on read() putting a null terminator at


the end of the string. No guarantee of that.

Flaw in line 2

The system stores user passwords in plaintext (as opposed to storing


a hash). This flaw is best found during an architectural risk analysis.

Flaw [or bug?] in line 3

The comparison also succeeds if the entered password is of length


zero. This problem can be uncovered with good testing.
Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 7

This code example is adapted from an interview in Slashdot by Paul Kocher. http://interview.slashdot.org/interviews/03/03/27/1357236.shtml?tid=172 htt //i t i l hd t /i t i /03/03/27/1357236 ht l?tid 172

Bug-Free Software

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 8

Software Artifacts

Common set of software artifacts that are nearly independent of the chosen software development process and its life cycle:

Waterfall model Rational Unified Process (RUP) eXtreme Programming (XP)

Requirements and Use Cases

Architecture and Design

Test Plans

Code

Tests and Test Results

Feedback from the Field

Agile software development Spiral model Capability Maturity Model integration (CMMi)
Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 9

The Three Pillars of Software Security

Software Security
Risk Management Best Practices Knowledge

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 10

10

Pillar I Applied Risk Management

1 1. Understand the business context (who cares?) 2 2. Identify the business and technical risks 3. Synthesize and prioritize the risks, producing a ranked set 4 4. Define the risk mitigation strategy 5 5. Carry out required fixes and validate that they are correct

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 11

11

Risk Management Framework (RMF)

Measurement and Reporting

1
Understand the Business Context

2
Identify the Business and Technical Risks Artifact Analysis Business Context

3
Synthesize and Rank the Risks

4
Define the Risk Mitigation Strategy

5
Carry Out Fixes and Validate

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 12

12

Pillar II Software Security Best Practices


In order of effectiveness:
1 1. Code review 2 2. Architectural risk analysis 3 3. Penetration testing 4 4. 4 Risk-based security tests 5 5. Abuse cases

(historical knowledge)

6. Security requirements 6 7. Security operation 7 Constructive actvities (white hat) Destructive activities (black hat)
Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 13

(weakly constructive)

Code review U of static analysis t l th t scan source code f common vulnerabilities. Use f t ti l i tools that d for l biliti Discovers about 50% of all security problems (implementation bugs). Architectural risk analysis Required at the specifications-based architecture stage and at the class-hierarchy design stage. Discovers about 50% of all security problems (design flaws). Penetration testing Gives a good understanding of fielded software in its real environment. Black box tests that to not take the software architecture into account wont uncover anything interesting. Risk-based security tests Based on attack patterns, risk analysis results, and abuse cases. Standard QA is unlikely to uncover all critical security issues (focus on good cases), Make sure bad things dont happen. Abuse cases Abuse cases describe the systems behavior under attack. Building abuse g y get q p cases is a great way to g into the mind of the attacker and requires explicit coverage of what should be protected, from whom, and for how long. Security requirements Security must be explicitly worked into the requirements level. Good security requirements cover both overt functional security (e.g. the use of cryptography) and emergent characteristics (best captured by abuse cases and attack patterns). Security operation Software security can benefit greatly form network security.Knowledge gained by understanding attacks and exploits should by cycled back into software development development.

13

Best Practices applied to Software Artifacts


6 Security Requirements 5 Abuse Cases 2 Risk Analysis 4 Risk-based Security Tests

1 Code Review (Tools) 2 Risk Analysis

3 Penetration 7 Testing Security Operation

Requirements and Use Cases

Architecture and Design

Test Plans

Code

Tests and Test Results

Feedback from the Field

Security Development Lifecycle

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 14

Important: All software security best practices are best applied by people not involved i th original d i and i l i l d in the i i l design d implementation of th system. t ti f the t Lots of work for external consultants ;-)

14

Barry Boehms Cost of Change Law


$15'000 $12'000 $9'000 $6'000 $3'000 $0 Cost per defect

Consequence: Push Left Strategy!


Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 15

Source: Prof. Barry Boehm, TRW

15

Microsofts Trustworthy Computing Security Development Lifecycle

Source: Steve Lipner & Michael Howard, The Trustworthy Computing Security Development Lifecycle, MSDN 2005
Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 16

16

Pillar III Software Security Knowledge

Prescriptive Knowledge

Principles Guidelines Rules


Diagnostic Knowledge

V l Vulnerabilities biliti Exploits Attack patterns


Historical knowledge

Historical risks

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 17

Principles Hi h l High-level architectural principles, e.g. th principle of l l hit t l i i l the i i l f least privilege. t i il Guidelines Mid-level guidelines, e.g. make all Java objects and classes final, unless there is a reason not to. Rules Tactical code-level rules, e.g. avoid the use of the library function gets() in C. Vulnerabilities Descriptions of software vulnerabilities experienced and reported in real systems (often with a bias towards operations). Exploits Descriptions how instances of vulnerabilities are used to compromise the security of particular systems. Attack Patterns Descriptions of common sets of exploits in a more abstract form that can be applied across multiple systems. Historical Ri k Hi t i l Risks Detailed descriptions of specific issues uncovered in real-wolrd software development efforts. Must include a statement of impact on the business or mission prposition. As a resource, this knowledge offers tremendous value in helping to identify similar issues in new software effores without starting from scratch. Source: Gary McGraw, Software Security

17

Software Security Unified Knowledge Architecture

Attack Pattern

Exploit
(Realized Vulnerability)

Guideline Principle Rule

Vulnerability

Historical Risk

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 18

18

Knowledge required for Best Practices


6 Security Requirements 5 Abuse Cases 2 Risk Analysis 4 Risk-based Security Tests

1 Code Review (Tools) 2 Risk Analysis

3 Penetration 7 Testing Security Operation

Requirements and Use Cases

Architecture and Design

Test Plans

Code

Tests and Test Results

Feedback from the Field

Principles Guidelines

Historical Risks Attack Patterns

Rules Vulnerabilities

Exploits

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 19

19

Code Review Tools

First Generation Code Scanners (intelligent grep)

ITS4 www.cigital.com/its4/ RATS www.fortifysoftware.com/security-resources/rats.jsp Flawfinder www.dwheeler.com/flawfinder/ First generation code scanners generate a lot of false positives. ITS4 (Cigital License), RATS and Flawfinder (GPLv2) are free software.

Advanced Source Code Analysis Tools

Coverity Prevent www.coverity.com/html/prod_prevent.html Fortify SCA www.fortifysoftware.com/products/sca/ Ounce Labs Prexis/Engine www.ouncelabs.com/prexis_engine.html Commercial tools try to minimize false positives by interpreting the software context obtained through parsing of the source code.

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 20

20

Coding Rules SourceScope vs. ITS4/RATS

In 2003 Cigital licensed SourceScope to the start-up Fortify Software which renamed it Fortify Source Code Analysis y y

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 21

21

Coding Rule Sets

A lot of expert knowledge is contained in the coding rule sets used by commercial and free source code analysis tools. The US Dep. of Homeland Securitys Build Security In portal currently documents 173 basic C/C++ coding rules derived mainly from ITS4.

http://buildsecurityin.us-cert.gov/

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 22

22

Architectural Risk Analysis

Attack resistance analysis

Check-list approach using catalogs of attack pattern and vulnerabilites. Good at finding known problems but not new or creative attacks.
Ambiguity analysis

Best carried out by a team of experienced analysts who first study the
available software artifacts independently and then discuss their findings. Wh fi di Where good architects disagree, there usually lie interesting d hit t di th ll li i t ti things (and sometimes new flaws). Good at finding ambiguity and inconsistency in a design.

Weakness analysis

Good at understanding the impact of external software dependencies,

e.g. software built on top of .NET or J2EE frameworks or using security libraries. Dependencies on network topology or platform environment.

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 23

23

Architectural Risk Analysis


Input Documents Exploit Graphs Attack Patterns Secure Design Literature Requirements Architectural Documents Regulations Standards External Resources Perform Attack Resistance Analysis Identify General Flaws Noncompliance N li to Guidelines and Standards Map Applicable Attack Patterns Show Risks and Drivers in Architecture Show Viability of Known Attacks Perform Ambiguity Analysis Ponder Design Implications Generate Separate Architecture Diagram and Documents Unify Understanding Uncover Ambiguity Sufficiency Analysis Unravel Convolutions Uncover Poor Traceability Perform Weakness Analysis Find & Analyze Flaws in COTS Frameworks Network Topology Platform Identify Services used by Application Map Weaknesses to Assumptions made by Application
Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 24

Build One-Page Architecture Overview

Output Documents Software Flaws Architectural Risk Assessment Report

24

The End

"LISTEN, KID, HOW MANY TIMES DO I HAVE TO EXPLAIN IT TO YOU...


BUGS DON'T BECOME PROGRAMMERS."

Andreas Steffen, 20.03.2009, 6-SoftwareSecurity.ppt 25

25

Potrebbero piacerti anche