external source to a vulnerable function without appropriate input validation I will present While ITS4 and Flawfinder target C and C++, RATS is also able to analyze Perl, PHP, and Python code FILE *file = fopen(path, "r"); 8 char c; 9
[PDF] raw socket python
[PDF] rayon de la terre
[PDF] rayon de la terre en km
[PDF] rayon de la terre en m
[PDF] raz and dworkin
[PDF] rb digital canada
[PDF] rbdigital vs flipster
[PDF] rdm 6
[PDF] rdm flexion exercice corrigé pdf
[PDF] rdm flexion poutre
[PDF] rdm6 flexion telecharger
[PDF] reaction acide base exercices corrigés pdf
[PDF] reactions of alkyl halides
[PDF] reactions of alkyl halides pdf
[PDF] reactions of amides pdf
Security Vulnerability Verication through Contract-Based Assertion Monitoring at
Runtime
by
Alexander M. Hoole
B.Sc., University of Victoria, 2003
M.A.Sc., University of Victoria, 2006
A Dissertation Submitted in Partial Fulllment of the
Requirements for the Degree of
DOCTOR OF PHILOSOPHY
in the Department of Electrical and Computer Engineering
©Alexander M. Hoole, 2016
University of Victoria
All rights reserved. This dissertation may not be reproduced in whole or in part, by photocopying or other means, without the permission of the author. ii Security Vulnerability Verication through Contract-Based Assertion Monitoring at
Runtime
by
Alexander M. Hoole
B.Sc., University of Victoria, 2003
M.A.Sc., University of Victoria, 2006
Supervisory CommitteeDr. I. Traore, Supervisor
(Department of Electrical and Computer Engineering)Dr. T.A. Gulliver, Departmental Member (Department of Electrical and Computer Engineering)Dr. K.F. Li, Departmental Member (Department of Electrical and Computer Engineering)Dr. J. Weber, Outside Member (Department of Computer Science) iii
Supervisory CommitteeDr. I. Traore, Supervisor
(Department of Electrical and Computer Engineering)Dr. T.A. Gulliver, Departmental Member (Department of Electrical and Computer Engineering)Dr. K.F. Li, Departmental Member (Department of Electrical and Computer Engineering)Dr. J. Weber, Outside Member (Department of Computer Science) ABSTRACTIn this dissertation we seek to identify ways in which the systems development life cycle (SDLC) can be augmented with improved software engineering practices to measurably address security concerns that have arisen relating to security vulnerability defects in software. By proposing a general model for identifying potential vulner- abilities (weaknesses) and using runtime monitoring for verifying their reachability and exploitability during development and testing reduces security risk in delivered products. We propose a form of contract for our monitoring framework that is used to specify the environmental and system security conditions necessary for the generation of probes that monitor security assertions during runtime to verify suspected vulner- abilities. Our assertion-based security monitoring framework, based on contracts and probes, known as the Contract-Based Security Assertion Monitoring Framework (CBSAMF) can be employed for verifying and reacting to suspected vulnerabilities in the application and kernel layers of the Linux operating system. Our methodology for ivintegrating CBSAMF into SDLC during development and testing to verify suspected vulnerabilities reduces the human eort by allowing developers to focus on xing veried vulnerabilities. Metrics intended for the weighting, prioritizing, establishing condence, and detectability of potential vulnerability categories are also introduced. These metrics and weighting approaches identify deciencies in security assurance programs/products and also help focus resources towards a class of suspected vulnera- bilities, or a detection method, which may presently be outside of the requirements and priorities of the system. Our empirical evaluation demonstrates the eectiveness of using contracts to verify exploitability of suspected vulnerabilities across ve input validation related vulnerability types, combining our contracts with existing static analysis detection mechanisms, and measurably improving security assurance processes/products used in an enhanced SDLC. As a result of this evaluation we introduced two new security assurance test suites, through collaborations with the National Institute of Standards and Technology (NIST), replacing existing test suites. The new and revised test cases provide numerous improvements to consistency, accuracy, and preciseness along with enhanced test case metadata to aid researchers using the Software Assurance
Reference Dataset (SARD).
v
Contents
Supervisory Committee ii
Abstract iii
Table of Contents v
List of Tables ix
List of Figures xii
Acknowledgements xiii
Dedication xiv
1 Introduction 1
1.1 Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 SDLC and Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3 Research Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4 Proposed Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.5 Research Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.6 Dissertation Organization . . . . . . . . . . . . . . . . . . . . . . . . 12
2 Related Work 14
2.1 Monitors and Intrusion Detection . . . . . . . . . . . . . . . . . . . . 14
2.2 Contracts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
2.3 Measurement and Metrics . . . . . . . . . . . . . . . . . . . . . . . . 29
2.3.1 History of Metrics in Security . . . . . . . . . . . . . . . . . . 30
2.3.2 Applicable AppSec Metrics . . . . . . . . . . . . . . . . . . . . 33
2.3.3 Metrics Summary . . . . . . . . . . . . . . . . . . . . . . . . . 36
2.4 Security Analysis Tools . . . . . . . . . . . . . . . . . . . . . . . . . . 37
vi
2.4.1 Static Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 37
2.4.2 Runtime Monitoring . . . . . . . . . . . . . . . . . . . . . . . 38
2.4.3 Current State of Static and Dynamic Approaches . . . . . . . 39
2.4.4 HP Fortify SCA . . . . . . . . . . . . . . . . . . . . . . . . . . 41
2.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
3 Weakness Identication and Vulnerability Verication 44
3.1 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
3.1.1 Specic Diction in Application Security . . . . . . . . . . . . . 45
3.1.2 Weaknesses . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.2 Verication of Weaknesses . . . . . . . . . . . . . . . . . . . . . . . . 47
3.3 Prioritizing weakness verication . . . . . . . . . . . . . . . . . . . . 49
3.4 Integrating Security Monitoring in a SDLC . . . . . . . . . . . . . . . 50
3.4.1 Secure Software Development Life Cycle . . . . . . . . . . . . 50
3.4.2 Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
3.5 Specic Weaknesses . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
3.5.1 Format String Vulnerability . . . . . . . . . . . . . . . . . . . 53
3.5.2 Resource Injection/Path Manipulation . . . . . . . . . . . . . 53
3.5.3 OS Command Injection . . . . . . . . . . . . . . . . . . . . . . 54
3.5.4 SQL Injection (SQLi) . . . . . . . . . . . . . . . . . . . . . . . 54
3.5.5 Basic Cross-Site Scripting (XSS) . . . . . . . . . . . . . . . . 55
3.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
4 CBSAMF 56
4.1 Model for Security Assertion Monitoring . . . . . . . . . . . . . . . . 56
4.1.1 Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
4.1.2 Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
4.2 Case Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
4.2.1 The Need for Verication . . . . . . . . . . . . . . . . . . . . . 67
4.2.2 Summary of System Requirements . . . . . . . . . . . . . . . 69
4.2.3 Security Requirements Analysis and Design . . . . . . . . . . 70
4.2.4 Contract-based Runtime Monitoring . . . . . . . . . . . . . . 72
4.3 Realization of Contracts . . . . . . . . . . . . . . . . . . . . . . . . . 80
4.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
5 Metrics for Assessing and Improving Security Assurance 85
vii
5.1 Metrics for Security Assurance Products . . . . . . . . . . . . . . . . 86
5.2 Alternate Evaluation Metrics . . . . . . . . . . . . . . . . . . . . . . 89
5.2.1 Default: Arithmetic Mean . . . . . . . . . . . . . . . . . . . . 90
5.2.2 Articial Scaling . . . . . . . . . . . . . . . . . . . . . . . . . 91
5.2.3 Consumer: Weighted Mean . . . . . . . . . . . . . . . . . . . 92
5.2.4 Verication Metrics . . . . . . . . . . . . . . . . . . . . . . . . 93
5.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
6 Experimental Evaluation 97
6.1 Experimental Context . . . . . . . . . . . . . . . . . . . . . . . . . . 98
6.1.1 Environment Description . . . . . . . . . . . . . . . . . . . . . 98
6.1.2 Taxonomies . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
6.1.3 Test Suite Datasets for Experiments . . . . . . . . . . . . . . 99
6.2 Selected Vulnerability Datasets . . . . . . . . . . . . . . . . . . . . . 99
6.2.1 Challenges: Test Suites 45 and 46 . . . . . . . . . . . . . . . . 101
6.2.2 Purpose of Test Suites . . . . . . . . . . . . . . . . . . . . . . 101
6.2.3 Coverage of CWE's . . . . . . . . . . . . . . . . . . . . . . . . 102
6.3 PART I: Verication of Test Suites 45 and 46 . . . . . . . . . . . . . 104
6.3.1 Experiment 1: Verifying Exploitability with Probes . . . . . . 105
6.3.2 Experiment 2: Manual Review of Datasets . . . . . . . . . . . 107
6.3.3 Improving Vulnerability Datasets . . . . . . . . . . . . . . . . 111
6.4 PART II: Applying Security Metrics . . . . . . . . . . . . . . . . . . . 112
6.4.1 Experiment 3: Static Verication of Datasets . . . . . . . . . . 113
6.4.2 Applying Alternate Evaluation Metrics . . . . . . . . . . . . . 120
6.4.3 Experiment 4: Improved Security Assurance . . . . . . . . . . 123
6.5 Experimental Summary . . . . . . . . . . . . . . . . . . . . . . . . . 130
7 Conclusions 133
quotesdbs_dbs7.pdfusesText_5