Tutorials
Monday | |||
---|---|---|---|
M1 | Web Application Security | ||
M2 | Common Criteria Version 3 | ||
M3 | Trust Management | ||
M4 | Defenses Against Viruses, Worms, and Malicious Software | ||
Friday | |||
F5 | Acquisition and Analysis of Large Scale Network Data V.2 | ||
F6 | Practical Security Policy Modeling | ||
F7 | Securing Enterprise & Government Web Service Applications: A Lifecycle Perspective | ||
F8 | Identifying and Addressing Mobile Security Issues |
[ TOP ]
Mr. David Wichers
Aspect Security
Abstract
The security of an organization.s web applications is critical to a successful online presence. In fact, for some organizations, particularly e-commerce and financial organizations, the security of their web applications may be the most important IT security issue they are facing today. Unfortunately, the security of their custom web applications is frequently an organization.s weakest area.
Most developers learn what they know about security on the job, usually by making mistakes. Security is just not a part of many computer science curricula today. This powerful one day course focuses on the most common application security problems facing web applications today. It describes the most common vulnerabilities present in today.s web applications, including the OWASP Top Ten, and practical techniques for avoiding, or identifying and removing such vulnerabilities from web applications.
Prerequisites:
None. Technical background suggested, but not required.
High Level Outline:
This course starts with a module designed to raise awareness of just how insecure most web applications are. We demonstrate how hackers are able to attack web applications, and what some of the common vulnerabilities are. The next module presents the technological fundamentals of HTTP, the language of web applications. The basics of developing a good security policy for a custom application are then described. The course then covers the following application security vulnerability areas:
• Authentication • Access Control • Parameter Use • Command/SQL Injection • Cross Site Scripting • Buffer Overflows • Error Handling |
• Cryptography • System Administration • Server Configuration • Unnecessary and Malicious Code • Thread Safety • Denial of Service • Privacy and Legislative Compliance |
• Accountability and Logging • Integrity • Caching, Pooling, and Reuse • Code Quality • and more. |
For each area, we cover the following:
The course then concludes with recommendations on how to improve security in your application development lifecycle so your teams can produce more secure code in the first place.
About the Instructor:
Mr. David Wichers is the Chief Operating Officer (COO) of Aspect Security, a company that specializes in application security services. Mr. Wichers has over seventeen years of experience in the information security field, in areas such as application security, security architectures, secure designs, security policies, models, database security, multilevel security, system and software development, and security testing. He has supported the design and development of enterprise web applications, trusted operating systems, trusted databases, secure routers, secure guards, and large integrated systems for a wide variety of Commercial and Government customers. He previously ran the Application Security Services Group at Exodus Communications. Mr. Wichers has a BSE in Computer Systems Engineering from Arizona State University and a Masters degree in Computer Science from the University of California at Davis. Mr. Wichers is a CISSP and a CISM, is currently the OWASP Conferences Chair (www.owasp.org), and is a coauthor of the OWASP Top Ten.
[ TOP ]
Mr. Ron Bottomly
NIAP/CCEVS
Dr. Dirk Jan-Out
TNO-ITSEF
Abstract
This tutorial introduces the Common Criteria version 3 and its associated methodology, in terms of its philosophy, structure, principles, and requirements. The tutorial will be useful for developers, in order to understand the requirements against which their products will be measured; for evaluators, who will use the criteria to perform security evaluations; and for consumers, who need to understand the meaning of results of CC evaluations.
Prerequisites:
Basic knowledge of the Common Criteria is strongly recommended.
High Level Outline:
About the Instructors:
Mr. Ron Bottomly is a chief validator of the Common Criteria Evaluation and Validation Scheme. He was an early reviewer of the Common Criteria, and the US scheme representative to the Common Evaluation Methodology Editorial Board, where he was a co-author of the CEM v1.0. He has also served as a member of the NATO CC working group. He is the US scheme representative to the Common Criteria Interpretations Management Board, where he co-authored the CC/CEM v3.0.
Dr. Dirk-Jan Out is the general manager of TNO-ITSEF, the Dutch Security Evaluation Facility, accredited under both the BSI (Germany) and the NSCIB (Netherlands) CC Certification Bodies. He was an early reviewer of the Common Criteria and the Dutch representative to the Common Evaluation Methodology Editorial Board, where he co-authored the CEM v1.0. He is currently a Dutch representative to the Common Criteria Interpretations Management Board, where he was a co-author of the CC/CEM v3.0.
[ TOP ]
Dr. Scott D. Stoller
State University of New York at Stony Brook
Abstract
As computer systems become increasingly inter-connected, there is a growing need to establish and enforce security policies in the information systems of large enterprises, coalitions, and other organizations that lack globally trusted security administrators. Trust management offers flexible solutions for decentralized management of security policies in such systems.
Trust management has three characteristic features: (1) each policy statement is associated with a principal, called its source or issuer; (2) each principal's policy specifies which sources it trusts for which kinds of statements, thereby delegating some authority to those sources; and (3) policies may refer to domain-specific attributes of and relationships between principals, resources, and other objects. Thus, in the trust management approach, the overall security policy is formed from decentrally managed policies interacting through delegation.
Prerequisites:
None
High Level Outline:
About the Instructor:
Dr. Scott D. Stoller is an Associate Professor in the Computer Science Department at the State University of New York at Stony Brook. His primary research interests are computer security, distributed systems, concurrency, and software analysis, testing, and verification. He received his Bachelor's degree in Physics, summa cum laude, from Princeton University in 1990 and his Ph.D. degree in Computer Science from Cornell University in 1997. He received an NSF CAREER Award in 1999 and an Office of Naval Research Young Investigator Award 2002. He is a member of the team that won the NASA Turning Goals Into Reality Award for Engineering Innovation in 2003. He is the author or co-author of over 50 refereed research publications and has given over 50 presentations at conferences, workshops, universities, and research labs.
[ TOP ]
Dr. Tom Chen
SMU, Dept of Electrical Engineering
Abstract
The Internet has created a fertile environment for viruses and worms since virtually every computer now interconnected into a global community. About every PC user has an experience with a virus or worm at some time or another. Unlike many types of security attacks directed at compromising a specific target, the self-replicating nature of viruses and worms creates a large-scale attack on the general community. The Internet itself is also affected by the resulting congestion.
This tutorial will give an overview of computer viruses, worms, and Trojan horses. The tutorial is organized into three major parts. The first part introduces the audience to the self-replicating mechanisms of viruses and worms, and describes how malicious software programs function. The possible effects on hosts and networks is described with real-life examples.
The second part of the tutorial gives an overview of current host-based and network-based defenses. Hosts are protected by antivirus software and operating system patching. Network-based defenses consist of various network equipment such as firewalls, intrusion detection systems, server proxies, and routers. In addition to explaining each type of defense, the limitations of each defense are pointed out. The limitations are important to understanding why malware outbreaks continue to be a major problem today and into the foreseeable future.
The third part of the tutorial gives an overview of some current research areas in improving defenses. The automation of defenses will be critical in the face of new worms that can be much faster than today.s manual, reactive defenses. Automated defenses will first depend on accurate detection of new outbreaks. New outbreaks must be detected before a virus/worm signature is available, so new behavior-based detection methods must be used. Unfortunately, behavior-based detection can result in a high number of false positives, so current research is seeking to improve the accuracy of behavior-based detection. After detection of a new outbreak, automated defenses will exercise some action to quarantine the worm. Examples proposed by Cisco and Microsoft will be described. Also, the use of tarpits and rate throttling to slow down outbreaks will be explained.
Prerequisites:
None.
High Level Outline:
About the Instructor:
Dr. Thomas M. Chen is an associate professor in the Department of Electrical Engineering at Southern Methodist University in Dallas, Texas. He received the BS and MS degrees in electrical engineering from the Massachusetts Institute of Technology in 1984, and the PhD in electrical engineering from the University of California, Berkeley, in 1990. He is currently the associate editor-in-chief of IEEE Communications Magazine, a senior technical editor for IEEE Network, an associate editor for ACM Transactions on Internet Technology, and past founding editor of IEEE Communications Surveys. He serves as the treasurer for the IEEE Technical Committee on Security and Privacy. He is a member of the Technical Advisory Board for the Voice over IP Security Allicance. Prior to joining SMU, he was a senior member of the technical staff at GTE Laboratories (now Verizon Labs) working on ATM research. He is the co-author of ATM Switching Systems (Artech House, 1995). He was the recipient of the IEEE Communications Society.s Fred W. Ellersick best paper award in 1996. His research interests include network security, traffic modeling, network performance, and network management.
[ TOP ]
Dr. John McHugh
Dalhousie University
Abstract
Detecting malicious activity in network traffic is greatly complicated by the large amounts of noise, junk, and other questionable traffic that can serve as cover for these activities. With the advent of low cost mass storage devices and inexpensive computer memory, it has become possible to collect and analyze large amounts of network data covering periods of weeks, months, or even years. This tutorial will present techniques for collecting and analyzing such data, both from network flow data that can be obtained from many routers or derived from packet header data and directly from packet data such as that collected by TCPDump, Ethereal, and Network Observer.
Because of the quantity of the data involved, we develop techniques, based on filtering of the recorded data stream, for identifying groups of source or destination addresses of interest and extracting the raw data associated with them. The address groups can be represented as sets or multisets (bags) and used to refine the analysis. For example, the set of addresses within a local network that appear as source addresses for outgoing traffic in a given time interval approximates the currently active population of the local network. These can be used to partition incoming traffic into that which might be legitimate and that which is probably not since it is not addressed to active systems. Further analysis of the questionable traffic develops smaller partitions that can be identified as scanners, DDoS backscatter, etc. based on flag combinations and packet statistics. Traffic to and from hosts whose sources appear in both partitions can be examined for evidence that its destinations in the active set have been compromised. The analysis can also be used to characterize normal traffic for a customer network and to serve as a basis for identifying anomalous traffic that may warrant further examination.
Prerequisites:
General familiarity with IP network protocols. Elementary familiarity with simple statistical measures.
High Level Outline:
About the Instructor:
Dr. John McHugh holds a Canada Research Chair in Privacy and Security at Dalhousie University in Halifax, NS where he leads the Privacy and Security Laboratory. Prior to joining Dalhousie, he was a senior member of the technical staff with the CERT Situational Awareness Team, where he did research in survivability, network security, and intrusion detection. Recently, he has been involved in the analysis of large scale network flow data. He was a professor and former chairman of the Computer Science Department at Portland State University in Portland, Oregon. His research interests include computer security, software engineering, and programming languages. He has previously taught at The University of North Carolina and at Duke University. He was the architect of the Gypsy code optimizer and the Gypsy Covert Channel Analysis tool. Dr. McHugh received his PhD degree in computer science from the University of Texas at Austin. He has a MS degree in computer science from the University of Maryland, and a BS degree in physics from Duke University.
[ TOP ]
Dr. Steven J. Greenwald
Independent Consultant
Abstract
Security Policies are the basis for the design and implementation of security mechanisms, among other things. This tutorial starts with a definition of .policy. and the important top-level policy properties for the INFOSEC field. Notions of security policies are examined, as well as some sample policy objectives in the context of enterprise/organizational and automated security policies. The differences between formal and informal security policy models will be examined, and some of the most influential policy models in the field will be presented (in the areas of access control, confidentiality, integrity, and organization). An example of the security policy modeling process will be given with an actual example model. Security policy modeling guidelines will then be presented.
The ideal student is someone who knows nothing about Security Policy Modeling, or needs a refresher regarding the basics. Since it is unrealistic to assume that the students can absorb all of this material in a one-day tutorial, each will be given an annotated bibliography of seminal papers and reports (most available on the web) that will be covered during the tutorial and which they may use for future study and reference. A major goal of this tutorial is that the student should be able to effectively understand, research, and apply such material when it is later encountered.
Prerequisites:
None. Novice level.
High-Level Outline:
About the Instructor:
Dr. Steven J. Greenwald is an independent consultant in the field of Information Systems Security specializing in distributed security, formal methods, security policy modeling, resource-based security, and related areas. He also works with organizational security policy consulting, evaluation, training, and auditing. He is a Research Fellow at Virginia.s Commonwealth Information Security Center (CISC) and is a member of the adjunct faculty at James Madison University's Computer Science department teaching in their graduate INFOSEC program (a National Security Agency-designated Center Of Academic Excellence in Information Security Assurance). Dr. Greenwald was formerly employed as a computer scientist in the Formal Methods Section of the US Naval Research Laboratory, and is also past General Chair and past Program Chair of the New Security Paradigms Workshop (NSPW). Dr. Greenwald earned his Ph.D. degree in Computer and Information Science from the University of Florida (with a dissertation in the field of information systems security). ). He has an M.S. degree in Computer Science and Information Systems from Barry University and a bachelor's degree in Chemistry from Emory University.
[ TOP ]
Mr. Steve Orrin
Sarvega
Abstract
Organizations that are implementing Web Services are discovering that there are security challenges unique to Web Services that can surface throughout the various phases of Web Service lifecycle. Traditional network and application protection and infrastructure systems lack the functionality, performance, and operational efficiencies needed to provide a secure, cost effective solution.
Web Services and SOA provide significant benefits and efficiencies to organizations that implement them. However they also introduce new risk structure not seen in other applications or technology solutions before. Web services introduce a .Perfect Storm. of security/risk requirements. These requirements for security, access control, business continuity and risk management can be divided into 3 categories or groupings: Trust services, Operationalization/Reliability, and Risk Mitigation.
This tutorial explores the process for securing Web Services as well as what is needed to address security through all phases of the Web Service lifecycle of Build, Publish, Deploy, and Run. The tutorial investigates the nature of XML Web Services threats, including threat models, threat types, attack vectors, and risk.
The tutorial covers what to expect from Web Service security standards and proposes practical approaches to secure Web Services and ensure the data integrity and confidentiality of Web Services transactions. The tutorial reviews an approach to XML Web Services security and threat prevention that provides a logical separation between XML Web Services trust enablement and XML Web Services threat prevention. The tutorial will cover what is needed to proactively protect Web Services data and communications. It will describe what can be implemented to protect against known and unknown XML network threats that could impact the availability and integrity of the Web Services application.
Prerequisites:
None
High Level Outline:
About the Instructor:
Mr. Steve Orrin is CSO of Sarvega, Inc., and is responsible for the security products strategy and direction for the company. Steve was formerly Vice President of Security Solutions for Watchfire, Inc., and was responsible for the product strategy and direction of Watchfire.s web application security and privacy software product lines. Steve was CTO of Sanctum, a pioneer company in Web application security testing and firewall software prior to Watchfire.s acquisition of Sanctum.
Prior to Sanctum, Steve was CTO and co-founder of Lockstar Inc., which provided a secure XML Web Services environment that enabled legacy and enterprise applications for e-business. Steve joined Lockstar from SynData Technologies, Inc., where he was CTO and chief architect of its desktop email and file security products.
Steve Orrin is a well known speaker on Web Services, XML, and application security topics and has spoken on numerous occasions at CSI, RSA, ISACA, N+1, TEPR, Vanguard, and SANS conferences. Steve was named one of InfoWorld.s Top 25 CTO.s of 2004 and has developed several patent-pending technologies covering user authentication, secure data access, and steganography, with one patent issued to him in steganography. Steve is a member of the Network and Systems Professional Association (NaSPA), the Computer Security Institute (CSI), the Information Systems Security Association (ISSA), and the Software Engineering Institute (SEI). He participates in several working groups of OASIS and ITEF.
[ TOP ]
Mr. Norm Laudermilch and Mr. Bill Supernon
Trust Digital
Abstract
Is your Perimeter Security really protecting your perimeter? Are you sure that you understand where all of your access points are? Do you even know where the processing and storage of your company proprietary information starts and stops? If you have even one employee with a palm pilot, smart phone, pocket pc or other mobile device, chances are you answered that last question with a "no". Do you know what your company is spending on mobility next year? Have you figured out what you need to do and how much its going to cost to extend your current security mechanisms to the mobile edge? As mobile devices become more a part of our daily work lives, and as more industries come to rely on the convenience of mobile data to achieve their goals, security at the mobile edge becomes critical. The mobile devices of today are far more advanced than just a few months ago, and their networking, processing, and storage capabilities are doubling every few months. Your employees (and non-employees) are buying and using them, and whether you like it or not they are in your corporate environment - connected. After this session you will know how to discover your mobile edge, understand how to protect your mobile edge, and feel confident with the security and trust at your mobile edge.
Prerequisites:
Working knowledge of TCP/IP networking - both wired and wireless, basic knowledge of standard security components and practices (firewalls, IDS, policy definition, etc).
High Level Outline:
About the Instructors:
Mr. Norm Laudermilch is a security expert with sixteen years of technical and executive experience. Mr. Laudermilch has been involved in the design, development, and implementation of numerous security products and services and is a sought after speaker at global computer security conferences. His definitive writings and teaching range from TCP/IP to UNIX to numerous security topics including several research papers on firewall design. Mr. Laudermilch has held the highest national security clearances and has experience in numerous classified environments. Mr. Laudermilch is currently the Chief Security Officer at Trust Digital where he plays an active role in the evolution of their Mobile Edge Security Architecture. He recently served as Vice President of Managed Security Services at global infrastructure services provider, VeriSign. Prior to Verisign, he served as Vice President of Managed Security Services at Riptech/Symantec from 2000-2002. Before joining Riptech/Symantec, Laudermilch co-founded the Internet Security Advisors Group as well as the Internet Service Provider Security Working Group (ISPSEC). Laudermilch was also Director of Global Security at UUNET.
Mr. Bill Supernor, CISSP, is the Trust Digital Vice President of Engineering. Mr. Supernor brings over 12 years of experience with enterprise application development, IT security, networking, and team management to the company. He has built a reputation of developing quality software for on-time releases against aggressive schedules. Prior to Trust Digital, Mr. Supernor directed software development at Cognio, building its wireless spectrum management prototype into an enterprise customer-ready product. Prior to Cognio, he worked at MountainWave/Symantec, at first directing the development of the CyberWolf security information management system, and then managing the evolution of CyberWolf into Symantec Incident Manager. Mr. Supernor also managed development at Network Associates/McAfee on the Gauntlet firewall, where he enhanced the firewall to perform gateway virus scanning over multiple protocols. Later, he was instrumental to the conversion of this function to the WebShield internet gateway appliances. Following the sale of Gantlet to Secure Computing, Mr. Supernor worked for a time as an Architect on the SideWinder G2 firewall. Mr. Supernor has also worked at a number of defense contracting and US military organizations.