Introduction, Overview, and Update
Mike Jerbic (Trusted Systems Consulting Group) introduced the agenda for this meeting.
Mike explained that as a result of working with seasoned professionals in government,
academia, the high-tech IT industry, and global information technology customers, we have
embarked upon a comprehensive program that includes:
- Assessing the scope of vulnerability management requirements for mission-critical
Enterprise services
- Developing and documenting liability limiting best practices that can be used to satisfy
those requirements
- Assessing technological gaps that users and product and services suppliers can close to
improve system defenses against evolving system security, safety, and dependability
stresses
- Advising public and private policy makers with pragmatic assessments of threats,
impacts, and remediation
In a separate discussion in the Security Forum earlier in this San Diego Conference, we
decided on how we will progress with our existing liaisons to support NIST-FISMA
requirements, and to explore methods which produce consistent and
credible results for measuring risk and vulnerability.
Mike reviewed the agenda and list of speakers in this VM meeting day, and looked
forward to us learning a lot about what the vendors are doing, and engaging in
constructive question and answer discussion to bring out the problem issues that need to
be addressed, and then decide which of those issues we in the Security Forum should take
up. We should include in our thinking ways in which we should collaborate with other
organizations to seek practical business solutions in the VM space.
The CERT Risk Assessment Methodology OCTAVE
Steve Kruse - Managing Partner, Impruve Inc. - gave a presentation
in which he described how organizations struggle with a starting point to information
security along some structured guideline. Typically the journey begins with firewall and
anti-virus issues, and quickly becomes unmanageable as more security issues continue to
surface. Carnegie Mellon Universitys Software Engineering Institute (SEI), the
organization that manages the Computer Emergency Response Team (CERT), has developed a
risk assessment methodology designed to bring a level of order to the dilemma. How would
an approach like this impact the work being done by the Vulnerability Management
Initiative of The Open Group? Is there value in reviewing this in relation to the
Initiative's work? Best practices, scoping, and terminology issues are some of the areas
which could be considered and explored. Steve presented how the OCTAVE process helps here,
including providing the ability to tailor the methodology. Steve explained that Impruve is
a licensed facilitation and training partner of the SEI for the OCTAVE approach.
OCTAVE - Operationally Critical Threat Asset and Vulnerability Evaluation - is
"asset-centric", with an information focus. It comprises three phases:
organizational view, then technological view, and strategy and plan development.
Implementing OCTAVE involves running these phases in a series of workshops. OCTAVE
involves an education process for those new to information security. Steve described in
his slides the detailed processes in each phase. He noted that the overall process is
scenario-based, and it involves building up threat trees in spreadsheets, mapping those
against ISO 17799 or similar category of practice compliance methodologies, and
identifying intersections which represent high-risk areas that you then select to
mitigate. You then draw up a mitigation plan for each area.
Further information on Steve and Impruve is available at www.impruve.com. OCTAVE materials are available from www.cert.org/octave.
In Q&A discussion, Art Robinson queried the acceptability of OCTAVE for resolving
liability disputes. Eliot Solomon wondered how OCTAVE measures up as an engineering
methodology against other techniques, and Bob Blakley asked how OCTAVE handles defect
reports. Steve said that from his quality engineering background he looks forward to
meaningful metrics methodologies being developed, and that it is not well defined where
cybersecurity ends and physical security begins; however, he likes the OCTAVE
approach especially when it is combined with the categories of ISO 17799. Art mentioned
SP800-26 which is relevant in this area - Steve acknowledged this but expressed
reservations on its ease-of-use and effectiveness for business organizations.
Vulnerabilities in Outsourcing
Francoise Gilbert - Managing Director, IT Law Group - noted (see presentation) that many companies are outsourcing some of their
operations offshore in the hope to save money and refocus on more pressing issues or
projects. However, reports indicate that a large proportion of outsourcing deals
fail. Why? And what could have been done to anticipate and prevent the problems? Her
presentation analyzed potential benefits and pitfalls in structuring an outsourcing
relationship, focusing on legal issues and risk management, from evaluation and due
diligence to the contact terms, the ongoing relationship, and termination.
For further information, see Francoise's website at www.itlawgroup.com.
Information Security: The New Frontier of Corporate Liability?
Scott Pink - GrayCary Law Firm - is deputy chair of the ABA in California. They are
focusing on the Patriot Act's implications, so if anyone has any specific interests and
concerns on this subject, particularly on their aim to set up a repository of compliance
information, Scott invited them to let him know.
Scott gave a presentation that explored emerging trends in
information security law, including why cybersecurity matters. We are seeing a slow
evolution in legal cases towards establishing whether someone has acted reasonably. It
matters because you need to prevent damage and loss, to maintain customer and employee
confidence, to meet contractual requirements, to ensure legal compliance, and to avoid
potential legal liability. In the USA there are sets of laws addressing consumer
protection, national security, organizational assets against misuse, and standards of
care, and Scot discussed relevant issues under each of these areas. He then discussed
recent legal developments, such as California SB 1386 requiring disclosure of security
breaches, and the compliance requirements raised by HIPAA, Gram-Leach-Bliley,
Sarbanes-Oxley, and Federal Trade Commission enforcement (invokes security requirements
imposed under the unfair trade practice law). Compliance begs risk assessment, privacy and
security audits, and appropriate procedures for implementing information security.
Scott considered that future trends are likely to be based on reasonable standards of
care, and following of best practice. However, where regulations are in force then
regulatory compliance is the essential issue if an organization is to avoid significant
penalties for non-compliance.
Worm and Virus Defense: The Laws of Vulnerabilities
Gerhard Eschelbeck - CTO, Qualys Inc. - discussed (see presentation)
how Worm and Virus attacks are increasing in sophistication and are capable of spreading
faster than any possible human response effort. To date, all automated attacks, including
the recent Blaster outbreak, involved known vulnerabilities where patches were available.
The Laws of Vulnerabilities are the result of research on vulnerability prevalence from
more than 1,500,000 security audit scans during the past 18 months. These results provide
valuable insight into the evolution of network security attacks and how we can prepare
networks for protection against automated threats of the future. Gerhard suggested that
those to watch in 2004 are on Remote Procedure Calls.
Qualys' pro-active approach to vulnerability management involves a discovery phase,
then an assessment phase, then an analysis phase, and finally a remediation phase. These
phases run in a continuous circle. Qualys' product is QualysGuard - a scalable and
distributed web service architecture. Gerhard concluded with a short demonstration of this
product.
Vulnerability Management Lifecycle: A Practical Approach
Stuart McClure - President and CTO, Foundstone Inc. - began his presentation
with an introduction to his career background including his authorship of books on
information security, hacking, and vulnerabilities.
Stuart identified several vectors of attack - direct, email/instant messaging, browser,
wireless, war dialing, voicemail, social engineering, physical, misuse (taking advantage
of trust), moles/sleepers (plants). The sophistication of attackers is rising. Vendor
vulnerabilities according to CERT 2002 were on the rise, but in 2003 reduced slightly.
Feedback metrics also indicates that the vulnerability-to-worm cycle is shrinking.
What is a vulnerability? He defined it as a weakness in a process, administration, or
technology that can be exploited to compromise IT security. There are vendor
vulnerabilities, developer vulnerabilities, mis-configurations, policy violations, etc. In
fact, almost every security technology depends on vulnerabilities to sell it.
Risk management includes risk avoidance, risk acceptance, risk transfer, and risk
mitigation (mitigation being the most often-used approach to vulnerability). Vulnerability
management involves assessment, analysis, and remediation. Addressing the levels of
criticality to your business on vulnerabilities and assets and threats defines the risk.
The 80:20 rule is traditionally a good way to decide what to address as your highest
priority. Stuart noted that keeping track of your organization's assets is difficult
enough, but correctly understanding the assets on those systems, determining their value
to the business, and managing the vulnerabilities and threats on those assets, is even
harder. He presented a risk management lifecycle - this is where he believes that
vulnerability management truly delivers practical value and solves real customer problems.
If we then combine risk metrics and remediation workflow, we will have the Nirvana of
security risk management. After all - Stuart claimed - if you cannot measure it, you
cannot manage it!
Several attendees questioned Stuart's conclusion in the last bullet of his last slide.
Trusted eGovernance - Security as a Business Enablement Strategy
Jacques Francoeur - CEO, Trustera Inc. - gave a presentation
on e-governance. He noted that enterprise e-governance has rapidly become a critical
business driver for the Board of Directors and Senior Executive Management.
He first defined what he means by a trusted electronic enterprise - it requires
e-Integrity, e-Compliance, and e-Enforceability. Taking the CEO's perspective, they want
the digital world to map to the traditional business model for trust in their business.
Accountability is an increasingly important business imperative, joining revenue
performance and cost control, which have always been the main drivers. With regulation and
the penalties for non-compliance now impacting governance, accountability is a third
imperative. He presented an assurance model, which is mandated by the penalties for
non-compliance to the Sarbanes-Oxley Act and future regulations such as the Corporate
Information Security Accountability Act. These Acts will hold CEOs and CFOs legally
accountable for the financial integrity of their organization and for the protection of
its critical information assets and systems. Electronic Signature laws now enable
organizations to make the transition to being an electronic enterprise, providing for
great efficiency and effectiveness gains. In fact, many industry regulations are driving
the transition. However, the potential for misrepresentation, information falsification,
and repudiation of decisions and acts are far greater in the electronic world.
Trusted Electronic Enterprise Governance is a vision and strategy of ensuring the
electronic integrity, enforceability, and compliance of an electronic enterprise, its
business models, and its processes. Today the key challenge is all about accountability,
and how to manage it. Jacques perceived a shift of accountability inward down the
value-chain of an organization, and claimed that security is at the core of this model of
who is accountable for doing what, and when. The value chain from the center of this model
can also be represented as working its way up to the top - at enterprise governance
management level. Security needs to be positioned strategically within enterprise
governance. In order for electronic governance to be worthy of trust by its stakeholders,
the traditional purview of security must be enriched and elevated to a new form. In this
new form, which Jacques called Enterprise Digital Trust Management,
security is a Governing Board and senior executive-level business enablement
strategy, which must be core to the organizations business integrity and
competitiveness.
Discussion focused on what the product of this offering is. Jacques represented his
presentation as a holistic approach to how security should be re-positioned.
Gerlinde noted that businesses tend to be loosely coupled so accountability is a much
more complex issue that Jacques' model would suggest. Jacques agreed but proposed that the
solution is to ensure an even more rigorous chain of accountability. Gerlinde also
observed that developers can erase all audit trails in their software, so if we have a
malicious internal developer, how can this control be resisted? The answer to this issue
was to put in place effective tamper-proof audit and archive controls.
An Advanced Approach to Application Security Assessment and Testing
Kapil Raina - Senior Product Manager, Cenzic - considered (see presentation)
that web application security remains the weakest link in the enterprise security chain.
More and more application functionality is being rolled out on corporate web sites,
creating increasing risk and increased liabilities for corporations. Existing approaches
to application security are primitive and relatively inconsistent. CVE (Common
Vulnerabilities and Exposure) scanning, manual penetration testing (pen testing), code
scanning, and QA tools, are all good starts, but provide inadequate coverage. These
approaches provide a false sense of security without any measurable return on investment
(ROI). Kapil gave a top-ten list of examples of web vulnerabilities, to indicate the type
of problems that users want fixed. Challenges with web application vulnerability
assessment include accuracy, performance, usability, and more. Kapil took accuracy as a
specific example to exemplify issues he wished to discuss.
Kapil pointed to new directions for web application security assessment and testing.
These emerging trends combine the automation advantage of CVE testing and the flexibility
of pen testing into a single, comprehensive framework. Efficiency (and ROI) is gained
through the focus on the creation of re-usable security tests that can be leveraged over
multiple releases and multiple applications. When combined with the development of
application security models, these tools provide the most comprehensive security
assessment and test coverage possible.
Next Steps, First Projects
Mike Jerbic reviewed the set of presentations that we had received during this meeting
and led discussion on what conclusions we should draw from the presentations from our
speakers. What value do we see we need, and which of them do we want to add? He summarized
the presentations and key issues he identified in them:
- OCTAVE - an interesting methodology for identifying vulnerabilities and relating them to
the NIST-FISMA or ISO 17799, or any other compliance methodology.
- Legal presentations - gave some new input, but not a lot that we did not already know.
The outsourcing presentation did note that outsourcing to low-cost areas can be
profitable, and the risk in outsourcing of losing control and being unable to recover your
intellectual property rights.
- Qualys presentation - what sort of standard metrics can we develop across the industry?
- Foundstone presentation - perhaps we could get the Gartner perspective to help flesh out
the issues Stuart McClure raised.
- Trustera presentation on the digital chain of trust - accountability is a key issue in
governance models so represents a lever to set the right IT security priorities in
business organizations.
- Cenzic presentation on application security assessment and testing - what VM issues does
it highlight?
Art Robinson asked if we should first get our terminology clear - should we be talking
about Vulnerability Management at all? Or are we addressing Risk Management? He noted
there is an existing NIST Risk Self-Assessment document (SP500-53) available, and also two
NSA risk documents, all of which will help us make a good start.
Eliot Solomon felt we should aspire to become thought-leaders. Bob Blakley noted we
could choose to work on several topics in the VM space to develop standards, guides,
processes, and metrics; it is not yet clear which of these have the best starting
platform, but good starting documentation seems to exist and there are some tools to help.
On the legal side maybe the ABA would be a good partner with us in progressing this topic;
however, our experience with The Open Group Active Loss Prevention Initiative (ALPI)
demonstrated that it is probably best to go to the ABA rather than try to bring them into
The Open Group - even if it is the Security Forum. Bob believed that the public policy and
regulatory approach will largely determine what regulatory regime we will ultimately be
compelled to comply with, so we must accept it as a key driver for all work we do on
vulnerability and risk management.
Eliot suggested our most valuable contribution would be on how to bridge the gap
between software developers and security technology and engineering practice, to enable
developers to write software that is more secure, robust, etc. through the entire
lifecycle, preventing vulnerabilities in new software and addressing vulnerabilities in
legacy software.
Mike proposed that on the operational and management side of VM, a current pain point
is compliance to regulatory requirements - do we want to do any work in this area? Craig
suggested not, though his customers might be interested in some kind of trust mark, as
well as rights management. Mike asked for reactions to potential for providing risk and
vulnerability certification assessments, and developing a metrics methodology that
produces consistent results.
Steve Whitlock said he was uncomfortable with the semantics of today's discussions and
a risk vocabulary would be useful. We could use the ALP's embryonic Risk Vocabulary draft
as a basis for this.
Bob thought that we could work on guidance for security modes - non-functional, not
line-of-business, integrated into products, intrusion-tolerant, security-fault-tolerant.
Many present showed interest in Bob Blakely's suggestion that the group focus its
attention on issues related to providing fall-back modes as system resources fail. That
goal would, by its nature, require us to take an integrated approach to addressing
detection and recovery from system security, dependability, and safety stresses. It is
also likely to drive us to more detailed consideration of resources that humans will need,
to contribute to recovering to system states that can still fulfill the basic requirements
of critical system services.
From this discussion, Mike and Ian took an action to map out problem statements in each
of the areas in this discussion where a proposal appears to exist - this aimed at giving
acceptable visibility to possible value-add VM projects - and to try to evaluate the
nature and size of the resources each would seem to need to deliver acceptable value-add.