Attendees
Bob Blakley, IBM, blakley@us.ibm.com
Ian Dobson, The Open Group, i.dobson@opengroup.org
Gerhard Eschelbeck, Qualys Inc., geschelbeck@qualys.com
Ben J Halpert, Lockheed Martin, benjamin.j.halpert@lmco.com
Tony Higgins, Legato, thiggins@legato.com
Mike Jerbic, Trusted Systems Consulting, mjerbic@trustedsystemsconsulting.com
Jacqueline D Knoll, Boeing, jacqueline.d.knoll@boeing.com
Narender Mangalam, AIG, narender.mangalam@aig.com
Art S Robinson, s/tdc, artrobinson@stdc.com
Ron Ross, NIST, rross@nist.gov
Eliot M Solomon, Eliot M Solomon Consulting, eliot@eliotsolomon.com
Gary Stoneburner, NIST, stoneburner@nist.gov
Stephen J Swenson, US Naval Undersea Warfare Center, swensonsj@npt.nuwc.navy.mil
Dennis Taylor, NASA SEWP, dtaylor@sewp.nasa.gov
Manny Vlastaks, DISA, vlastake@ftm.disa.mil
Steve T Whitlock, Boeing, stephen.whitlock@boeing.com
Introduction
Mike Jerbic welcomed all present to the meeting. He recalled the
invitation-only exploratory EVM Initiative meeting held in Boston on 24 July 2003, where the attendees concluded that:
- Users dont care why systems fail - they are only
really concerned that the failure has happened and demand restoration of service without
delay.
- There are significant vulnerabilities outside of
security. Other disciplines have much to offer - Safety, Security, and Dependability are
the key watchwords that describe the essential disciplines involved.
- We should apply knowledge in these three fields to the
general-case problem of system vulnerability.
Mike introduced (see slides)
the agenda for this meeting, and welcomed the presenters, noting that they will each set
the scene to enable us to decide how to move forward.
Presentation on NIST's FISMA
Dr. Ron Ross (NIST) gave a presentation
based on his wider presentation to the Architecture plenary on Monday of this Conference.
He focussed on the key issues that are dependencies in EVM. NIST is developing
certification and accreditation guidelines for FISMA regulated systems. These are
relevant and valuable for the private sector, especially the heavily regulated private
sector.
- Today's climate - highly
interactive environment of powerful computing devices and interconnected systems across
global networks; federal agencies routinely interact with industry, private citizens,
state and local governments, and the governments of other nations; the complexity of todays
systems and networks presents great security challenges for both producers and consumers
of information technology.
- The links in the whole chain involve non-technology as well as technology components.
- The FISMA (Federal Information Security Management Act of 2002) tasks NIST with the job
of implementing its requirements.
- The US Office of Management & Budget (OMB) circular A-130 (Management of Federal
Resources) requires all Federal Agencies to plan for security, ensure that appropriate
officials are assigned security responsibility, and authorize system processing
prior to operations and periodically, thereafter.
Ron's slides summarize NIST's project objectives to comply with these responsibilities,
and the significant benefits that are expected to accrue as a result of successful
implementation. His final slide summarizes the "big picture" for his Information
Security Program, which embraces:
- Risk Assessment
- Security Planning
- Security Control Selection and Implementation
- Security Authorization
- Verification of Security Control Effectiveness
- Categorization of Information and of Information Systems
Ron emphasized that their priorities are to switch funds to address the issues that
have most impact, and so the verification component in this big picture is a vital tool to
ensuring their effectiveness. They are working to hard deadlines, having had a major
change of focus to bring them to address concerns for the risk to systems that are core to
the mission for key services and their continuity. Flaws that can be exploited are
vulnerabilities, and the highest (core) vulnerabilities are the ones that they will focus
on. The risk to information in this context is inseparable from the risk to the mission.
It will forever remain true that we will live in a risk environment; however, we can work
"smart" to minimize vulnerabilities in core services, and coverage here must
include public-private government and commercial enterprise collaborations.
Phase II is to deliver an accreditation program for verifying an organization's
competence to do certification and accreditation (C&A) type work. It is primarily
aimed at certification services to federal systems. It's a three-step process: planning
quality manual, proficiency tests, and on-site assessment. Ron and Art asserted that there
are many things that help fix system vulnerabilities - and around 80% of them can be fixed
by having good configuration out-of-the-box. This statement aligns with the submission
that the Security Forum made in its comments on the September 2002 US Critical
Infrastructure for Cyberspace draft. Ron noted that as an initial practical step, he would
welcome the Security Forum reviewing NIST's SP800-53 which should be out next week.
Presentation on ASC's RPI
Narender Mangalam (AIG) gave a presentation
on the American Security Consortium and its Information Security Risk Preparedness Index
(RPI), which ASC aims to establish as a national rating system for organizational risk
preparedness, accepted by the audit and insurance industries. The big four audit companies
Deliotte Touche Tohmatsu, Ernst & Young, KPMG, and PricewaterhouseCoopers, are all
supportive.
Mike Jerbic explained that the ASC and The Open Group Security
Forum Chair initiated a discussion centered on seeking The Open Group's endorsement of the
RPI. To this end, the Security Forum conducted a lightweight fasttrack review of the RPI,
and will be consolidating its feedback in a members-only meeting this week.
Narender's presentation covered:
- The history of the ASC.
- The driver is that company officers and directors have liability
for compliance with government regulations (Sarbanes-Oxley, HIPAA, GLBA, Basel-II, and
Case Law).
- The initial goal for the RPI is to provide a risk measurement
model and tool that has the support of the big four audit firms, to demonstrate due
diligence in performing best practice in the insurance industry.
- The case for establishing a quantitative (as opposed to the
Baldrige "quality" approach) measure for risk preparedness.
- What the ASC's RPI is, its current target applicability, and its
development team (which includes AIG with the big four audit companies).
- The ASC plans to launch the RPI, and their goals for endorsements
and support.
Narender said ASC plan to launch the RPI in November 2003.
Discussion on the RPI noted that the weightings applied to the calculation will massively
skew the resulting numbers, so these weightings must be validated to give the RPI
credibility, and this suggests that the algorithms used in the RPI should be disclosed so
those measured can have confidence in the resulting quantitative measurement. Mike
undertook that the Security Forum will provide its feedback on the RPI in the next week,
and will seek to work with the ASC.
Presentation on Concepts of EVM
Dr. Arthur (Art) Robinson gave a presentation
on the concepts of Enterprise Vulnerability Management, including a Security Development
Lifecycle:
- An open non-proprietary approach to EVM is needed, to facilitate
cooperative government-industry-academia participation in addressing critical system
missions.
- Assessments of system vulnerabilities need an open, holistic
end-to-end approach.
- Multiple application environments need to be addresed with a
multi-disciplinary approach involving dependability, safety, and security.
- System VM requires additional processes to address stress and
stress effects, to ensure we arrive at the improvement points we aim to reveal.
- Evolving government C&A processes (from NIST) support
dependability, safety, and security VM processes.
- A documented body of Case Evidence will demonstrate the validity
of this EVM approach.
In discussion, it was brought out that conventional system upgrade processes often do
not address important vulnerability management issues. Risk is hard for people to buy into
because it is not easy to measure and therefore take seriously, though the visible effects
of such events as the huge power failure in the NE US and Canada in September 2003 did
highlight the vulnerability issues.
Presentation on EOIF
Tony Higgins (Legato) gave a presentation
on the Electronic Original Initiative Foundation (EOIF) and its advancement of the
requirements and best practice guidelines for admissibility of e-legal documents for
formal audit and evidential purposes. In his presentation:
- Tony first described the EOIF organization , its origins, its
structure, and its objectives.
- He listed its commercial sponsors, and its industry associations.
- He identified the duty of care requirements involved in record
retention of company documents, particularly of financial records, for evidential purposes
for audit, and in the event of legal proceedings.
- He noted how government regulations and professional body best
practices are not uniform across different geographical regions and jurisdictions, and
these frequently miss out coverage of the duty of care aspects.
- The EOIF seeks to address retention of electronic records to
respond to compliance with regulations issues and e-legal evidence issues, and also to
promote harmonization of standards and regulations internationally.
- He listed record-keeping system requirements, and the challenge of
"authenticity" of electronic documents, what is "usable evidence", and
what is needed for acceptable e-document management, accessibility, and security.
Tony summarized the needs and benefits of addressing the whole
subject of e-document retention and admissibility in a holistic manner. The EOIF goal is
to do exactly this.
It was noted that the British Standards Institute has published
best practice guidelines for e-document retention, so this work is not starting from a
blank sheet. There are many other government regulations and professional institute
practices that have established islands of legal acceptance, and all these practices need
to be brought in and reconciled into a single best practice standard. Much of the work required is to raise the profile of the need for
harmonizing these best practices by establishing international standards in this area.
Drawing Conclusions
Mike Jerbic drew out of the presentations in this meeting (see final slide) that we are spanning three areas that CxO's are heavily
concerned with - Risk, Compliance, and Business Performance - and that vulnerability
management connects these three areas.
In discussion it was proposed that we could bring the right constituencies together to
harmonize best practices, maybe information interoperability standards, test suites,
certifications for processes or products, that this group is well qualified to work on.
They can start here or in some other place. Historically it has been difficult to present
the business case for security to business managers. This approach seems to identify with
business management concerns, harmonizing the NIST standards and the RPI to commercial
industry, and the VM lifecycles to architect and implement mission-critical goals. We
should include in the business case the cost of not doing security. We have an opportunity
to bring industry perspective to this work which will give greater government acceptance
and also stimulate outreach to industry and the enterprise. In all this enthusiasm,
however, we also need to consider what we can do most effectively here - while it is all
worthy, we must consider what we can realistically achieve and who we must involve to make
the effort successful.
The Security Forum agreed to look critically at what we can and want to do, and produce
a report by the end of this week's Conference, which will then be shared and developed
with NIST, ASC, and EOIF.