Conference Home Page
Proceedings Index

The EA Challenge - Customer Perspective
Allen Brown
Introduction
John Gilligan
Integrated Global Information
Dennis C Moran
Interconnecting the Edges
Fred Riedl
Introducing Open Architectures into Naval Systems
Fatma Dandashi
DoD Architecture Framework
Amy Wheelock
DHS Enterprise Architecture
Ron Ross
NIST Security Certification & Accreditation Project
Jeremy Kaplan
Net-Centric Enterprise Services (NCES)
Vendor Architecture Development
Eliot Solomon
Progress towards the Boundaryless Information Flow Reference Architecture
Bret Greenstein
IBM's On-Demand Strategy
Kazuo Hajikano
TRIOLE:
Fujitsu IT Infrastructure
Leo Laverdure
HP Darwin Reference Architecture Framework
Ed Harrington
Boundaryless Directories
Brian Breton
Identity Management: The Next Critical Step
Joseph Sprute
Remote Access Virtual Environment (RAVE)

Terry Blevins/BillEstrem
So What? Facilitated Q&A Session

Commercial Architecture Development
Allen Brown
Introduction
George S Paras
Emerging Industry Best Practices for EA
Mike Baker
Architecture for an Adaptive Enterprise
Jonathan Willey
Architrecture in the ERP Domain
Andras Szakal
Architecting an On-Demand Enterprise using FEA
Mark W Maier
Architectural Elements for Breaking Boundaries

PLENARY: Boundaryless Information Flow
and Enterprise Architecture

This Conference Plenary is focused on one aspect of the work of The Open Group membership: Enterprise Architecture. It is designed for enterprises at all stages of creating an enterprise architecture, from developing the business scenarios, gaining buy-in, developing the architecture framework, through to developing an architecture. Additional considerations brought about by the practical concerns of mobility, security, real-time and embedded systems, and more are presented.

Session 1 (Day 1, Monday October 20):
The Enterprise Architecture Challenge - Customer Perspective

With key speakers from the US government and military, this session focuses upon understanding the challenges shared by organizations when addressing enterprise architecture, and the strategies they have used for addressing those challenges.

Introduction

Allen Brown, President and CEO of The Open Group

Allen welcomed all present to this conference, and introduced the speakers

Enterprise Architecture: Integrated Global Information

John Gilligan, Air-Force Chief Information Officer and member of the Senior Executive Service, Washington D.C.

John said he wanted to share the view that US Air Force has on architectures. He told the joke about Sherlock Holmes and Dr Watson where Watson noticed the sky was visible but did not take in the reality that he could only see the sky because someone had stolen their tent. He observed that architecture seems to have similar characteristics to this story - how do we use architecture to drive our decisions rather than missing the value of the messages it can tell us. Architecture helps us deal with the problem - access to information all the time, seamless information exchange, responsible stewardship of data. In our current environment we have a lack of integration, but what we need is a network-centric infrastructure where data can be shared and used effectively to drive decision-making. The big problem that architects and engineers have to solve now is not why systems and enterprise architecture is a good thing, but how do you translate the information that enterprise architecture provides into forms that are digestable and useful to inform good decision-making. Architecture is not valuable unless it informs decision-making.

The Federal CIO Council integrates the Office of Management & Budget (OMB) and CIO Council architecture efforts, to develop a consistent taxonomy and terminology, facilitate cross-agency efforts, and oversee "operationalization" of architecture efforts to forge a common understanding of architecture frameworks and structures across the federal government agencies. Architecture must inform decision-making in three key areas - requirements, budget, and design and development. The relationships in this federal CIO Council bring together Governance, Components (repository of qualified components representing capabilities), and Emerging Technology (qualified technology that enables implementation of solution components), identifying where these core entities sit and how they fit in and link to IT architecture - the business reference model, the systems components reference model, the technical reference model, and the data reference model.

Within the US DoD, a DoD/Joint Enterprise architecture has existed for several years. It includes the Global Information Grid (GIG) and the Business Enterprise Architecture (BEA). Below it is the Air Force Enterprise Architecture. While the low-level architectures are not proving difficult, it is the top-level architectures that are proving difficult to do in ways that yield valuable assistance in decision-making - to help with this problem they are deploying visualization techniques. The top decision-makers do understand the importance of architecture; however, they continue to find it hard to use the outputs from architecture work to inform their decision-making.

Looking more closely at the Business Enterprise Architecture, the DoD has contracted IBM to help develop this. A key part of this is to get into each stovepipe by integrating it using an enterprise process view. All this work continues to leverage existing work - not to develop new architectures. This contract was mandated to deliver solutions that demonstrably inform decision-making, within 60 days (a very short time deliberately chosen so the project can only focus on the top-level solutions space), and that time is up at the end of October 2003. The end game is to get the right information to the right people at the right time and presented in the right way, so that decision-makers can make the right (informed) decisions.

In answer to questions, John explained that plain English language did not prove adequate to describe architectures because they find that engineers use different terminology to describe what is essentially the same thing but in a different context. He also repeated the DoD experience that all enterprises - small, medium, and large - need architecture, not for its own sake but to inform decision-making; system architects are not doing themselves any favors unless they solve with some urgency this large and present difficulty - how to translate architecture information into forms that decision-makers can use.

Interconnecting the Edges

Brigadier General (P) Dennis C. Moran, Director of Information Operations, Networks and Space, Headquarters Department of The Army

As a consumer of architectures, he said the US Army is a victim of the use of architectures. He endorses everything John Gilligan said and will build on John's Air Force approach in presenting the Army approach. The Global C4 challenge is to absorb and use information in a seamless integrated way to inform good decision-making. In his context the most important entity in the whole GIG system is the soldier - "Space to Mud"; "Factory to Foxhole". The soldier needs a common relevant operational picture. The sustaining base is an Army Knowledge Enterprise Architecture (AKEA) to enable the Army Knowledge Management (AKM) system. They have identified five pillars in the AKM, each breaking down into lower-level components. For the battlefield environment, the US Army are using a new combat system architecture, at the center of which is a Network-Centric Force. From the communications viewpoint, they use a joint tactical radio system (JTRS) and a warfighter information network - tactical (WIN-T). The danger to avoid here - and Dennis used a cartoon slide to show all the information systems and communications facilities that are potentially involved - is overburdening the soldier in the field with all the technology equipment involved in making all this information available to him/her.

Returning to the initial GIG chart, Dennis noted that there are sets of common architectures that enable (and do not inhibit) interoperability between all the armed services. A key goal is to identify these areas of commonality to merge and share solutions between all the armed services so that interoperability and resultant cost savings can be achieved.

A questioner asked about how the differences in architectures in the many projects currently underway will be resolved - before or after deployment? Dennis said there is no simple answer to this question; however, no-one is denying the multiple projects involved will be easy to coordinate. Nevertheless awareness of the end-goal is at the back of everyone's mind when evaluating these outcomes and no contribution project to this architectures work should underestimate the determination of the DoD to make architecture work so that it informs decision-making.

Introducing Open Architecture into Naval Systems

Fred Riedl, Open Architecture, Program Executive Office, Integrated Warfare Systems, United States Navy

Fred Riedl, Program Executive Office for Navy IWS, said he aims to explain how the Navy is getting into open architectures. The Navy is with the Air Force and the Army in saying they want the same things from investing in architecture - to inform good decision-making. His PEO is a newly-created office, representing a user community tasked with providing an open architecture across all branches of the Navy, both air, surface, and underwater. Their approach is from the bottom-up as well as the top-down. Some of the top-down approach is driven by the business case as well as from the technical complexity necessity and the fact that much of their large fleet uses different architecture solutions; they are looking for commonality in these solutions as their fleet modernizes and evolves. This commonality is not just a goal across the Navy, but across all the armed services where the opportunity arises. They understand that improved warfighting capability depends on open architecture. The US Navy's open architecture comprises a navy-wide technical architecture and functional architecture. These embrace the GIG concepts that are shared with the Air Force and the Army. Broad participation with industry partners is an important part of this approach. Their objective is a common and reusable base of IT components, enabled by the significant efficiencies derived from using standards - decoupling computing environment from applications, affordable technology refresh, and certification and testing are vital pieces in this work, to ensure the required functionality, performance, and robust systems, result.

Their open architecture strategy is an enabler to get to where the Navy wants to go. It is focused on modernization (levels 3 & 4) and new construction (level 5) areas, leaving legacy systems (levels 1 & 2) to follow where they can be fitted. The Navy's integrating component on its open architecture is its FORCEnet, which has open architecture links to all the functional areas in its warfighting model.

In response to a question on JTA overlap with other areas, Fred said the Navy published three documents in March 2003 to recognize these overlaps and indicate their aim to resolve these across all the armed services. Also,  the US Navy does have an open architecture test facility to exercise and validate conformance to open standards and real-time performance requirements and considers this facility vital to their operations. Regarding handling situations where there are gaps in available standards - dynamic resource management, for example - they look to encourage development of standards to fill these gaps, but in the interim they define a set of requirements which they expect will become a part of an eventual standard.

DoD Architecture Framework

Dr Fatma Dandashi, Architecture Framework Director, Office of the Assistant Secretary of Defense for Networks and Information Integration, DoD CIO

Dr. Fatma Dandashi, Mitre, gave an outline of the DoD Architecture Framework and the history and policies that underlie it. The reason this is needed is to bring all DoD warfighter systems into an interoperable enterprise architecture for sharing information, the framework being the uniform common way that the architectures for all DoD systems must be modeled. To be eligible for federal funding, IT developments must conform to the DoD architecture frameworks, and must also conform to DoD policy on using architectures. Fatma gave a brief history of the evolution of the C4ISR framework that is the now-approved DoD Architecture Framework. She noted the alignment with the IEEE definition and the TOGAF definition of architecture frameworks. The DODAF provides a common approach for defining and implementing DoD architectures. It defines three views - operational, systems, and technical standards for interoperability. The four products involved are graphics, textual, tabular and xonomy, and dictionary relationships, all used to capture information, communicate it, and analyze it.

DODAF comprises:

  • Volume 1 - definitions, background, and guidelines, aimed at decision-makers and managers.
  • Volume 2 - product descriptions - definition, purpose, detailed description, UML representation, data elements definitions, CADM support. This volume is aimed at the product manager, the architect and engineering team, and the architecture data modelers.
  • Volume 3 - the Deskbook on Architecture Guidance.

These documents are available online at http://aitc.aitcnet.org/dodfw/. They have developed a capabilities-based methodology, to map where resources are being used and where they need to be applied.

Future evolution areas include development of a common ontology of architecture elements, work to address baseline and objective (target) architectures, and use of architectures to measure effectiveness.

Q: Bill Estrem - how go you translate the architecture information?
A: Agree it's hard to translate from geek-speak to useful information for decision-making. It's not easy to present examples at this time, but it is a main objective.

Q: How does this relate to TAFIM?
A: TOGAF is derived from TAFIM. DISA gave TAFIM to The Open Group some seven years ago.

Q: What problem is this DODAF actually trying to solve?
A: Cannot continue to operate with separate systems when these systems have to share information not only between each US military service but also internationally with coalition forces, so this interoperability problem has to be solved and the DODAF is the way that is being used to solve it.

Q: Can this DODAF be brought into the TOGAF work in The Open Group?
A: There is no real reason why not on a standards basis; however, it is being driven as a matter of high priority within the Mitre team and will necessarily remain there until the urgent goals have been achieved.

Q: Are you going to make a distinction between mission-critical/safety-critical versus non-critical systems?
A: There is no such distinction in this DODAF work at present.

Comment: There is a common misunderstanding between what a framework is versus what an architecture is - a framework says you need a consistent mechanism for representing architectures; it does not define the architecture.

Q: Is the move to using UML to define architectures a good one, bearing in mind UML takes a lot of space and effort compared to other tools which take much less space and time?
A: Agree you need to know what you want to achieve with these definitions. Maybe a business process modeling approach will prove better in the long run.

DHS Enterprise Architecture

Amy Wheelock, Office of the Chief Information Officer, Department of Homeland Security

The challenge when she started on 1 March 2003 was to produce an enterprise architecture for the DHS in four months. They had to start with "what is", and have a transition plan to move to a "what is to be" architecture - covering business, data, application, and technology. The results of this challenging project are summarized in Amy's presentation. In such a timeframe they had to focus on clear objectives, and these were the business benefits.

The business architecture is a value chain that maps to a business model. They have identified about 80 items in the chain, having integrated many agencies' operations and squeezed out the duplications. The data (information sharing) architecture has to accommodate three communities of interest - DHS internal, shared with external, and externally-owned.

They approached the target architecture for applications using a "CURE" process. Their technical architecture approach was through patterns, models, and profiles. All this led to the concept of the virtual application, viewed from the user's viewpoint, which helps to validate the output.

The transition strategy proposes a sequence:

  • Conceptual projects and associated capabilities are enabled over time.
  • Each conceptual project aligns to objectives.
  • Each conceptual project is sequenced based on its alignment to objectives.
  • The detailed sequence strategy displays all sub-projects required by the conceptual project.
  • Conceptual projects provide target capabilities and include other non-technical projects focused on business process and organizational improvements.

This strategy is the first step in developing a detailed transition plan. Their enterprise architecture products fall into three areas: business model, target enterprise architecture, and transition strategy. Amy's slide on this details the components involved under each.

Q: Did you consider existing change management solutions to the DHS EA?
A: Yes. Because of the timeframes involved, they wanted to get the first version of the DHS EA out, but they are now very much involved with working out how to manage their EA.

Q: To what extent does governance play a role in change management?
A: From her background of the US Immigration Service, her experience is that where the Service put good governance processes in place they gained credibility and therefore better funding to improve their service. She therefore believes that good governance is a critical aspect that must be established.

Q: While pleased to see a DHS EA model, when standardizing, are DHS going to force use of a specific technology tool (e.g., MS Exchange for messaging) or on open standards tools?
A: This is a difficult question to answer - open standards are not yet proven to reduce costs and improve interoperability in a business case analysis. This issue has yet to be worked through.

NIST Security Certification and Accreditation Project: An Integrated Strategy Supporting FISMA

Dr. Ron Ross, Senior Computer Scientist and Information Security Researcher, National Institute of Standards and Technology (NIST)

Ron explained that in today's climate of very powerful information systems in place in national, state, and enterprise business partners who support governmental operations, together with the US federal reorganization that brings many federal agencies under the one Homeland Security umbrella, NIST is developing certification and accreditation guidelines for the Federal Information Security Management Act (FISMA) of 2002. These are relevant and valuable for the private sector, especially the heavily regulated private sector.

  • Today's climate - highly interactive environment of powerful computing devices and interconnected systems across global networks; federal agencies routinely interact with industry, private citizens, state and local governments, and the governments of other nations; the complexity of today’s systems and networks presents great security challenges for both producers and consumers of information technology, particularly their vulnerability to attack.
  • Also in the current IS climate, there is massive cooperation and interdependencies between the public and private sector, all of whom use much of the same technology and applications. The vulnerability and security problems are common to all, and the sophistication of the attackers is increasing, so in this climate it makes sense to cooperate and share the responses we need to make to combat potential attacks.
  • The links in the whole security chain involve non-technology as well as technology components. Risk assessment is key to understanding vulnerabilities in the chain. Every link in the security chain has to be robust, and Ron's slide on this lists the key elements in non-technology as well as technology categories. Ron asks that bearing in mind adversaries attack the weakest link ... where is yours?
  • The Federal Information Security Management Act (FISMA) of 2002 sets standards to be used by Federal agencies to categorize information and information systems in specific areas. It tasks NIST with the job of implementing its requirements.
  • The US Office of Management & Budget (OMB) circular A-130 (Management of Federal Resources) requires all Federal Agencies to plan for security, ensure that appropriate officials are assigned security responsibility, and authorize system processing prior to operations and periodically, thereafter.

Ron's presentation slides summarize NIST's project objectives to comply with these responsibilities, and the significant benefits that are expected to accrue as a result of successful implementation. There is no way we can achieve full security. However, we can minimize vulnerability to attack and failure. The approach is to ask what are the controls that will have greatest impact on minimizing vulnerabilities in all areas of the IT critical infrastructure, to identify these vulnerabilities, decide what controls are most effective, implement those controls, then check/audit and verify their effectiveness, and feed back lessons learned to refine understanding of the vulnerabilities, to revise the controls so they are more effective, and repeat measuring their effectiveness and feeding back the lessons learned. The essence of this approach is that funding is tight, so the limited resources will be put towards addressing those vulnerabilities that are seen as most critical to maintaining the mission of each area of the US government critical infrastructure.

Today's challenges are to adequately protect information systems within constrained budgets, and changing the current culture of connect first and check security later. We need to bring standards to security controls for information systems, and verification procedures to assess their effectiveness, into our everyday thinking. Building more secure systems requires well-defined system-level security requirements and security specifications, well-designed component products, sound practices, etc.

We also need supporting tools and programs - building more secure systems is enhanced by standardized security requirements and specifications, component-level product testing and evaluation programs, security implementation guidance, test and certification.

Ron's "big picture" slide for his Information Security Program embraces all the components that he has introduced in this presentation:

  • Risk Assessment
  • Security Planning
  • Security Control Selection and Implementation
  • Security Authorization
  • Verification of Security Control Effectiveness
  • Categorization of Information and of Information Systems

He exemplified the importance of understanding this in practical terms - for example, in recent weeks when 50 million people in the north east area of the Unites States of America and eastern Canada lost all electric power supply in nine seconds. He emphasized their priorities are to switch funds to address the issues that have most impact. The verification component in this big picture is a vital tool to ensuring their effectiveness. They are working to hard deadlines, having had a major change of focus to bring them to address concerns for the risk to systems that are core to the mission for key services and their continuity. Flaws that can be exploited are vulnerabilities, and the highest (core) vulnerabilities are the ones that they will bring resources to solving. The risk to information in this context is inseparable from the risk to the mission. It will forever remain true that we will live in a risk environment; however, we can work "smart" to minimize vulnerabilities in core services, and coverage here must include public-private government and commercial enterprise collaborations.

Phase II is to deliver an accreditation program for verifying an organization's competence to do certification and accreditation (C&A) type work. It is primarily aimed at certification services to federal systems. It's a three-step process: a planning quality manual, proficiency tests, and on-site assessment. Ron asserted that there are many things that help fix system vulnerabilities - and it is estimated that around 85% of them can be fixed by having good configuration out-of-the-box.

Ron explained that has a more complete set of slides which he will supply to anyone present on request (send him an email) - he finds that other organizations can use these slides to significant beneficial effect when advancing this message.

NCES - Net-Centric Enterprise Services

Mike Krieger, Director for Information Management, DASD, Technical Integration Services, Defense Information Systems Agency

Mike Krieger presented a roadmap of what the DASD considers are key net-centric initiatives. These enable rapid exploitation of diverse data sources by the GIG users in a manner that can be customized to meet specific mission demands. He identified nine core enterprise services and APIs. Information Assurance and Security is embedded in all of them.

He then discussed how transitioning to an enterprise services architecture transforms IT systems development, and how NCES has taken over development funds that were allocated to the Common Operating Environment (COE) to address the approved GIG architecture requirements for common capabilities, noting what NCES currently supports as part of this. He summarized the objectives of the program, the GIG enterprise services scope, and identified all the component GIG enterprises involved.

Mike showed the current status on what is currently being done, and noted that FY04 program new start provides excellent opportunity for input from The Open Group's members. He then presented a timeline for their NCES program. The NSA has the responsibility to provide the Information Assurance/Security component. He noted some of the more important issues that have arisen, and closed with the message that NCES is the US approach to providing the GIG infrastructure needed for timely, secure, ubiquitous edge user access to decision quality information.

Q: Given a simple application like messaging and the complexity this revealed, how will you manage applications?
A:  Take the requirements that have been identified and analyze the complexities raised in applications to reconcile the two into tractable solutions.

Q: Bill Estrem - is IPv6 part of your plan?
A: Yes - all new programs have to work with IPv4 and IPv6. By 2008 all have to work with IPv6.

Q: How do web services fit into the NCES program?
A: Web services are included where appropriate; the NCES program wants to be standards-based.

Session 2 (Day 1, Monday October 20)
Vendor Architecture Development - Current/Future Concepts

Senior architecture experts outlined vendor architectures that address the problems with Boundaryless Information Flow.

Why should you care? - Progress toward the Boundaryless Information Flow Reference Architecture

Eliot Solomon, Principal, Eliot M. Solomon Consulting, Inc.

Eliot introduced the subject of architecting boundarylessness. We have six short presentations. These were selected from responses to our call to our members in the previous conference for their specific technologies, products, standards, and architectures that address the specific models of boundarylessness. From these presentations, we aim to review the salient features that create "boundarylessness", and make plans for next steps; e.g., standards development, conformance testing, creation of guides, etc.

You should care about this because in every organization you have at least some "boundaryless business" requirements. You need to address the challenge of reducing boundaries, but without having to undo what you already have, and without incurring huge re-engineering costs, and also without having to bend your business to the demands of a specific technology or IT product.

The architectural problem here is that the vendors are addressing boundaryless information flow, but they don’t all do it the same way, and different "solution spaces" often get different approaches. The challenge is deciding how to choose the right approach, and to decide whether to make your approaches align or converge.

Key things to look for are:

  • Architectural approaches that meet your needs... and ones that don’t
  • Models of boundarylessness that address your business model and competitive strategy
  • Ways to unify your existing IT with your (new) boundaryless systems
  • Opportunities for you and your organization to work on boundaryless information flow architectures that work for you.

IBM's On-Demand Strategy

Bret Greenstein, Director of Strategy, e-business on demand, IBM

Bret's sub-title for this presentation is "unlocking the value in your business". Major forces are deeper integration of IT with business, and accelerating advances in technology. At the apex of these is the emergence of industry ecosystems.

Continuing advances in technology (processor, storage, communications, systems) affect the component cost but not the whole enterprise. Also the growth of the Internet is stunning and core to future expectations. Traditional models are no match for current market realities in change, competition, driving down costs, and unpredictable threats. On-demand business equates to creating real added value. So IBM addresses industry-specific on-demand points of view. The transformation requires an approachable, adaptive, integrated, and reliable infrastructure delivering on-demand services for on-demand business operations. These have to be open, integrated, virtualized, and autonomic. The resulting Operating Environment is based on open standards for integration, automation, and virtualization, and Bret expanded on each of these to demonstrate the holistic approach IBM has in its on-demand computing.

TRIOLE: Fujitsu IT infrastructure

Kazuo Hajikano, Director, TRIOLE Business Development Division, Fujitsu Ltd.

Hajikano-san explained that the current trend in information systems is fusion of systems optimized individually. Customers expect new IT systems services-oriented integration, development and migration (not replacement), and expansion to meet business growth, and reduction of operating costs. Fujitsu's TRIOLE enables all of this. It allows evolution of new product applications, and integration into each platform product. He described how TRIOLE's platform integration model provides for tight linkage between products and guarantees of stability of entire system though system verification. It has set up a Platform Integration (Pi) center which uses Pi templates for integration of new functionality into existing systems.  Their goal is to provide for up to 80% of customer requirements through use of around 100 Pi templates.

TRIOLE also includes service-oriented integration, comprising business processing integration, information integration, and people integration. To reduce total cost of ownership, Fujitsu have sought to establish a quality chain, from component to system level. TRIOLE also includes a Resource Coordinator to allocate resources for handling variable loads.

HP Darwin Reference Architecture Framework

Leo Laverdure, Enterprise Architect, HP Adaptive Enterprise Program Office

Leo explained that Darwin is an adaptive enterprise architecture framework. Enterprise architecture and EA frameworks are coming of age - it has to get linked in to strategy, planning, EPMO, and financial controls. Change is the hardest part of this problem - due to the Internet the current IT business environment keeps changing. In this context the two problems are how to keep relevant, and how to be agile enough to change sufficiently rapidly.

The new dimension in business agility requires us to manage cost, improve agility, increase quality, and mitigate risk. To respond to this challenge, Darwin has a simple enterprise model, which Leo transitioned to an Adaptive Enterprise Reference Architecture, and then listed the key principles that apply to it. Leo illustrated the value of overlaying real business scenario use cases to demonstrate the value of their model. He presented the progress chart as moving from stable through efficient to agile, and thence evolution to an adaptive enterprise, and the four key steps towards the adaptive enterprise.

Wrapping up, Leo characterized Darwin as a journey:

  • That is evolutionary
  • That requires a continuous architecture effort (think "city planning") in which the main journey stages are stable, efficient, and agile
  • That requires collaboration and support across IT and the business
  • In which we need to break down unhelpful barriers

The HP Adaptive Enterprise architecture framework focuses on enabling change and balance across cost, quality, risk, and agility imperatives. New investments should enable the journey.

Boundaryless Directories for Boundaryless Information Flow

Ed Harrington, Principal Consultant & CEO, EPH Associates LLC

Ed considered that identity management is key in Boundaryless Information Flow. Identity of the people authenticated to use an information system requires that they be identified. Directories are an enabling technology for identity management, though with his known background in directories this will come as no surprise to this audience. Ed listed the business drivers for identity management, and then the requirements for an identity management system. He noted that every identity management system needs its own practice and policy.

Architecting an identity management system requires the following resources:

  • Identity store; e.g., a directory
  • Enterprise logic - for implementing policy and design practice
  • Bought-in products and services - sso, provisioning , biometrics, admin, federation, etc.

He identified the existing standards for interoperability of identity management products as follows:

  • For directory: LDAP, X.500, and DSML
  • For application interaction: SAML, XACML, SPML

An example for identity management in Boundaryless Information Flow is "Strategic Decision Support" as described on The Open Group's reference architecture paper, and the Directory Interoperability Forum will be presenting this in another meeting this week.

Ed discussed some supply chain management issues in the context of identity management, and concluded that:

  • Identity Management enables personalized services in a boundaryless organization.
  • It is typically just a part, not the whole, of a business solution.
  • It is implemented using an identity store, bought-in products and services, and enterprise logic.
  • LDAP is currently the only reliable interoperability standard for identity management.

Identity Management: The Next Critical Step

Brian Breton, Senior Product Marketing Manager, RSA Security

Brian said that after Ed Harrington's presentation, he could just say "ditto" and end there. However, he does have a particular view on identity architecture issues and this is what he will put forward here.

So what is identity? - a user has many identities. When we move identity to the online environment, we need to think of the composition of identity - it comprises identifiers, authenticators, and profiles. The problem with online identity is that we have lots of identities, and in a boundaryless enterprise we are moving towards wanting to consolidate these into one identity for single sign-on through some kind of identity and access management system. Brian offered a definition for identity and access management: the people, processes, and technologies dedicated to creating, managing, and revoking digital identities, as well as developing and enforcing policies governing authentication and access to information systems both inside and outside the enterprise.

Looking at the business value of this, enterprises and agencies want to:

  • Manage identities and access for a growing number of applications and users cost effectively
  • Connect users and applications across business boundaries seamlessly
  • Establish and maintain trust in the identity
  • Define and enforce security policy for varying business requirements flexibly, but with a single infrastructure to manage it
  • Extend the infrastructure beyond users to support web services authentication and authorization requirements

He represented this in an overview infrastructure and related this to the Identity Management models that exist today - the centralized model and the open federated model. In the web services standards stack, the Liberty Alliance is very active. SAML, WS, and, SOAP are key technologies. In a further slide Brian listed the benefits of an integrated identity and access management solution - increased competitive advantage, reduced management cost, improved security, and maximized value through increased interoperability.

Remote Access Virtual Environment (RAVE) — A VPN Knowledge Grid

Joseph Sprute, Founder, CyberRAVE

Joseph asked us to imagine the real world and a virtual world next to it; then this presentation is about the junction between the two. His point of reference is the semantic web as a VPN Knowledge Grid, and the intersection between this and the 1998 Presidential Directive PDD-63 and its implementation. He considers this  raises a number of issues. Among them is the ability and willingness of the private sector to cooperate with the federal government in sharing information. To what extent will the federal government get involved in the monitoring of privately operated infrastructures and what are the privacy implications?

CyberRAVE's mission is to provide Communities-of-Interest (COI) with ever-increasing levels of information security, actionable intelligence, and simplified access to remote resources by establishing Vertical Communities governed by democratic online Advisor Groups (VCAG). Its vision is to establish a structured network environment, accessible from anywhere, that maximizes free interchange of information while continuously addressing and adapting to the needs of Network Members, and the issues and probable conflicts identified in PPD-63 ... in other words - a RAVE.

Joseph went into some detail on the background to where he is coming from, and his slides list the RAVE objectives, the current requirements as he sees them, how unstructured (raw) data is an obstacle to information efficiency, and that a Knowledge Grid is the right solution. He identified existing data protection standards and semantic mediation, and introduced his view of a RAVE community organic model that encompasses a Vertical Community Advisory Group (VCAG). His slides include a list of the existing standards that are relevant in this space, and his concept of a boundaryless organization architecture for data - its benefits and its challenges.

So What? Facilitated Q&A Session

Terry Blevins, CIO The Open Group
Moderator: Bill Estrem

Terry Blevins introduced Bill Estrem to moderate this panel session.

Q: What is the vendors view of certification?
A: For HP, Walter replied that certification is a desired end-point and currently we lack something to certify against. From the systems infrastructure viewpoint at this time, the maturity level is high and standards and certification are correspondingly high. At the middle level, things get less mature, and at the application-level maturity is quite low, so the current situation reflects the current maturity of this area.
A: For IBM, Andras Szakal noted that they certify to the present verification demands from their customers, and then there is the issue of certifying against standards. The cost of certification is very high at present and this is a serious barrier to doing more certification programs. 
A: For RSA, Brian noted that from a vendor perspective it is mostly customers who drive certifications.

Q: From the morning presentations it seems that we need more re-usable components so would a library or repository be a good thing to set up?
A: Leo - libraries of reusable components are an attractive idea, but what is useful and reusable is a difficult judgement - each case needs to be assessed on its specific merits.

Allen closed the session with an invitation that if customers want more say, The Open Group's Customer Council is the most effective place to do it.

Session 3 (Day 2, Tuesday October 21)
Boundaryless Information Flow & Enterprise Architecture Commercial Development - Focussing on Architecture Concerns within the Industry

Analysts and commercial users outline enterprise architecture developments and consider future requirements and concerns.

Introduction

Allen Brown, President and CEO of The Open Group

Allen welcomed all present to this second day of this Enterprise Architecture Conference, and introduced the speakers.

The Evolution of Enterprise Architecture: Emerging Industry Best Practices

George S. Paras, Vice President, Enterprise Planning and Architecture Strategies, META Group, Inc.

George noted that the Meta Group is not well-known for its expertise as consultants on management practices, including enterprise architecture. However, it has a significant track record in this business, working at all levels from small organizations to governments. A great deal of its business in this area is spent coaching CIOs and architects on enterprise architecture and how to be effective in doing it for their business.

We can all study from many published books on the theory and case histories for how to do enterprise architecture. It also has to do with its evolution and technical underpinnings. The evolution has begun. George listed the drivers of fundamental change. We live with these every day and they will always exist. How can we be better at achieving IT business alignment with the long-term business requirements of your organization? This is the process of listening to the customer, and specifically the key operational people who run the operations that make the business work. If you can execute against a good plan then you will be successful. The enterprise architecture process is able to provide that long-term context for successful execution of the plan.

Evolution from Enterprise Technology Architecture to Enterprise Business Architecture and Enterprise Information Architecture, to get to Enterprise Solutions Architecture, requires a dedicated and disciplined approach. George summarized this in his slides. The evolution started from ETA, and many EA early projects failed to be productive because they did not link better to the business processes and the information needs of the organization. The key lifeblood of all businesses is its information. So effective EA goes beyond technology to embrace business strategy and information needs, and how these will need to change as the business develops.

George asserted that EA needs to move beyond business and information into portfolio management, if it is to realize its full potential. In his activities, an IT portfolio is a set of managed IT investments that are allocated to investment strategies according to an optimal mix, and based on assumptions about future performance, to maximize value versus risk tradeoffs. He believes there are few real "IT projects" - only business projects. This captures the notion that everyone is a business user as well as an IT user.

Portfolios contain assets that continually change and need to be managed, and require projects that enable you to manage that change. Managing the IT portfolio requires process disciplines. The bottom line is "think about yourselves and where your organization is in the EA continuum".

Q: A CIO is important in an organization but in the UK government there is no single CIO so what should we do in this situation to coordinate this approach?
A: It is very difficult if you don't have centralization of  IT strategy. In such situations you have to build a community of EA interest to generate a culture that creates a virtual CIO in the organization.

Q: Can you give examples of how competitive advantage derives from EA?
A: Procter & Gamble and the Canadian Government immediately spring to mind, and he will offer others if requested.

Architecture for an Adaptive Enterprise

Mike Baker, Vice President & Chief Information Technology Officer, Hewlett-Packard

Mike began by saying that HP has been a customer of the Meta Group over the years and has benefited from George Paras' expertise.

Why an adaptive enterprise architecture? George has already explained why architecture. HP sees the need to add "adaptive" to this, so HP refers to its Adaptive Enterprise Architecture. HP is a complex company that has evolved  through mergers and acquisitions. It is both a product vendor company and a major customer of IT. Its portfolio is vast. So it considers EA to be centric to its operations, and has adopted the name "Darwin Reference Architecture" to embrace its adaptive culture. In this environment HP looks at what has to change to accommodate its business processes. The recent merge of HP and Compaq depended heavily on using this Darwin Reference Architecture to make the right business operational and IT decisions to achieve the planned goals.

HP is blessed by a huge portfolio - as is demonstrated by their slide showing IT by the operational numbers - so a key HP strategy is to use Darwin inside HP to consolidate their business processes to achieve best return on the investments involved, across each of their business processes, applications portfolio, and all their infrastructure domains. The results have been excellent in that they have achieved and booked the $3billion of cost savings that were predicted when the HP-Compaq merger was decided, and done so in less time than was predicted. An essential part of their strategy is a "sunset" program - application optimization - where business process decisions have identified those applications that should be retired and so reduce the applications portfolio to a more efficient and manageable level. This sunset process needs to continue to achieve their planned goals for reducing the number of applications in their portfolio. All this has enabled HP to plan its migration from its current state to a future state where the focus of the company's operations move to where they want to be. Architecture has enabled them to achieve this migration.

Q: What decision guidelines are in place to ensure appropriate decisions are made on what to change in HP's very large and distributed organization?
A: The management structure was carefully set up to handle exactly this problem, and it has proved successful as part of the Darwin EA.

Architecture in the ERP Domain

Jonathan Willey, Director, Business Development, Glovia International Inc. A Fujitsu Company

Jonathan explained that Glovia is a Fujitsu company. Glovia originated from Xerox Computer Services, and has been in the business of providing manufacturing solutions since 1970. Their business evolution moved to ERP in the 1990s. All ERP product companies have a common set of basic modules. However, the number of modules required a huge number of "crud" (create, release, update, delete) applications. Their architecture strategy was written in a 4GL (ProIV), and this created boundaries in an ERP environment where their products provided a complete solution, but was proprietary so not interoperable with other products. As a result, in the early 1990s they could provide the full solution but could not interoperate with part solutions from other vendors.

The world changed on the ERP vendors in the late 1990s, when the business model went global, and also the technology went global, and so customers demanded that their ERP solutions and products were interoperable. The way ERP customers viewed ERP systems also changed, where the scope of ERP expanded significantly - not only from the functional operations view but also from the viewpoint of global distributed manufacturing. So ERP products had to change, requiring a different approach by Glovia and its competitors in order to remain competitive in the marketplace. This new model demanded an entirely new architecture for ERP, and a realization of what parts of this ERP architecture Glovia wanted to play in. The architecture dictated how they needed to migrate from the old Glovia into the current company that now remains competitive in the ERP industry.

Summarizing, Jonathan saw Glovia's evolution as Glovia built their own boundaries, then their market changed, so their boundaries had to change, and the Enterprise Architecture process showed them the way to do it successfully and remain a major player in the ERP business.

Architecting an On-Demand Enterprise using FEA

Andras Szakal, Chief Architect, Federal Software Sales Division, IBM

Andras noted that Bret Greenstein had already described IBM's "on-demand" approach in a previous presentation. Technology gives us a constantly changing environment. Autonomic computing is going to be an important part of this technology driver. Business needs require constant improvement in business design and business process. This requires dynamic change of business processes on-the-fly, not following traditional development processes. IBM believe that this business transformation will drive competitive advantage. Looking at the US Federal Enterprise Architecture, which is being used as the driver for US e-government evolution, the performance and business reference models are owned by the business managers (OMB), the service component reference model falls into both the business and CIO domain, and DRM and TRM are the technical architecture layers that fall into the CIO domain. The Performance Reference Model links to measurement. Organizational productivity means that business operations have to shift from vertical to horizontal focus. The PRM and BRM have to change to become "on-demand" if they are to meet the requirements made of them.

Andras showed the attributes of an on-demand enterprise, and asserted that if your architecture does not respond directly to these attributes you will become marginalized or irrelevant. He then listed the significant organizational changes that transforming to an on-demand business requires, and showed the structure that IBM believes an on-demand operating environment requires. In future, applications will be created by business architects, by giving them a fully virtualized environment to create the functional characteristics they require, and to then be able to verify that their changed application works correctly.

Andras went on to describe the Federal Enterprise Architecture PRM and its proposed control and oversight process, and how this FEA moves to an on-demand enterprise. The FEA lifecycle drives on-demand reinvestment.

Architectural Elements for Breaking Boundaries

Dr. Mark W. Maier, Distinguished Engineer. The Aerospace Corporation

Mark began by asserting a contrasting view to all the previous speakers - he is not very much interested in Enterprise Architecture. However, he wanted to talk about enterprise architecture in a positive and concrete way. He does not come from the IT camp, but rather from a systems and systems engineering background and perspective. His fundamental purpose is support for the US Government aerospace program. He presented three intersecting questions:

  • If I had something that exhibited Boundaryless Information Flow how would I know?
  • If I had a system-of-systems that had Boundaryless Information Flow, what would be its architecture?
  • What is the architecture of the Internet?

He discussed "boundaryless" concepts, and noted these are essentially the same as the thinking that gave rise to the Internet in the 1970s. He reviewed his own view of the architecture of the Internet - the organizing structure is IP, the choice of IP is a direct consequence of the main objective, collaborative bodies control the Internet architectural evolution, and this structure repeats in the Web. He went on to look at the IETF, its architecture, its collaborative nature, and its RFC processes, and concluded that not much of it (apart from the IP protocol) represents architecture at all.

Mark concluded that the key lessons about recognizing an architecture are:

  • The bigger the system, the smaller the architecture.
  • Architectures provide interoperation , not architecture descriptions.
  • Collaborative systems need collaborative architectures, and collaboration-enhancing processes.
  • Standards for architectures and standards for architecture descriptions are not the same thing, and do not fulfill the same purpose.

Mark closed with a short list of relevant references in support of his presentation.


Home · Contacts · Legal · Copyright · Members · News
© The Open Group 1995-2012  Updated on Wednesday, 29 October 2003