Conference Home Page
Proceedings Index
Note that whilst all summaries are available to everyone, availability of slide presentations is limited to Members of The Open Group and Conference attendees.

Allen Brown
Cutting through Stovepipes
Development of Global Information Society & Effective Dissemination to the End-User - European Commission Initiatives
Christian Devillers
European Commission Initiatives
Joao Serres Pereira
The Extended Enterprise - Netframe
Ensuring Usability and the Currency of Information across the Organization, and Limiting Information Overflow - The User Perspective
Skip Slone
Barriers and Breakthroughs
Q&A Panel Session
Managing Information Flow: The Development of the Global Information Society and the Effective Dissemination to the End User - Defense and Global Initiatives
Robert Walker
The Net-Centric Environment
Ivan Herman
The Semantic Web
Securing Information Flow: Understanding the Security Issues that arise when Boundaries become Permeable
Mark O'Neill
Web Services Security
Securing the Information Flow
David Lacey
De-Perimeterization - Jericho
Andy Leigh
Broadcasting Requirements
Tim Parsons
The Networked Enterprise
Q&A Panel Session
Key Issues behind Information Management and the Transition to Boundaryless Information
Jamie Clark
Standards in Transactional Web Services
Ed Harrington
Identity Management Issues
Dean Richardson
Examining the Flow
Next Generation Information Applications: What Suppliers are Providing to Help Customers Achieve Boundaryless Information Flow
David McCaskill
Procter & Gamble - Case Study
Gavenraj Sodhi
Controlling Access to Information
Steve Harriman
Auto-Control for Application Resources

PLENARY
Boundaryless Information Flow:
Managing the Flow

The overriding requirement for customers of IT products and services has become enabling secure, reliable, and timely access to the right information, unlocking the information, and overcoming the boundaries within and between organizations.

Vendors are responding with their own interpretation of this requirement. With terms like adaptive, seamless, and on-demand computing, workflow, and process management, solutions are becoming more visible in product offerings. The Open Group vision of Boundaryless Information Flow is a vendor and technology-neutral way of describing what today’s business IT environment needs in order to deliver.

As organizations everywhere strive to do more with less, it is important to be aware of best practices, what others are doing, their experiences so far, and what is coming in the near future.

This conference provides a platform for updating this awareness - presenting valuable information from customers, vendors, and analysts on how to better manage the flow in an organization.

The Information Revolution - cutting through the stovepipes

Allen Brown, President and CEO of The Open Group

Allen welcomed all present to this conference. He mentioned the recent writings of Peter Drucker (Harvard) and his latest writings on this information revolution. Drucker is now saying that the latest information revolution 50 years on is not now happening - it should not be a revolution in technology but in content. So far, the information revolution has focussed on technology not data - the "I" part (information) as opposed to the "T" part (technology) in IT is now the important part. Looking at what current industry leaders were saying in 2002:

  • Bill Gates - Microsoft's vision is seamless computing - it's the boundaries between applications, and between departments, that need to be broken down
  • Sam Palmisano - IBM's e-business on-demand, requiring end-to-end integration and horizontally integrating disparate silos of information
  • Carly Fiorina - HP addressing issues around overcoming islands of automation and needing to think horizontally

All are about breaking down boundaries to enable flow of information. What were we in The Open Group doing in 2002? In January 2002, we developed a problem statement, and our Open Group Customer Council went on to produce a business scenario on The Interoperable Enterprise. This identified a common problem, where specialists had organized themselves into bounded departments in which their information was held in stovepipes and silos that were inaccessible to other departments. Jack Welch in General Electric addressed this problem by reorganizing GE into a "boundaryless organization". In an effective organization, information must flow in a boundaryless manner. This led members of The Open Group to formulate its present vision statement. Of course the boundaryless information flow statement does not mean that organizations have no boundaries, but that the boundaries are permeable, allowing ease of access to information by those who are authorized to have it.

The Open Group does not have a product for boundaryless information flow. Our role is not to sell boundaryless information flow, but to enable it in four areas:

  • Working with customers to understand and address requirements
  • Bringing the industry together, working with vendors and other consortia and standards bodies to evolve solutions that are open and integrated
  • Offering services to enable consortia to cooperate
  • Certification of conformance to standards so purchasers know a product does what it claims to do

In our February 2004 San Diego conference, Dawn Meyerreicks called for integration of all types of information for servicing their "agile netcentric warfare" program, in which vast quantities of different types of intelligence information from many sources must be gathered, published, and analyzed, in order to decide what action to take and how to execute the action, then repeat the cycle to gather, etc. The important feature here is that at each stage a transformation of the information takes place in order to make it valuable. Taking logistics information as another example, this can be broken down into requirements, ordering, and delivery, and at each stage the information is transformed into forms that make it useful. Largescale manufacturing provides a special example where the trend is to outsource - using partner suppliers to supply the components - keeping only the brand in-house; in these situations their IT systems have to integrate over a large number of outsourcing suppliers, and in a later presentation in this conference we will hear how Netframe is enabling this.

What is information? Consideration of all the available definitions in different contexts leads us to asking what is the meaning and purpose of information - and its quality. It's not always knowledge, it's not a collection of data, it's not the computing science definition (data), it's not the legal definition. Information is something you need to do your job or conduct everyday life, presented in a manner that enables effective action to be taken - make a decision, and take informed action. Again, quality of information is important, and what we mean by quality depends on context: quality of the information itself (accuracy); quality of access (accessibility, availability, readability, timeliness); and quality of presentation.

So managing the flow - the "T" in IT is well established. The "I" in IT needs to catch up. Today at lunchtime a group of our members want to gather together to talk about working on information-centric issues. All members are invited to join that lunchtime gathering.  One idea we already have for this is a standard framework to improve information quality, but we will see what members decide.

Development of Global Information Society & Effective Dissemination to the End-User - European Commission Initiatives

Government Programs enabling the Global Information Society - the IDA Program

Christian Devillers, Coordinator of IDA Common Tools and Techniques, European Commission, DG Enterprise

Christian outlined IDA's mission and legislative context over the period from 1999 to 2004, which is to facilitate effective and secure exchange of information between Member State administrations and the EC institutions to support implementing EC policies, achievement of the internal market, and effective decision-making, through establishing operational interoperable telematic networks across all member states.

Christian explained the legal basis within the EC for this IDA program. The IDA's activities cover a wide range of projects of common interest, and in one of his slides he listed the generic services this range embraces. One example was support of migrant workers through multilingual web portals. Another was support for emergence of a single pharmaceutical market. More information is available from their web page at http://www.europa.eu.int/ISPO/ida.

The IDA program's horizontal "infrastructure" actions and measures have included spread of good practice, business applications, and technology solutions. An example of technology solutions is the IDA communication platform. In business applications, an example is their "your Europe" portal.

The eEurope 2005 action plan aims for an information society for all, covering online public services (e-Government, e-learning services, e-health services), a dynamic e-business environment, and the affordable broadband and secure information infrastructure to support these. Christian put these into the context of the EC 6th Framework program, feeding into the IDA and eTEN network and applications work, in turn feeding into the e-Content work.

The Pan-European dimension for IDA development has four levels of sophistication - information only, on-line forms, individual transactions, and automated processes. Drivers for e-Government services include public services for both citizens and for business. Underlying principles are accessibility, multilingualism, security and privacy, subsidiarity, use of open standards, assessing the benefits of open source software, and use of multilateral solutions (a common solution that suits all member states). Interoperability is a prerequisite, so there is a European Interoperability Framework to complement national frameworks for pan-European e-Government services, at the organizational level, the semantic level, and the technical level. The program for Interoperable Delivery of IDA to Administrations, Businesses, and Citizens (IDABC) runs from 2005 through to 2009 - it is the successor to the IDA program.

The Extended Enterprise - Netframe

Joao Serres Pereira, Director, Innovation Relay Center, ISQ and Netframe Project Leader

Joao described his model of a supplier having a complex product that is built from many high-precision components whose manufacture is outsourced to many different suppliers, and whose assembly depends on total accuracy and high quality of each component. Having run several successful pilot projects, they extended the concept by adopting the challenge presented by two big manufacturing industries - the automotive industry and the aeronautic industry. Both use outsourcing from many suppliers and have many components which need to fit together with high precision. Netframe is a framework for the extended enterprise. in which companies cooperate to share electronic documents providing design information which must be sufficient and timely, and able to respond to design changes and all consequential effects on related components and their suppliers.

Netframe is set to deliver a comprehensive integrated set of tools - including a self-assessment tool, a benchmarking tool, an audit tool, plus a guidebook and a tutorial, and an industry consensual model. Netframe delivers an IT architecture supporting the cross-industry and supply chain steps. They use the TOGAF methodology, and ISO 15504 quality model, to transform their goals into deliverable value results. The steps involved are:

  • From business scenarios
  • They developed a process model
  • From which they identified practices, metrics, and technologies
  • They categorized these practices and technologies
  • They correlated these practices to performance
  • From which they produced the Guidebook
  • They derived an XML model and accompanying tools (the XML model enabled transition to an entity relationship model)
  • They built a pilot

Ensuring Usability and the Currency of Information across the Organization, and Limiting Information Overflow - The User Perspective

Information Flow: Barriers and Breakthroughs

Skip Slone, Principle Systems Architect, Lockheed Martin

Skip gave a brief overview of Lockheed Martin: with 132,500 employees, 55,00 scientists and engineers, 30.000 software and systems engineers, and operations in 45 US states and 56 countries. They had sales exceeding $31 billion in 2003, and are active in aeronautics, space systems, electronic systems, IT services, and Integrated Systems and Solutions.

Their new business priorities are to be responsive to the new customer priorities since "9-11" which include the war on terrorism, requiring a transformed armed services capability, a new strategic posture, and Homeland Security. In the defense industry, drivers for change are a major transformation from product-focus to solutions-focus. The net-centric environment requires total integration of information services in the battlefield environment, which itself ranges from space and high-altitude surveillance, through ground-based information, delivery of information, and its analysis to those in the battlefield on the ground and in the air who need it. Skip noted that one of the biggest challenges to providing the right information in the right form is the context of language (e.g., "trim it on the CAT" in the US navy refers to the catapult on an aircraft carrier, nothing to do with the household pet) . Another is the human challenge of different generations with widely different cultural and lifestyle approaches working together. Also the 21st century gives us conflicting pulls - the 20th century model was lift versus gravity, thrust versus drag, whereas the 21st century model is more innovation versus inertia, investment versus risk.

Skip suggested that changing IT perspectives indicate three dimensions of information flow - information access, infrastructure, and application integration, and he identified an information flow capability envelope. The infrastructure services have to bridge operating systems, directory, security, system & network management, and cross-platform systems. Applications services have to bridge the application platform, component development, COTS integration, host integration, and lifecycle management. He then presented his view of an information access model enabling the required functionality for today's vision of information availability and flow.

Skip proposed the Joint Strike Fighter (JSF) as a real-world example. It is a jewel in Lockheed Martin's crown of achievements. It represents a fine example of  a multi-service (airforce, navy, army) global program. It has three basis versions: a F-35A CTOL (USAF), a F-35 STOVL (USMC & UK), and a F-35C CV (USN). Nine national countries are involved in its development and usage. Access to information from these nine countries itself poses major information flow problems. Skip listed several "amazing facts" about the JSF. What does all this have to do with information flow? - it requires collaborative development, with global design and production from the same digital data. An example of the scale of information involved is that a single test flight generates four terabytes of data. The ability to capture and handle that data is a major challenge. An integrated management framework is needed, along with centralized data handling and distribution. Information Flow is essential to achieving cost goals and five-month assembly timeframe. Skip closed with a short video clip, and a reminder of Lockheed Martin's motto - we never forget who we are working for.

Q&A Panel Session

Christian Devillers, Joao Serres Pereira, Skip Slone

Q: In Joao's presentation, he mentioned Open Source in the same context as Open Standards - what did he intend here?
A: Joao - Netframe is not focused on developing software - open source or otherwise - or on the tools themselves, but more on the models and processes.

Q: Terry - how does the EC drive procurement?
A: Christian - Inform the market through e-Notices. It is driven by the internal market.

Q: Which areas covered in your presentations are not addressed by standards?
A: Christian - The Open Document Format (ODF) is an area in which dialog with Sun and Microsoft is underway.
A: Skip - Look at where standards are mature rather than immature. Each barrier in his portrayal of the infrastructure and application models generates issues where standards can add value.

Q: What is the role for certification?
A: Joao - Certification is a key element for Netframe. The whole approach in the Netframe project depends on self-certification.
A: Skip - Certification of products and of skill-sets is important . For products, certification gives greater assurance of out-of-the-box interoperability. For open source it is hard to persuade the open source community to spend money or effort to get certified - and this is not just for Linux, because there are numerous non-Linux projects around.
A: Christian - The EC has an approach where an approved open standard has to be used for it to have value.

Q: Who does the modeling for the process and who maintains these models?
A: Joao - In the Netframe project this involves all of the members of the team, and includes keeping down auditing costs.
A: Skip - The ability to model business processes is critical to controlling efficient delivery across multiple suppliers
A: Christian - Business interoperability interfaces are a key approach to the IDA program.

Q: In the Netframe project - is it also expected that new requirements will shape new standards?
A: Joao - The project team try to draw parallels on the way different companies use Netframe - they find that many SMEs do useful prototyping which help develop the process.

Q: In most projects we have to work on specific domains - is some work going on in The Open Group to address generic Netframe standards that will encourage deployment?
A: Joao - We appreciate the goal here and need help from the EC in promoting this.
A: Terry - The Open Group is aware of this problem and considering the solution space.

Q: Terry - Each one of the panel - if you have a single message that would help you solve the Boundaryless Information Flow problem, what would it be?
A: Skip - To think of interoperability as a baseline requirement, not a point of differentiation
A: Christian - Interoperability is a key point and the EC is looking at openness as a key requirement for ease of implementation.
A: Joao - Listen to the supply chain, because that is what makes interoperability work best..

Managing Information Flow: The Development of the Global Information Society and the Effective Dissemination to the End User - Defense and Global Initiatives

Standards, Workflow, and Orchestration in the Net-Centric Environment

Robert Walker, NCC Project Director, Defense Information Systems Agency (DISA)

Rob gave his presentation by teleconference. He showed that the broad approach to net-centric warfare is transitioning from today's pre-web through networking, to joint/enterprise pervasive networked services in the future. The Global Information Grid (GIG) enterprise services program comprises four main areas - the Core Enterprise Services, the Business Mission area, the Warfighting Mission Area, and the National Intelligence Domain. Today we have institutional communities of interest (CoI) as well as expedient CoIs, plus cross-domain CoIs. The future CoIs need to be provided by pervasive network services. Rob reminded the audience of what he means by net-centricity - the US DoD is looking at a service-oriented architecture with a service-enabled infrastructure that enables publishing (posting) of information and also enables users to discover (find) information. The net-centric design principles are an open architecture that is independent of underlying object models and programming languages, accommodation of asynchronous change, decentralization of operations and management, integrated layered security, and ubiquitous IP networks. Rob listed the top ten items in their focus for implementation. A major requirement is to provide agility to put together composable services to provide just-in-time capabilities. Rob showed a set of initial Net-Centric Capabilities (NCC) pilot software components that they are using.

Assessing standards maturity in the application integration stack, Rob indicated DISA's views on where more work is required, marking in red on his slides those items that they consider are immature. He then suggested trends in enterprise computing where data-centric movement is leading to process-centric capability. They are also looking at Service and Event-oriented architectures, noting the differences, and how to solve the flow and relationship problems that these differences identify. As we go through the taxonomy of event systems we find that events become an integral part of the applications infrastructure. There are different planes for different paradigms. Rob showed a representative hosting view of the infrastructure for their NCC pilot.

In summary, Rob confirmed that the DoD is actively pursuing the SOA approach to enable the net-centric environment, and that open standards are an essential element of that approach.

Q: Moving from a world of individual to composable services, who becomes responsible for maintaining these services and how will you manage the metadata issues?
A: Rob - The supplier of each service will accompany their service with an ownership and performance characteristic, and will be expected to manage all aspects of it, including cooperation on integration issues and metadata.

Introduction to the Semantic Web

Ivan Herman, Head of Offices, W3C

Ivan provided a link to his slide presentation on the W3C web - http://www.w3.org.

Moving to a semantic web, the current web represents information using natural language and second-guessing. An example is searching - the best known example is Google. Another example is automatic assistant, which knows about your preferences. A further example is data(base) integration. Yet others are digital libraries, and semantics of web services. What is needed? A resource should provide information about itself - the term "semantic web" is a metadata-based infrastructure for reasoning on the web. It extends - not replaces - the current web.

Ivan offered a problem example - conveying the description of a figure through text. The important thing here is to provide statements on these resources. You then need a resource description framework (RDF) which is W3C's general model for such statements. Ivan gave a simple example of an RDF statement. URIs play a fundamental role in uniquely identifying all resources on the web. However, RDF is not enough - adding metadata and using it from a program works, but the semantic web needs the support of ontologies as well. W3C's ontology language is codenamed OWL. Ivan listed several possibilities that OWL has the potential to enable. This work is going on now in W3C.

Semantic web applications already exist. A SWAD-Europe survey provides interesting information on this. The Dublin core is a well-known application example which has become a de facto standard. Web Content Syndication is another application - see http://purl.org/rss.

Further application examples are:

  • Web Services Descriptions
  • Gene Ontology
  • Mozilla
  • Creative Commons for Digital Rights over content on the web
  • Baby CareLink for information on treatment for premature babies - a perfect example for the synergy between web services and the semantic web

The semantic web is not just research. Also metadata has to be supplied by the author to start with, but can be added to by machine-supplied information.

Ivan offered his email address to accept questions and field further interest - ivan@w3.org.

Q: When metadata is created, how do you assure it is not spoofing or masking what you really mean?
A: Ivan - You can't have this assurance, though security solutions can help reduce it. Market forces tend to help steer users to the valid information.

Q: Architecture frameworks required a big education step to move from IT to general vocabulary of how the terminology can help business strategists in more generic business ways; how can we learn from the past to move more smoothly to using a common vocabulary for the semantic web?
A: Ivan - Taking the example of libraries, the knowledge is already there and people should be willing to use their experience to reflect this back and use the results they deduce.

Securing Information Flow: Understanding the Security Issues that arise when Boundaries become Permeable

Web Services Security

Mark O'Neill, CTO, Vordel Web Services Security

Mark explained that Vordel is a vendor of products to protect web services. He has written a book with collaboration from others on this subject, and it is published by Osborne/McGraw-Hill.

What are the web services security challenges?

  • Authorizing the end-user. The solution is to maintain end-user continuity end-to-end. Security information within SOAP messages can be passed using SAML (Liberty Alliance), or Web Services Security.
  • Blocking the new application layer threats that make use of XML and SOAP.
  • Blocking unauthorized clients from accessing the web service itself. This is where SSL and network security come into play.

These three aspects of web services security are mutually independent.

In any given security context there are at least three primary actors: the user, the transaction, and the data. Security descriptors may be used to convey the security requirements of each actor. Mark showed three examples of initiatives to annotate web services with security descriptors. He showed an example of using SAML injection and SAML validation to add security context to the information flow. Vordel's dispatcher and enforcer - controlled by VordelDirector - does this. Another example showed mapping of security to an XML transaction, adding security of the user, transaction, and data to the information flow.

Summarizing, web services requires that security is added to XML messages; the semantic web has a lot to offer on privacy, authorization, and permissions; and effective security solutions for web services are fairly well understood and are available today.

Q: What certification is available to assure conformance
A: Mark - What really matters is conforming to the WS-I profile; there is no third party doing it at present; however, neither is the WS-I profile agreed beyond draft stage at present, so any certification to it would be premature.

Securing the Information Flow

Introducing the Jericho Forum

David Lacey, Director, Security and Risk Management, Technology Services and Innovation, Royal Mail Group

David said he takes a forward-looking approach to IT security issues. The latest approach is something he has termed "de-perimeterization". This new approach is necessary because existing solutions are not sustainable. David regarded de-perimeterization as recognition of the disappearing perimeter in IT systems, and acceptance that the industry must develop new pragmatic solutions that are workable, scalable, and cost-effective.

The vision of the Jericho Forum is seamless security management. David listed what he believes is needed to realize this vision, and went on to discuss the benefits, not least of which is that there is no alternative because it is essential to survive the future. Because Jericho is forward-looking, this de-perimeterization approach is not complete. Rather it is anticipating future challenges, and is addressing future trends in technology. We need to understand the consequences of all this for information security, in a world of increasing openness and complexity. The Jericho Forum grew out of informal exchanges between leading organizations who share a common vision for a new approach to security. The members see themselves as a catalyst to accelerate the achievement of the vision. David showed his current assessment of the problem space, and listed the current members.

Broadcast Networking - An Example of De-Perimeterization

Andy Leigh, Information Security Strategy Manager, Technology Direction, BBC

Andy explained that the BBC story to date is that they have built a citadel with a very strong wall that has a similarly strong gate, and they trust it. They check the credentials of all who work inside their citadel, and so have encouraged a culture in which they trust everyone inside, and distrust everyone outside. However, this has to change, because de-regulation and independent production company contributions have meant that they need now to cooperate flexibly with external partner organizations and individuals, have mobile employees, receive and accept secured (encrypted) inputs that they can't first open to examine or verify, and are getting huge volumes and increasing ranges of inputs from people who no longer are in that band of trusted insiders. The truth is that they always did include a small number of rogue citizens on the inside, and the reality is they have had to create holes in their wall.

They addressed this set of changes initially by splitting their citadel into sections with their own walls, compartmentalizing their operating areas and applying appropriate security to each compartment while minimizing the risk to the other compartments.

Andy explained why the needs of the broadcast industry are different. They operate two types of network: distribution (outward) and contribution (inward). Characteristics of these are that they are real-time, and they demand zero latency, high bandwidth, and high-speed transmission of digital data. A lot of contribution comes from agency suppliers, but the security of these contributors is virtually non-existent. However, because the networks used to be mostly private networks they did not need to be secure because they were with an already trusted source. But in today's environment, most radio programs are being contributed from PCs. Suddenly broadcast material looks like (huge amounts of) business data, sent over IP connections, over the Internet.

So broadcasters like the BBC now need a perimeter, but they would rather have a de-perimeterization model so they can continue to operate the same way they have been for so long. Problems that still have to be faced include forming communities of trust between broadcasters. Clearly, for the broadcast industry, de-perimeterization looks like a good thing, and they hope it will provide the solutions they need. That is why the BBC is an enthusiastic supporter of the Jericho Forum.

Securing Information across the Networked Enterprise

Tim Parsons, Head of Information Security, Future Systems, BAE Systems

Tim said when he came away from his first meeting with the Jericho Forum founders, he was very keenly supportive of their vision, which matched his own. In his role in BAE Systems, he liaises with R&D and with customers, and also works with product planners.

Networking since the 1980s has been enabled by technology, and was really driven by business requirements for changes in operating processes. Tim listed the prime characteristics of the transformed enterprise operating in a global information infrastructure, and what the consequential issues were. In 2000, he reviewed the current focus in R&D on information security, and noted it was working on intrusion resistance, and service recovery as a key business differentiator.

Fast-forwarding to 2002/3, there had been a major shift towards adaptation, evolution, and service recovery within open systems - supported by the European Community's Framework 5 & 6 programs, and similar DARPA programs. The historical approach was that the infrastructure should protect content by ring-fencing it. This approach is increasingly less useful in an increasingly networked world. Tim therefore concluded in the early 2000s that information security issues must be decoupled from infrastructure and protection issues. What then are the architectural implications of this? In architecture, this decoupling has been evolving over a number of years. Looking at the infrastructure from a business perspective, Tim built up an assessment of the scope of relevant initiatives at the enterprise, services, systems and architectures, application, content, and object (ensemble of data) levels, and mapped each of these levels to relevant associated security initiatives that we encounter today.

Q&A Panel Session

David Lacey, Andy Leigh, Tim Parsons

Q: How is the Jericho Forum setting about approaching their de-perimiterization work?
A: David - We are making good progress on defining the problem space well. We plan to hold several workshops and run expert analytical sessions to arrive at what needs to be done, then prioritize each component of this solution space on a roadmap, and identify the successive phases of implementation to realize the vision.

Q: What is the concept of rights on content in broadcasting?
A: Andy - Broadcasters in general want to be broadcast, so DRM is not currently such a big issue. However, this could easily change into being a major problem, and they do not have a pragmatic solution yet.

Q: How does the Jericho Forum plan to get organizations to adopt their radically different approach?
A: David - Getting key organizations together is the main supportive move, and agreeing the problem space and the priorities. The members of the Jericho Forum are big IT consumers with huge purchasing power and therefore big influence on the IT community. These big IT consumers are outsourcing a lot of their IT - how do they do this securely? - how do they classify systems and data to apply the right levels of security? These are among the issues the Jericho Forum are including in their activities.
A: Tim - From his BAE Systems viewpoint, defence systems are never produced in isolation - the collaborative effort and outsourcing involved heightens his company's requirement for the Jericho security solution.

Q: Is the Jericho Forum looking at trust models?
A: Tim - Yes we are looking at trust models, in the academic community and in their R&D, using the Zachman model as a starting point to scope the problems.
A: David - This is not such a difficult issue - ISO 17799 in his organization (Royal Mail) is well established and thoroughly audited. In his previous appointment in Shell International he set up a a certifcation scheme for their trust model.

Key Issues behind Information Management and the Transition to Boundaryless Information

What do Standards tell us about Workflow and Transactional Web Services?

Jamie Clark, Manager for Technical Standards Development, OASIS

Jamie noted that OASIS produces industry standards for web services, and he listed the quantity and range of these. What is a standard? OASIS thinks that only published specifications that are available and accessible and used are "open standards"; anything else is proprietary. OASIS produces open standards on web services which represent the return on investment for e-Commerce. Key issues in these standards are interoperability and convergence. Jamie showed OASIS' standardization map, covering some 65 projects, and explained that he will focus in this presentation on five projects that are underway in OASIS which are relevant to information workflow:

  • ASAP - asynchronous service access protocol
  • BTP - Business Transactions Protocol
  • ebXML Business Process
  • WSP+BPEL - Web Services Business Process Execution Language
  • WSCAF - Web Services Composition Application Framework

He outlined the basis problem these are addressing - making sense of multiple messages, enforcing logical relationships between them, and grouping them logically to conserve metadata. He noted that it is important to keep in mind what you want to achieve when developing standards responding to requirements, particularly how lightweight or heavyweight you want the solution to be. Also how many transactional methods are enough? Standards center on real needs, and users largely influence the outcome, so it is the users who will decide which standards are important, which must interoperate, and which must converge.

Identity Management: A Necessary Step towards the Boundaryless Organization

Ed Harrington,   Principle Consultant and CEO, EPH Associates

Ed presented some initial thoughts on widely different concepts of what "identity" has been represented to be through history. He then noted the background that led to development of The Open Group's now published Identity Management White Paper. Key concepts in this Paper are Trust, Authentication, Provisioning, Authorization, Federation, and Repositories, and he went on to discuss each of these in more detail:

  • Trust - its definition, what it is not, the balance between trust and risk, and the ultimate decision being a human one
  • Authentication - its relationship to identity, verification processes and elements, revocation
  • Provisioning - authorities as the agents of provisioning, account provisioning, resource provisioning
  • Authorization - appropriate use of resources, more than just access control
  • Federation - not much covered in the IdM White Paper, based on circles of trust, enabling simplified sign-on
  • Repositories - storage of security and identity information

Ed then assessed the areas of impact, in terms of business value, security, personnel and privacy, and technical issues involving standards and interoperability, plus regulatory and legal issues. The business value addresses escalation of threats, essential administration of identities, and balancing risk and costs. Privacy creates conflicts - ownership, security, personalization, efficiency - all context significant, and regulation complicates these issues even further. Ed listed some of the technical issues - the elusive core identity (naming) issue, source of authority, trust and trust models, semantic framework, interoperability issues, and standards agreement among all the organizations active in this field. Ed also listed the main regulation and legislation impacts on IdM requirements.

Ed closed with The Open Group's Identity Management program forward plan, to produce an Architecture Guide, to develop an Identity Management certification program, to work on the "core identity" problem, to continue developing our Identity Management Catalog, and to maintain and develop our liaisons with other consortia who are active in Identity Management, particularly the Liberty Alliance.

Examining the Flow of Information

Dean Richardson, Vice President of Technology, Messagegate, Inc.

Dean gave a brief introduction to his organization (Messagegate), to indicate that the information he was about to present comes from large customers. In his role a chair of the Messaging Forum he has presided over several "messaging" projects, including Secure Messaging (and SMGateway certification), Coping with Spam (a newly published Managers' Guide), and recently started projects on Unified Messaging and on Instant Messaging.

Managing the flow is about enabling delivery of the right information to the right person, in the right format, at the right time. Dean listed examples of wrong information - the best current example of wrong outbound messages is Spam. For inbound messages, examples of wrong messages are unauthorized disclosures, and for bi-directional messages we have examples like virus, worms, hate, hoax, chain, phishing.

The current situation in messaging is that email is the "killer application", SMTP is not the best technology but it was easy to implement, spoofing does not need identity management, Spam is out of control, IPR is regularly stolen, companies now have to audit their email traffic to control their business processes and this is revealing shocking results. Current trends on Spam are that with 80-90% of email traffic being non-essential business, the filtering solution inevitably does block some business email. Legislation has not helped to limit the Spam problem. Technology has helped but it has also caused problems. New identity/authentication standards are being proposed - like Sender Policy Framework (SPF), Microsoft's Caller-ID, and Yahoo's Domain Keys. Phishing is also proving up to 20% successful, despite the widespread publicity warning that it is a scam.

Dean closed with a list of recommendations on measures to combat Spam and phishing. Most of these measures were of an educational nature, warning users to be wary and not release their personal details, however tempting the invitation messages displayed on their screen are.

Next Generation Information Applications: What Suppliers are Providing to Help Customers Achieve Boundaryless Information Flow

The Importance of Standards and Conformance to the Aerospace Industry

David McCaskill, Section Manager, GBS Insfrastructure Services, Global Security Solutions, Procter & Gamble

David described a case study on Procter & Gamble's deployment of their Common Interface Bus (CIB) - its architecture, implementation, and the next generation on how it will move forward into the brave new web services world. David noted that point-to-point integration resulted in what he termed "integration spaghetti". The best solution was application integration through a common interface bus which transports business object documents, and uses application adaptors to connect to the applications. David listed the design goals of their CIB, and then explained the services it provides - a messaging backbone, transformation and routing, message warehouse, development repository, security services, and administration to monitor and control CIB services.

Their technical architecture implementation uses TIB ("The Information Bus") to integrate all the components in their system. The next generation requires they accommodate new applications and web services into their information flow, which makes them very interested in the new information-centric initiative in The Open Group. David showed a conceptual diagram for their next-generation integration broker, and listed the main standards, trust and policy framework, integration broker web service (not point-to-point), and an industry standard discovery mechanism.

Controlling Access to Information - Within Organizational Internal and External Boundaries - Security and User Viewpoints

Gavenraj Sodhi, Product Manager, eTrust Security Management Solutions

Gavenraj asserted that identities are at the core of everyone's business, so identities need to be managed, and centralizing this control is the only efficient way to do it. He listed key concepts around corporate social networking - dynamically associating contact identities related to roles and responsibilities within the organization, controlling the information between organizations, and dynamically constructing relationships across the business. The business drivers are facilitation with risk mitigation, and the business benefits include operational efficiencies, an improved security environment, and improved productivity and satisfaction. He then listed what he believes businesses can do to improve their lot. He reviewed the issues that he sees are centric to corporate users and the federated enterprise, and extended these into considering internal and external boundary requirements, where identity and access management become even more important, and where identity theft needs to be addressed.

Looking at the identity and access management issues further, Gavenraj considered the key components are provisioning/de-provisioning, single sign-on (where directories have a major role), web access control, access control policy mapping (and enforcement), directory, and web services. Areas of concern include identity theft. Gavenraj listed possible federal and standards resolutions. He presented a Federated Identity & Access Management architecture for controlling the flow of information - because Federated Identity Services require an infrastructure.

Enabling Multi-Vendor Solutions to Monitor & Automatically Control Heterogeneous Hardware & Software Application Resources

Steve Harriman, Vice President Marketing, VIEO Inc.

Steve started out with the IT Management paradox - we must reduce IT costs, and drive IT innovation , and do it now, across the whole business. It is impossible to do all of these. However, we can work smart by using automated process controls to do the mechanistic parts of the tasks.

Today's control system comprises silos of presentation/application/database services, with overlaps and redundancies. We need to reduce and integrate these - and the lessons to be learned from previous business evolution show that a rational approach to this problem is to apply automated process control. Just as aircraft control systems have evolved so they effectively fly the aircraft itself with supervisory control by the pilot, so too can IT control systems evolve to do the automatic parts of the tasks, leaving human intervention free to monitor progress, oversee operations, and override the automated system when specific optimization decisions are appropriate.

As part of its initial work on establishing and defining its development strategy, VIEO did research to arrive at ten rules for intelligent control systems, and Steve described each of them in turn. Steve then listed the business drivers, the maturing technology, the proven concepts that are accepted norms in the industry, and invention (especially mapping resources to applications). He reviewed management evolution from the 1980/90s, through the late 1990s, up to today, where we can expect a virtualized application infrastructure. We need standards in this area, without which the evolution will be painful and slow. Steve listed the key requirements that he sees are needed, and noted that Tom Bishop of VIEO is chair of the ARM Initiative (Application Quality & Resource Management) so invited everyone to give their feedback to him or Tom. Steve looked forward to offers of support to move this work forward.

Summarizing, Steve felt it is insane to do the same things over and over again when they can be automated, and so provide all the advantages of improved productivity and accuracy/predictability. The business drivers remain the same as ever, fundamental changes in managing modern application infrastructures are needed, next generation management solutions should bring process control into the data center, and we need new management standards to provide the key business enablers for interoperable solutions.


Home · Contacts · Legal · Copyright · Members · News
© The Open Group 1995-2012  Updated on Wednesday, 28 April 2004