Lamont 5Dn Mission Control Software

Computational Orchestration in Oil Production Control Environments


What are 5Dn technologies and how do we get there?

5Dn Technologies

5Dn Technologies consist of the real-time analysis of multiple datasets -- the product of both monitoring and simulation -- being collaboratively fused, visualized and realized. Data is routinely collected in real-time over packet-switched communication links by sensor networks placed in wells within producing fields. Computation is distributed across a high speed information infrastructure -- all steered at the visualization and realization nexus by controlling personnel.


Six critical areas of development are achieved by LAMONT'S 5Dn MISSION CONTROL SOFTWARE

1. Component-Based Software and Agents

The information-infrastucture change required for a company to convert to 4D4 and 5Dn technologies is enormous and profound. It implies a large change in the way computer products interact across organizations and disciplines in the Oil and Gas business. We will be required to do enterprise re-engineering of all computer-aided science and engineering aspects. The problem begins at the wellsite where we would integrate and install a commercially available array of sensors to detect temperature, pressure, fluid fluxes, oil/gas/water mixes and viscosity changes. The sensors have been invented (some by Lamont), but never integrated into what we call a "Smart Wells" configuration. The information must then be transported in near-real-time to the control center, merged and interpreted by the interpretation team using "component-based software" introduced with the sensor arrays. Intelligent agents must also be introduced to move software components to the distributed database because the data sizes are often too large for transport. Agents also act as projectors which extract only the components of the datasets of immediate interest.

2. Orchestration

A new orchestration layer must be be developed that can act intelligently to interoperate among the varying demands of the applications within the Field controller. This layer will exploit lower level networking capabilities to more efficiently implement interoperations than coordination language environments alone. Not allowing the lower network layers to help deliver high level service requirements of applications in an informed way makes it impossible to cope with the interoperating demands present in future oil and gas production computation. In fact, the temporal and spatial scales of these demands reflect the temporal and spatial structure of the domain. The orchestration layer will translate domain-specific high-level interoperating demands to lower-level service requirements. It manages and optimizes in order to coral lower-level layer functionality to meet these constraints. Persistence, not just for buffering, but for checkpoint-restart, interrupt, and redirection functions will be necessary.

3. Visualization

We require the development of novel 3-D visualization technologies and software modules (controllers), and networked, on- and off-platform seismic processing, modeling and integration software modules (with remote packet switching to deal with the computing communications needs). The technology involves the development of true 4-D color imaging (BBN's Color Spacegraph and other Virtual Reality ways of interacting (using probes) with the controller. The probes will be intelligent -- some will have pattern recognition capability to assist with quantitative questions such as volumetric improvements, Gas/Oil/Water mixes, etc.

4. Further Development of 4D Seismic Interpretation Technologies

The development of quick-look 4-D seismic analysis modules is one of the major drives of the project. The critical requirement is to deliver modular software that allow solutions to time dependent observations in time for production decisions. That is, realistic Production Time-Scales must drive our ability to visualize 4-D seismic data and its integration with other geological information and seismic/reservoir models. The goal is to take production engineering out of the 2-1/2Dworld (2-D bounding horizons plus production logs in time) into a true 4-D world of observing the remaining volume and volume change of a producing reservoir over time. Where is the oil and gas coming from and when, within the context of true intricacies in the drainage pattern? Another goal is to allow for a more global reservoir picture than presently available to production engineers. This involves the further development of pattern recognition and tracking software to get quantitative production predictions out of the 4-D seismic datasets. These will be the first products of the strategic alliance. Additional 4-D detection technologies will then be developed such as time-dependent well logging, and well sensor arrays that are cemented into the casing during the initial stages of every new well--the 5Dn technologies.

5. Database management.

Database services for the Field Control Environment must be distributed -- especially when considering the intellectual property rights of the data product spectrum across collaborating companies. Graded access levels will be needed. An object-oriented representation of the combined schemas (i.e., a mixed approach combining traditional relational with object oriented database) will make browsing the data appear seamless. We must develop a GeoModel Server (GMS) to support uniform mechanisms for scalable access to heterogeneous geological models. The GMS bears a functional analogy to WWW servers, except the patterns of access and manipulation of data that it must support are substantially richer and address the special features of geo-models. We will build GMS as a generalization of WWW servers. GeoModel Information Server (GMS) technologies link related heterogeneous sources of geological information: stored seismic models, on-line sensor information, and results of running computations. GMS technologies generalize WWW to support efficient access by computational agents. We will develop a Geo-Modeling Language (GML), in analogy with HTML, to create a common representation of geo-models and linkages between them. The GMS server will support mechanisms to control and account for access, to permit organizations to leverage their geo-model resources for commercial benefits. GMS servers will organize computational access to distributed stores of geological models as well as sources of real-time measurements The GMS architecture generalizes WWW in several ways. First, we generalize from documents to a mix of stored and computed geological information. Second, we generalize the retrieval of documents via URLs to launch computational agents. Third, navigation and execution of geo-model computations is controlled by agents rather than by an end user. Fourth, we must develop mechanisms to identify ownership by controlling authorizations of agents and accounting for their usage. These generalizations should be valuable for large scale sharing of complex information resources in other fields, as well.

6. Integrator

The strategic alliance will utilize multiple 3-D Seismic Datasets (4-D), including ongoing seismic reservoir monitoring, to control production through a modular reservoir controller. The controller connects modules that give the production engineering team the capabilities of calculating the optimal extraction strategy for each reservoir in an oil field. Modules for acoustic and elastic seismic models will be linked to 4-D seismic data analysis modules and reservoir simulation modules in order to computationally control production in an oil field--for the first time ever. Models and datasets will be jointly visualized and interactively compared utilizing component-based software. The development of the 4-D nexus between seismic data analysis and forward models of the different species -- reservoir, elastic, acoustic will allow for interaction and "data fusion" among the production engineer, geologist and geophysicist. People will be able to explore and interact with the volume in an interactive, unimpeded, and intimate way.


The most influential force that affects the computer processing methodology behind 5Dn oil and gas production is that drainage monitoring and remediation is an ill-posed inverse problem. The 5Dn mission is to build a simulator of what is happening "down there" at all spatial and time scales of interest including, seismic, well-logs, prior knowledge of geological structure and process, etc. Control of well production, guided by such models, is like a multi-arm Shiva with information gathering and control arms working at different time and spatial scales. The challenge is TO INTEGRATE INTO A VISUALIZATION AND REALIZATION NEXUS, both the instrumentation of the field (seafloor seismic sensors for example) and modeling (high resolution seismic models that can match well logs AND surface seismic images). The 5Dn implementation goals are to progressively couple today's seismic scale to tomorrow's reservoir scale.

Seismic processing, tracking of drainage, forward and inverse models OF SEISMIC CHANGE must be "orchestrated" by intelligent AGENTS over a high speed network. Databasing, integrated control, and eventual merger of offshore with onshore operations, pipeline and delivery with daily economic models are the goals and objectives of 5Dn .


Component-Based Software Built on the Orchestration Layer

The Component-based inter-operations of hydrodynamic models with the results of 4D seismic monitoring pose unique computational challenges. We envisage the development of an architecture for "lego" models. These models will be able to describe themselves to each other, and an interface layer will be able to adapt to a mutually specified set of needs such as spatial and temporal resolution (auto re-gridding). This will involve the development of network based protocols and a new orchestration protocol layers that can a) understand the requirements of the models such as variable mesh geometries among seismic vs reservoir simulators vs fluid flow models, and b) provide interpolation and unit conversion services between the models. Providing the concert function as a network service can lead to profound data exchange bandwidth efficiencies. Providing semantic descriptions of the Component-Based modules to the orchestration layer can also allow this layer to migrate modules among different hardware to load balance, as well as provide security within the project. Cost, computer cycle and communications effectiveness will result from the orders-of-magnitude improvement in the scientific understanding of how fluids drain into wellbores in the subsurface over time.


Functions of the Orchestration Layer

Once a quantitative approach to Field Control is created, the challenge remains of enabling the disparate software components to interoperate in a distributed but-secure way across collaborating organizations. We must produce a major new network protocol layer we call the "orchestration layer", to steer and manage the distributed collaborative computation needed for the Field Control concept.


An Orchestration Layer for Oil and Gas Computation, Visualization and Production Control

There are major changes in the way the oil and gas industry is approaching its task of exploration and production. These changes are due to the pervasive downsizing in the industry even while the technical challenges for the industry are are inflating as new reserve discoveries are in ever more difficult to produce areas. These reserves, such as the ultradeep Gulf of Mexico, hold a lot of promise, but the challenges are becoming to be beyond one single industry to bear the risk. In fact, the development and exploitation of new oil fields now bear a lot of parallels to the business of airframe manufacturing -- in both we are truly in the era of strategic alliances. In addition, there is much renewed interest in more effective production of existing fields.

These alliances have information infrastucture needs. Because of the above trends in the oil and gas industry, the methodologies used to discover and produce from fields are merging. Seismic, coupled with acoustic/elastic modeling and reservoir simulation, will become essential in the efficient production of existing and new fields. The expertise for this, due to downsizing, has become more distributed across organizational boundaries. This trend, combined with the evolution of the coupling between interpretation and modeling, creates a need to interoperate between the major modeling, analysis, and interpretation applications used to explore and produce oil and gas.

Click here to get an overhead version of below.

These interoperating needs include:

  1. Data Services which involve translating data between interoperating codes. Its function could be thought of as "smart transport". Data services will cover broad functions including:
  2. The execution of agents on remote data.
  3. Load Balancing.
  4. Multi-Flow Synchronization.
  5. Model Steering (ie realtime control, persistancy, checkpoint, interrupt, and restart functions including Timewarping).
  6. Accountability for both security and costing.

Interoperation between scientific and engineering codes presents needs for delivering data in a format acceptable to the consumer of the data. These needs often concern spatial-temporal resolution, and extent. The data services function of the orchestration layer will provide interpolation and extrapolation functions as part of its basic data delivery role for interpreting programs. This functionality can also be used for coordinate system conversion. The distributed buffering of the underlying networking support can be used for efficient data transport in cases where extrapolation, interpolation, or downsizing are provided by the orchestration layer. For example, data can be downsized close to (in terms of network connectivity) the generator of the data whereas it is advantageous to execute extrapolation closer to the consumer. The interpolation of unevenly spaced data to full grids (kriging) is a interesting application of this functionality


How BBN's Multi-Flow Synchronization Protocol works.

Click here to get an overhead version of below.
The orchestration layer involves the concept of an agent to the ether -- the ether agent. High level knowledge of applications at the orchestration layer will enable ether agents. With this generalized concept of agent we introduce the idea of extending the WWW concept of the URL to include agents -- the agent resource locator (ARL) and a query mechanism, possibly KQML (Knowledge Query and Manipulation Language) developed by Tim Finin or SQL. We use the concept of data services functions being encapsulated as projection operations or "projectors" in a Geo-Model Server (GMS). Agents and ether agents are mediators for data services. The Muti-Flow Synchronization protocol developed at BBN is a way to deal with the synchronization of multiple parallel streams of data. Acoountability functions will need fine-grained integration of accountability such as the The Digital Silk Road proposal of Norman Hardy and Eric Dean Tribble or agents and ether agents that are accountability-aware (secret agents). Agents, Ether Agents, ARLs, the GMS and Muti-Flow Synchronization and concepts like the Digital Silk Road together provide all 6 categories of interoperating functionality. Finally we are developing the programmatic interface to the orchestration layer and are investigating its relationship to new distributed program environments such as Nexus from Argonne National Laboratory, Concert/C (IBM), etc.

An example HLL is AKCESS.* from Computational Mechanics Corporation


Orchestration Layer Slide Set

NPAC WebWork (Geoffrey Fox's Slides)

HPC at the Crossroads -- Academic Niche or Economic Development Cornucopia
NII Compute & Communications Capability in Year 2000 --> 2005
What will National Information Infrastructure give us?
The Next Generation Home Computers include Settop Boxes and Videogame Controllers
Some Implications of HPCC Observations
However we need more than fast enough machines
The 33 Application areas were studied in detail -- Hottest
HPCC needs a large enough market to sustain technology (systems and software)

WebWork

WebWork -- NPAC, Boston University, Cooperating Systems Collaboration
What Is WebWork I?
What Is WebWork II?
WebWork Architecture
WebWork Architecture Diagram
World-Wide Virtual Machine
PCRC Naturally Fits in with WebWork
The Hyperworld of New Interactive Media
Future Components of WebWindows
Server-to-Server Communication Diagram
General WebScript and Agents
WebWork Integration Model
WebWork Technologies Table

WebFlow

WebFlow Paradigm
Java demo (NPAC) -- WebFlow Editor prototype
Java demo (NPAC) -- WebFlow application prototype: Project Manager

Halloween Presentation

RCI Presentation on HPCC and NII for Industry and Education
World Wide Web (WWW) is key to HPCC Implementation
WebWindows is Open Portable Environment
Architecture of Web Software
Illustration of WebWindows Concept for Presentation Software
Full Index for LOCAL Additional Material for WebWork Presentations
Full Index for LOCAL General Foils on PCRC -- Parallel Compiler Runtime Consortium
Full Index for LOCAL WebWork -- MetaComputing and Distributed Software EngineeringLinks to HPF Information Elsewhere
NPAC Research Projects
WebTools
WebTools Foils
Status of Foilsets of Geoffrey Fox Presentations


Other Relevant Links.




About Lamont