Spectral Decomposition - The Road Ahead

Presented by Greg Partyka on October 27, 2003, at the "Recent Advances and Road Ahead" Session, of the Seventy-Third Annual Meeting of the Society of Exploration Geophysicists, Dallas U.S.A.


Slide 3

Spectral Decomposition is directly analogous to remote sensing.

In remotes sensing: we use sub-bands of much higher electro-magnetic frequencies (e.g. infrared and UV) to map characteristics of the earth's surface (thereby detecting and resolving contrasts caused surface variability in air, vegetation, rock, fluids, etc...).

In spectral decomposition: we use sub-bands of substantially lower seismic frequencies (typically less than 100hz) to map interference in the subsurface (thereby detecting and resolving subsurface variability in rock, fluid pressure, etc...).

How does it help?

Slide 4

In seismic work, the goal is to typically to use the available signal bandwidth to optimally resolve or detect variability in impedance, thickness, and layer-stacking.
Traditional seismic approaches are plagued by interference and tuning that wreaks havoc on our ability to understand and characterize the underlying geologic information.
Traditional methods require constant attention to the wavelet-imposed tuning-thickness and wavelet-imposed dominant frequency.

Slide 5

Spectral Decomposition on the other hand, quickly and easily gets underneath the wavelet-overprint to reveal geologic content.
It moves detection and resolution out from under the control of the wavelet, and allows impedance, thickness, and layer-stacking to be examined with respect to signal and noise at each frequency. It maximizes your ability to seismically characterize.

Why is that important?

Slide 6

Seismic technology has come a long way, and continues to evolve. It is mature enough, that we now routinely expect it to reveal structure and stratigraphy....and in some cases the presence or absence of fluid type. The quantitative tie however, to reservoir properties and reservoir perfomance in particular, is still an evolving science with a lot of uncharted territory. Two of the bigger industry-wide research topics in this area, relate to sensitivity and scale....and include questions such as, for instance:

Getting past these question marks requires squeezing-out as much seismic resolution and detection as possible. Spectral decomposition does that.

Lets look at a couple of examples, both qualitative and quantitative.

Slide 7

Here for example, is a 3D perspective view of a subsurface structure map over an undeveloped offshore West Africa reservoir. Draped on the structure is a spectral decomposition image. The reservoir is a sandy turbidite filling an erosional valley.

The dominant frequency of the input seismic data, was significantly lower than what was required to adequately image the reservoir. Spectral Decomposition exposed the higher frequency information that was required to image the stacking pattern of flow units.

The draped image conveys a 3D perspective by showing three different slices through the reservoir section with different colour bars (red, green, and blue, from top to bottom). The wider underlying amalgamated-fill shows up mostly in the 'deeper' greens and blues, whereas the overlying sinuous channel deposition is exposed in the 'shallower' reds. The three colour-bands are allowed to mix such that, for examplefeatures that show up in both red and green, mix to create yellow.

For many reservoirs such as this, understanding the vertical stacking patterns of flow units is important for determining optimum drainage with the right number of wells.

We're also pushing the boundaries of quantatitative characterization

Slide 8

Here is an example (from the 2000 SEG) showing the power of spectral decomposition in quantitative reservoir imaging, in this case 8hz spectral amplitude images the details of a channel geometry with levee deposits. Black represents low amplitude, white represents high amplitude. The channel itself is shale filled and therefore dim. The oil-filled porosity within the sand-rich levees, creates a strong impedance contrast that results in high amplitude reflections.

Slide 9

Here, we've calibrated that spectral decomposition data and transformed it to reservoir thickness. Colour, in this case, represents thickness, ranging from 0 to 100ft. Spectral decomposition enabled us to get underneath the wavelet-overprint, to expose thickness and impedance contrasts that would otherwise straddle the wavelet-imposed tuning threshold.

At this point it is worth mentioning that throughout any characterization it is important to keep in mind scales of measurement, potential uncertainty and errors.

Slide 10

Different scales of measurement respond to different rock properties, representative of the rock-mass sensed by the measurement tool. Unfortunately, we have to live with uncertainty and errors at all measurement scales. For example:

These are just a few of the many examples.

A key message here, is that there is often no one absolutely "correct" data source to which all other datasets can be calibrated. We are often best-off, balancing insight that comes from multiple data sources that span across the scales of measurement. Spectral Decomposition maximizes your ability to seismically detect and resolve. It quickly reveals underlying geologic information with greater clarity and certainty, thereby facilitating the road-ahead.

What are some of the building blocks of a workflow in the road-ahead?

Slide 11

One of our starting points is to select an appropriate seismic volume that targets the reflective material that you are trying to image, pick a rough guide-horizon, and use spectral decomposition technologies to predict layer stacking that is no longer constrained by the resolving power of just the dominant frequency of the source wavelet.
That is shown over here, where you can see a cross-section through the input data along with a map of the rough guide horizon. The predicted layer stacking is shown here, both in cross-section and plan view.

The predicted 3-D distribution of layering reveals geological processes, as well as the architecture and heterogeneity of flow and storage units.

It is very important at this point to calibrate and assess the geophysical and geological-reasonableness of the resulting layer-stacking and its 3-D distribution. For example, please note the presence of channels, the more sheet-like bodies, as weel as faults.
One such piece of geophysical feedback and QC comes from the creation of a seismic model that is derived from the geological model. A mismatch between the seismic model and the input data is cause for re-examination and iteration. Like I mentioned previously, we're not after a "black-box", "one-button-push" approach, but rather a very fast iterative approach that facilitates quality-control and feedback.

With a geological model in hand, you can experiment with reservoir simulation 'what-ifs' - flowing and capturing time-lapse snap-shots of fluid and pressure behaviour within the 3-D distribution. This allows you to scope-out potential compartmentalization and drainage options via simulation of producer/injector scenarios. An example is shown here on the right, as an animation, where the movement of contours indicates dynamic fluid and pressure behaviour - dynamic behaviour that is controlled by the specific producer-injector scenario in conjunction with the predicted reservoir architecture. It points to significant baffles and barriers, such as for instance the faults.

I can't stress enough this "Dynamic Modeling" piece of the road-ahead. It is a very important, but currently under-utilized piece of the puzzle.

A key part of making this "Road-Ahead" happen, that is: Seismic, Geology and Simulation in Real-Time, is to approach the pieces concurrently rather than piece-wise linearly.

Slide 12

For example, a linear work flow may look something like this: seismic imaging work, followed by geologic interpretation, then detailed surface mapping, leading to building of a detailed geological model. Unfortunately, at this stage, the geologic model often needs to be upscaled to something that a simulator can handle...and finally...ending up with dynamic modelling.

Along the way, there are many opportunities for derailment that may go undiscovered until the very end, at which point, after months of work, you may discover that the resulting static and dynamic characterization does not match reality, causing you to have to re-evaluate assumptions and rebuild all the way from step 1. All in all, a tremendous loss of time.

Instead...

Slide 13

Consider seismic imaging, geologic interpretation, and reservoir simulation as a package (in real time), to determine whether they in combination, lead to a reasonable static and dynamic characterization. In the process, flagging areas of poor fit for re-examination. Doing so, reduces uncertainty, dead-ends, and cycle-time.

At BP, we have started down this road-ahead, with applications to dozens of areas, and what we are finding is that our spectral decomposition-based technologies are playing a major role in making it happen. Sure we've got a long way to go, but the road traveled so far, has provided insight into what it will take to get there. I'd like to finish-up with some of that insight, starting with:

Slide 14

Quality Data

The whole premise of "Garbage In - Garbage out" really applies. We are often faced with many variations in noise tpyes, both random and coherent. Prior to spectral decomposition analysis, it is important to:

In other words, Spectral Decomposition exposes the signal that is present, it does not create signal from noise. Having said that, in poor signal areas, spectral decomposition is often able to reveal geologic content that would otherwise go un-noticed.

Slide 15

Another thing it takes, is an evolving toolkit that facilitates creative thinking, and provides a framework for research, algorithm development, implementation, and application. At BP we have such a toolkit in the form of Unix Seismic Processing (USP). There is a non-proprietary version of that toolkit, out on the internet, and if your interested, you can find out more information about it at: www.freeusp.org

Slide 16

Another thing it takes, is our evolving High Performance Computing Center

On site, at BP, we have one of the most powerful commercial computing centers in any industry, anywhere in the world. Tasks that would traditionally be unrealistic to perform, taking many months of computing time, become not only realistic, but also are turned around in a fraction of the time. This not only enables us to pursue the road-ahead, it allows significantly greater scope, breadth and certainty along the way.

Slide 17

And last but not least....It takes a stable and conducive working environment that fosters collaboration.

Really two parts to this:

The two parts are complementary and feed off each other.

To be spectral decomposition specific, developing this technology and extending its impact to areas such as reservoir simulation and time-lapse characterization, has come-about and continues to do-so via collaboration among people with complementary multi-disciplinary skills.

At BP we've been extraordinarily fortunate in terms of both the environment and exceptional colleagues and collaborations. I'd like to acknowledge in particular, key individuals who as a team are forging down this road-ahead, and who I'm representing here today:

I'd also like to acknowledge Craig Cooper for fostering an environment to make that happen.

In addition, I'd like to thank the user community for their support and feedback, and BP and the SEG for the opportunity to present this material.


PowerPoint version of this Presentation.


BACK Return to USP Notes.