AAPG Explorer Interview – Seismic data conversations January 2010 (13)

January 2010

Seismic Data Talk

For AAPG
By Louise S. Durham

(NOTE to Art: Here is draft of story for you to review for message, accuracy of content. Please highlight any changes and return to me, so I can do a final copy and submit to editor. Titles are lower case as this is style they use – Thanks, Louise.)

Seismic Data Conversations Very Revealing

Seismic data processing technology has progressed markedly over the past decade or so.

In fact, the research folks sometimes come up with a new technology that doesn’t just push the envelope, it appears to blast all the way through.

New seismic methods that allows data to ‘talk’ to itself no doubt fall into this category. The processor or interpreter only decides on the topic for the data to talk about and then instructs them to “talk amongst themselves”

Staying on topic (i.e., staying focused on a specific seismic processing goal) until the data conversation delivers the specific processing objective. The details of the individual data discussions are neither known, nor needed, just that a focused conversation has been set up, and has taken place that delivered on its purpose. Data talking to data?

No joke.

That type of thinking has already resulted in forms of coherent noise removal, i.e., free surface and internal multiples to be removed by distinct algorithms (distinct data conversations) that are now widely used within the petroleum industry, and require absolutely no subsurface information. They are particularly effective and demonstrate their mettle, and added-value compared to other multiple removal methods, under highly complex and difficult to define geologic circumstances, as in subsalt plays in the Gulf of Mexico, offshore Brazil, and the Red Sea.

The idea now is to extend that earlier noise removal capability to the extraction of useful subsurface information from signal. Researchers have developed, and are preparing to field test a method that enables seismic events arriving at the recorder to “converse” with each other to reveal a raft of critical subsurface information. Among the latter and current goals are depth imaging, target delineation and Q compensation, each using a distinct data conversation that focuses on one of these goals, and avoids any explicit or implicit need for velocity or any other subsurface information.

The added kicker is all of these can be accomplished directly and without any particular knowledge of the earth.

This near-mystical-sounding development stems from the Mission-Oriented Seismic Research program established in 2001 at the University of Houston (UH).

The program is supported by more than a dozen major oil and industry service companies, including Shell, ExxonMobil, ConocoPhillips, ENI, BHP Billiton, Petrobras, Anadarko, EnCana, WesternGeco, IBM, PGS, Devon, PetroChina, IONGEO, Statoil, Landmark/Halliburton, Chevron, BP, and Repsol. The research program functions under the leadership of its founder, Arthur Weglein, who is the Hugh Roy and Lillie Cranz Cullen distinguished professor of physics at UH, where he holds the position of professor in the physics department as well as the department of earth and atmospheric sciences.

The research effort is complex, but the goal is defined succinctly.

“We want to make the currently inaccessible petroleum target accessible,” Weglein said, “and the accessible target better defined.”

When a seismic source sends a wave into the earth during the data acquisition phase, it continues traveling until it hits an interface when a part of it is reflected back to the seismic recorder. The larger the contrast in properties at the interface, the larger the amplitude or size of the reflection. The time of the arrival reveals how long the round trip required.

“We classify events by whether they go straight down and back up, which we call a primary,” Weglein said, “or if a wave bounced around a bit and then comes back up, we call it a multiply-reflected event, or a multiple.

“You want to get rid of multiples because they hit too many reflectors, and you can’t decipher and isolate the encoded information within the multiply reflected event’s complicated history,” Weglein noted. “We’ve become known for getting rid of the multiples and without knowing anything about the earth.”

Here’s the blueprint.

Suppose someone observes events, i.e., arrival of energy into the phone, and treats the events individually. Historically, the only way to know if the signal went down and straight back, or whether it has bounced around and hit multiple reflectors prior to returning is to know the earth, particularly to be able to determine velocity of the signal as it traversed the subsurface.

“If we mathematically make events talk to each other in a math-physics sense, we set up a math-physics conversation where we get them to talk together,” he said. “By getting events to communicate with each other with a certain conversation, they tell us which events are down and back, i.e., primaries and which are multiples – without our knowing anything whatsoever about the earth.”

“The inverse scattering series, or ISS, is a math-physics program we’ve developed that allows that kind of communication between events for different seismic purposes,” Weglein said. “The ISS has the unique ability to achieve all processing goals directly and in precisely the same manner that free surface multiples are removed, i.e., without subsurface information.

“The methods we originally developed 20 years ago for removing multiples were highly controversial and radical when we first introduced them, due to their claim of not needing any subsurface information, which was entirely counter intuitive. Our claim on free surface and internal multiple removal capability from our new methods, back then, ran counter to the entire seismic evolution, experience and intuition where more complex and difficult targets and the matching more capable seismic processing methods had, at ever step of advance and progress, ever greater demand and need for more detailed and accurate subsurface information, including velocity. However, our earlier “radical” ideas for removing all multiples have now become fully mainstream, delivered to our sponsors, imitated by other consortia, and in wide spread industry use, world-wide.”

Now we are focused on primaries, and target information extraction, and we now claim that we can directly determine the depth of the target without any need for a velocity model. That is the current controversial and radical thought. As in our earlier history with multiples, we try our best to communicate but at some point we ignore naysayers, work hard and effectively to earn and deserve the large support we have from the petroleum industry, and make it happen. Field data tests and impact are where philosophy becomes science, and where the tire hits the road. ” Weglein added.

Ordinarily, when working with an individual primary event, the question becomes how deep in the earth did the down-going wave encounter a reflector, then what specifically did it experience at that depth, and finally whether what resides at the reflector may be something that interests the petroleum industry.

Consider a simple analog to a homogeneous geologic setting where the wave velocity may be known to be, say, 60 mph, and the wave makes a round trip into the subsurface and back in one hour, then determining the depth of the reflector to be 30 miles is a slam-dunk.

Venture out to subsalt plays in the deepwater Gulf of Mexico, however, and such simplicity disappears.

“The problem with current imaging in the deepwater Gulf of Mexico in the subsalt environment is they can’t figure out the velocity above the target because the salt is very complex,” Weglein said. “They can’t get that 60 mph, so to speak.

“If I have a top salt primary, or bounce, and a primary from bottom salt and the subsalt target primary and I know the velocity experienced to reach each of these, then I could figure the depth,” Weglein said. “But all too often I can’t because it’s a very complicated problem in such complex subsalt plays.

“We have a roughly 90 percent failure rate in the deep water Gulf of Mexico drilling, with only 25% even reaching the target” Weglein said. “If we were more effective in determining velocity and acquiring images under salt we would have a much higher percent success in drilling.”

Well, you say, something’s obviously wrong, so maybe the industry folks just need to collect more data and latch on to more computer speed.

Won’t work.

“Collecting more wide azimuth data and having faster computers is useful but by themselves do not recognize nor address the underlying problems. By themselves they don’t represent a comprehensive response to the imaging challenge. What’s missing and what’s wrong is what we call a breakdown or violation of algorithmic assumptions, violations not caused by limited data or computers, and hence not addressed solely by those two important and useful factors” Weglein noted. “The current ability to find velocity fails under complex geology, so we’ve been looking for a method to find depth without needing velocity, aiming to locate and delineate target reservoirs without having to know anything above it.

“If you have a problem finding velocity, there are two approaches,” he noted. “One is to find a new and improved way to find the velocity, and then you can use current imaging methods that depend on having an accurate velocity model. However, and unfortunately, there is no candidate method or concept today with that improved velocity promise or potential.

“Or the second approach: you can find a totally new imaging method that doesn’t need it, no velocity needed directly or indirectly, and that’s what we’re doing.

“These primaries at the top of the salt, bottom of salt and subsalt target have to have a conversation – a math-physics conversation,” Weglein emphasized. “There’s a certain math-physics communication that occurs that ISS allows that will output the depth directly without the velocity.

“If you allow all those primaries, or single bounces, from top salt, base salt and subsalt target to communicate with each other, then they will locate where each of their reflectors are,” Weglein said, “without needing in principle or practice to know velocity or anything about the earth.”

“We’re after this game-changing new imaging capability that will be effective where current imaging methods fail, and thereby making currently inaccessible targets accessible,” Weglein noted.

He emphasized that this new target location and imaging capability applies to complex geology other than subsalt and to shallow water environments as well as deep. It also applies to the daunting on-shore challenges for removing internal multiples and depth imaging.

The first field test of the ISS imaging theory is scheduled within a year, and likely will occur in the deepwater Gulf of Mexico. Actually, this will be a sequence of tests, kicking off with addressing an imaging challenge ( e.g. , a fault shadow zone) but not the worst and most daunting challenge first, and moving in stages on to the more difficult, -thereby enabling the M-OSRP team to build its imaging experience on field data. The first field data test of this new imaging capability is an important and exciting moment and next delivery both within the ISS history, and for exploration seismology.

Weglein said they began tackling this imaging problem about seven years ago, noting that new capabilities from long range, high impact, fundamental research usually require seven to 10 years. E.g., the ISS internal multiple removal delivery took about eight years, and in the 1970’s the pioneering development of finite difference seismic migration, by Jon Claerbout of Stanford University, took 10 years from embryonic concept to field data application.

“We are very fortunate to be given the opportunity and privilege to address this next generation of pressing seismic exploration imaging challenges,” Weglein stated. “The M-OSRP program clearly indicates that the petroleum industry will support fundamental high impact, potentially game-changing research if you can describe to the petroleum sponsors what benefits would derive, and be delivered, if we are successful, and in terms that make sense to them. The petroleum industry is definitely not risk averse, and will invest and partner to develop new and relevant high impact predictive capability. That industry interest fully aligns with our university mandate, and our program’s responsibility and commitment to support and encourage fundamental scientific advancement in exploration seismology and to serve the educational and career interests of our students.”