30 STRONG: Computer tech revolutionizes seismic Better data resolution, high-tech processing create collaborative approach to subsurface Alan Bailey
Wind back 30 years in the seismic industry, and you’d probably have seen an expert geophysicist huddled with a bunch of colored pencils over a long sheet of paper covered with the squiggly plots of seismic traces. The interpreter would mark up his or her ideas on how those squiggles represented underground geologic structures, before handing off the interpreted plot to, say, a geologist as an authoritative assessment of what lay under the ground.
Those days are long gone. A quantum leap in the amount of detailed information that geoscientists and others have been able to glean from the seismic data has revolutionized seismic interpretation.
“The days of having a seismic interpreter sit at his workstation in a cubicle and just basically do his interpretation and hand it off to someone — that type of thing just isn’t done any more because you really need to be working in an integrated fashion with everyone,” Tom Walsh, principal partner and manager of Petrotechnical Resources of Alaska, told Petroleum News.
The seismic process A seismic survey involves sending sound waves from a surface vibrator or, offshore, from an array of air guns into the subsurface rocks. The sound waves bounce off the boundaries between different types of rock strata and devices called geophones on the surface detect the resulting echoes.
The echoes appear as peaks in plots of the sound signals received at a geophone. And by laying side by side the plots from an array of geophones, it is possible to use those echo “peaks” to pick out the images of subsurface rock structures.
To minimize unwanted noise that can obliterate the echoes, surveyors shoot multiple recordings for the same survey point, using sound sources and geophones located over a range of different offsets from that point. Adding together the results of the multiple recordings has the effect of canceling out random noise while enhancing the coherent signals from the echoes.
The signals from the geophones are computer processed and plotted.
Massive data volumes The vast amount of data from a seismic survey used to place severe limitations on how many recordings, or channels, could be made from a single sound source shot. But rapidly evolving computer and recording technology over the past few decades has enabled an exponential increase in the number of channels that a seismic survey can gather — data are now recorded on small computer disk drives, rather than on roomfuls of magnetic tape reels, while modern high-power computers can organize and process the data very rapidly.
In 1977 you might have had 200 channels if you were lucky but nowadays a survey typically involves around 10,000 channels, Jon Anderson, chief geophysicist, exploration and land for ConocoPhillips Alaska, told Petroleum News.
The vast increase in the number of channels has enabled the distances between geophones to be reduced and the resolution of the images of the subsurface to be greatly increased — the effect is a bit like increasing the pixel density in a digital camera.
That improved resolution has resulted, for example, in an ability to resolve small geologic faults that can cause significant disruption to the reservoirs of oil fields — knowing the fault locations assists with precision well planning and improved reservoir modeling.
“That’s been a great success story for North Slope fields which are fairly well shattered by faults,” Walsh said.
2-D, 3-D, 4-D The increase in the number of seismic channels recorded during a survey has also helped in the development of a technique known as 3-D surveying.
In a traditional survey, known as a 2-D survey, seismic signals are triggered and the corresponding geophone recordings made along a single line. The data collected then results in a two dimensional image of the subsurface below the line of the survey. Multiple 2-D surveys along lines in different directions across a region then give a picture of the subsurface geology.
In a 3-D survey, the seismic sound sources and geophones are placed in a surface grid, rather than along a line. The resulting data enables a three-dimensional image of the subsurface to be produced.
It’s a bit like the difference between a CAT scan and a regular X-ray, Michael Faust, offshore exploration manager for ConocoPhillips Alaska, told Petroleum News.
“Now you can see a three-dimensional image and you can rotate it around and slice through it in any direction and really see what’s going on,” Faust said.
Jon Konkler, senior development geophysicist for BP Exploration (Alaska), said that the use of 3-D seismic started as a technique to assist with oilfield development, rather than for oil and gas exploration.
The pioneering use of 3-D seismic on Alaska’s North Slope dated back to 1978, when a 3-D survey was first used to image the Prudhoe Bay gas cap, Konkler said. Since that time 3-D seismic has been shot at least once in every field on the slope and in some fields there have been three or four surveys of this type.
And, as in other types of seismic survey, the spatial resolution has continuously improved over time, as geophone spacings have shrunk.
Early 3-D surveys proved beneficial in mapping geologic structures but only had a horizontal resolution of perhaps 400 to 500 feet. A 3-D survey in a field such as Prudhoe Bay now typically involves thousands of geophones, resulting in a resolution as small as 55 by 55 feet, Konkler said.
“In doing that we found that we were able to pinpoint our faults a lot better, understand what faults might be there … and which ones aren’t,” Konkler said.
And with well spacings becoming ever shorter in a mature field such as Prudhoe Bay, 3-D seismic data have become critical in determining where the remaining oil is located.
“We want to make sure that we drill the most economic target that we can find,” Konkler said.
The most recent stage of evolution in seismic technology involves what is termed 4-D surveying, in which a 3-D survey is conducted periodically in the same oil field. By finding subtle differences between the data from one survey to the next, interpreters can try to glean information about the movement of oil, gas and water through a field reservoir.
The frequency with which 3-D surveys are done to form a 4-D survey depends on the speed of migration of fluids through the reservoir — in a mature field such as Prudhoe Bay a 3-D survey might be carried out every three to five years, but surveys might be done more frequently in a newer field, Konkler said.
But 4-D surveying is very much in its infancy as a technique that could make a major impact on oilfield development.
“The North Slope is one of those places where we’ve started investigating, does it work?” Konkler said. “We’ve got a couple of places where we’ve overlain successive surveys and we’re in the process of evaluating can we see the fluid movements and, if we can … how do we use that.”
Improved processing As well as enabling the collection of vast amounts of data, modern computer technology has opened the door to a whole new world of seismic data processing and display.
People can now evaluate workstation displays of many different attributes of the data, including the sound wave amplitudes, signal coherence, signal frequency and signal phase, Anderson said. And the raw sound data from the seismic survey can be converted into inferred rock properties such as the rock density or the sound velocity in the rock — interpreters can then link those rock properties back to similar properties that are measured in wells in the area of a survey, Walsh said.
Those linkages to well data, combined with sophisticated computer processing and display, now enable interpreters to use seismic data to determine much more about the subsurface geology than just the basic structure of the rock strata.
Computer displays can overlay different types of seismic data, well data and petroleum engineering data in composite plots that enable new insights into the data. And data depicted in three dimensions can be rotated and tilted, so that people can assess whether faults and other geologic interpretations appear to make sense, Konkler said.
And this ability to simultaneously view several different types of data has driven the need for teams of different specialists to work collaboratively on seismic interpretations.
“Four-D doesn’t just stand for the fourth dimension in time; it stands for the — at least — four disciplines in takes to interpret that data,” Konkler said. “You need a team of different disciplines to do 4-D interpretation, because you have to have a geophysicist to understand what’s making the signal change; and you have to have your reservoir engineer and your petroleum engineer, and your driller sometimes, and your geologist, to talk about the geology, the production history, etc.”
And as data interpretation becomes more multi-disciplinary, experts within each discipline tend to need some level of understanding of the other disciplines.
“As geophysicists we have to learn a lot of things outside of geophysics,” Konkler said. “We have to learn about reservoir engineering. We always have to understand and be able to know geologically that our interpretation makes sense. We have to understand petrophysics to understand what the (well) logs are telling us, versus what we think is in the reservoir.”
A team of specialists now often uses a purpose-built room to meet and discuss how to interpret data. Simultaneous or overlaid computer displays of seismic, well and geologic data, coupled with interactive interpretation software, facilitate the discussions and enable the interpretations to be captured as the discussions progress.
BP uses rooms that it calls “collaborative visualization environments,” or COVEs. Communication links between COVEs at different BP sites, such as Anchorage and Houston, enable specialists across different sites to assess the same data collaboratively, Konkler said.
BP also has a facility known as the “highly interactive visualization environment,” or HIVE, in which a team can view and manipulate large three-dimensional images of subsurface data.
Exploration Traditional 2-D seismic surveying 30 years ago was the primary exploration technique, used to find large underground structures that might trap oil and gas.
“We used to just look for big trapping structures, big major features that were easy to see on half-a-dozen seismic lines,” Faust said.
But success with 3-D seismic in oilfield development led to the subsequent use of 3-D techniques in exploration. And the use of high-resolution 3-D seismic exploration has really come into its own on the North Slope, where many of the major structures have been drilled and companies now tend to focus on the search for small stratigraphic traps.
“Alpine is a great example of very significant leveraging using seismic data,” Walsh said. “That whole NPR-A area is now something that people are using very sophisticated tools to do their interpretations. … We’re not just looking for bumps any more. We are looking for those stratigraphic traps. And to find those you have to use these more sophisticated tools.”
Walsh also cited the Tarn field as an example of a situation where high-resolution 3-D surveying proved critical to exploration success — after drilling several unsuccessful wells, ConocoPhillips used 3-D seismic to find the Tarn reservoir, Walsh said.
“The 3-D seismic of Tarn is fascinating in what it tells you about the architecture of that reservoir,” Walsh said. “It really dramatically shows exactly where the reservoir channels are and where the high-quality reservoir is.”
Still use 2-D Not that 2-D seismic has disappeared from exploration programs. The use of 2-D seismic is much cheaper than 3-D seismic when it comes to surveying large areas of territory. So, companies tend to shoot 2-D seismic survey over relatively wide areas, and then use 3-D seismic to home in on specific prospects, Faust said.
But the ability of high-resolution, 3-D techniques, coupled with modern visualization techniques, to pinpoint drilling targets really has opened up a world of exploration that was not available 30 years ago. In fact the use of 3-D seismic has significantly improved the success rate for exploration wells. The average exploration well success rate 20 years ago was 10 percent, Faust said.
“With the advent of 3-D data that jumped to almost 50 percent,” Faust said. “So suddenly you were drilling far fewer wells to find the same amount of oil.”
“Obviously we’re very dependent on 3-D seismic on the slope,” Walsh said. “Pretty much everything that is prospective is now shot with 3-D seismic data.”
|