Agile

View Original

Innovations of the decade

Exploration geophysics and subsurface geoscience have come a long way since 2001. I thought I could just sneak under the wire before the end of January with a look back at the ideas and technologies that have changed how we find oil and gas today. The list isn't definitive, or even objective: I have a bias towards the realm of integrated subsurface interpretation. Anyone with another perspective would, I’m certain, pick different highlights of the previous decade.

It’s fun to think back to the year 2000. It’s the year I emigrated to Canada from Norway, so I remember it clearly. I was at university for most of the 1990’s, but my recollection is that exploration geoscience was all about the emergence of computer-based interpretation, the commoditization of 3D seismic data, huge integrated databases, and the acceptance of amplitude-versus-offset methods (or AVO) as a valid approach.

Here’s what I think were the greatest advances of the noughties, the decade 2001 to 2010:

  • The rise of the Linux workstation. Even in oil-and-gas tech-hotspot Calgary, it only really started in about 2003, and was regarded by many as a fad until about 2006. It changed everything. For example: in 2000, one integrated oil company in Calgary paid well over a million dollars for a single Silicon Graphics machine for its viz room. It had fast graphics and 12 GB of RAM. Five years later, the same company put 16 GB of RAM and better graphics on every interpreter’s desk. For $20 000.
  • Cheap high-performance computing. Connected to this trend, the rise of Linux and denser silicon chips has also made high-performance computing, especially multi-node clusters, much, much more affordable. This has enabled countless things that I am too lowly to know much at all about, but I know the list includes things like reverse time migration, anisotropic migration, high trace densities, massive fold, vibrator arrays, and lots of other wonderful things.
  • Real-time voxel picking. Volume interpretation was born in the mid 1990’s, and you could make a case for its continued obscurity, compared to traditional line picking. But I think it took off in the early part of the decade as Paradigm’s VoxelGeo software fought MagicEarth’s eye-opening GeoProbe for the crown of Best Seismic Video Game. The tools are now widely available, even free (see below), but in my opinion still not widely enough used.
  • Seismic monitoring matured into an almost ubiquitous tool in the management of large fields. Active time-lapse surveys are commonplace on older oil and gas fields as they undergo enhanced recovery methods. But today monitor surveys are being acquired after as little as a few weeks on oil sands fields under steam-assisted gravity drainage (or SAGD), as steam-chamber conformance in the early weeks is critical to achieving good field performance. Passive monitoring of fracture stimulation jobs is worthy of special mention because of its popularity in the ongoing shale gas boom. I think one of the biggest achievements of the new decade will realizing the full utility of these huge new data sets.
  • Frequency analysis was a rather obscure corner of seismic analysis until Greg Partyka wrote his game-changing paper in 1999 (Partyka et al, 1999, The Leading Edge 18). However, it took the release of Landmark’s Spectral Decomposition software in about 2001 to open up the workflows to interpreters at large oil companies, and they are now widely available (if not all that widely understood, or appreciated!).
  • Discontinuity analysis: volume attributes like coherency and semblance are part of the everyday workflow, while newer attributes like curvature (introduced to most of the community by Andy Roberts, 2001, First Break 19) have enabled interpreters to see and exploit subtle and discontinuous fracture systems.
  • Azimuthal methods. Perhaps an extension of these stack-based discontinuity attributes, amplitude variation with azimuth (AVAz), and velocity variation with azimuth (VVAz), are emerging as powerful instruments for understanding the 3D distribution of fractures. Along with high-resolution image logs like Schlumberger's FMI tool, and other azimuthal logs like the SonicScanner tool, there is a lot of work going on here. The next step will be integrating these technologies. 
  • Wavefield sampling. Seismic acquisition is not my domain of expertise, but I know there has been a revolution in both land and marine recording. There have been giant leaps in trace density. These have been made possible by new technologies like wireless receiver units and cheaper supercomputers (see above), but also by processing methods which allow simultaneous recording of overlapping vibrator sweeps, for example. And there have been at least a couple of revolutions in receiver design, principally driven by the desire to record all three orthogonal components of motion.

  • The open source software movement began to flourish in the last decade. It has always existed in some form, and is by no means mainstream yet, but I think it went from niche and academic to widespread and industrial. The internet and Linux are largely responsible for this, of course; it’s part of modern science and technology. Sometimes it’s easy to feel like our corner of the scientific world is left out of these trends, but it isn’t. Tools like Madagascar (seismic processing) and OpendTect (interpretation) are just the beginning!

Please let us know what you think in the comments. I know I have glossed over a lot in this post, and probably missed your own favourite innovation.

Next week I'll have a look at what the next decade might bring.

Linux is a trademark of Linus Torvalds, and the Tux mascot is attributable to Larry Ewing and the GIMP. Landmark and its software are trademarks of Landmark and Halliburton. OpendTect is a trademark of dGB Earth Sciences. VoxelGeo is a trademark of Paradigm. FMI and SonicScanner are trademarks of Schlumberger. Seismic image shows Penobscot data from Sable Island, Nova Scotia, available from Open Seismic Repository.