G is for Gather

When a geophysicist speaks about pre-stack data, they are usually talking about a particular class of gather. A gather is a collection of seismic traces which share some common geometric attribute. The term gather usually refers to a common depth point (CDP) or common mid-point (CMP) gather. Gathers are sorted from field records in order to examine the dependence of amplitude, signal:noise, moveout, frequency content, phase, and other attributes that are important for data processing and imaging. 

Common shot or receiver gather: Basic quality assessment tools in field acquistion. When the traces of the gather come from a single shot and many receivers, it is called a common shot gather. A single receiver with many shots is called a common receiver gather. It is very easy to inspect traces in these displays for bad receivers or bad shots. 

shot gatherImage: gamut.to.it CC-BY-NC-NDCommon midpoint gather, CMP: The stereotypical gather: traces are sorted by surface geometry to approximate a single reflection point in the earth. Data from several shots and receivers are combined into a single gather. The traces are sorted by offset in order to perform velocity analysis for data processing and hyperbolic moveout correction. Only shot–receiver geometry is required to construct this type of gather.

Common depth point gather, CDP: A more sophisticated collection of traces that takes dipping reflector geometry other subsurface properties into account. CDPs can be stacked to produce a structure stack, and could be used for AVO work, though most authors recommend using image gathers or CIPs [see the update below for a description of CIPs]A priori information about the subsurface, usually a velocity model, must be applied with the shot–receiver geometry in order to construct this type of gather. [This paragraph has been edited to reflect the update below].

Common offset gather, COFF: Used for basic quality control, because it approximates a structural section. Since all the traces are at the same offset, it is also sometimes used in AVO analysis; one can quickly inspect the approximate spatial extent of a candidate AVO anomaly. If the near offset trace is used for each shot, this is called a brute stack.

Variable azimuth gather: If the offset between source and receiver is constant, but the azimuth is varied, the gather can be used to study variations in travel-time anisotropy from the presence of elliptical stress fields or reservoir fracturing. The fast and slow traveltime directions can be mapped from the sinsoidal curve. It can also be used as a pre-stack data quality indicator. 

Check out the wiki page for more information. Are there any gather types or applications that we have missed?

Find other A to Z posts

AVO* is free!

The two-bit experiment is over! We tried charging $2 for one of our apps, AVO*, as a sort of techno-socio-geological experiment, and the results are in: our apps want to be free. Here are our download figures, as of this morning: 

You also need to know when these apps came out. I threw some of the key statistics into SubSurfWiki and here's how they stack up when you account for how long they've been available:

It is clear that AVO* has performed quite poorly compared to its peers! The retention rate (installs/downloads) is 100% — the price tag buys you loyalty and even a higher perceived value perhaps? But the hit in adoption is too much to take. 

There are other factors: quality, relevance, usefulness, ease-of-use. It's hard to be objective, but I think AVO* is our highest quality app. It certainly has the most functionality, hence this experiment. It is rather niche: many geological interpreters may have no use for it. But it is certainly no more niche than Elastic*, and has about four times the functionality. On the downside, it needs an internet connection for most of its juicy bits.

In all, I think that we might have expected 200 installs for the app by now, from about 400–500 downloads. I conclude that charging $2 has slowed down its adoption by a factor of ten, and hereby declare it free for everyone. It deserves to be free! If you were one of the awesome early adopters that paid a toonie for it, I have only this to say to you: we love you.

So, if you have an Android device, scan the code or otherwise hurry to the Android Market!

News of the week

Dips from pics

Algeria foldsIn collaboration with the Geological Survey of Canada, Pangaea Software have built a very nifty tool, Orion, for computing dip from satellite images and digital elevation models. With these two pieces of data, and some assumptions about scale, it's possible to deduce the dip of strata without getting your boots muddy. Matt heard all about this tool from the GSC collaborator, Paul Budkewitsch, at the 3P Arctic conference in Halifax last week; here's their abstract

CGGV Trilobit nodeOcean bottom investment

CGGVeritas has made a commitment to manufacture 800 new Trilobit four-component deepwater nodes for seismic acquisition, to add to its existing pool. The device has three oriented accelerometers plus a hydrophone in addition to an onboard battery and recording system. This all-in-one design can be deployed on the seabed by most ROVs, making it easy to place near platforms and other infrastructure that towed streamer and cable systems cannot access. 

Arguably the industry leader in cableless systems is FairfieldNodal, who are already deploying more than a thousand nodes. It's great to see a big player like CGGVeritas coming to compete with this potentially transformative technology.

Update for Insight Earth

Colorado-based software company TerraSpark has just announced the release of Insight Earth 1.6, an integrated volume interpretation tool. Enhancements include a more interactive data import and export interface, improved velocity modeling, and upgrades to the automated fault extraction. In a January post, Evan highlighted an article by Stan Hammon of TerraSpark on the computational and psychological factors affecting intellegent design. It's inspired stuff.

Re-introducing SubSurfWiki

AgileWiki is now SubSurfWiki, at subsurfwiki.org. Please change your bookmarks! We felt that it was a little too Agile-centric and want to appear as open web-space for anything subsurface. We want it to grow, deepen and diversify, and above all be useful. So check it out and let us know if you have any feedback on utility, appearance and content.

More news... If you like this, check out previous news posts from Agile*

Orion is a trademark of Pangaea Software. Insight Earth is a trademark of TerraSpark. SubSurfWiki is a trademark of Agile Geoscience. The satellite image is copyright of Google. This regular news feature is for information only. We aren't connected with any of these organizations, and don't necessarily endorse their products or services.

What we did over the summer holidays

The half-life of a link is hilariously brief, so here is an attempt to bring some new life back into the depleted viewership of our summer-time blogging. Keep in mind that you can search for any of the articles on our blog using the search tool, shown here, or sign up for email updates lower down on the side bar, for hands-free, automated Agile goodness every time we post something new.  

Well worth showing off, 4 July: This post was a demonstration of the presentation tool Prezi applied to pseudo-digital geoscience data. Geoscience is inherently visual and scale dependant, so we strive to work and communicate in a helicoptery way. I used Prezi to navigate a poster presentation on sharing geo-knowledge beyond the experts

Geophysical stamps—Geophone, 15 July: Instalment 3 of Matt's vintage German postage stamps was a tribute to the geophone. This post prompted a few readers to interject with suggestions and technical corrections. We strive for an interactive, dynamic and malleable blog, and their comments certainly improved the post. It was a reminder to be ready to react when you realize someone is actually reading your stuff. 

Petrophysics cheatsheet, 25 July and its companion post: Born out a desire to make a general quick reference for well logs, we published the Petrophysics cheatsheet, the fourth in our series of cheatsheets. In this companion post, you can read why petrophysics is hard. It sits in a middle ground between drilling operations, geoscience, and reservoir engineering, and ironically petrophysical measurements seldom measure the properties we are actually interested in. Wireline data is riddled with many service providers and tool options, data formats, as well as historical and exhaustive naming conventions.

How to cheat at spot the difference, 3 Aug: Edward Tufte says, "to clarify, add detail". Get all your data into one view to assist your audience in making a comparison. In this two-part post Matt demonstrated the power of visual crossplotting using two examples: a satelite photo of a pyroclastic flow, and a subsurface horizon with seismic attributes overlain. Directly mapping partially varying properties is better than data abstractions (graphs, tables, numbers, etc). Richer images convey more information and he showed us how to cheat at spot the difference using simple image processing techniques.

Digital rocks and accountability, 10 Aug: At the First International Workshop in Rock Physics, I blogged about two exciting talks on the first day of the conference on the promise of digital rock physics and how applied scientists should strive to be better in their work. Atul Gawande's ternary space of complexity could serve as tool for mapping out geoscience investigations. Try it out on your next problem and ask your teammates to expose the problem as they see it.

Wherefore art thou, Expert?, 24 Aug: Stemming from a LinkedIn debate on the role of service companies in educating and empowering their customers, Matt reflected on the role of the bewildered generalist in today's upstream industry. Information systems have changed, perfection is a myth and domain expertise runs too deep. Generalists can stop worrying about not knowing enough, specialists can builder shallower and more accesible tools, and service companies can serve instead of sell. 

Pseudogeophysics, 31 Aug: Delusion, skeptisicm, and how to crack a nut. This post drew comments about copyright control and the cost of lost opportunity; make sure to read the comments section of this post.

So yeah, now go catch up on your reading. 

Bad Best Practice

Applied scientists get excited about Best Practice. New professionals and new hires often ask where 'the manual' is, and senior technical management or chiefs often want to see such documentation being spread and used by their staff. The problem is that the scientists in the middle strata of skill and influence think Best Practice is a difficult, perhaps even ludicrous, concept in applied geoscience. It's too interpretive, too creative.

But promoting good ideas and methods is important for continuous improvement. At the 3P Arctic Conference in Halifax last week, I saw an interesting talk about good seismic acquisiton practice in the Arctic of Canada. The presenter was Michael Enachescu of MGM Energy, well known in the industry for his intuitive and integrated approach to petroleum geoscience. He gave some problems with the term best practice, advocating instead phrases like good practice:

  • There's a strong connotation that it is definitively superlative
  • The corollary to this is that other practices are worse
  • Its existence suggests that there is an infallible authority on the subject (an expert)
  • Therefore the concept stifles innovation and even small steps towards improvement

All this is reinforced by the way Best Practice is usually written and distributed:

  • Out of frustration, a chief commissions a document
  • One or two people build a tour de force, taking 6 months to do it
  • The read-only document is published on the corporate intranet alongside other such documents
  • Its existence is announced and its digestion mandated

Unfortunately, the next part of the story is where things go wrong:

  • Professionals look at the document and find that it doesn't quite apply to their situation
  • Even if it does apply, they are slightly affronted at being told how to do their job
  • People know about it but lack the technology or motivation to change how they were already working
  • Within 3 years there is enough new business, new staff, and new technology that the document is forgotten about and obselete, until a high-up commissions a document...

So the next time you think to yourself, "We need a Best Practice for this", think about trying something different:

  • Forget top-down publishing, and instead seed editable, link-rich documents like wiki pages
  • Encourage discussion and ownership by the technical community, not by management
  • Request case studies, which emphasize practical adaptability, not theory and methodology
  • Focus first on the anti-pattern: common practice that is downright wrong

How do you spread good ideas and methods in your organization? Does it work? How would you improve it?