Your child is dense for her age

Alan Cohen, veteran geophysicist and Chief Scientist at RSI, secured the role of provacateur by posting this question on the rock physics group on LinkedIn. He has shown that the simplest concepts are worthy of debate.

From a group of 1973 members, 44 comments ensued over the 23 days since he posted it. This has got to be a record for this community (trust me I've checked). It turns out the community is polarized, and heated emotions surround the topic. The responses that emerged are a fascinating narrative of niche and tacit assumptions seldomly articulated.

Any two will do

Why are two dimensions used, instead of one, three, four, or more? Well for one, it is hard to look at scatter plots in 3D. More fundamentally, a key learning from the wave equation and continuum mechanics is that, given any two elastic properties, any other two can be computed. In other words, for any seismically elastic material, there are two degrees of freedom. Two parameters to describe it.

  • P- and S-wave velocities
  • P-impedance and S-impedance
  • Acoustic and elastic impedance
  • R0 and G, the normal-incidence reflectivity and the AVO gradient
  • Lamé's parameters, λ and μ 

Each pair has its time and place, and as far as I can tell there are reasons that you might want to re-parameterize like this:

  1. one set of parameters contains discriminating evidence, not visible in other sets;
  2. one set of parameters is a more intuitive or more physical description of the rock—it is easier to understand;
  3. measurement errors and uncertainties can be elucidated better for one of the choices. 

Something missing from this thread, though, is the utility of empirical templates to makes sense of the data, whichever domain is adopted.

Measurements with a backdrop

In child development, body mass index (BMI) is plotted versus age to characterize a child's physical properties using the backdrop of an empirically derived template sampled from a large population. It is not so interesting to say, "13 year old Miranda has a BMI of 27", it is much more telling to learn that Miranda is above the 95th percentile for her age. But BMI, which is defined as weight divided by height squared, in not particularity intuitive. If kids were rocks, we'd submerge them Archimedes style into a bathtub, measure their volume, and determine their density. That would be the ultimate description. "Whoa, your child is dense for her age!" 

We do the same things with rocks. We algebraically manipulate measured variables in various ways to show trends, correlations, or clustering. So this notion of a template is very important, albeit local in scope. Just as a BMI template for Icelandic children might not be relevant for the pygmies in Paupa New Guinea, rock physics templates are seldom transferrable outside their respective geographic regions. 

For reference see the rock physics cheatsheet.

Thermogeophysics, whuh?

Earlier this month I spent an enlightening week in Colorado at a peer review meeting hosted by the US Department of Energy. Well-attended by about 300 people from organizations like Lawerence Livermore Labs, Berkeley, Stanford, Sandia National Labs, and *ahem* Agile, delegates heard about a wide range of cost-shared projects in the Geothermal Technologies Program. Approximately 170 projects were presented, representing a total US Department of Energy investment of $340 million.

I was at the meeting because we've been working on some geothermal projects in California's Imperial Valley since last October. It's fascinating, energizing work. Challenging too, as 3D seismic is not a routine technology for geothermal, but it is emerging. What is clear is that geothermal exploration requires a range of technologies and knowledge. It pulls from all of the tools you could dream up; active seismic, passive seismic, magnetotellurics, resistivity, LiDAR, hyperspectral imaging, not to mention the borehole and drilling technologies. The industry has an incredible learning curve ahead of them if Enhanced Geothermal Systems (EGS) are going to be viable and scalable.

The highlights of the event for me were not the talks that I saw, but the people I met during coffee breaks:

John McLennan & Joseph Moore at the the University of Utah have done some amazing laboratory experiments on large blocks of granite. They constructed a "proppant sandwich", pumped fluid through it, and applied polyaxial stress to study geochemical and stress effects on fracture development and permeability pathways. Hydrothermal fluids alter the proppant and gave rise to wormhole-like collapse structures, similar to those in the CHOPS process. They incorporated diagnostic imaging (CT-scans, acoustic emission tomography, x-rays), with sophisticated numerical simulations. A sign that geothermal practitioners are working to keep science up to date with engineering.

Stephen Richards bumped into me in the corridor after lunch after he overheard me talking about the geospatial work that I did with the Nova Scotia Petroleum database. It wasn't five minutes that passed before he rolled up his sleeves, took over my laptop, and was hacking away. He connected the WMS extension that he built as part of the State Geothermal Data to QGIS on my machine, and showed me some of the common file formats and data interchange content models for curating geothermal data on a continental scale. The hard part isn't nessecarily the implementation, the hard part is curating the data. And it was a thrill to see it thrown together, in minutes, on my machine. A sign that there is a huge amount of work to be done around opening data.

Dan Getman - Geospatial Section lead at NREL gave a live demo of the fresh prospector interface he built that is accesible through OpenEI. I mentioned OpenEI briefly in the poster presentation that I gave in Golden last year, and I can't believe how much it has improved since then. Dan once again confirmed this notion that the implementation wasn't rocket science, (surely any geophysicist could figure it out), and in doing so renewed my motivation for extending the local petroleum database in my backyard. A sign that geospatial methods are at the core of exploration and discovery.

There was an undercurrent of openness surrounding this event. By and large, the US DOE is paying for half of the research, so full disclosure is practically one of the terms of service. Not surprisingly, it feels more like science going on here, where innovation is being subsidized and intentionally accelerated because there is a demand. Makes me think that activity is a nessecary but not sufficient metric for innovation.

Geophysicists are awesome

Thirty-nine amazing, generous, inpiring authors contributed to our soon-to be released book about exploration geoscience. A few gave us more than one chapter: Brian Russell, Rachel Newrick, and Dave Mackidd each gave us three, and Clare Bond, José M Carcione, Don Herron, and Rob Simm did two. We humbly thank them for their boudless energy — we're happy to have provided an outlet! And Evan and I each did three chapters, partly because I was obsessed with getting to the completely arbitrary number 52. It just seemed 'right'. 

There are biographies on all the authors in the book, so you can find out for yourself what a diversity of backgrounds there is. By the numbers, out of 39 authors...

  • 10 are connected in some way to academia (5 of them full-time)
  • 19 are North American
  • 22 currently work in North America
  • 1515 papers and 14 books have been written by this crowd (not including this one :)

Update on the book: we got our proof copies on Friday and spent the weekend combing it for errors. There was nothing catastrophic, so the bugs were fixed and the book is ready! We are completely new to this self-publishing lark, so I'm not certain how the next bit goes, but I think it will be live in Amazon this Friday, a whole week early. Probably.

What happens next is also not completely clear yet. We are working on the Kindle edition, which should be out soon. In terms of layout, digital books are less complicated than print books, so we are mostly removing things that don't work or don't make sense in ebooks: page numbers, fancy formatting, forced line-breaks, etc. Once the Kindle edition is out, we will have a go at other platforms (iBooks, Google Books). Then we will turn to the web and start getting the material online, where it will no doubt be different again.

We'll keep you, dear reader, up to date right here. 

What's inside? 52 things!

On Tuesday we announced our forthcoming community collaboration book. So what's in there? So much magic, it's hard to know where to start. Here's a list of the first dozen chapters:

  1. Anisotropy is not going away, Vladimir Grechka, Shell
  2. Beware the interpretation-to-data trap, Evan Bianco, Agile
  3. Calibrate your intuition, Taras Gerya, ETH Zürich
  4. Don’t ignore seismic attenuation, Carl Reine, Nexen
  5. Don’t neglect your math, Brian Russell, Hampson-Russell
  6. Don’t rely on preconceived notions, Eric Andersen, Talisman
  7. Evolutionary understanding in seismic interpretation, Clare Bond, University of Aberdeen
  8. Explore the azimuths, David Gray, Nexen
  9. Five things I wish I’d known, Matt Hall, Agile
  10. Geology comes first, Chris Jackson, Imperial College London
  11. Geophysics is all around, José M Carcione, OGS Trieste, Italy
  12. How to assess a colourmap, Matteo Niccoli, MyCarta blog
  13. ...

When I read that list, I cannot wait to read the book — and I've read it three times already! This is not even one quarter of the book. You can guess from the list that some are technical, others are personal, a few may be controversial.

One thing we had fun with was organizing the contents. The chapters are, as you see, in alphabetical order. But each piece has thematic tags. Some were a little hard to classify, I admit, and some people will no doubt wonder why, say, Bill Goodway's The magic of Lamé is labeled 'basics', but there you go.

One thing I did with the tags was try to group the chapters according to the tags they had. Each chapter has three tags. If we connect the three tags belonging to an individual chapter, and do the same for every chapter, then we can count the connections and draw a graph (right). I made this one in Gephi

The layout is automatic: relative positions are calculated by modeling the connections as springs whose stiffness depends on the number of links. Node size is a function of connectedness. Isn't it great that geology is in the middle?

Now, without worrying too much about the details, I used the graph to help group the chapters non-exclusively into the following themes:

  • Fundamentals  basics, mapping (16 chapters)
  • Concepts  geology, analogs (12 chapters)
  • Interpretation  needed a theme of its own (21 chapters)
  • Power tools  attributes, ninja skills (9 chapters)
  • Pre-stack  rock physics, pre-stack, processing (11 chapters)
  • Quantitative  mathematics, analysis (20 chapters)
  • Integration  teamwork, workflow (15 chapters)
  • Innovation  history, innovation, technology (9 chapters)
  • Skills  learning, career, managing (15 chapters)

I think this accurately reflects the variety in the book. Next post we'll have a look at the variety among the authors — perhaps it explains the breadth of themes. 

Today's the day!

We're super-excited. We said a week ago we'd tell you why today. 

At the CSEG-CSPG conference last year, we hatched a plan. The idea was simple: ask as many amazing geoscientists as we could to write something fun and/or interesting and/or awesome and/or important about geophysics. Collect the writings. Put them in a book and/or ebook and/or audiobook,... and sell it at a low price. And also let the content out into the wild under a creative commons license, so that others can share it, copy it, and spread it.

So the idea was conceived as Things You Should Know About Geophysics. And today the book is born... almost. It will be available on 1 June, but you can see it right now at Amazon.com, pre-order or wish-list it. It will be USD19, or about 36 cents per chapter. For realz.

The brief was deliberately vague: write up to 600 words on something that excites or inspires or puzzles you about exploration geophysics. We had no idea what to expect. We knew we'd get some gold. We hoped for some rants.

Incredibly, within 24 hours of sending the first batch of invites, we had a contribution. We were thrilled, beyond thrilled, and this was the moment we knew it would work out. Like any collaborative project, it was up and down. We'd get two or three some days, then nothing for a fortnight. We extended deadlines and crossed fingers, and eventually called 'time' at year's end, with 52 contributions from 38 authors.

Like most of what we do, this is a big experiment. We think we can have it ready for 1 June but we're new to this print-on-demand lark. We think the book will be in every Amazon store (.ca, .co, .uk, etc), but it might take a few weeks to roll through all the sites. We think it'll be out as an ebook around the same time. Got an idea? Tell us how we can make this book more relevant to you!

News of the month

Welcome to our more-or-less regular new post. Seen something awesome? Get in touch!

Convention time!

Next week is Canada's annual petroleum geoscience party, the CSPGCSEGCWLS GeoConvention. Thousands of applied geoscientists will descend on Calgary's downtown Telus Convention Centre to hear about the latest science and technology in the oilfield, and catch up with old friends. We're sad to be missing out this year — we hope someone out there will be blogging!

GeoConvention highlights

There are more than fifty technical sessions at the conference this year. For what it's worth, these are the presentations we'd be sitting in the front row for if we were going:

Now run to the train and get to the ERCB Core Research Centre for...

Guided fault interpretation

We've seen automated fault interpretation before, and now Transform have an offering too. A strongly tech-focused company, they have a decent shot at making it work in ordinary seismic data — the demo shows a textbook example:

GPU processing on the desktop

On Monday Paradigm announced their adoption of NVIDIA's Maximus technology into their desktop applications. Getting all gooey over graphics cards seems very 2002, but this time it's not about graphics — it's about speed. Reserving a Quadro processor for graphics, Paradigm is computing seismic attributes on a dedicated Tesla graphics processing unit, or GPU, rather than on the central processing unit (CPU). This is cool because GPUs are massively parallel and are much, much faster at certain kinds of computation because they don't have the process management, I/O, and other overheads that CPUs have. This is why seismic processing companies like CGGVeritas are adopting them for imaging. Cutting edge stuff!

In other news...

This regular news feature is for information only. We aren't connected with any of these organizations, and don't necessarily endorse their products or services. 

One week countdown

We're super-excited, dear reader. Even more than usual.

At the Calgary GeoConvention last year, we hatched a plan. The idea was simple: ask as many amazing geophysicists as we could to help us create something unique and fun. Now, as the conference creeps up on us again, it's almost ready. A new product from Agile that we think will make you smile.

Normally we like to talk about what we're up to, but this project has been a little different. We weren't at all sure it was going to work out until about Christmas time. And it had a lot of moving parts, so the timeline has been, er, flexible. But the project fits nicely into our unbusiness model: it has no apparent purpose other than being interesting and fun. Perfect!

In an attempt to make it look like we have a marketing department, or perhaps to confirm that we definitely do not, let's count down to next Tuesday morning, in milliseconds of course. Come back then — we hope to knock your socks at least partly off...

K is for Wavenumber

Wavenumber, sometimes called the propagation number, is in broad terms a measure of spatial scale. It can be thought of as a spatial analog to the temporal frequency, and is often called spatial frequency. It is often defined as the number of wavelengths per unit distance, or in terms of wavelength, λ:

$$k = \frac{1}{\lambda}$$

The units are \(\mathrm{m}^{–1}\), which are nameless in the International System, though \(\mathrm{cm}^{–1}\) are called kaysers in the cgs system. The concept is analogous to frequency \(f\), measured in \(\mathrm{s}^{–1}\) or Hertz, which is the reciprocal of period \(T\); that is, \(f = 1/T\). In a sense, period can be thought of as a temporal 'wavelength' — the length of an oscillation in time.

If you've explored the applications of frequency in geophysics, you'll have noticed that we sometimes don't use ordinary frequency f, in Hertz. Because geophysics deals with oscillating waveforms, ones that vary around a central value (think of a wiggle trace of seismic data), we often use the angular frequency. This way we can also express the close relationship between frequency and phase, which is an angle. So in many geophysical applications, we want the angular wavenumber. It is expressed in radians per metre:

$$k = \frac{2\pi}{\lambda}$$

The relationship between angular wavenumber and angular frequency is analogous to that between wavelength and ordinary frequency — they are related by the velocity V:

$$k = \frac{\omega}{V}$$

It's unfortunate that there are two definitions of wavenumber. Some people reserve the term spatial frequency for the ordinary wavenumber, or use ν (that's a Greek nu, not a vee — another potential source of confusion!), or even σ for it. But just as many call it the wavenumber and use k, so the only sure way through the jargon is to specify what you mean by the terms you use. As usual!

Just as for temporal frequency, the portal to wavenumber is the Fourier transform, computed along each spatial axis. Here are two images and their 2D spectra — a photo of some ripples, a binary image of some particles, and their fast Fourier transforms. Notice how the more organized image has a more organized spectrum (as well as some artifacts from post-processing on the image), while the noisy image's spectrum is nearly 'white':

Explore our other posts about scale.

The particle image is from the sample images in FIJI. The FFTs were produced in FIJI.

Update

on 2012-05-03 16:41 by Matt Hall

Following up on Brian's suggesstion in the comments, I added a brief workflow to the SubSurfWiki page on wavenumber. Please feel free to add to it or correct it if I messed anything up.