Poisson's controversial stretch-squeeze ratio

Before reading this, you might want to check out the previous post about Siméon Denis Poisson's life and career. Then come back here...


Physicists and mathematicians knew about Poisson's ratio well before Poisson got involved with it. Thomas Young described it in his 1807 Lectures on Natural Philosophy and the Mechanical Arts:

We may easily observe that if we compress a piece of elastic gum in any direction, it extends itself in other directions: if we extend it in length, its breadth and thickness are diminished.

Young didn't venture into a rigorous formal definition, and it was referred to simply as the 'stretch-squeeze ratio'.

A new elastic constant?

Twenty years later, at a time when France's scientific muscle was fading along with the reign of Napoleon, Poisson published a paper attempting to restore his slightly bruised (by his standards) reputation in the mechanics of physical materials. In it, he stated that for a solid composed of molecules tightly held together by central forces on a crystalline lattice, the stretch squeeze ratio should equal 1/2 (which is equivalent to what we now call a Poisson's ratio of 1/4). In other words, Poisson regarded the stretch-squeeze ratio as a physical constant: the same value for all solids, claiming, 'This result agrees perfectly' with an experiment that one of his colleagues, Charles Cagniard de la Tour, recently performed on brass. 

Poisson's whole-hearted subscription to the corpuscular school certainly prejudiced his work. But the notion of discovering of a new physical constant, like Newton did for gravity, or Einstein would eventually do for light, must have been a powerful driving force. A would-be singular elastic constant could unify calculations for materials soft or stiff — in contrast to elastic moduli which vary over several orders of magnitude. 

Poisson's (silly) ratio

Later, between 1850 and 1870, the physics community acquired more evidence that the stretch-squeeze ratio was different for different materials, as other materials were deformed with more reliable measurements. Worse still, de la Tour's experiments on the elasticity of brass, upon which Poisson had hung his hat, turned out to be flawed. The stretch-squeeze ratio became known as Poisson's ratio not as a tribute to Poisson, but as a way of labeling a flawed theory. Indeed, the falsehood became so apparent that it drove the scientific community towards treating elastic materials as continuous media, as opposed to an ensemble of particles.

Today we define Poisson's ratio in terms of strain (deformation), or Lamé's parameters, or the speed \(V\) of P- and S-waves:

 
 

Interestingly, if Poisson turned out to be correct, and Poisson's ratio was in fact a constant, that would mean that the number of elastic constants it would take to describe an isotropic material would be one instead of two. It wasn't until Augustin Louis Cauchy used the notion of a stress tensor to describe the state of stress at a point within a material, with its three normal stresses and three shear stresses, did the need for two elastic constants become apparent. Tensors gave the mathematical framework to define Hooke's law in three dimensions. Found in the opening chapter in any modern textbook on seismology or mechanical engineering, continuum mechanics represents a unique advancement in science set out to undo Poisson's famously false deductions backed by insufficient data.

References

Greaves, N (2013). Poisson's ratio over two centuries: challenging hypothesis. Notes & Records of the Royal Society 67, 37-58. DOI: 10.1098/rsnr.2012.0021

Editorial (2011). Poisson's ratio at 200, Nature Materials10 (11) Available online.

 

Great geophysicists #13: Poisson

Siméon Denis Poisson was born in Pithiviers, France, on 21 June 1781. While still a teenager, Poisson entered the prestigious École Polytechnique in Paris, and published his first papers in 1800. He was immediately befriended — or adopted, really — by Lagrange and Laplace. So it's safe to say that he got off to a pretty good start as a mathematician. The meteoric trajectory continued throughout his career, as Poisson received more or less every honour a French scientist could accumulate. Along with Laplace and Lagrange — as well as Fresnel, Coulomb, Lamé, and Fourier — his is one of the 72 names on the Eiffel Tower.

Wrong Poisson

In the first few decades following the French Revolution, which ended in 1799, France enjoyed a golden age of science. The Société d’Acrueil was a regular meeting of savants, hosted by Laplace and the chemist Claude Louis Berthollet, and dedicated to the exposition of physical phenomena. The group worked on problems like the behaviour of gases, the physics of sound and light, and the mechanics of deformable materials. Using Newton's then 120-year-old law of gravitation as an analogy, the prevailing school of thought accounted for all physical phenomena in terms of forces acting between particles. 

Poisson was not flawless. As one of the members of this intellectual inner circle, Poisson was devoted to the corpuscular theory of light. Indeed, he dismissed the wave theory of light completely, until proven wrong by Thomas Young and, most conspicuously, Augustin-Jean Fresnel. Even Poisson's ratio, the eponymous elastic modulus, wasn't the result of his dogged search for truth, but instead represents a controversy that drove the development of the three-dimensional theory of elasticity. More on this next time.

The workaholic

Although he did make time for his wife and four children — but only after 6 pm — Poisson apparently had little time for much besides mathematics. His catchphrase was

Life is only good for two things: doing mathematics and teaching it.

In the summer of 1838, he learned he had a form of tuberculosis. According to James (2002), he was unable to take time away from work for long enough to recuperate. Eventually, insisting on conducting the final exams at the Polytechnique for the 23rd year in a row, he took on more than he could handle. He died on 20 April 1840. 


References

Grattan-Guinness, I. (1990). Convolutions in French Mathematics, 1800-1840: From the Calculus and Mechanics to Mathematical Analysis and Mathematical Physics. Vol.1: The Setting. Springer Science & Business Media. 549 pages.

Ioan James, I (2002). Remarkable Mathematicians: From Euler to Von Neumann. Cambridge University Press, 433 pages.

The University of St Andrews MacTutor archive article on Poisson.

Deriving equations in Python

Last week I wrote about the elastic moduli, and showed the latest version of my table of equations. Here it is; click on it for a large version:

Making this grid was a bit of an exercise in itself. One could spend some happy hours rearranging things by hand; instead, I spent some (mostly) happy hours learning to use SymPy, a symbolic maths library for Python. For what it's worth, you can see my flailing in this Jupyter Notebook. Warning: it's pretty untidy.

Wrangling equations

Fortunately, SymPy is easy to get started with. Let's look at getting an expression for \(V_\mathrm{P}\) in terms of \(E\) and \(K\), given that I already have an expression in terms of \(E\) and \(\mu\), plus an expression for \(\mu\) in terms of \(E\) and \(K\).

First we import the SymPy library, set it up for nice math display in the Notebook, and initialize some parameter names:

 
>>> import sympy
>>> sympy.init_printing(use_latex='mathjax')
>>> lamda, mu, nu, E, K, rho = sympy.symbols("lamda, mu, nu, E, K, rho")

lamda is not a typo: lambda means something else in Python — it's a sort of unnamed function.

Now we're ready to define an expression. First, I'll import SymPy's own square root function for convenience. Then I define an expression for \(V_\mathrm{P}\) in terms of \(E\) and \(\mu\):

 
>>> vp_expr = sympy.sqrt((mu * (E - 4*mu)) / (rho * (E - 3*mu)))
>>> vp_expr

$$ \sqrt{\frac{\mu \left(E - 4 \mu\right)}{\rho \left(E - 3 \mu\right)}} $$

Now we can give SymPy the expression for \(\mu\) in terms of \(E\) and \(K\) and substitute:

 
>>> mu_expr = (3 * K * E) / (9 * K - E)
>>> vp_new = vp_expr.subs(mu, mu_expr)
>>> vp_new

$$\sqrt{3} \sqrt{\frac{E K \left(- \frac{12 E K}{- E + 9 K} + E\right)}{\rho \left(- E + 9 K\right) \left(- \frac{9 E K}{- E + 9 K} + E\right)}}$$

Argh, what is that?? Luckily, it's easy to simplify:

 
>>> sympy.simplify(vp_new)

$$\sqrt{3} \sqrt{\frac{K \left(E + 3 K\right)}{\rho \left(- E + 9 K\right)}}$$

That's more like it! What's really cool is that SymPy can even generate the \(\LaTeX\) code for your favourite math renderer:

 
>>> print(sympy.latex(sympy.simplify(vp_new)))
\sqrt{3} \sqrt{\frac{K \left(E + 3 K\right)}{\rho \left(- E + 9 K\right)}}

That's all there is to it!

What is the mystery X?

Have a look at the expression for  \(V_\mathrm{P}\) in terms of \(E\) and \(\lambda\):

 

$$\frac{\sqrt{2}}{2} \sqrt{\frac{1}{\rho} \left(E - \lambda + \sqrt{E^{2} + 2 E \lambda + 9 \lambda^{2}}\right)}$$

I find this quantity — I call it \(X\) in the big table of equations — really curious:

 

$$ X = \sqrt{9\lambda^2 + 2E\lambda + E^2} $$

As you can see from the similar table on Wikipedia, a similar quantity appears in expressions in terms of \(E\) and \(M\). These quantities look like elastic moduli, and even have the right units and order of magnitude as the others. If anyone has thoughts on what significance it might have, if any, or on why expressions in terms of \(E\) and \(\lambda\) or \(M\) should be so uncommonly clunky, I'm all ears. 

One last thing... I've mentioned Melvyn Bragg's wonderful BBC radio programme In Our Time before. If you like listening to the radio, try this recent episode on the life and work of Robert Hooke. Not only did he invent the study of elasticity with his eponymous law, he was also big in microscopy, describing things like the cellular structure of cork in detail (right).

All the elastic moduli

An elastic modulus is the ratio of stress (pressure) to strain (deformation) in an isotropic, homogeneous elastic material:

$$ \mathrm{modulus} = \frac{\mathrm{stress}}{\mathrm{strain}} $$

OK, what does that mean?

Elastic means what you think it means: you can deform it, and it springs back when you let go. Imagine stretching a block of rubber, like the picture here. If you measure the stress \(F/W^2\) (i.e. the pressure is force per unit of cross-sectional area) and strain \(\Delta L/L\) (the stretch as a proportion) along the direction of stretch ('longitudinally'), then the stress/strain ratio gives you Young's modulus, \(E\).

Since strain is unitless, all the elastic moduli have units of pressure (pascals, Pa), and is usually on the order of tens of GPa (billions of pascals) for rocks. 

The other elastic moduli are: 

There's another quantity that doesn't fit our definition of a modulus, and doesn't have units of pressure — in fact it's unitless —  but is always lumped in with the others: 

What does this have to do with my data?

Interestingly, and usefully, the elastic properties of isotropic materials are described completely by any two moduli. This means that, given any two, we can compute all of the others. More usefully still, we can also relate them to \(V_\mathrm{P}\), \(V_\mathrm{S}\), and \(\rho\). This is great because we can get at those properties easily via well logs and less easily via seismic data. So we have a direct path from routine data to the full suite of elastic properties.

The only way to measure the elastic moduli themselves is on a mechanical press in the laboratory. The rock sample can be subjected to confining pressures, then squeezed or stretched along one or more axes. There are two ways to get at the moduli:

  1. Directly, via measurements of stress and strain, so called static conditions.

  2. Indirectly, via sonic measurements and the density of the sample. Because of the oscillatory and transient nature of the sonic pulses, we call these dynamic measurements. In principle, these should be the most comparable to the measurements we make from well logs or seismic data.

Let's see the equations then

The elegance of the relationships varies quite a bit. Shear modulus \(\mu\) is just \(\rho V_\mathrm{S}^2\), but Young's modulus is not so pretty:

$$ E = \frac{\rho V_\mathrm{S}^2 (3 V_\mathrm{P}^2 - 4 V_\mathrm{S}^2) }{V_\mathrm{P}^2 - V_\mathrm{S}^2} $$

You can see most of the other relationships in this big giant grid I've been slowly chipping away at for ages. Some of it is shown below. It doesn't have most of the P-wave modulus expressions, because no-one seems too bothered about P-wave modulus, despite its obvious resemblance to acoustic impedance. They are in the version on Wikipedia, however (but it lacks the \(V_\mathrm{P}\) and \(V_\mathrm{S}\) expressions).

Some of the expressions for the elastic moduli and velocities — click the image to see them all in SubSurfWiki.

Some of the expressions for the elastic moduli and velocities — click the image to see them all in SubSurfWiki.

In this table, the mysterious quantity \(X\) is given by:

$$ X = \sqrt{9\lambda^2 + 2E\lambda + E^2} $$

In the next post, I'll come back to this grid and tell you how I've been deriving all these equations using Python.


Top tip... To find more posts on rock physics, click the Rock Physics tag below!

Why Python beats MATLAB for geophysics

MATLAB — the scientific computing environment which includes a programming language — is amazing. It has probably done as much for the development of new geophysical methods, and for the teaching and learning of geophysics, as any other tool or language. A purely anecdotal assertion, but it's rare to meet a geophysicist who has not at least dabbled in MATLAB, and it is used daily in geophysics labs and classrooms. Geophysics <3 MATLAB.

It's easy to see why — MATLAB definitely has some advantages.

Advantages of MATLAB

  • Matrices. MATLAB implicitly treats arrays as matrices (the name means 'matrix laboratory'). As a result, notation is quite intuitive for mathematicians. For example, a*b means standard matrix multiplication, the dot product. (Slightly confusingly, to get Python-style element-wise multiplication, add a dot: a.*b).
  • Lots of functions. MATLAB has been around for over 30 years, so there are many, many useful functions. Find them either in the core product, in one of the toolboxes, or in MATLAB Central.
  • Simulink. This block-based system design and simulation engine is much-loved by engineers. It allows users to model physical systems in an intuitive, graphical environment.
  • Easy to install. The MATLAB environment is a desktop application, so it is instantly familiar and can be managed under the same processes other software in your machine or organization is managed.
  • MATLAB is widespread in academia. Thanks to one of those generous schemes where software corporations give free software to universities, just because they're awesome and definitely not for any other reason, students and profs have easy and free access to MATLAB. Outside academia, however, you're looking at tens of thousands of dollars.

So far so good, but it's time for geophysics to switch to Python. On the face of it, the language has a lot in common with MATLAB: they're both easy to learn, and both have broad ecosystems that make things like image processing, statistics, and signal processing easy. But Python has some special features that make it a fantastic platform for scientific computing...

Advantages of Python

  • Free and open. Thanks to one of those generous schemes where people make software and let anyone use it for any purpose for free, Python is free! Not only is it free of charge, you are free to inspect and modify the code. Open is awesome. (There are other free alternatives to MATLAB, notably GNU Octave and SciLab.)
  • General purpose. One of the things I love about Python is its flexibility. You can use it in the shell on microtasks, or interactively, or in scripts, or to write server software, or to build enterprise software with GUIs.
  • Namespaces. Everything in MATLAB lives in the main namespace, whereas Python keeps things inherently modular. To access NumPy, say, you have to import it and then use its namespace to get at its contents: numpy.ndarray([1, 2, 3]). This has various advantages, including flexibility, readability, learnability, and portability.
  • Introspection. A powerful idea in Python, introspection means that you (or your code) can see inside every module, class, and function. You can use access private variables, or write code that 'knows' about other objects' interfaces.
  • Portable. You can run your Python code on any architecture, whereas to run MATLAB code you either need all the MATLAB licenses the software uses, or another pricey toolbox to make executables.
  • Popular. Python is the 7th most popular tag in Stack Overflow, whereas MATLAB is the 58th. While programming is not a popularity contest, think of your career, or the careers of your students. Once they graduate, Python will serve them better than MATLAB. There are over 300 jobs for Pythonistas on Stack Overflow Jobs right now. MATLAB jobs? Nine.

So there you have it. It's time to switch to Python. If you're new to programming, there's no contest. I suppose if you're productive in MATLAB, and have access to all the toolboxes, then admittedly it's hard to say you should switch.

But I'll still say it.


I was inspired to write this post after talking to a geophysicist about using programming languages in the classroom, and by the lists in this nice post on pyzo.org. It would be interesting to hear what you use in the classroom — as an instructor or as a student. I know geophysics is being taught with the help of MATLAB (in many places), Java (e.g. at Colorado School of Mines), Mathematica (e.g. by Chris Liner). I wonder if there's anyone using JavaScript, which wouldn't be a terrible choice. Or C++? Or Fortran?? Let us know in the comments!

Tools for drawing geoscientific figures

This is a response to Boyan Vakarelov's useful post on LinkedIn about tools for creating geological figures. I especially liked his SketchUp tip.

It's a while since we wrote about our toolset, so I thought I'd document what we're currently using for making figures. You won't be surprised to hear that they're mostly open source. 

Our figure creation toolbox

  • QGIS — if it's a map, you should make it in a GIS, it's as simple as that.
  • Inkscape — for most drawing and figure creation tasks. It's just as good as Illustrator.
  • GIMP — for raster editing tasks. Rasters are no good for editable figures or line art though.
  • TimeScale Creator — a little-known tool for making editable chronostratigraphic columns. Here's an example from way back on this very blog. The best thing: you can export SVG files, then edit them in Inkscape.
  • Python, R, etc. — the best way to make reproducible scientific figures is not to draw them at all. Instead, create data visualizations programmatically.

To really appreciate how fantastic the programmatic approach is, check out Sergey Fomel's treasure trove of reproducible documents, in which every figure is really just the output of a little program that anyone can run. Here's one of my own, adapted from a previous post and a sneak peek of an upcoming Leading Edge tutorial:

Different sample interpolation styles give different amplitudes for inter-sample positions, as shown at the red 'horizon' time pick. From upcoming tutorial in the April edition of The Leading Edge

Everything you wanted to know about images

Screenshots often form part of a figure, because they're so much easier than trying to figure out how to export an image, or trying to wrangle the data from scratch. If you find yourself grabbing a screenshot, and any time you're providing an image for someone else — especially if it's destined for print — you need to know all about image resolution. Read my post Save the samples for my advice. 

If you still save your images as JPEG, you also need to read my post about How to choose an image format. One day you might need the fidelity you are throwing away! Here's the short version: save everything as a PNG.

Last thing: know the difference between vector and raster graphics. Make vectors when you can.

Stop using PowerPoint!

The only bit of Boyan's post I didn't like was the bit about PowerPoint. I admit, fifteen years ago I was a bit of a slave to PowerPoint. I'd have preferred to use Illustrator at the time, but it was well beyond corporate IT's ken, and I hadn't yet discovered Inkscape. But I'm over it now — and just as well because it's a horrible drawing tool. The main limitation is not having layers, which is a show-stopper for me, but there's also the generic typography, simplistic spline editing, the inability to handle standard formats like SVG, and no scripting or plug-ins.

Getting good

If you want to learn about making effective scientific figures, I strongly recommend reading anything you can by Edward Tufte, Robert Kosara, Alberto Cairo, and Mike Bostock. For some quick inspiration check out the #dataviz hashtag on Twitter, or feast your eyes on this amazing collection of graphics, or Mike Bostock's interactive examples, or... there are too many resources to choose from.

How about you? Share your favourite tools in the comments or on Boyan's post.

A European geo-gaming hackathon

I'm convinced that hackathons are the best way to get geoscientists and engineers inventing and collaborating in new ways. They are better for learning than courses. They are better for networking than parties. And they nearly always have tacos! 

If you are unsure what a hackathon is, or why I'm so enthusiastic about them, you can read my November article in the Recorder (Hall 2015, CSEG Recorder, vol 40, no 9).

The next hackathon will be 28 and 29 May in Vienna, Austria — right before the EAGE Conference and Exhibition. You can sign up right now! Please get it in your calendar and pass it along.

Throwing down the gauntlet

Colorado School of Mines has dominated the student showing at the last 2 autumn hackathons. I know there are plenty more creative research groups out there. Come out and show the world your awesomeness — in teams of up to 4 people — and spend a weekend learning and coding. Also: there will be beer.

To everyone else: this is not a student event, it's for everyone. Most of the participants in the past have been professionals, but the more diverse it is, the more we all get out of it. So don't ask yourself if you'll fit in — you will. 

A word about the fee

Our previous hackathons have been free, but this one has a small fee. It's an experiment. Like most free events, no-shows are a challenge; I'm hoping the fee reduces the problem. If the fee makes it difficult for you to join us, please get in touch — I do not want it to be a barrier.

Just to be clear: these events do not make money. Previous events have been generously sponsored — and that's the only way they can happen. We need support for this one too: if you're a champion of creativity in science and want to support this event, you can find me at matt@agilegeoscience.com, or you can read more about sponsorship here.

Details

The dates are 28 and 29 May. The event will run 8 till 6 (or so) on both the Saturday and the Sunday. We don't have a venue finalized yet. Ideas and contributions of any kind are welcome — this is a community event.

The theme this year will be Games. If you have ideas, share them in the comments! Here are some random project ideas to get you going...

  • Acquisition optimizer: lay out the best geometry to image the geology.
  • Human inversion: add geological layers to match a seismic trace.
  • Drill wells on a budget to make the optimal map of an unseen surface.
  • Which geological section matches the (noisy) seismic section?
  • Top Trumps for global 3D seismic surveys, with data scraped from press releases.
  • Set up the best processing flow based for a modeled, noisy shot gather.

It's going to be fun! If you're traveling to EAGE this year, I hope we see you there!


Photo of Vienna by Nic Piégsa, CC-BY. Photo of bridge by Dragan Brankovic, CC-BY.

Images as data

I was at the Atlantic Geoscience Society's annual meeting on Friday and Saturday, held this year in a cold and windy Truro, Nova Scotia. The AGS is a fairly small meeting — maybe a couple of hundred geoscientists make the trip — but usually good value, especially if you're working in the area. 

A few talks and posters caught my attention, as they were all around a similar theme: getting data from images. Not in an interpretive way, though — these papers were about treating images fairly literally. More like extracting impedance from seismic than, say, making a horizon map.

Drone to stereonet

Amazing 3D images generated from a large number of 2D images of outcrop. LEft: the natural colour image. Middle: all facets generated by point cloud analysis. Right: the final set of human-filtered facets. © Joseph Cormier 2016

Amazing 3D images generated from a large number of 2D images of outcrop. LEft: the natural colour image. Middle: all facets generated by point cloud analysis. Right: the final set of human-filtered facets. © Joseph Cormier 2016

Probably the most eye-catching poster was that of Joseph Cormier (UNB), who is experimenting with computer-assisted structural interpretation. Using dozens of high-res photographs collected by a UAV, Joseph combines them to create reconstruct the 3D scene of the outcrop — just from photographs, no lidar or other ranging technology. The resulting point cloud reveals the orientations of the outcrop's faces, as well as fractures, exposed faults, and so on. A human interpreter can then apply her judgment to filter these facets to groups of tectonically significant sets, at which point they can be plotted on a stereonet. Beats crawling around with a Brunton or Suunto for days!

Hyperspectral imaging

There was another interesting poster by a local mining firm that I can't find in the abstract volume. They had some fine images from CoreScan, a hyperspectral imaging and analysis company operating in the mining industry. The technology, which can discern dozens of rock-forming minerals from their near infrared and shortwave infrared absorption characteristics, seems especially well-suited to mining, where mineralogical composition is usually more important than texture and sedimentological interpretation. 

Isabel Chavez (SMU) didn't need a commercial imaging service. To help correlate Laurasian shales on either side of the Atlantic, she presented results from using a handheld Konica-Minolta spectrophotometer on core. She found that CIE L* and a* colour parameters correlated with certain element ratios from ICP-MS analysis. Like many of the students at AGS, Isabel was presenting her undergraduate thesis — a real achievement.

Interesting aside: one of the chief applications of colour meters is measuring the colour of chips. Fascinating.

The hacker spirit is alive and well

The full spectrum (top), and the CCD responses with IR filter, Red filter, green filter, and blue filter (bottom). All of the filters admitted some infrared light, causing problems for calibration. © Robert McEwan 2016.

The full spectrum (top), and the CCD responses with IR filter, Red filter, green filter, and blue filter (bottom). All of the filters admitted some infrared light, causing problems for calibration. © Robert McEwan 2016.

After seeing those images, and wishing I had a hyperspectral imaging camera, Rob McEwan (Dalhousie) showed how to build one! In a wonderfully hackerish talk, he showed how he's building a $100 mineralogical analysis tool. He started by removing the IR filter from a second-hand Nikon D90, then — using a home-made grating spectrometer — measured the CCD's responses in the red, green, blue, and IR bands. After correcting the responses, Rob will use the USGS spectral library (Clark et al. 2007) to predict the contributions of various minerals to the image. He hopes to analyse field and lab photos at many scales. 

Once you have all this data, you also have to be able to process it. Joshua Wright (UNB) showed how he has built a suite of VisualBasic Macros to segment photomicrographs into regions representing grains using FIJI, then post-process the image data as giant arrays in an Excel spreadsheet (really!). I can see how a workflow like this might initially be more accessible to someone new to computer programming, but I felt like he may have passed Excel's sweetspot. The workflow would be much smoother in Python with scikit-image, or MATLAB with the Image Processing Toolbox. Maybe that's where he's heading. You can check out his impressive piece of work in a series of videos; here's the first:

Looking forward to 2016

All in all, the meeting was a good kick off to the geoscience year — a chance to catch up with some local geoscientists, and meet some new ones. I also had the chance to update the group on striplog, which generated a bit of interest. Now I'm back in Mahone Bay, enjoying the latest winter storm, enjoying the feeling of having something positive to blog about!

Please be aware that, unlike the images I usually include in posts, the images in this post are not open access and remain the copyright of their respective authors.


References

Isabel Chavez, David Piper, Georgia Pe-Piper, Yuanyuan Zhang, St Mary's University (2016). Black shale Selli Level recorded in Cretaceous Naskapi Member cores in the Scotian Basin. Oral presentation, AGS Colloquium, Truro NS, Canada.

Clark, R.N., Swayze, G.A., Wise, R., Livo, E., Hoefen, T., Kokaly, R., Sutley, S.J., 2007, USGS digital spectral library splib06a: U.S. Geological Survey, Digital Data Series 231

Joseph Cormier, Stefan Cruse, Tony Gilman, University of New Brunswick (2016). An optimized method of unmanned aerial vehicle surveying for rock slope analysis, 3D modeling, and structural feature extraction. Poster, AGS Colloquium, Truro NS, Canada.

Robert McEwan, Dalhousie University (2016). Detecting compositional variation in granites – a method for remotely sensed platform. Oral presentation, AGS Colloquium, Truro NS, Canada.

Joshua Wright, University of New Brunswick (2016). Using macros and advanced functions in Microsoft ExcelTM to work effectively and accurately with large data sets: An example using sulfide ore characterizatio. Oral presentation, AGS Colloquium, Truro NS, Canada.

Old skool plot tool

It's not very glamorous, but sometimes you just want to plot a SEG-Y file. That's why we crafted seisplot. OK, that's why we cobbled seisplot together out of various scripts and functions we had lying around, after a couple of years of blog posts and Leading Edge tutorials and the like.

Pupils of the old skool — when everyone knew how to write a bash script, pencil crayons and lead-filled beanbags ruled the desktop, and Carpal Tunnel Syndrome was just the opening act to the Beastie Boys — will enjoy seisplot. For a start, it's command line only: 

    python seisplot.py -R -c config.py ~/segy_files -o ~/plots

Isn't that... reassuring? In this age of iOS and Android and Oculus Rift... there's still the command line interface.

Features galore

So what sort of features can you look forward to? Other than all the usual things you've come to expect of subsurface software, like a complete lack of support or documentation. (LOL, I'm kidding.) Only these awesome selling points:

  • Make wiggle traces or variable density plots... or don't choose — do both!
  • If you want, the script will descend into subdirectories and make plots for every SEG-Y file it finds.
  • There are plenty of colourmaps to choose from, or if you're insane you can make your own.
  • You can make PNGs, JPGs, SVGs or PDFs. But not CGM, sorry about that.

Well, I say 'selling points', but the tool is 100% free. We think this is a fair price. It's also open source of course, so please — seriously, please — improve the source code, then share it with the world! The code is in GitHub, natch.

Never go full throwback

There is one more feature: you can go full throwback and add scribbles and coffee stains. Here's one for your wall:


The 2D seismic line in this post is from the USGS NPRA Seismic Data Archive, and are in the public domain. This is line number 31-81-PR (links directly to SEG-Y file).

White magic: calibrating seismic attributes

This post is part of a series on seismic attributes; the previous posts were...

  1. An attribute analysis primer
  2. Attribute analysis and statistics

Last time, I hinted that there might be a often-overlooked step in attribute analysis:

Calibration is a gaping void in many published workflows. How can we move past "that red blob looks like a point bar so I drew a line around it in PowerPoint" to "there's a 70% chance of finding reservoir quality sand at that location"?

Why is this step such a 'gaping void'? A few reasons:

  • It's fun playing with attributes, and you can make hundreds without a second thought. Some of them look pretty interesting, geological even. "That looks geological" is, however, not an attribute calibration technique. You have to prove it.
  • Nobody will be around when we find out the answer. There's a good chance that well will never be drilled, but when it is, you'll be on a different project, in a different company, or have left the industry altogether and be running a kayak rental business in Belize.
  • The bar is rather low. The fact that many published examples of attribute analysis include no proof at all, just a lot of maps with convincing-looking polygons on them, and claims of 'better reservoir quality over here'. 

This is getting discouraging. Let's look at an example. Now, it's hard to present this without seeming over-critical, but I know these gentlemen can handle it, and this was only a magazine article, so we needn't make too much of it. But it illustrates the sort of thing I'm talking about, so here goes.

Quoting from Chopra & Marfurt (AAPG Explorer, April 2014), edited slightly for brevity:

While coherence shows the edges of the channel, it gives little indication of the heterogeneity or uniformity of the channel fill. Notice the clear definition of this channel on the [texture attribute — homogeneity].
We interpret [the] low homogeneity feature [...] to be a point bar in the middle of the incised valley (green arrow). This internal architecture was not delineated by coherence.

A nice story, making two claims:

  1. The attribute incompletely represents the internal architecture of the channel.
  2. The labeled feature on the texture attribute is a point bar.

I know explorers have to be optimists, and geoscience is all about interpretation, but as scientists we must be skeptical optimists. Claims like this are nice hypotheses, but you have to take the cue: go off and prove them. Remember confirmation bias, and Feynman's words:

The first principle is that you must not fool yourself — and you are the easiest person to fool.

The twin powers

Making geological predictions with seismic attribute analysis requires two related workflows:

  1. Forward modeling — the best way to tune your intuition is to make a cartoonish model of the earth (2D, isotropic, homogeneous lithologies) and perform a simplified seismic experiment on it (convolutional, primaries only, noise-free). Then you can compare attribute behaviour to the known model.
  2. Calibration — you are looking for an explicit, quantitative relationship between a physical property you care about (porosity, lithology, fluid type, or whatever) and a seismic attribute. A common way to show this is with a cross-plot of the seismic amplitude against the physical property.

When these foundations are not there, we can be sure that one or more bad things will happen:

  • The relationship produces a lot of type I errors (false positives).
  • It produces a lot of type II error (false negatives).
  • It works at some wells and not at others.
  • You can't reproduce it with a forward model.
  • You can't explain it with physics.

As the industry shrivels and questions — as usual — the need for science and scientists, we have to become more stringent, more skeptical, and more rigorous. Doing anything else feeds the confirmation bias of the non-scientific continent. Because it says, loud and clear: geoscience is black magic.


The image is part of the figure from Chopra, S and K Marfurt (2014). Extracting information from texture attributes. AAPG Explorer, April 2014. It is copyright of the Authors and AAPG.