Interpreting spectral gamma-ray logs

Before you can start interpreting spectral gamma-ray logs (or, indeed, any kind of data), you need to ask about quality.

Calibrate your tool...

The main issues affecting the quality of the logs are tool calibration and drilling mud composition. I think there's a tendency to assume that delivered logs have been rigorously quality checked, but... they haven't. The only safe assumption is that nobody cares about your logs as much as you. (There is a huge opportunity for service companies here — but in my experience they tend to be focused on speed and quantity, not quality.)

Calibration is critical. The measurement device in the tool consists of a thallium-laced NaI crystal and a photomultiplier. Both of these components are sensitive to temperature, so calibration is especially important when the temperature of the tool is changing often. If the surface temperature is very different from the downhole—winter in Canada—calibrate often.

Drilling mud containing KCl (to improve borehole stability) increases the apparent potassium content of the formation, while barite acts as a gamma-ray absorber and reduces the count rates, especially in the low energies (potassium).

One of the key quality control indicators is negative readings on the uranium log. A few negative values are normal, but many zero-crossings may indicate that the tool was improperly calibrated. It is imperative to quality control all of the logs, for bad readings and pick-up effects, before doing any quantitative work.

...and your interpretation

Most interpretations of spectral-gamma ray logs focus on the relationships between the three elemental concentrations. In particular, Th/K and Th/U are often used for petrophysical interpretation and log correlation. In calculating these ratios, Schlumberger uses the following cut-offs: if uranium < 0.5 then uranium = 0.5; if potassium < 0.004 then potassium = 0.001 (according to my reference manual for the natural gamma tool).

In general, high K values may be caused by the presence of potassium feldspars or micas. Glauconite usually produces a spike in the K log. High Th values may be associated with the presence of heavy minerals, particularly in channel deposits. Increased Th values may also be associated with an increased input of terrigenous clays. Increases in U are frequently associated with the presence of organic matter. For example, according to the ODP, particularly high U concentrations (> 5 ppm) and low Th/U ratios (< 2) often occur in black shale deposits.

The logs here, from Kansas Geological Survey open file 90-27 by Macfarlane et al. shows a quite overt interpretive approach, with the Th/K log labelled with minerals (feldspar, mica, illite–smectite) and the Th/U log in uranium 'fixedness', a proxy for organic matter.

Sounds useful. But really, you can probably find just a paper to support just about any interpretation you want to make. Which isn't to say that spectral gamma-ray is no use — it's just not diagnostic on its own. You need to calibrate it to your own basin and your own stratigraphy. This means careful, preferably quantitative, comparison of core and logs. 

Further reading 

What is spectral gamma-ray?

The spectral gamma-ray log is a measure of the natural radiation in rocks. The amplitude of the signal from the gamma-ray tool, which is just a sensor with no active source, is proportional to the energy of the gamma-ray photons it encounters. Being able to differentiate between photons of different energies turns out to be very handy Compared to the ordinary gamma-ray log, which ignores the energies and only counts the photons, it's like seeing in colour instead of black and white.

Why do we care about gamma radiation?

First, what are gamma rays? Highly energetic photons: electromagnetic radiation with very short wavelengths. 

Being able to see different energies, or 'colours', means we can differentiate between the radioactive decay of different elements. Elements decay by radiating energy, and the 'colour' of that energy is characteristic of that element (actually, of each isotope). So, we can tell by looking at the energy of a photon if we are seeing a potassium atom (40K) or a uranium atom (238U) decay. These are very different isotopes, with very different habits. We can do geology!

In fact, all sorts of radioisotopes occur naturally in the earth. By far the most abundant are potassium 40K, thorium 232Th and uranium 238U. Of these, potassium is the most abundant in sedimentary rocks, but thorium and uranium are present in small quantities, and have particular sedimentological implications.

What exactly are we measuring?

Potassium 40K decays to argon about 10% of the time, with γ-emission at 1.46 MeV (the other 90% of the time it decays to calcium). However, all of the decay in the 232Th and 238U decay series occurs by α- and β-particle decay, which don't always result in photon emission. The tool in fact measures γ-radiation from the decay of thallium 208Tl in the 232Th series (right), and from bismuth 214Bi in the 238U series. The spectral gamma-ray tool must be calibrated to known samples to give concentrations of 232Th and 238U from its readings. Proper calibration is vital, and is temperature-sensitive (of note in Canada!).

The concentrations of the three elements are estimated from the spectral measure­ments. The concentration of potassium is usually measured in percent (%) or per mil (‰), or sometimes in kilograms per tonne, which is equivalent to per mil. The other two elements are measured in parts per million (ppm).

Here is the gamma-ray spectrum from a single sample from 509 m below the sea-floor at ODP Site 1201. The final spectrum (heavy black line) is shown after removing the background spectrum (gray region) and applying a three-point mean boxcar filter. The thin black line shows the raw spectrum. Vertical lines mark the interval boundaries defined by Peter Blum (an ODP scientist at Texas A&M). Prominent energy peaks relating to certain elements are identified at the top of the figure. The inset shows the spectrum for energies >1500 keV at an expanded scale. 

We wouldn't normally look at these spectra. Instead, the tool provides logs for K, Th, and U. Next time, I'll look at the logs.

Spectrum illustration by Wikipedia user Inductiveload, licensed GFDL; decay chain by Wikipedia user BatesIsBack, licensed CC-BY-SA.

Machines can read too

The energy industry has a lot of catching up to do. Humanity is faced with difficult, pressing problems in energy production and usage, yet our industry remains as secretive and proprietary as ever. One rich source of innovation we are seriously under-utilizing is the Internet. You have probably heard of it.

Machine experience design

semantic.jpg

Web sites are just the front-end of the web. Humans have particular needs when they read web pages — attractive design, clear navigation, etc. These needs are researched and described by the rapidly growing field of user experience design, often called UX. (Yes, the ways in which your intranet pages need fixing are well understood, just not by your IT department!)

But the web has a back-end too. Rather than being for human readers, the back-end is for machines. Just like human readers, machines—other computers—also have particular needs: structured data, and a way to make queries. Why do machines need to read the web? Because the web is full of data, and data makes the world go round. 

So website administrators need to think about machine experience design too. As well as providing beautiful web pages for humans to read, they should provide widely-accepted machine-readable format such as JSON or XML, and a way to make queries.

What can we do with the machine-readable web?

The beauty of the machine-readable web, sometimes called the semantic web, or Web 3.0, is that developers can build meta-services on it. For example, a website like hipmunk.com that finds the best flights, wherever they are. Or a service that provides charts, given some data or a function. Or a mobile app that knows where to get the oil price. 

In the machine-readable web, you could do things like:

  • Write a program to analyse bibliographic data from SEG, SPE and AAPG.
  • Build a mobile app to grab log mnemonics info from SLB's, HAL's, and BHI's catalogs.
  • Grab course info from AAPG, PetroSkills, and Nautilus to help people find training they need.

Most wikis have a public application programming interface, giving direct, machine-friendly access to the wiki's database. Here are two views of one wiki page — click on the images to see the pages:

At SEG last year, I suggested to a course provider that they might consider offering machine access to their course catalog—so that developers can build services that use their course information and thus send them more students. They said, "Don't worry, we're building better search tools for our users." Sigh.

In this industry, everyone wants to own their own portal, and tends to be selfish about their data and their users. The problem is that you don't know who your users are, or rather who they could be. You don't know what they will want to do with your data. If you let them, they might create unimagined value for you—as hipmunk.com does for airlines with reasonable prices, good schedules, and in-flight Wi-Fi. 

I can't wait for the Internet revolution to hit this industry. I just hope I'm still alive.

Dream geoscience courses

MOOCs mean it's never been easier to learn something new.This is an appeal for opinions. Please share your experiences and points of view in the comments.

Are you planning to take any technical courses this year? Are you satisfied with the range of courses offered by your company, or the technical societies, or the commercial training houses (PetroSkills, Nautilus, and so on)? And how do you choose which ones to take — do you just pick what you fancy, seek recommendations, or simply aim for field classes at low latitudes?

At the end of 2012, several geobloggers wrote about courses they'd like to take. Some of them sounded excellent to me too... which of these would you take a week off work for?

Here's my own list, complete with instructors. It includes some of the same themes...

  • Programming for geoscientists (learn to program!) — Eric Jones
  • Solving hard problems about the earth — hm, that's a tough one... Bill Goodway?
  • Communicating rocks online — Brian Romans or Maitri Erwin
  • Data-driven graphics in geoscience — the figure editor at Nature Geoscience
  • Mathematics clinic for geoscientists — Brian Russell
  • Becoming a GIS ninja — er, a GIS ninja
  • Working for yourself — needs multiple points of view
What do you think? What's your dream course? Who would teach it?

Making images or making prospects?

Well-rounded geophysicists will have experience in each of the following three areas: acquisition, processing, and interpretation. Generally speaking, these three areas make up the seismic method, each requiring highly specified knowledge and tools. Historically, energy companies used to control the entire spectrum, owning the technology, the know-how and the risk, but that is no longer the case. Now, service companies do the acquisition and the processing. Interpretation is largely hosted within E & P companies, the ones who buy land and drill wells. Not only has it become unreasonable for a single geophysicist to be proficient across the board, but organizational structures constrain any particular technical viewpoint. 

Aligning with the industry's strategy, if you are a geophysicist, you likely fall into one of two camps: those who make images, or those who make prospects. One set of people to make the data, one set of people to do the interpretation.

This seems very un-scientific to me.

Where does science fit in?

Science, the standard approach of rational inquiry and accruing knowledge, is largely vacant from the applied geophysical business landscape. But, when science is used as a model, making images and making prospects are inseperable.

Can applied geophysics use scientific behaviour as a central anchor across disciplines?

There is a significant amount of science that is needed in the way that we produce observations, in the way that we make images. But the business landscape built on linear procedures leaves no wiggle room for additional testing and refinement. How do processors get better if they don't hear about their results? As a way of compensating, processing has deflected away from being a science of questioning, testing, and analysis, and moved more towards, well,... a process.

The sure-fire way to build knowledge and decrease uncertainty, is through experimentation and testing. In this sense this notion of selling 'solutions', is incompatible with scientific behavior. Science doesn't claim to give solutions, science doesn't claim to give answers, but it does promise to address uncertainty; to tell you what you know.

In studying the earth, we have to accept a lack of clarity in our data, but we must not accept mistakes, errors, or mediocrity due to shortcomings in our shared methodologies.

We need a new balance. We need more connectors across these organizational and disciplinary divides. That's where value will be made as industry encounters increasingly tougher problems. Will you be a connector? Will you be a subscriber to science?

Hall, M (2012). Do you know what you think you know? CSEG Recorder 37 (2), February 2012, p 26–30. Free to download from CSEG. 

Filters that distort vision

Almost two weeks ago, I had LASIK vision correction surgery. Although the recovery took longer than average, I am seeing better than I ever did before with glasses or contacts. Better than 20/20. Here's why.

Low order and high order refractive errors

Most people (like me) who have (had) poor vision fall short of pristine correction because lenses only correct low order refractive errors. Still, any correction gives a dramatic improvement to the naked eye; further refinements may be negligible or imperceptible. Higher order aberrations, caused by small scale structural irregularities of the cornea, can still affect one's refractive power by up to 20%, and they can only be corrected using customized surgical methods.

It occurs to me that researchers in optometry, astronomy, and seismology face a common challenge: how to accurately measure and subsequently correct for structural deformations in refractive media, and the abberrations in wavefronts caused by such higher-order irregularities. 

The filter is the physical model

Before surgery, a wavefront imaging camera was used to make detailed topographic maps of my corneas, and estimate point spread functions for each eye. The point spread function is a 2D convolution operator that fuzzies the otherwise clear. It shows how a ray is scattered and smeared across the retina. Above all, it is a filter that represents the physical eye.

Point spread function (similar to mine prior to LASIK) representing refractive errors of the cornea (top two rows), and corrected vision (bottom row). Point spread functions are filters that distort both the visual and seismic realms. The seismic example is a segment of inline 25, Blake Ridge 3D seismic survey, available from the Open Seismic Repository (OSR).Observations in optics and seismology alike are only models of the physical system, models that are constrained by the filters. We don't care about the filters per se, but they do get in the way of the underlying system. Luckily, the behaviour of any observation can be expressed as a combination of filters. In this way, knowing the nature of reality literally means quantifying the filters that cause distortion. Change the filter, change the view. Describe the filter, describe the system. 

The seismic experiment yields a filtered earth; a smeared reality. Seismic data processing is the analysis and subsequent removal of the filters that distort geological vision. 

This image was made using the custom filter manipulation tool in FIJI. The seismic data is available from OpendTect's Open Seismic Repository.