Old skool plot tool

It's not very glamorous, but sometimes you just want to plot a SEG-Y file. That's why we crafted seisplot. OK, that's why we cobbled seisplot together out of various scripts and functions we had lying around, after a couple of years of blog posts and Leading Edge tutorials and the like.

Pupils of the old skool — when everyone knew how to write a bash script, pencil crayons and lead-filled beanbags ruled the desktop, and Carpal Tunnel Syndrome was just the opening act to the Beastie Boys — will enjoy seisplot. For a start, it's command line only: 

    python seisplot.py -R -c config.py ~/segy_files -o ~/plots

Isn't that... reassuring? In this age of iOS and Android and Oculus Rift... there's still the command line interface.

Features galore

So what sort of features can you look forward to? Other than all the usual things you've come to expect of subsurface software, like a complete lack of support or documentation. (LOL, I'm kidding.) Only these awesome selling points:

  • Make wiggle traces or variable density plots... or don't choose — do both!
  • If you want, the script will descend into subdirectories and make plots for every SEG-Y file it finds.
  • There are plenty of colourmaps to choose from, or if you're insane you can make your own.
  • You can make PNGs, JPGs, SVGs or PDFs. But not CGM, sorry about that.

Well, I say 'selling points', but the tool is 100% free. We think this is a fair price. It's also open source of course, so please — seriously, please — improve the source code, then share it with the world! The code is in GitHub, natch.

Never go full throwback

There is one more feature: you can go full throwback and add scribbles and coffee stains. Here's one for your wall:


The 2D seismic line in this post is from the USGS NPRA Seismic Data Archive, and are in the public domain. This is line number 31-81-PR (links directly to SEG-Y file).

White magic: calibrating seismic attributes

This post is part of a series on seismic attributes; the previous posts were...

  1. An attribute analysis primer
  2. Attribute analysis and statistics

Last time, I hinted that there might be a often-overlooked step in attribute analysis:

Calibration is a gaping void in many published workflows. How can we move past "that red blob looks like a point bar so I drew a line around it in PowerPoint" to "there's a 70% chance of finding reservoir quality sand at that location"?

Why is this step such a 'gaping void'? A few reasons:

  • It's fun playing with attributes, and you can make hundreds without a second thought. Some of them look pretty interesting, geological even. "That looks geological" is, however, not an attribute calibration technique. You have to prove it.
  • Nobody will be around when we find out the answer. There's a good chance that well will never be drilled, but when it is, you'll be on a different project, in a different company, or have left the industry altogether and be running a kayak rental business in Belize.
  • The bar is rather low. The fact that many published examples of attribute analysis include no proof at all, just a lot of maps with convincing-looking polygons on them, and claims of 'better reservoir quality over here'. 

This is getting discouraging. Let's look at an example. Now, it's hard to present this without seeming over-critical, but I know these gentlemen can handle it, and this was only a magazine article, so we needn't make too much of it. But it illustrates the sort of thing I'm talking about, so here goes.

Quoting from Chopra & Marfurt (AAPG Explorer, April 2014), edited slightly for brevity:

While coherence shows the edges of the channel, it gives little indication of the heterogeneity or uniformity of the channel fill. Notice the clear definition of this channel on the [texture attribute — homogeneity].
We interpret [the] low homogeneity feature [...] to be a point bar in the middle of the incised valley (green arrow). This internal architecture was not delineated by coherence.

A nice story, making two claims:

  1. The attribute incompletely represents the internal architecture of the channel.
  2. The labeled feature on the texture attribute is a point bar.

I know explorers have to be optimists, and geoscience is all about interpretation, but as scientists we must be skeptical optimists. Claims like this are nice hypotheses, but you have to take the cue: go off and prove them. Remember confirmation bias, and Feynman's words:

The first principle is that you must not fool yourself — and you are the easiest person to fool.

The twin powers

Making geological predictions with seismic attribute analysis requires two related workflows:

  1. Forward modeling — the best way to tune your intuition is to make a cartoonish model of the earth (2D, isotropic, homogeneous lithologies) and perform a simplified seismic experiment on it (convolutional, primaries only, noise-free). Then you can compare attribute behaviour to the known model.
  2. Calibration — you are looking for an explicit, quantitative relationship between a physical property you care about (porosity, lithology, fluid type, or whatever) and a seismic attribute. A common way to show this is with a cross-plot of the seismic amplitude against the physical property.

When these foundations are not there, we can be sure that one or more bad things will happen:

  • The relationship produces a lot of type I errors (false positives).
  • It produces a lot of type II error (false negatives).
  • It works at some wells and not at others.
  • You can't reproduce it with a forward model.
  • You can't explain it with physics.

As the industry shrivels and questions — as usual — the need for science and scientists, we have to become more stringent, more skeptical, and more rigorous. Doing anything else feeds the confirmation bias of the non-scientific continent. Because it says, loud and clear: geoscience is black magic.


The image is part of the figure from Chopra, S and K Marfurt (2014). Extracting information from texture attributes. AAPG Explorer, April 2014. It is copyright of the Authors and AAPG.

A coding kitchen in Stavanger

Last week, I travelled to Norway and held a two day session of our Agile Geocomputing Training. We convened at the newly constructed Innovation Dock in Stavanger, and set up shop in an oversized, swanky kitchen. Despite the industry-wide squeeze on spending, the event still drew a modest turnout of seven geoscientists. That's way more traction then we've had in North America lately, so thumbs up to Norway! And, since our training is designed to be very active, a group of seven is plenty comfortable. 

A few of the participants had some prior experience writing code in languages such as Perl, Visual Basic, and C, but the majority showed up without any significant programming experience at all. 

Skills start with syntax and structures 

The first day we covered basic principles or programming, but because Python is awesome, we dive into live coding right from the start. As an instructor, I find that doing live coding has two hidden benefits: it stops me from racing ahead, and making mistakes in the open gives students permission to do the same. 

Using geoscience data right from the start, students learn about key data structures: lists, dicts, tuples, and sets, and for a given job, why they might chose between them. They wrote their own mini-module containing functions and classes for getting stratigraphic tops from a text file. 

Since syntax is rather dry and unsexy, I see the instructor's main role to inspire and motivate through examples that connect to things that learners already know well. The ideal containers for stratigraphic picks is a dictionary. Logs, surfaces, and seismic, are best cast into 1-, 2, and 3-dimensional NumPy arrays, respectively. And so on.

Notebooks inspire learning

We've seen it time and time again. People really like the format of Jupyter Notebooks (formerly IPython Notebooks). It's like there is something fittingly scientific about them: narrative, code, output, repeat. As a learning document, they aren't static — in fact they're meant to be edited. But they aren't so open-ended that learners fail to launch. Professional software developers may not 'get it', but scientists really subscribe do. Start at the start, end at the end, and you've got a complete record of your work. 

You don't get that with the black-box, GUI-heavy software applications we're used to. Maybe, all legitimate work should be reserved for notebooks: self-contained, fully-reproducible, and extensible. Maybe notebooks, in their modularity and granularity, will be the new go-to software for technical work.

Outcomes and feedback

By the end of day two, folks were parsing stratigraphic and petrophysical data from text files, then rendering and stylizing illustrations. A few were even building interactive animations on 3D seismic volumes.  One recommendation was to create a sort of FAQ or cookbook: "How do I read a log?", "How do I read SEGY?", "How do I calculate elastic properties from a well log?". A couple of people of remarked that they would have liked even more coached exercises, maybe even an extra day; a recognition of the virtue of sustained and structured practice.


Want training too?

Head to our courses page for a list of upcoming courses, or more details on how you can train your team


Photographs in this post are courtesy of Alessandro Amato del Monte via aadm on Flickr

Why I don't flout copyright

Lots of people download movies illegally. Or spoof their IP addresses to get access to sports fixtures. Or use random images they found on the web in publications and presentations (I've even seen these with the watermark of the copyright owner on them!). Or download PDFs for people who aren't entitled to access (#icanhazpdf). Or use sketchy Russian paywall-crumbling hacks. It's kind of how the world works these days. And I realize that some of these things don't even sound illegal.

This might surprise some people, because I go on so much about sharing content, open geoscience, and so on. But I am an annoying stickler for copyright rules. I want people to be able to re-use any content they like, without breaking the law. And if people don't want to share their stuff, then I don't want to share it.

Maybe I'm just getting old and cranky, but FWIW here are my reasons:

  1. I'm a content producer. I would like to set some boundaries to how my stuff is shared. In my case, the boundaries amount to nothing more than attribution, which is only fair. But still, it's my call, and I think that's reasonable, at least until the material is, say, 5 years old. But some people don't understand that open is good, that shareable content is better than closed content, that this is the way the world wants it. And that leads to my second reason:
  2. I don't want to share closed stuff as if it was open. If someone doesn't openly license their stuff, they don't deserve the signal boost — they told the world to keep their stuff secret. Why would I give them the social and ethical benefits of open access while they enjoy the financial benefits of closed content? This monetary benefit comes from a different segment of the audience, obviously. At least half the people who download a movie illegally would not, I submit, have bought the movie at a fair price.

So make a stand for open content! Don't share stuff that the creator didn't give you permission to share. They don't deserve your gain filter.

Rockin' Around The Christmas Tree

I expect you know at least one geoscientist. Maybe you are one. Or you want to be one. Or you want one for Christmas. It doesn't matter. The point is, it'll soon be Christmas. If you're going to buy someone a present, you might as well get them something cool. So here's some cool stuff!

Gadgets

There isn't a single geologist alive that wouldn't think this was awesome. It's a freaking Geiger counter! It goes in your pocket! It only costs USD 60, or CAD 75, or less than 40 quid! Absurd: these things normally cost a lot more.

OK, if you didn't like that, you're not going to like this IR spectrometer. Yes, a pocket molecular sensor, for sensing molecules in pockets. It does cost USD 250 though, so make sure you really like that geologist!

Back down to earth, a little USB microscope ticks most of the geogeek boxes. This one looks awesome, and is only USD 40 but there are loads, so maybe do some research.

Specimens

You're going to need something to wave all that gadgetry at. If you go down the well-worn path of the rock & mineral set, make sure it's a good size, like this 100-sample monster (USD 70). Or go for the novelty value of fluorescent specimens (USD 45) — calcite, sphalerite, and the like.

If minerals seem passé for a geologist, then take the pure line with a tour of the elements. This set — the last of it's kind, by the way — costs USD 565, but it looks amazing. Yet it can't hold a candle to this beauty, all USD 5000 of it — which I badly want but let's face it will never get.

Home

If you have a rock collection, maybe you want a mineralogical tray (USD 35) to put them in? The same store has all sorts of printed fabrics by designers Elena Kulikova and Karina Eibitova. Or how about some bedding?

These steampunk light switch plates are brilliant and varied (USD 50). Not geological at all, just awesome.

I don't think they are for sale, but check out Ohio artist Alan Spencer's ceramic pieces reflecting each of the major geological periods. They're pretty amazing.

Lego

My kids are really into Lego at the moment. Turns out there are all sorts of sciencey kits you can get. I think the Arctic Base Camp (USD 90) is my favourite that's available at the moment, and it contains some kind of geological-looking type (right).

I don't condone the watching of television programmes, except Doctor Who obviously, but they do sometimes make fun Lego sets. So there's the Doctor, naturally, and other things like Big Bang Theory.

You can fiddle with these while you wait for the awesome HMS Beagle model to come out.

Books etc.

A proven success — winner of the Royal Society's prestigious Winton Prize for science books this year — is Adventures in the Anthropocene: A Journey to the Heart of the Planet We Made, by Gaia Vince, Milkweed Editions, September 2015. Available in hardback and paperback.

Lisa Randall's Dark Matter and the Dinosaurs: The Astounding Interconnectedness of the Universe (HarperCollins) just came out, and is doing remarkably well at the moment. It's getting decent reviews too. Randall is a cosmologist, and she reckons the dinosaurs were obliterated by a comet nudged out of orbit by mysteriousness. Hardback only.

If those don't do it for you, I reviewed some sciencey comic books recently... or there's always Randall Munroe.

Or you could try poking around in the giftological posts from 2011, 2012, 2013, or 2014.

Still nothing? OK, well, there's always chocolate :)


The images in this post are all someone else's copyright and are used here under fair use guidelines. I'm hoping the owners are cool with people helping them sell stuff!

The big data eye-roll

First, let's agree on one thing: 'big data' is a half-empty buzzword. It's shorthand for 'more data than you can look at', but really it's more than that: it branches off into other hazy territory like 'data science', 'analytics', 'deep learning', and 'machine intelligence'. In other words, it's not just 'large data'. 

Anyway, the buzzword doesn't bother me too much. What bothers me is when I talk to people at geoscience conferences about 'big data', about half of them roll their eyes and proclaim something like this: "Big data? We've been doing big data since before these punks were born. Don't talk to me about big data."

This is pretty demonstrably a load of rubbish.

What the 'big data' movement is trying to do is not acquire loads of data then throw 99% of it away. They are not processing it in a purely serial pipeline, making arbitrary decisions about parameters on the way. They are not losing most of it in farcical enterprise data management train-wrecks. They are not locking most of their data up in such closed systems that even they don't know they have it.

They are doing the opposite of all of these things.

If you think 'big data', 'data' science' and 'machine learning' are old hat in geophysics, then you have some catching up to do. Sure, we've been toying with simple neural networks for years, eg probabilistic neural nets with 1 hidden layer — though this approach is very, very far from being mainstream in subsurface — but today this is child's play. Over and over, and increasingly so in the last 3 years, people are showing how new technology — built specifically to handle the special challenge that terabytes bring — can transform any quantitative endeavour: social media and online shopping, sure, but also astronomy, robotics, weather prediction, and transportation. These technologies will show up in petroleum geoscience and engineering. They will eat geostatistics for breakfast. They will change interpretation.

So when you read that Google has open sourced its TensorFlow deep learning library (9 November), or that Microsoft has too (yesterday), or that Facebook has too (months ago), or that Airbnb has too (in August), or that there are a bazillion other super easy-to-use packages out there for sophisticated statistical learning, you should pay a whole heap of attention! Because machine learning is coming to subsurface.

Moving ahead with social interpretation

After quietly launching Pick This — our social image interpretation tool — in February, we've been busily improving the tool and now we're moving into 2016 with a plan for world domination. I summed up the first year of development in one of the interpretation sessions at SEG 2015. Here's a 13-minute version of my talk:

In 2016 we'll be exploring ways to adapt the tool to in-house corporate use, mainly by adding encryption and private groups. This way, everyone with @awesome.com email addresses, say, would be connected to each other, and their stuff would only be shared among the group, not with the general public.

Some other functionality is on the list of things to do:

  • Other types of interpretation than points, lines and polygons.
  • Ways to find content more easily, for example with tags like 'Seismic' or 'Outcrop'.
  • Ways to follow individuals, or get notifications of new interpretations on an image.
  • More ways to visualize and generally get at the data Pick This produces.

We're always open to suggestions. Please get in touch if you have a neat idea!

What now?

Times are rock hard in industry right now.

If you have a job, you're lucky — you have probably already survived one round of layoffs. There will likely be more, especially when the takeovers start, which they will. I hope you survive those too. 

If you don't have a job, you probably feel horrible, but of course that won't get you anywhere. I heard one person call it an 'involuntary sabbatical', and I love that: it's the best chance you'll get to re-invent, re-learn, and find new direction. 

If you're a student, you must be looking out over the wasteland and wondering what's in store for you. What on earth?

More than one person has asked me recently about Agile. "You got out," they say, "how did you do it?" So instead of bashing out another email, I thought I'd blog about it.

Consulting in 2015

I didn't really get out, of course, I just quit and moved to rural Nova Scotia.

Living out here does make it harder to make a living, and things on this side of the fence, so to speak, are pretty gross too I'm afraid. Talking to others at SEG suggested that I'm not alone among small companies in this view. A few of the larger outfits seem to be doing well: IKON and GeoTeric for instance, but they also have product, which at least offers some income diversity. 

Agile started as a 100% bootstrapped effort to be a consulting firm that's more directly useful to individual professional geoscientists than anyone else. Most firms target corporate accounts and require permission, a complicated contract, an AFE, and 3 months of bureaucracy to hire. It turns out that professionals are unable or unwilling to engage on that lower, grass-roots level, though — turns out almost everyone thinks you actually need permission, contracts, AFEs, etc, to get hired in any capacity, even just "Help me tie this well." So usually we are hired into larger, longer-term projects, just like anyone else.

I still think there's something in this original idea — the Uberification of consulting services, if you will — maybe we'll try again in a few years.

But if you are out of work and were thinking of getting out there as a consultant... I'm an optimistic person, but unless you are very well known (for being awesome), it's hard for me to honestly recommend even trying. It's just not the reality right now. We've been lucky so far, because we work in geothermal and government as well as in petroleum, but oil & gas was over half our revenue last year. It will be about 0% of it this year, maybe slightly less.

The transformation of Agile

All of which is to explain why we are now, since January, consciously and deliberately turning ourselves into a software technology R&D company. The idea is to be less dependent on our dysfunctional industry, and less dependent on geotechnical work. We build new tools for hard problems — data problems, interpretation problems, knowledge sharing problems. And we're really good at it.

We hired another brilliant programmer in August, and we're all learning more every day about our playground of scientific computing and the web — machine learning, cloud services, JavaScript frameworks, etc. The first thing we built was modelr.io, which is still in active development. Our latest project is around our tool pickthis.io. I hope it works out because it's the most fun I've had on a project in ages. Maybe these projects spin out of Agile, maybe we keep them in-house.

So that's our survival plan: invent, diversify, and re-tool like crazy. And keep blogging.

F**k it

Some people are saying, "things will recover, sit it out" but I think that's awful — the very worst — advice. I honestly think your best bet right now* is to find an accomplice, set aside 6 months and some of your savings, push everything off your desk, and do something totally audacious. 

Something you can't believe no-one has thought of doing yet.

Whatever it was you just thought of — that's the thing.

You might as well get started.


* Unless you have just retired, are very well connected in industry, have some free time, and want to start a new, non-commercial project that will profoundly benefit the subsurface community for the next several decades at least. Because I'd like to talk to you about another audacious plan...

Notes from a hackathon

The spirit of invention is alive and well in exploration geophysics! Last weekend, Agile hosted the 3rd annual Geophysics Hackathon at Propeller, a large and very cool co-working space in New Orleans, Louisiana.

A community of creative scientists

Commensurate with the lower-than-usual turnout at the SEG Annual Meeting, which our event preceded, we had 15 hackers. The remaining hackers were not competing, but hanging out and self-teaching or hacking around with code.

As in Denver, we had an amazing showing from Colorado School of Mines, with 6 participants. I don't know what's in the water over there in the Rockies, or what the profs have been feeding these students, but it works. Such smart, creative talent. But it can't stay this one-sided... one day we'll provoke Stanford into competitive geophysics programming.

Other than the Mines crew, we had one other student (Agile's Ben Bougher, who's at UBC), the dynamic wiki duo from SEG, and the rest were professional geoscientists from large and small companies, so it was pretty well balanced between academia and industry.

Thank you

As always, we are indebted to the sponsors and supporters of the hackathon. The event would be impossible without their financial support, and much less fun without their eager participation. This year we teamed up with three companies:

  • OpenGeoSolutions, a fantastic group of geophysicists based in Calgary. You won't find better advice on signal processing problems. Jamie Alison and Greg Partyka also regularly do us the honour of judging our hackathon demos, which is wonderful.
  • EMC, a huge cloud computing company, generously supported us through David Holmes, their representative for our industry, and a fellow Landmark alum. David also kindly joined us for much of the hackathon, including the judging, which was great for the teams.
  • Palladium Consulting, a Houston-based bespoke software house run by Sebastian Good, were a new sponsor this year. Sebastian reached out to a New Orleans friend and business partner of his, Graham Ganssle, to act as a judge, and he was beyond generous with his time and insight all weekend. He also acted as a rich source of local knowledge.

Although he craves no spotlight, I have to recognize the personal generosity of Karl Schleicher of UT Austin, who is one of the most valuable assets our community has. His tireless promotion of open data and open source software is an inspiration.

And finally, Maitri Erwin again visited to judge the demos on Sunday. She brings the perfect blend of a deep and rigorous expertise in exploration geoscience and a broad and futuristic view of technology in the service of humankind. 

I will do a round up of the projects in the next couple of weeks. Look out for that because all of the projects this year were 'different'. In a good way.


If this all sounds like fun, mark your calendars for 2016! I think we're going to try running it after SEG next year, so set aside 22 and 23 October 2016, and we'll see you there. Bring a team!

PS You can already sign up for the hackathon in Europe at EAGE next year!

More highlights from SEG

On Monday I wrote that this year's Annual Meeting seemed subdued. And so it does... but as SEG continued this week, I started hearing some positive things. Vendors seemed pleasantly surprised that they had made some good contacts, perhaps as many as usual. The technical program was as packed as ever. And of course the many students here seemed to be enjoying themselves as much as ever. (New Orleans might be the coolest US city I've been to; it reminds me of Montreal. Sorry Austin.)

Quieter acquisition

Pramik et al. (of Geokinetics) reported on a new marine vibrator acquisition using their AquaVib source. This instrument has been around for a while, indeed it was first tested over 20 years ago by IVI and later Geco (e.g. see J Bird, TLE, June 2003). If perfected, it will allow for much quieter marine seismic acquisition, reducing harm to marine mammals, with no loss of quality (images below from their abstract and their copyright with SEG):

Ben told me one of his favourite talks was Schostak & Jenkerson with a report from a JIP (Shell, ExxonMobil, Total, and Texas A&M) trying to build a new marine vibrator.  Three designs are being tested by the current consortium, respectively manufactured by PGS with an electrical model, APS with a mechanical piston, and Teledyne with a bubble resonator.

In other news:

  • Talks at Dallas 2016 will only be 15 minutes long. Hopefully this is to allow room in the schedule for something else, not just more talks.
  • Dave Hale has retired from Colorado School of Mines, and apparently now 'writes software with Dean Witte'. So watch out for that!
  • A sure sign of industry austerity: "Would you like Bud Light, or Miller Light?"
  • Check out the awesome ribbons that some clever student thought of. I'm definitely pinching that idea.

That's all I have for now, and I'm flying home today so that's it for SEG 2015. I will be reporting on the hackathon soon I promise, and I'll try to get my paper on Pick This recorded next week (but here's a sneak peek). Stay tuned!


References

Bill Pramik, M. Lee Bell, Adam Grier, and Allen Lindsay (2015) Field testing the AquaVib: an alternate marine seismic source. SEG Technical Program Expanded Abstracts 2015: pp. 181-185. doi: 10.1190/segam2015-5925758.1

Brian Schostak* and Mike Jenkerson (2015) The Marine Vibrator Joint Industry Project. SEG Technical Program Expanded Abstracts 2015: pp. 4961-4962. doi: 10.1190/segam2015-6026289.1