Openness is a two-way street

Last week the Data Analysis Study Group of the SPE Gulf Coast Section announced a new machine learning contest (I’m afraid registration is now closed, even though the contest has not started yet). The task is to predict shear-wave sonic from other logs, similar to the SPWLA PDDA contest last year. This is a valuable problem in the subsurface, because shear sonic log is essential for computing elastic properties of rocks and therefore in predicting rock and fluid properties or processing seismic. Indeed, TGS have built a business on predicted logs with their ARLAS product. There’s money in log prediction!

The task looks great, but there’s one big problem: the dataset is not open.

Why is this a problem?

Before answering that, let’s look at some context.

What’s a machine learning contest?

Good question. Typically, an organization releases a dataset (financial timeseries, Netflix viewer data, medical images, or whatever). They invite people to predict some valuable property (when to sell, which show to recommend, how to treat the illness, or whatever). And they pick the best, measured against known labels on a hidden dataset.

Kaggle is one of the largest platforms hosting such challenges, and they often attract thousands of participants — competing for large prizes. TGS ran a seismic salt-picking contest on the platform, attracting almost 74,000 submissions from 3220 teams with a $100k prize purse. Other contests are more grass-roots, like the one I ran with Brendon Hall in 2016 on lithology prediction, and like this SPE contest. It’s being run by a team of enthusiasts without a lot of resources from SPE, and the prize purse is only $1000 — representing about 3 hours of the fully loaded G&A of an oil industry professional.

What has this got to do with reproducibility?

Contests that award a large prize in return for solving a hard problem are essentially just a kind of RFP-combined-with-consulting-job. It’s brutally inefficient: hundreds or even thousands of people spend hours on the problem for free, and a handful are financially rewarded. These contests attract a lot of attention, but I’m not that interested in them.

Community-oriented events like this SPE contest — and the recent FORCE one that Xeek hosted — are more interesting and I believe they are more impactful. They have lots of great outcomes:

  • Lots of people have fun working on a hard problem and connecting with each other.

  • Solutions are often shared after, or even during, the contest, so that everyone learns and grows their toolbox.

  • A new open dataset that might even become a much-needed benchmark for the task in hand.

  • Researchers can publish what they did, or do later. (The SEG ML contest tutorial and results article have 136 citations between them, largely from people revisiting the dataset to show new solutions.)

A lot of new open-source machine learning code is always exciting, but if the data is not open then the work is by definition not reproducible. It seems especially unfair — cheeky, even — to ask participants to open-source their code, but to keep the data proprietary. For sure TGS is interested in how these free solutions compare to their own product.

Well, life’s not fair. Why is this a problem?

The data is being shared with the contest participants on the condition that they may not share it. In other words it’s proprietary. That means:

  • Participants are encumbered with the liability of a proprietary dataset. Sure, TGS is sharing this data in good faith today, but who knows how future TGS lawyers will see it after someone accidentally commits it to their GitHub repo? TGS is a billion-dollar company, they will win a legal argument with you. (Having said that, there’s no NDA or anything, just a checkbox in a form. I don’t know how binding it really is… but I don’t want to be the one that finds out.)

  • Participants can’t publish reproducible papers on their own work. They can publish classic oil-indsutry, non-reproducible work — I did this thing but no-one can check it because I can’t give you the data — but do we really need more of that? (In the contest introductory Zoom, someone asked about publishing plots of the data. The answer: “It should be fine.” Are we really still this naive about data?)

If anyone from TGS is reading this and thinking, “Come on, we’re not going to sue anyone — we’re not GSI! — it’s fine :)” then my response is: Wonderful! In that case, why not just formalize everything by releasing the data under an open licence — preferably Creative Commons Attribution 4.0? (Unmodified! Don’t make the licensing mistakes that Equinor and NAM have made recently.) That way, everyone knows their rights, everyone can safely download the data, and the community can advance. And TGS looks pretty great for contributing an awesome dataset to the subsurface machine learning community.

I hope TGS decides to release the data with an open licence. If they don’t, it feels like a rather one-sided deal to me. And with the arrangement as it stands, there’s no way I would enter this contest.

The new reality

In Calgary last week I heard the phrase "when the industry recovers" several times. Dean Potter even went so far as to say:

Don’t believe anyone who says ‘It’s different this time’. It isn’t.

He knows what he's talking about — the guy sold his company to Vermillion in 2014 for $427 million.

But I think he's dead wrong.

What's different this time?

A complete, or at least non-glacially-slow, recovery seems profoundly unlikely to me. We might possibly be through the 'everything burns to the ground' phase, but the frenzy of mergers and takeovers has barely started. That will take at least a couple of years. If and when any stability returns to operations, it seems highly probable that it will have these features:

  1. It will be focused on shale. (Look at the Permian Basin today.)
  2. It will need fewer geoscientists. (There are fewer geological risks.)
  3. It will be driven by data. (We have barely started on this.)
  4. It will end in another crash. (Hungry animals bolt their food.)

If you're a geoscientist and have never worked find-grained plays, I think the opportunities in front of you are going to be different from the ones you're used to. And by 'different', I mean 'scarcer'.

Where else can you look?

It may be time to think about a pivot, if you haven't already. (Pivot is lean-startup jargon for 'plan B' (or C). And I don't think it's a bad idea to think of yourself, or any business, as a start-up. Indeed, if you don't, you're headed for obsolescence.)

What would you pivot to? What's your plan B? If you think of petroleum geoscience as having a position in a matrix, think about our neighbours in that matrix. Industries are vertical; disciplines are horizontal.

Opportunities in neighbouring cells are probably within relatively easy reach. Think about:

  • Near surface: archaeology, UXO detection, engineering geophysics.
  • Geomatics, remote sensing, and geospatial analysis. Perhaps in mining or geothermal energy.
  • Stepping out of industry into education or government. People with applied knowledge have a lot to offer.
  • Making contacts in a new industry like finance or medicine. Tip: go to a conference. Talk to everyone you can find.

Think about your technical skills more broadly

I don't know where those new opportunities will come from, but I think it only takes a small shift in perspective to spot them. Think of your purpose, not your tasks. For example:

  • Many geophysicists are great quantitative scientists. If you know linear algebra or geostatistics and write code too, you have much sought-after skills in any industry.
  • Many geologists are great at spatial analysis. If you can wield geodatabases and GIS software like a wizard, you are a valuable asset to any industry.
  • Many engineers are great at project management and analytics. If you have broken out of Excel and can drive Spotfire or Tableau, you are gold in any industry.

If you forgot to keep your skills up to date and are locked into clicking buttons in Petrel, or making PowerPoint maps of the Cardium, or fiddling with charts in Excel, I'm not sure what to tell you. Everyone has those skills. You're yesterday's geoscientist and you don't have a second to lose. 

Running away from easy

Matt and I are in Calgary at the 2017 GeoConvention. Instead of writing about highlights from Day 1, I wanted to pick on one awesome thing I saw. Throughout the convention, there is a air of sadness, of nostalgia, of struggle. But I detect a divide among us. There are people who are waiting for things to return to how they were, when life was easy. Others are exploring how to be a part of the change, instead of a victim of it. Things are no longer easy, but easy is boring. 


Want to start an oil and gas company? What resources are you going to need? Computers, pricey software applications, data. Purchase all of this stuff as a one-time capital expense, build a team, get an office lease, buy desks and a Keurig. Then if all goes well, 18 months later you'll have a slide deck outlining a play that you could pitch to investors. 

Imagine getting started without laying down a huge amount of capital for all those things. What if you could rent a desk at a co-working space, access the suite of software tools that you're used to, and use their Keurig. The computer infrastructure and software is managed and maintained by an IT service company so you don't have to worry about it. 

Yesterday at the Calgary Geoconvention I heard all about ReSourceYYC, a co-working space catering to oil and gas professionals, introduced ResourceNET, a subscription-based cloud workstation environment for freelancers, consultants, startups, and the newly and not-so-newly underemployed community of subsurface professionals.

In making this offering, ReSourceYYC has partnered up with a number of software companies: Entero, Seisware, Surfer, ValNav, geoLOGIC, and Divestco, to name a few. The limitations and restrictions around this environment, if any, weren't totally clear. I wondered: Could I append or swap my own tools with this stack? Can I access this environment from anywhere?

It could be awesome. I think it could serve just as many freelancers and consultants as "oil and gas startups". It seems a bit too early to say, but I reckon there are literally thousands of geoscientists and engineers in Calgary that'd be all over this.

I think it's interesting and important and I hope they get it right.

The disappearing lake trick

On Sunday 20 November it's the 36th anniversary of the 1980 Lake Peigneur drilling disaster. The shallow lake — almost just a puddle at about 3 m deep — disappeared completely when the Texaco wellbore penetrated the Diamond Crystal Salt Company mine at a depth of about 350 m.

Location, location, location

It's thought that the rig, operated by Wilson Brothers Ltd, was in the wrong place. It seems a calculation error or misunderstanding resulted in the incorrect coordinates being used for the well site. (I'd love to know if anyone knows more about this as the Wikipedia page and the video below offer slightly different versions of this story, one suggesting a CRS error, the other a triangulation error.)

The entire lake sits on top of the Jefferson Island salt dome, but the steep sides of the salt dome, and a bit of bad luck, meant that a few metres were enough to spoil everyone's day. If you have 10 minutes, it's worth watching this video...

Apparently the accident happened at about 0430, and the crew abandoned the subsiding rig before breakfast. The lake was gone by dinner time. Here's how John Warren, a geologist and proprietor of Saltworks, describes the emptying in his book Evaporites (Springer 2006, and repeated on his awesome blog, Salty Matters):

Eyewitnesses all agreed that the lake drained like a giant unplugged bathtub—taking with it trees, two oil rigs [...], eleven barges, a tugboat and a sizeable part of the Live Oak Botanical Garden. It almost took local fisherman Leonce Viator Jr. as well. He was out fishing with his nephew Timmy on his fourteen-foot aluminium boat when the disaster struck. The water drained from the lake so quickly that the boat got stuck in the mud, and they were able to walk away! The drained lake didn’t stay dry for long, within two days it was refilled to its normal level by Gulf of Mexico waters flowing backwards into the lake depression through a connecting bayou...

The other source that seems reliable is Oil Rig Disasters, a nice little collection of data about various accidents. It ends with this:

Federal experts from the Mine Safety and Health Administration were not able to apportion blame due to confusion over whether Texaco was drilling in the wrong place or that the mine’s maps were inaccurate. Of course, all evidence was lost.

If the bit about the location is true, it may be one of the best stories of the perils of data management errors. If anyone (at Chevron?!) can find out more about it, please share!

10 ways to improve your data store

When I look at the industry's struggle with the data mess, I see a parallel with science's struggle with open data. I've written lots about that before, but the basic idea is simple: scientists need discoverable, accessible, documented, usable data. Does that sound familiar?

I wrote yesterday that I think we have to get away from the idea that we can manage data like we might manage a production line. Instead, we need to think about more organic, flexible strategies that cope with and even thrive on chaos. I like, or liked until yesterday, the word 'curation', because it implies ongoing care and a focus on the future. But my friend Eric Marchand was right in his comment yesterday — the dusty connotation is too strong, and dust is bad for data. I like his supermarket analogy: packaged, categorized items, each with a cost of production and a price. A more lively, slightly chaotic market might match my vision better — multiple vendors maintaining their own realms. One can get carried away with analogies, but I like this better than a library or museum.

The good news is that lots of energetic and cunning people have been working on this idea of open data markets. So there are plenty of strategies we can try, alongside the current strategy of giving giant service companies millions of dollars for their TechCloud® Integrated ProSIGHT™ Data Management Solutions.

Serve your customer:

  • Above all else, build what people need. It's amazing that this needs to be said, but ask almost anyone what they think of IT at their company and you will know that it is not how it works today. Everything you build should be in response to the business pulling. 
  • This means you have to get out of the building and talk to your customers. In person, one-one-one. Watch them use your systems. Listen to them. Respond to them. 

Knock down the data walls:

  • Learn and implement open data practices inside the organization. Focus on discoverability, accessiblity, documentation of good-enough data, not on building The One True Database. 
  • Encourage and fund open data practices among providers of public data. There is a big role here for our technical societies, I believe, but I don't think they have seen it yet.

I've said it before: hire loads of geeks:

  • The web (well, intranet) is your pipeline. Build and maintain proper machine interfaces (APIs and web APIs) for data. What, you don't know how to do this? I know; it means hiring web-savvy data-obsessed programmers.
  • Bring back the hacker technologists that I think I remember from the nineties. Maybe it's a myth memory, but sprinkled around big companies there used to be super-geeks with degrees in astrophysics, mad UNIX skills, and the Oracle admin password. Now it's all data managers with Petroleum Technology certificates who couldn't write an awk script if your data depended on it (it does). 
  • Institute proper data wrangling and analysis training for scientists. I think this is pretty urgent. Anecdotal evidence: the top data integration tools in our business is PowerPoint... or an Excel chart with two y-axes if we're talking about engineers. (What does E&P mean?)

Three more things:

  • Let data live where it wants to live (databases, spreadsheets, wikis, SharePoint if you must). Focus on connecting data with APIs and data translators. It's pointless trying to move data to where you want it to be — you're just making it worse. ("Oh, you moved my spreadsheet? Fine, I will copy my spreadsheet.")
  • Get out of the company and find out what other people are doing. Not the other industry people struggling with data — they are just as clueless as we are. Find out what the people who are doing amazing things with data are doing: Google, Twitter, Facebook, data.gov, Wikipedia, Digital Science, The New York Times, The Guardian,... there are so many to choose from. We should invite these people to our conferences; they can help us.
  • If you only do one thing, fix search in your company. Stop tinkering with semantic ontologies and smart catalogs, just buy Google Search Appliance and fix it. You can get this one done by Christmas.

Last thing. If there's one mindset that will really get in the way, it's the project mindset. If we want to go beyond coping with the data mess, far beyond it to thriving on it, then we have to get comfortable with the idea that this is not a project. The word is banned, along with 'initiative', 'governance', and Gantt charts. The requirements you write on the back of a napkin with three colleagues will be just as useful as the ones you get back from three months of focus groups.

No, this is the rest of your career. This is never done, next year there are better ideas, more flexible libraries, faster hardware, and new needs. It's like getting fit: this ain't an 8-week get-fit program, it's an eternity of crunches.

The photograph of Covent Market in London, Ontario is from Boris Kasimov on Flickr.

Lusi's 8th birthday

Lusi is the nickname of Lumpur Sidoarjo — 'the mud of Sidoarjo' — the giant mud volcano in the city of Sidoarjo, East Java, Indonesia. This week, Lusi is eight years old.

Google MapsBefore you read on, I recommend taking a look at it in Google Maps. Actually, Google Earth is even better — especially with the historical imagery. 

The mud flow was [may have been; see comments below — edit, 26 June 2014] triggered by the Banjar Panji 1 exploration well, operated by Lapindo Brantas, though the conditions may have been set up by a deadly earthquake. Mud loss events started in the early hours of 27 May 2006, seven minutes after the 6.2 Mw Yogyakarta earthquake that killed about 6,000 people. About 24 hours later, a large kick was killed and the blow-out preventer activated. Another 22 hours after this, while fishing in the killed well, mud, steam, and natural gas erupted from a fissure about 200 m southwest of the well. A few weeks after that, it was venting 180,000 m³ every day — enough mud to fill 72 Olympic swimming pools.

Thousands of years

In the slow-motion disaster that followed, as hot water from Miocene carbonates mobilized volcanic mud from Pleistocene mudstones, at least 15,000 people — and maybe as many as 50,000 people — were displaced from their homes. Davies et al. (2011) estimated that the main eruption may last 26 years, though recent sources suggest it is easing quickly. Still, during this time, we might expect 95–475 m of subsidence. And in the long term? 

By analogy with natural mud volcanoes it can be expected to continue to flow at lower rates for thousands of years. — Davies et al. (2011)

So we're only 8 years into a thousand-year man-made eruption. And there's already enough mud thrown up from the depths to cover downtown Calgary...

References and further reading

Quite a bit has been written about LUSI. The Hot Mud Flow blog tracks a lot of it. The National University of Singapore has a lot of satellite photographs, besides those you'll find in Google Earth. The Wikipedia article links to a lot of information, as you'd expect. The Interweb has a few others, including this article by Tayvis Dunnahoe in E&P Magazine. 

There are also some scholarly articles. These two are worth tracking down:

Davies, R, S Mathias, R Swarbrick and M Tingay (2011). Probabilistic longevity estimate for the LUSI mud volcano, East Java. Journal of the Geological Society 168, 517–523. DOI 10.1144/0016-76492010-129

Sawolo, N, E Sutriono, B Istadi, A Darmoyo (2009). The LUSI mud volcano triggering controversy: was it caused by drilling? Marine & Petroleum Geology 26 (9), 1766–1784. DOI 10.1016/j.marpetgeo.2009.04.002


The satellite images in this post are © DigitalGlobe and Google, captured from Google Earth, and are used here in accordance with their terms of use. The maps are © OpenStreetMap and licensed ODbL. The seismic section is from Davies et al. 2011 and © The Geological Society of London and is used here in accordance with their terms of use. The text of this post is © Agile Geoscience and openly licensed under the terms of CC-BY, as always!

Try an outernship

In my experience, consortiums under-deliver. We can get the best of both worlds by making the industry–academia interface more permeable.

At one of my clients, I have the pleasure of working with two smart, energetic young geologists. One recently finished, and the other recently started, a 14-month super-internship. Neither one had more than a BSc in geology when they started, and both are going on to do a postgraduate degree after they finish with this multinational petroleum company.

This is 100% brilliant — for them and for the company. After this gap-year-on-steroids, what they accomplish in their postgraduate studies will be that much more relevant, to them, to industry, and to the science. And corporate life, the good bits anyway, can teach smart and energetic people about time management, communication, and collaboration. So by holding back for a year, I think they've actually got a head-start.

The academia–industry interface

Chatting to these young professionals, it struck me that there's a bigger picture. Industry could get much better at interfacing with academia. Today, it tends to happen at a few key relationships, in recruitment, and in a few long-lasting joint industry projects (often referred to as JIPs or consortiums). Most of these interactions happen on an annual timescale, and strictly via presentations and research reports. In a distributed company, most of the relationships are through R&D or corporate headquarters, so the benefits to the other 75% or more of the company are quite limited.

Less secrecy, free the data! This worksheet is from the Unsolved Problems Unsession in 2013.Instead, I think the interface should be more permeable and dynamic. I've sat through several JIP meetings as researchers have shown work of dubious relevance, using poor or incomplete data, with little understanding of the implications or practical possibilities of their insights. This isn't their fault — the petroleum industry sucks at sharing its goals, methods, uncertainties, and data (a great unsolved problem!).

Increasing permeability

Here's my solution: ordinary human collaboration. Send researchers to intern alongside industry scientists for a month or two. Let them experience the incredible data and the difficult problems first hand. But don't stop there. Send the industry scientists to outern (yes, that is probably a word) alongside the academics, even if only for a week or two. Let them experience the freedom of sitting in a laboratory playground all day, working on problems with brilliant researchers. Let's help  people help each other with real side-by-side collaboration, building trust and understanding in the process. A boring JIP meeting once a year is not knowledge sharing.

Have you seen good examples of industry, government, or academia striving for more permeability? How do the high-functioning JIPs do it? Let us know in the comments.


If you liked this, check out some of my other posts on collaboration and knowledge sharing...

Scientists not prospects

If you've never worked on 'the dark side' — selling technical products and services — you may not think much about marketing. If you work for an energy company, and especially if you're a 'decision maker' (wait, don't we all make decisions?), you may not realize that it's all aimed at you. Every ad, every sponsored beer, every trade show booth and its cute bunnies. The marketers are the explorers, and you are the prospect.

My question is: are you OK with their exploration methods?

The cost of advertising

Marketing futurists have been saying for ages that interruption advertising is dead. Uncurated, highest-bidder, information-free ads, 'inserted' (that's what they call it) into otherwise interesting and useful 'content' do not work. Or at least, people can't agree on whether they work or not. And that means they don't. 

The price of print advertising does not reflect this, however. Quite the opposite. Here's what a year of premium full-page ads will cost you in three leading publications: 

Still not bad compared to Wired ($1.67 million). You start to understand why companies hire marketing people. Negotiating volume pricing and favourable placements is a big deal, think of the money you can save. What a shame ads bring nothing at all to our community. All that money — so little impact. Well, zero impact. 

Conferences are where it really gets serious. Everything has a price. Want to buy 250 gallons of filtrate, I mean 'sponsor a coffee break'? That will be $5000 but don't worry, you get a little folded card with your name on it (and some coffee stains). How about a booth in the exhibition? They are only $23 per sq ft (about $250/m2), so that big shiny booth? That'll be about $75,000. That's before you bring in carpet, drywall, theatrical lighting, displays, and an espresso machine.

No wonder one service company executive once told me: "It's not a waste of money. It's a colossal waste of money." He said they only went because people would talk if they didn't.

Welcome to the oil industry

Walking around the trade show at SEG the other week, we were not very surprised to be accosted by a troop of young women dressed in identical short, tight dresses, offering beer tickets. Where's the beer? At their booth, obviously, about half a kilometre away (Manhattan distance). Apparently the marketing department assumed that no-one in their right mind would visit their booth on the basis of their compelling products or their essential relationships with an engaged and enthusiastic user community. Come to think of it, they were probably correct.

One innovative company has invented time travel, but unfortunately only to 1975. At least, that's the easiest way to explain the shoeshine stand at Ovation's booth. You can imagine the marketing meeting: "Let's get some women in short skirts, and get them to shine people's shoes!" I expect someone said, "Wait, isn't this a technical conference for subsurface scientists, shouldn't we base our marketing strategy on delighting the industry with our unbeatable services?" — and after a moment's reflection, the raucous laughter confirming that yes, the sexy shoeshine stand was indeed an awesome idea.

Let's be clear: marketing, as practised in this industry, is a waste of money. And this latter kind of marketing — remarkable for all the wrong reasons — is an insult to our profession and our purpose. 

Are we cool with this?

Last year I asked whether we (our community of technical practitioners and scientists) are okay with burning 210 person-years of productivity at a major conference and having very little to show for it at the end. 

Today I'm asking a different question: are we okay with burning millions of dollars on glossy ads, carpeted booths, nasty coffee, and shoeshine stands? Is this an acceptable price for our attention? Is the signal:noise ratio high enough?

I am not sure where I'm going with all of this — I am still trying to figure out what I think about it all. But I know one thing: I can't stand it. I will not step into another exhibition. I am withdrawing my attention — which I suppose means that yours is now worth a tiny bit more. Or less.

Update on 2013-10-16 17:10 by Matt Hall

Anyone involved in marketing that has actuallyl read this far without surfing off in disgust might like to carry on reading the follow-up to this post — Do something that scares you.

Key technology trends in earth science

Yesterday, I went to the workshop entitled, Grand challenges and research opportunities in geophysics, organized by Cengiz Esmersoy, Wafik Beydoun, Colin Sayers, and Yoram Shoham. I was curious if there'd be overlap with the Unsolved Problems Unsession we hosted in Calgary, and had reservations about it being an overly fluffy talkshop, but it was much better than I expected.

Ken Tubman, VP of Geosciences and Reservoir Engineering at ConocoPhillips, gave a splendid talk to open the session. But it was the third talk of the session, from Mark Koelmel, General Manager of Earth Sciences at Chevron, that resonated most with me. He highlighted 5 trends in applied earth science.

Data and information management

Data volumes are expanding with Moore's law. Chevron has more than 15 petabytes of data, by 2020 they will have more than 100PB. Koelmel postulated that spatial metadata and tagging will become pervasive and our data formats will have to evolve accordingly. Instead of managing ridiculously large amounts of data, a better solution may be to 'tag it and chuck it in the closet' — Google's approach to the web (and we know the company has been exploring the use of Hadoop). Beyond hardware, he stressed that new industry standards are needed now. The status quo is holding us back.

Full azimuth seismic data

Only recently have we been able to wield the computing power to deal with the kind of processes needed for full-waveform inversion. It's not only because of data volumes that new processing facilities will not be cheap — or small. He predicted processing centres that resemble small cities in terms of their power consumption. An interesting notion of energy for energy, and the reason for recent massive growth in Google's power production capability. (Renewables for power, oil for cooling... how funny would that be?)

Interpretive seismic processing and imaging

Interpretation, and processing are actually the same thing. The segmentation of seismic technology will have to be stitched back together. Imagine the interpreter working on field data, with a mixing board to produce just the right image for today's work. How will service companies (who acquire data and make images), and operators (who interpret data and make prospects) merge their efforts? We may have to consider different business relationships.

Full-cycle interpretation systems

The current state of integration is sequential at best, each node in a workflow produces static inputs for the next step, with minimal iteration in between. Each component of the sequence typically ends with 'throwing things over the wall' to the next node. With this process, the uncertainties are cumulative throughout, which is unnerving because we don't often know what the uncertainties are. Koelmel's desired future state is one of seamless geophysical processing, static model-building, and dynamic reservoir simulation. It won't reduce uncertainties altogether, but by design it will make them easier to identify and addressed.

Intellectual property

The number of patents filed in this industry has more than tripled in the last decade. I assumed Koelmel was going to give a Big Oil lecture on secrecy and patents, touting them as a competitive advantage. He said just the opposite. He asserted that industries with excessive patenting (think technology, and Big Pharma) make innovation difficult. Chevron is no stranger to the patent processes, filing 125 patents both in 2012 and in 2011, but this is peanuts compared to Schlumberger (462 in 2012) and IBM (6457 in 2012). 

The challenges geophysicists are facing are not our own. They stem from the biggest problems in the industry, which are of incredible importance to mankind. Perhaps expanding the value proposition to such heights is more essential than ever. Geophysics matters.

Where have all the geologists gone?

ExCel LondonFresh off the plane from my vacation in Europe, I spent today exploring the Exhibition at ExCel London, at the 2013 EAGE convention. It's a massive venue, and I spent the entire day there having conversations. I didn't look upon a single PowerPoint slide all day, and it was awesome. 

Seismic domination

This is my first time at the EAGE conference, and I was expecting to see an fairly equal spread of geoscientists and engineers. I was wrong. The exhibition hall at least, is dominated by seismic acquisition and processing companies. Which suggests one thing. There is big business in seismic methods — manufacturing equipment, designing and acquiring surveys, and processing all that data.

Additionally, I counted 17 operating companies out on display. Recruiting and networking hoopla at its finest. In contrast, I don't think I saw one operator on the exhibition floor one month ago at the Calgary GeoConvention.  

Apparently, geophysics is hot. But where have all the geologists gone? Are they lurking in the shadows? Based on the technology represented at the exhibition, are we at the risk of homogenizing the industry and all calling ourselves geophysicists one day?

EAGE ExhibitionEven though the diversity of disciplines appears to be lacking, marketing creativity certainly is not. Today, I listened to a string quartet perform at one booth, while sipping on a carrot-and-ginger juice freshly squeezed at another. Moments later, I struggled through a hard-to-hear conversation because the noise from a Formula One pit crew demonstration was deafening. It is both amazing and disturbing the expense companies will rack up to try to be remarkable. But such remarks of the fleeting kind. It fades as quickly as the song, drink, or tire-change is over.

And being a spectator of it all, I am reminded that the remarkability I am after is of the more enduring kind.