Skip to content

Wish list of digital ‘tools’ for stratigraphic analysis

October 5, 2013

Matt Hall and Evan Bianco of Agile Geoscience recently ran a two-day ‘hackathon’ where participants got together in a room and created some digital tools for working with geophysical data of subsurface geology. It’s a great idea — get some creative and passionate geoscientists and programmers (many participants are both of those) into the same room for a couple of days and build something. And they did.

The success of the Geophysics Hackathon has initiated discussion among the more stratigraphically oriented subsurface interpreters to have a similar event. The when, where, how, and other logistical matters need to be figured out. It takes a critical mass of interested people willing to do the leg work to make stuff like this happen, so we’ll see what happens.

But, for this post, I figured I would brainstorm some digital tools (apps?) that I would love to have for the work I do. Some may be easy to deal with, some may not … it’s a wish list. My focus is on outcrop-derived data because outcrop-based stratigraphy is a focus of my research group. But, these outcrop data are used directly or indirectly to inform how subsurface interpreters make predictions and/or characterize their data.

  • Tools for converting outcrop measured sections (and core descriptions) into useable data. To me, this is the primary need. I could envision an entire event focused on this. As outcrop stratigraphers, we typically draft up our 1-D measured sections from scanned images of our field notebooks into a figure. There is value in simply looking at detailed section of sedimentological information in graphic form. This is why we spend time to draft these illustrations. However, we also need to extract quantitative information from these illustrations (an image file of some kind) — information including bed thickness stats, facies proportions, grain-size ‘logs’, package/cycle thickness stats, and more. The key is it has to flexible and able to work with an image file. That is, the data extraction workflow cannot dictate how the original data is collected in the field or how the section is drafted. Sections are drafted in different ways, at different resolutions, and are dependent on the exposure quality, purpose of study, etc. The post-processing must be able to deal with a huge variety.
  • Tools for processing outcrop/core data (above) and generating metrics/statistics. Whether integrated into a single tool or as part of a multi-step workflow, we then need some tools that automate the processing of the outcrop/core data to generate metrics we are interested in. For example, if I have 10 sections and I want to calculate the percentages of various facies (e.g., as different colors in image or based on a grain-size log), it would be amazing to have script to automate that.
  • A simple and accurate ‘ruler’ app for subsurface visualization software. I find myself constantly wanting to do quick (but not dirty, I want it to be accurate) measurements of dimensions of stratigraphy of interest. Some software packages have built-in rulers to make measurements and some of these are okay, but it’s usually clunky and I end up unsatisfied. I want to know how wide (in meters) and how thick (in TWT milliseconds) something is in seismic-reflection data? And I want it in a matter of seconds. Launch the app, a couple clicks, and I got it. I can move on and don’t have to mess around with any display settings.  And if I want to do the same on a different subsurface visualization software, I can use the same app. I have no idea if this is even possible across multiple proprietary software packages, but this is what I want.

Please add comments, write your own post, start a wiki page, etc. to add to this.

(You probably see an advertisement right below the post. Sorry to clutter up this content with annoying ads, I don’t like it either, but I would otherwise have to pay for hosting. This option is the least intrusive.)

5 Comments leave one →
  1. October 5, 2013 12:19 pm

    Hi Brian. I personally think the measuring tools in Petrel, Kingdom Suite, etc, are OK. What would be much more powerful and useful is a tool that allowed measurements in ms TWT in a time-migrated volume to be directly converted to metres, based on a layer-cake (or more sophisticated) velocity model ‘layered’ on top of the time data.

  2. October 6, 2013 4:46 am

    A company I used to work for a few years back makes pretty powerful image analysis software (http://www.definiens.com/solutions-overview/product-list.html). They focus on biomedical applications, but at least in the old days, the software itself was “programmable” with different rule-sets for the interpretation of what was in images. Pricey, but may be worth a look.

  3. October 7, 2013 7:05 am

    Saw a presentation that somewhat works with part of your wish list by UT Dallas folks, the other week. http://www.utdallas.edu/research/cybermapping/work.html

    I think a GIS software could do what you are asking — instead of the “normal” map view for querying, data should be mapped on a cross-section.

  4. bukanepo permalink
    October 20, 2013 9:32 am

    Hi Brian,
    Maybe hyperspectral image will be useful in core analysis and provide more quantifiable result and statistically accounted.

  5. April 27, 2014 12:46 am

    Hi Brian,
    You should try WellCAD. Widely used by oil and gas (and also mining-) companies for core descriptions, and also useable for outcrops (it runs on Windows, you could use a Panasonic toughpad for outcrops). Everything is done in depth, you can also add wireline logs and borehole image logs, and it can easily give you statistics as well.
    If you want to try it out, please send me an email: rien.corstanje@alt.lu.

Leave a reply to bukanepo Cancel reply