I recently submitted a review paper along with four co-authors on the topic of signal propagation in sedimentary systems across timescales. The idea that landscapes contain within them information about controls such as tectonics and climate has been a part of our science for a very long time. But, recent advances in the measurement/calculation of rates of processes (for example, with cosmogenic radionuclides) as well as theory and modeling related to how such ‘signals’ generate sediment and propagate across the Earth’s surface to be, potentially, encoded into stratigraphy motivated us to write a review. I’ll post more about the paper once it’s gone through the review-and-revise process, but wanted to write a brief post here on the topic.
Let’s start simple. Consider a sedimentary source-to-sink system with erosional uplands (sediment production) connected to depositional lowlands and/or marine basin (sediment accumulation). A tectonic or climatic change can change the rate of sediment production in the uplands that is potentially recorded down-system as a change in deposition. The morphology and length-scales of the system play a huge role in the behavior, which, in turn affects how (or if) that up-system signal is ‘preserved.’
As analogy, consider human-made debris basins. These structures, common in steep and tectonically active mountains such as the west coast of North America, are designed to mitigate debris-flow hazards on communities built on slopes that are prone to mass failure, especially during precipitation events. Debris basins are positioned on failure-prone slopes above concentrated population and/or infrastructure and designed to capture newly liberated sediment as it flows down slope, preventing that sediment from being transferred further down slope where potential damage and/or injury could occur.
Essentially, these basins are localized sinks that store sediment, thus preventing the signal (in this case, a rain storm) from propagating down system as a mass-wasting event. However, if the magnitude of the event exceeds the storage capacity of the sink, part of the signal will propagate down system anyway. For example, if the volume of liberated material exceeds the volume the debris basin can hold, the excess mass would bypass the basin after it fills to capacity. For debris basins to be effective they must be emptied following an event such that the storage capacity is returned to its maximum. So, in addition, time and the accumulation of multiple events plays a critical role in system behavior. For example, the sediment volume released from a single rain storm may only be enough to fill a debris basin to 10% its capacity. But, material from >10 storms of similar magnitude, if not removed, would effectively erase the signal-stopping action of the basin, which would allow future events (signals) to propagate down system.
What is exciting (and quite daunting) is applying these concepts to much larger length-scales and much longer timescales. Over longer and longer time periods the only evidence remaining of these mass-transfer dynamics is the stratigraphic record.
See this post from FOP about debris basins. And, if you haven’t already, read John McPhee’s “The Control of Nature,” which has a section about debris flows in the San Gabriel Mtns of southern California.
You’ve probably seen the fantastic How Science Works interactive diagram and website developed by the University of California Museum of Paleontology. If not, I encourage you to check it out. The main message is that the scientific method is not a simple, linear process. This is an important aspect for both budding scientists and the public to appreciate.
A new video on the YouTube channel for the Consortium for Ocean Leadership discusses the nonlinear, iterative, collective, and highly creative endeavor of doing science in the context of Earth science and ocean drilling. The video is only 10 minutes long and includes footage from IODP Expedition 342 that I sailed on in summer 2012. (If the video isn’t embedded below, here’s the YouTube link.)
Hey look at this, I’m posting on this blog again! The academic year has ended and I feel like I can take a breath. When people told me that being a tenure-track junior prof was going to keep me very busy they weren’t kidding, holey moley. But, it’s all exciting, fun, exhausting, challenging, and rewarding.
In an attempt to resuscitate this blog, I wanted to post about a few papers in sedimentary geoscience that I’m currently reading (or will read) this summer:
- From gullies to mountain belts: A review of sediment budgets at various scales — Matthias Hinderer, Sedimentary Geology, 2012 — This is a review paper taking a look at the importance of considering mass balance in Earth surface systems with an emphasis on methods for determining sediment budgets at geologic timescales. This is not about calculating sediment budgets in a hydrology/geomorphology sense — where flux can be monitored and measured in real time along different reaches of a stream, for example — this is about what we can (and, importantly, cannot) estimate or infer regarding paleo sediment budgets. For example, what tools are being used and what critical uncertainties exist for determining sediment budget in Quaternary (<2 million years old) systems? What about further back in time? The paper includes several case study examples that span a range of spatial and temporal scales.
- Scaling laws for aggradation, denudation, and progradation rates: The case for timescale invariance at sediment sources and sinks — Peter Sadler and Douglas Jerolmack, online-early contribution to upcoming GSL Special Publication “Strata and Time: Probing the Gaps in Our Understanding” — I predict this paper will be cited a lot by those of us interested in Earth surface dynamics, it’s a good one. I’ve written about Sadler’s contributions on this blog before (nearly seven years ago, time flies!), which deal with how unsteady depositional processes create a potentially misleading ‘artifact’ when comparing accumulation rates measured over different durations. Sadler & Jerolmack make the point that denudation (mass removal) rates rarely have this problem because methods like catchment-wide cosmogenics or thermochronology (for exhumation information) already sample the entire area of interest. Depositional (mass accumulation) rates, on the other hand, are typically measured in single locations, e.g., a borehole or outcrop section, which rarely are representative of the entire depositional area. This is perhaps an obvious point to make, but sometimes really important points are simple, and I think they articulate it very well. They then go on to make the case that cross-sectional area (a 2-D slice through a depositional volume) is potentially sufficient to approximate the true 3-D mass balance, which is difficult to impossible to constrain in ancient records.
- Mid-Cretaceous to Paleocene North American drainage reorganization from detrital zircons – Mike Blum and Mark Pecha, Geology, 2014 — I have not read this yet, I’m looking forward to diving in soon. This paper uses a database of >5,000 detrital zircon ages to reconstruct continent-scale drainage basin evolution of North American over 10s of millions of years of geologic history. The punchline is that during the Cretaceous much of North American drained north, toward the Arctic, and then in the Paleocene it switched towards the Gulf of Mexico (i.e., what has become the modern Mississippi drainage system).
This week’s photo is from the mouth of Wildrose Canyon in Panamint Valley, California. My grad student (for scale) and I were out this way doing field work last spring and ended up driving by this nice outcrop on our ‘commute’ from the campsite to the field site.
We didn’t spend much time examining these deposits, but they are almost certainly Pleistocene (maybe Pliocene?) alluvial-fan deposits. Notable features include the poorly sorted, matrix-supported character and the large, inclined surfaces (dipping towards the basin, from right to left).
(You probably see an advertisement right below the post. Sorry to clutter up this content with annoying ads, I don’t like it either, but I would otherwise have to pay for hosting. This option is the least intrusive.)
Remember when I used to post semi-regularly on this blog? Yeah, not so much these days. Incredibly busy with three graduate students working on three different projects, two undergraduate researchers, teaching two courses and a seminar this semester, trouble-shooting instrument issues in my lab, working with collaborators over email/phone, and trying to make progress on a couple of papers and two proposals with whatever time is left.
Overall, I’m lovin’ it, these are exciting times. It’s a joy working with motivated and curious graduate and undergraduate students. They make me want to improve how I do my science. I’m trying to continue to learn new skills (or, in some cases, re-learn skills I once knew).
Perhaps someday I’ll get back to posting on here!
Matt Hall and Evan Bianco of Agile Geoscience recently ran a two-day ‘hackathon’ where participants got together in a room and created some digital tools for working with geophysical data of subsurface geology. It’s a great idea — get some creative and passionate geoscientists and programmers (many participants are both of those) into the same room for a couple of days and build something. And they did.
The success of the Geophysics Hackathon has initiated discussion among the more stratigraphically oriented subsurface interpreters to have a similar event. The when, where, how, and other logistical matters need to be figured out. It takes a critical mass of interested people willing to do the leg work to make stuff like this happen, so we’ll see what happens.
But, for this post, I figured I would brainstorm some digital tools (apps?) that I would love to have for the work I do. Some may be easy to deal with, some may not … it’s a wish list. My focus is on outcrop-derived data because outcrop-based stratigraphy is a focus of my research group. But, these outcrop data are used directly or indirectly to inform how subsurface interpreters make predictions and/or characterize their data.
- Tools for converting outcrop measured sections (and core descriptions) into useable data. To me, this is the primary need. I could envision an entire event focused on this. As outcrop stratigraphers, we typically draft up our 1-D measured sections from scanned images of our field notebooks into a figure. There is value in simply looking at detailed section of sedimentological information in graphic form. This is why we spend time to draft these illustrations. However, we also need to extract quantitative information from these illustrations (an image file of some kind) — information including bed thickness stats, facies proportions, grain-size ‘logs’, package/cycle thickness stats, and more. The key is it has to flexible and able to work with an image file. That is, the data extraction workflow cannot dictate how the original data is collected in the field or how the section is drafted. Sections are drafted in different ways, at different resolutions, and are dependent on the exposure quality, purpose of study, etc. The post-processing must be able to deal with a huge variety.
- Tools for processing outcrop/core data (above) and generating metrics/statistics. Whether integrated into a single tool or as part of a multi-step workflow, we then need some tools that automate the processing of the outcrop/core data to generate metrics we are interested in. For example, if I have 10 sections and I want to calculate the percentages of various facies (e.g., as different colors in image or based on a grain-size log), it would be amazing to have script to automate that.
- A simple and accurate ‘ruler’ app for subsurface visualization software. I find myself constantly wanting to do quick (but not dirty, I want it to be accurate) measurements of dimensions of stratigraphy of interest. Some software packages have built-in rulers to make measurements and some of these are okay, but it’s usually clunky and I end up unsatisfied. I want to know how wide (in meters) and how thick (in TWT milliseconds) something is in seismic-reflection data? And I want it in a matter of seconds. Launch the app, a couple clicks, and I got it. I can move on and don’t have to mess around with any display settings. And if I want to do the same on a different subsurface visualization software, I can use the same app. I have no idea if this is even possible across multiple proprietary software packages, but this is what I want.
Please add comments, write your own post, start a wiki page, etc. to add to this.