GeoLog

modelling

GeoTalk: To understand how ice sheets flow, look at the bedrock below

GeoTalk: To understand how ice sheets flow, look at the bedrock below

Geotalk is a regular feature highlighting early career researchers and their work. In this interview we speak to Mathieu Morlighem, an associate professor of Earth System Science at the University of California, Irvine who uses models to better understand ongoing changes in the Cryosphere. At the General Assembly he was the recipient of a 2018 Arne Richter Award for Outstanding Early Career Scientists.  

Could you start by introducing yourself and telling us a little more about your career path so far?

Mathieu Morlighem (Credit: Mathieu Morlighem)

I am an associate professor at the University of California Irvine (UCI), in the department of Earth System Science. My current research focuses on better understanding and explaining ongoing changes in Greenland and Antarctica using numerical modelling.

I actually started glaciology by accident… I was trained as an engineer, at Ecole Centrale Paris in France, and was interested in aeronautics and space research. I contacted someone at the NASA Jet Propulsion Laboratory (JPL) in the US to do a six-month internship at the end of my master’s degree, thinking that I would be designing spacecrafts. This person was actually a famous glaciologist (Eric Rignot), which I did not know. He explained that I was knocking on the wrong door, but that he was looking for students to build a new generation ice sheet model. I decided to accept this offer and worked on developing a new ice sheet model (ISSM) from scratch.

Even though this was not what I was anticipating as a career path, I truly loved this experience. My initial six-month internship became a PhD, and I then moved to UCI as a project scientist, before getting a faculty position two years later. Looking back, I feel incredibly lucky to have seized that opportunity. I came to the right place, at the right time, surrounded by wonderful people.

This year you received an Arne Richter Award for Outstanding Early Career Scientists for your innovative research in ice-sheet modelling. Could you give us a quick summary of your work in this area?

The Earth’s ice sheets are losing mass at an increasing rate, causing sea levels to rise, and we still don’t know how quickly they could change over the coming centuries. It is a big uncertainty in sea level rise projections and the only way to reduce this uncertainty is to improve ice flow models, which would help policy makers in terms of coastal planning or choosing mitigation strategies.

I am interested in understanding the interactions of ice and climate by combining state-of-the-art numerical modelling with data collected by satellite and airplanes (remote sensing) or directly on-site (in situ).  Modelling ice sheet flow at the scale of Greenland and Antarctica remains scientifically and technically challenging. Important processes are still poorly understood or missing in models so we have a lot to do.

I have been developing the UCI/JPL Ice Sheet System Model, a new generation, open source, high-resolution, higher-order physics ice sheet model with two colleagues at the Jet Propulsion Laboratory over the past 10 years. I am still actively developing ISSM and it is the primary tool of my research.

More specifically, I am working on improving our understanding of ice sheet dynamics and the interactions between the ice and the other components of the Earth system, as well as improving current data assimilation capability to correctly initialize ice sheet models and capture current trends. My work also involves improving our knowledge of the topography of Greenland and Antarctica’s bedrock (through the development of new algorithms and datasets). Knowing the shape of the ground beneath the two ice sheets is key for understanding how an ice sheet’s grounding line (the point where floating ice meets bedrock) changes and how quickly chunks of ice will break from the sheet, also known as calving.

Steensby Glacier flows around a sharp bend in a deep canyon. (Credit: NASA/ Michael Studinger)

At the General Assembly, you presented a new, comprehensive map of Greenland’s bedrock topography beneath its ice and the surrounding ocean’s depths. What is the importance of this kind of information for scientists?

I ended up working on developing this new map because we could not make any reliable simulations with the bedrock maps that were available a few years ago: they were missing key features, such as deep fjords that extend 10s of km under the ice sheet, ridges that stabilize the retreat, underwater sills (that act as sea floor barriers) that may block warm ocean waters at depth from interacting with the ice, etc.

Subglacial bed topography is probably the most important input parameter in an ice sheet model and remains challenging to measure. The bed controls the flow of ice and its discharge into the ocean through a set of narrow valleys occupied by outlet glaciers. I am hoping that the new product that I developed, called BedMachine, will help reduce the uncertainty in numerical models, and help explain current trends.

3D view of the bed topography and ocean bathymetry of the Greenland Ice Sheet from BedMachine v3 (Credit: Peter Fretwell, BAS)

How did you and your colleagues create this map, and how does it compare to previous models?

The key ingredient in this map, is that a lot of it is based on physics instead of a simple “blind” interpolation. Bedrock elevation is measured by airborne radars, which send electromagnetic pulses into the Earth’s immediate sub-surface and collect information on how this energy is reflected back. By analyzing the echo of the electromagnetic wave, we can determine the ice thickness along the radar’s flight lines. Unfortunately, we cannot determine the topography away from these lines and the bed needs to be interpolated between these flight lines in order to provide complete maps.

During my PhD, I developed a new method to infer the bed topography beneath the ice sheets at high resolution based on the conservation of mass and optimization algorithms. Instead of relying solely on bedrock measurements, I combine them with data on ice flow speed that we get from satellite observations, how much snow falls onto the ice sheet and how much melts, as well as how quickly the ice is thinning or thickening. I then use the principle of conservation of mass to map the bed between flight lines. This method is not free of error, of course! But it does capture features that could not be detected with other existing mapping techniques.

3D view of the ocean bathymetry and ice sheet speed (yellow/red) of Greenland’s Northwest coast (Credit: Mathieu Morlighem, UCI)

What are some of the largest discoveries that have been made with this model? 

Looking at the bed topography alone, we found that many fjords beneath the ice, all around Greenland, extend for 10s and 100s of kilometers in some cases and remain below sea level. Scientists had previously thought some years ago that the glaciers would not have to retreat much to reach higher ground, subsequently avoiding additional ice melt from exposure to warmer ocean currents. However, with this new description of the bed under the ice sheet, we see that this is not true. Many glaciers will not detach from the ocean any time soon, and so the ice sheet is more vulnerable to ice melt than we thought.

More recently, a team of geologists in Denmark discovered a meteorite impact crater hidden underneath the ice sheet! I initially thought that it was an artifact of the map, but it is actually a very real feature.

More importantly maybe, this map has been developed by an ice sheet modeller, for ice sheet modellers, in order to improve the reliability of numerical simulations. There are still many places where it has to be improved, but the models are now really starting to look promising: we not only understand the variability in changes in ice dynamics and retreat all around the ice sheet thanks to this map, we are now able to model it! We still have a long way to go, but it is an exciting time to be in this field.

Interview by Olivia Trani, EGU Communications Officer

Geosciences Column: Meshing models with the small-scale ocean

The latest Geosciences Column is brought to you by Nikita Marwaha, who explains how a new generation of marine models is letting scientists open up the oceans. The new technique, described in Ocean Science, reveals what’s happening to ocean chemistry and biology at scales that are often hard to model…

Diving into the depths of the ocean without getting your feet wet is possible through biogeochemical modelling – a method used by scientists in order to study the ocean’s living systems. These simulated oceans are a means of understanding the role of underwater habitats and how they evolve over time. Covering nutrients, chlorophyll concentrations, marine plants, acidification, sea-ice coverage and flows, such modelling is an important tool used to explore the diverse field of marine biogeochemistry.

Barents Sea plankton bloom: sub-mesoscale flows may be responsible for the twisted, turquoise contours of this bloom (Credit: Jeff Schmaltz, MODIS Land Rapid Response Team, NASA GSFC)

Barents Sea plankton bloom: sub-mesoscale flows may be responsible for the twisted, turquoise contours of this bloom (Credit: Jeff Schmaltz, MODIS Land Rapid Response Team, NASA GSFC)

There is one outstanding problem with this technique though, as the very-small scale or sub-mesoscale marine processes are not well represented in global ocean models. Sub-mesoscale interactions take place on a scale so small, that computational models are unable to resolve them. Short for sub-medium (or ‘sub- meso’) length flows – the smaller flows in question are on the scale of 1-10 km. They are difficult to measure and observe, but their effects are seen in satellite imagery as they twist and turn beautiful blooms of marine algae.

Sub-mesoscale phenomena play a significant role in vertical nutrient supply – the vertical transfer of nutrients from nutrient-rich deep waters to light-rich surface waters where plankton photosynthesise. This is a major area of interest since the growth of marine plants is limited by this ‘two-layered ocean’ dilemma. But the ocean is partially able to overcome this, which is where sub-mesoscale flows come in. Sub-mesoscale flows are important in regions with large temperature differences over short distances – when colder, heavier water flows beneath warmer, lighter water. This movement brings nutrient-rich water up to the light-rich surface. Therefore, accurately modelling these important small-scale processes is vital to studying their effect on ocean life.

Global chlorophyll concentration: red and green areas indicate a high level or growth, whereas blue areas have much less phytoplankton. (Credit: University of Washington)

Global chlorophyll concentration: red and green areas indicate a high level or growth, whereas blue areas have much less phytoplankton. (Credit: SeaWiFS Project)

A group of scientists, led by Imperial College’s Jon Hill, probes the technique of biogeochemical ocean modelling and the issue of studying sub-mesoscale processes in a paper recently published in the EGU journal Ocean Science.  Rather than simply increasing the resolution of the models, the team suggests a novel method – utilising recent advances in adaptive mesh computational techniques. This simulates ocean biogeochemical behavior on a vertically adaptive computational mesh – a method of numerically analysing complex processes using a computer simulation.

What makes it adaptive? The mesh changes in response to the biogeochemical and physical state of the system throughout the simulation.

Their model is able to reproduce the general physical and biological behavior seen at three ocean stations (India, Papa and Bermuda), but two case studies really showcase this method’s potential: observing the dynamics of chlorophyll at Bermuda and assessing the sinking detritus at Papa. The team changed the adaptivity metric used to determine the varying mesh sizes and in both instances. The technique suitably determined the mesh sizes required to calculate these sub-mesoscale processes. This suggests that the use of adaptive mesh technology may offer future utility as a technique for simulating seasonal or transient biogeochemical behavior at high vertical resolution – whilst minimising the number of elements in the mesh. Further work will enable this to become a fully 3D simulation.

Comparison of different meshes produced by adaptive simulations: (a) Bermuda, taking the amount of chlorophyll into account (b) the original adaptive simulation at Bermuda, without taking chlorophyll into account (c) adaptive simulation at Papa, taking the amount of detritus into account (d) the original Papa simulation, without taking detritus into account. (Credit: Hill et al, 2014)

Comparison of different meshes produced by adaptive simulations: (a) Bermuda, taking the amount of chlorophyll into account (b) the original adaptive simulation at Bermuda, without taking chlorophyll into account (c) adaptive simulation at Papa, taking the amount of detritus into account (d) the original Papa simulation, without taking detritus into account. (Credit: Hill et al., 2014)

The fruits of this adaptive way of studying the small-scale ocean are already emerging as the secrets of the mysterious, sub-mesoscale ocean processes are probed. The ocean holds answers to questions about our planet, its future and the role of this complex, underwater world in the bigger, ecological picture – adapting to life and how we model it may just be the key we’ve been looking for.

By Nikita Marwaha

Reference:

Hill, J., Popova, E. E., Ham, D. A., Piggott, M. D. and Srokosz, M.: Adapting to life: ocean biogeochemical modelling and adaptive remeshing. Ocean Sci., 10, 323- 343, 2014

GeoTalk: The mantle and models and measurements, oh my! Talking geophysics with Juan Carlos Afonso

This week in GeoTalk, we’re talking to Juan Carlos Afonso, a geophysicist from Macquarie University, Sydney. He explains how a holistic approach is crucial to understanding tectonic processes and how a little “LitMod philosophy” can go a long way to achieving this…

First, could you introduce yourself and tell us a little about what you are currently working on?

My name is Juan Carlos Afonso and I’m a geophysicist currently working at Macquarie University in Sydney, Australia.  My research interests lie in the fields of geophysics and geodynamics, and span many different geophysical and geological processes. My current research integrates a lot of different disciplines, such as mineral physics, petrology, geodynamics, lithospheric modelling, nonlinear inversion, and physics of the mantle, to explore and improve our understanding of lithospheric evolution and plate tectonics.

More specifically, I am interested in the thermochemical structure and evolution of the lithospheric mantle, the mechanical and geochemical interactions between tectonic plates and the sublithospheric upper mantle, and their effects on small- and large-scale tectonic processes. The lithosphere is critical to humans because it is the reservoir of most of the natural resources on which modern society depends, as well as the locus of important geological and biological process such as seismic activity, CO2-recycling, mineralisation events, and volcanism!

Juan Carlos out in the field! (Credit: Juan Carlos Afonso)

Juan Carlos out in the field! (Credit: Juan Carlos Afonso)

During EGU 2012, you received a Division Outstanding Young Scientists Award for your research into the lithosphere and its properties. Could you tell us a bit more about your work in this area?

First of all, it was such a humbling experience to receive this award. I really admire the previous awardees and it is a real honour to have received this award.

I was selected for this award based mainly on the work I did on combining different geophysical and geochemical datasets into a single conceptual framework that has become known as the “LitMod approach”. This theoretical and computational framework fully integrates geochemistry, mineral physics, thermodynamics, and geophysics in an internally-consistent manner*. And allows researchers from different disciplines – seismology, geodynamics, petrology, mineral physics, etc. – to construct models of the Earth that not only satisfy one particular set of observations, but a multitude of observations. This is of primary importance because it guarantees consistency between theories and models (i.e. you can’t cheat!), and results in better and more robust data interrogation and interpretation. This approach is being applied to a wide range of geodynamic and geophysical problems, from studying the water content of the mantle to inferring the thermal structure of Venus.

More recently, my colleagues and I presented the idea of multi-observable probabilistic inversion, a technique that is similar to CAT-scanning in medicine, but that we used to study the thermochemical (or thermo-chemical-mechanical) structure of the lithosphere and upper mantle. We showed that it is a feasible, powerful and general method that makes the most out of available datasets and helps reconcile disparate observations and interpretations. This unifying framework brings researchers from diverse disciplines together under a unique holistic platform where everything is connected to everything else and it will hopefully help understand the workings of the Earth in a more complete manner. But there is a lot of work yet to be done to achieve this!!

…and off duty! (Credit: Juan Carlos Afonso)

…and off duty! (Credit: Juan Carlos Afonso)

How can programmes like LitMod help improve our understanding of plate tectonics?

A great scientist recently said “Each single discipline within the geosciences has progressed tremendously over the 20th century; the problems now lie at the interfaces between the sub-disciplines and ensuring that all geoscientific data are honoured in integrated models. We are well beyond the time when scientists can present their interpretations based on mono-discipline thinking. We absolutely must think of the Earth as a single physico-chemical system that we are all observing with different tools.”  These sentences capture very well the spirit of the LitMod approach, which forces you to think about and interpret geoscientific data in a manner that ensures consistency (as much as possible!). I think one of the reasons for the interest in such an approach is the need for robust and easy-to-use tools that researchers from different disciplines can apply to their individual datasets (seismic, gravity, magnetotelluric, etc.) and explore the connections to other related datasets and disciplines  it helps researchers have a better understanding of the broader implications of their own models. It is also useful to petrologists interested in testing the geophysical and geodynamic implications of their petrological and geochemical models.

LitMod provides a platform wherein chemistry and physics are married such that models of lithosphere and sub-lithospheric mantle must be consistent with petrology, heat flow, topography, gravity, geoid, and seismic and electromagnetic observations. Too often we see models of the Earth, derived from a single dataset, that are incompatible with other observations. Some are better, some are worse. To have a model that explains all observations does not imply that the model is correct, but it does minimise the chances of being wrong! Plate tectonics and science in general use this concept to advance our knowledge of the Earth.

An important (if not the most important!) factor to mention here is that, as with any other project of this magnitude, LitMod would not be possible without the contribution of many scientists who unselfishly helped me to put things together. I’d like to thank Javier Fullea, James Connolly, Nick Rawlinson, Yingjie Yang, Alan Jones, Bill Griffin, Sue O’Reilly and Manel Fernandez for all their help and crucial input to the “LitMod philosophy”.

Sussing out an outcrop. (Credit: Juan Carlos Afonso)

Sussing out an outcrop. (Credit: Juan Carlos Afonso)

And importantly, how does it work?

The main idea is actually quite simple:  a valid physicochemical model of the Earth has to explain all available data in a consistent manner. In essence, this is one of the main steps of the scientific method, right? The LitMod approach is simply a way of constructing Earth models (either by forward or inverse modelling) that satisfy basic physical principles and observations. In a nutshell, LitMod says “you cannot try to fit an observation by changing one parameter of your model without having to change all other parameters in a physically and thermodynamically consistent way, which in turn will affect the prediction of all the other observations”. This is a nice idea, and it should provide robust results as long as what one thinks is consistent, is actually correct. At this stage, we are confident with most of our choices, but there still is much work to do to get a complete understanding of how to model all available datasets simultaneously and how much we can believe our results.

The problem lies in the details, of course, because it is not easy to explain all data consistently when our understanding of each individual dataset is incomplete to different degrees. Moreover, the resolution and sensitivities of different datasets are markedly different too. This problem has a potential solution though. We just need to study the individual problems more carefully (e.g. more laboratory experiments, field case studies, etc.) until we obtain an understanding of them that is similar to the others. In practise this is not straightforward, and many gaps still exist in the description of some problems. A current example, but not the only one, is the discrepancy between results obtained by the magnetotelluric and seismic methods. But even in this case, an integrated modelling approach helps us to isolate the root causes of these discrepancies and to propose new studies to remediate them; something that could not be done by analysing the data separately.

And don’t forget the computational problems, which I find particularly fascinating and frustrating at the same time. Surprisingly, there is not much written about formal joint inversions of multiple datasets; we are learning as we go, but that is what keeps it entertaining!

Lastly, what are your research plans for the future?

I cannot know for sure what I’ll be doing in 10 years (probably geochemistry!), but I can tell you what I’m going to be doing in the next 5-6. Besides continuing working on regional scale inversions with LitMod, I am currently starting to work on two fronts that may appear disconnected at a first glance, but are actually intimately related. The first front is the construction of whole-Earth thermo-chemical-mechanical models, similar to what we are doing with LitMod, but at planetary scale. The other is modelling multiphase reactive flow in the Earth’s mantle with some new numerical techniques. In the end, 5-6 years from now, I think these two fronts will coalesce into a single thick wall… but noone knows whether the wall will stand solid or collapse like a castle of cards… we have to try though!

Want to know more about LitMod? Check out these resources:

Afonso, J. C. , Fullea, J., Griffin, W. L. , Yang, Y., Jones, A. G. , Connolly, J. A. D., O’Reilly, S. Y.: 3D multi-observable probabilistic inversion for the compositional and thermal structure of the lithosphere and upper mantle. I: a priori petrological information and geophysical observables. J. Geophys. Res., 118, 2586–2617, 2013.

Afonso, J. C., Fullea, J., Yang, Y., Connolly, J. A. D., Jones, A. G.: 3D multi-observable probabilistic inversion for the compositional and thermal structure of the lithosphere and upper mantle. II: General methodology and resolution analysis. J. Geophys. Res., 118, 1650–1676, 2013.

Fullea, J., Afonso, J. C., Connolly, J. A. D., Fernàndez, M., García-Castellanos, D., Zeyen, H.: LitMod3D: an interactive 3D software to model the thermal, compositional, density, rheological, and seismological structure of the lithosphere and sublithospheric mantle. Geochem. Geophys. Geosyst., 10, 2009.

*What is an internally consistent model?

By “internal consistency” I mean that all calculated parameters (e.g. thermal conductivity, bulk modulus, etc.) and observables (e.g. dispersion curves, travel times, et.c) are only and ultimately dependent on temperature, pressure, and composition (the fundamental independent variables), while being linked together by robust and sound (typically nonlinear) physical theories. This guarantees that a local change in properties (like density), which may be required to improve the fitting of a particular observable, will also be reflected in all other observables in a thermodynamically and physically consistent manner. It also implies that no linearity between observables needs to be assumed; each observable responds according to its own governing physical theory (e.g. sound propagation).

If you’d like to suggest a scientist for an interview, please contact Sara Mynott.

 

Imaggeo on Mondays: Getting a handle on Greenland’s glaciers

The picture below shows several small glaciers surrounding the Greenland ice sheet, in Tassilaq, near Kulusuk, East Greenland. The dark lines are glacial moraines, responsible for the transport of rock material from mountains towards sea.

The photographer, Romain Schläppy, highlights that “an important scientific topic consists to place the recent and ongoing Greenland warming in the broader context of past changes in south Greenland land climate, vegetation, sedimentation and ice history”. Indeed, with the recent report produced by the Ice2Sea programme, there is a lot of work being done to investigate glacial mass balance, with one particularly cool model looking at the how the edges of the Greenland ice sheet are changing in the greatest detail.

“The power of ice” by Romain Schläppy, distributed by the EGU under a creative commons licence.

“The power of ice” by Romain Schläppy, distributed by the EGU under a Creative Commons licence.

Most models separate large regions into squares, for surface modelling, or cubes, for something a little more 3D. This makes all the data that goes into a model easier to handle as you simplify the variation in, say, runoff rate, over a large area into a single value for runoff. While this makes information easier to handle, you also lose a lot of resolution, not something you want when big changes are happening on small scales.

This is the case in the Greenland ice sheet. The edges are advancing and retreating year in and year out, as they are influenced by the climate and conditions of the ocean around them, but the centre of the ice sheet remains relatively stable. This means that parameters such as meltwater runoff will be changing lots at the glacier front and relatively little in the middle.

To combat this, climate modellers have produced a new model using triangular blocks rather than square ones, so instead of having many equally large simplifications, you can have large, simple triangles where there’s not much going on and tiny ones to capture all the detail where the excitement is happening!

Reference:

Vaughan, D.G., Aðalgeirsdóttir, G., Agosta, C. et al. From Ice to High Seas, The ice2sea Consortium: 2013.

Imaggeo is the EGU’s online open access geosciences image repository. All geoscientists (and others) can submit their images to this repository and since it is open access, these photos can be used by scientists for their presentations or publications as well as by the press and public for educational purposes and otherwise. If you submit your images to Imaggeo, you retain full rights of use, since they are licensed and distributed by the EGU under a Creative Commons licence.