Minds over Methods is the second category of our T&S blog and is created to give you some more insights in the various research methods used in tectonics and structural geology. As a numerical modeller you might wonder sometimes how analogue modellers scale their models to nature, or maybe you would like to know more about how people use the Earth’s magnetic field to study tectonic processes. For each blog we invite an early career scientist to share the advantages and challenges of their method with us. In this way we are able to learn about methods we are not familiar with, which topics you can study using these various methods and maybe even get inspired to use a multi-disciplinary approach! This first edition of Minds over Methods deals with Numerical Modelling and is written by Anouk Beniest, PhD-student at IFP Energies Nouvelles (Paris).
Approaching the non-measurable
Anouk Beniest, PhD-student at IFP Energies Nouvelles, Paris
‘So, what is it that you’re investigating?’ It’s a question every scientist receives from time to time. In geosciences, the art of answering this question is to explain the rather abstract projects in normal words to the interested layman. Try this for example: “A long time ago, the South American and African Plate were stuck together, forming a massive continent, called Pangea, for many millions of years. Due to all sorts of forces, the two plates started to break apart and became separated. During this separation hot material from deep down in the earth rose to the surface increasing the temperature of the margins of the two continents. How exactly did this temperature change over time, since the separation until present-day? How did this change affect the basins along continental margins?”
These are legitimate questions and not easy to answer, since we cannot measure temperature at great depth or back in time. In this first post on numerical methods, we will be balancing between geology and geophysics, highlighting the possibilities and limits of numerical modelling.
The migration of ‘temperature’ through the lithosphere is a process that takes time and depends heavily on the scale you look at. Surface processes that affect the surface temperature can be measured and monitored, yielding interesting results on the present-day state and variations of the temperature. The influence of mantle convection cycles and radiogenic heat production are already more difficult to identify, take much more time to evolve and might not even affect the surface processes that much. Going back in time to identify a past thermal state of the earth seems almost impossible. This is where numerical models can be of use, to improve, for example, our understanding on the long-term behaviour of ‘temperature’.
Temperature is a parameter that affects and is affected by a variety of processes. When enough physical principles are combined in a numerical model, we can simulate how the temperature has evolved over time. All kinds of different parameters need to be identified and, most importantly, they need to make sense and apply to the observation or process you try to reproduce. Some of these parameters can be identified in the lab, like the density or conductivity of different rock types. Others need to be extracted from physical or geological observations or even estimated.
Once the parameters have been set, the model will calculate the thermal evolution. It is not an easy task to decide if a simulation approaches the ‘real’ history and if we can answer the questions posed above. We should always realise that thermal model results at best approach the real world. We can learn about the different ways temperature changes over time, but we should always be on the hunt to find measurements and observations that confirm what we have learned from the simulations.