GD
Geodynamics

Geodynamics

Work-life balance: insights from geodynamicists

Work-life balance: insights from geodynamicists

Maintaining a good work-life balance is essential for a steady career and happy life in academia. However, like with all good things, it is not easy. In this new Wit & Wisdom post, Jessica Munch, PhD student at ETH Zürich, explores how to achieve a good work-life balance.

Jessica Munch
Credit: Manon Lauffenburger

Research is a truly amazing occupation, especially in geodynamics (okay, that might be a bit biased…). However, disregarding the position you have academia, research is also a job asking for a lot of commitment, an ability to deal with pressure, (very) good organisational skills and an ability to deal with everything on time to (hopefully) stay in academia (if you want to learn more about stress and pressure effects on researchers, there are plenty of articles related to that on the web – if you did not experience them by yourself already – yay!) Hence, it seems that this dream job can sometimes turn into a nightmare, which could partly explain why so many people quit academia.

The solution to avoid having to make such an extreme decision as quitting? The legendary work-life balance: how to reconcile a job that you cannot get out of your mind once you’re done with your day (it’s not like you can easily switch off your brain once you leave your office and forget about all the questions you are trying to answer with your research, right?) with your private life, hobbies, and families?

I wanted to try to figure out what a work-life balance really means. At the very least, I wanted to find a more meaningful answer than wikipedia’s definition:

Work-life balance is a concept including the proper prioritisation between work (career and ambition) and lifestyle (health, pleasure, leisure, family)

The definition I would give based on my limited experience, is that a work-life balance is an (often rather fragile) equilibrium between academia and your private life that allows you to stay efficient and motivated about your research without losing the link with/neglecting the world outside. All of that while being happy and healthy. Easy, no?

Given my restricted experience with this balance, I wondered how other researchers at different stages in their career deal with this. I contacted several researchers and they took the time to reply to my questions although they had a very tight schedule (thanks you very much!).

A first insight comes from Susanne Buiter, Team leader at the Geological Survey of Norway: “I guess the fact that I am writing you in the weekend, on a Saturday evening says something about my balancing work and private life at the moment!”. Sounds quite tough, but then she gives some explanations. She appreciates that everyone is very dedicated to their research in our field. This often leads to long and rather unconventional working hours for research and teaching duties. This is fine as long as it is voluntary, but it should not become an expectation. Susanne’s take on this is that there should be flexibility from both sides, and that it is absolutely fine that some weeks are very busy as long as other times can be more relaxed. Considering her research not only as a job but also as a hobby seems to allow for a lot of dedication while keeping being her both happy and motivated. She also raises the question if anyone is able to actually do all the work he or she has to do when only working regular hours.

A second opinion on this precious work-life balance comes from an early career researcher, Marie Bocher, a postdoc at ETH Zürich. When discussing with her, she first points out that a work-life balance is not necessarily a condition to do a good research: one should not directly link the lack of balance in your life to a burnout. For Marie it is okay to work a lot, as long as her research is meaningful to her and she is efficient and motivated. Sometimes you work really hard, do not have a real balance, have no time for hobbies, etc., but this does not necessarily mean you are going to end up with a burnout. These kind of moments can actually be enjoyable, because you often notice that you are efficient and making progress, which is quite rewarding. Hence, what you need in research (even more than a balanced life) is a meaning to what you are doing or a reason for going to work every morning. This is what will prevent you from having a burnout and will help you to be a happy researcher. Support and validation from peers can also help.

However, Marie wonders if the work-life balance issue has always been an issue in academia. She mentions the fact that sometimes people feel pressure to have a balanced life according to someone else’s definition. Sometimes colleagues comment on the fact that you work late and that this is not normal, and that you should have a hobby (pressure to have hobbies, quite paradoxical, no?). This might result in you pushing yourself to do activities even though you would prefer to for once hang out at home and relax during your weekend. Not everyone needs to have an hyperactive life. Instead, people should just try to live a life they enjoy.

Finally, she raises the point that work-life balance is actually a dynamic equilibrium: it is something that changes depending on your situation. You cannot organise yourself the same way if you are single, if you have a partner, or if you have kids. It is a hard to find balance and that evolves with life and responsibilities.

Potentially dangerous/lethal way of working on your work-life balance
Credit: Antoine Grisart

Speaking about kids and family, the third and last (but definitely not least) thoughts on this topic come from Saskia Goes, lecturer/reader at Imperial College London. For her, having a balanced life means having time for other things besides works and occasionally time for herself – a definition she is not sure she could apply to her own life where she constantly has to juggle between work and family. Saskia explains that it is a continuous challenge to do enough work to keep the department happy and functioning, but to also say no to enough work so that she still has time for her own research, students and her family. She also points out that she has very little time to do research herself – only a few hours now and then. Her main research activity at the moment actually consists in working with students and postdocs on their papers.

When asked how she reconciles family life with her work, Saskia replies that it is doable, but only with sufficient support in the form of a partner, school care, family or friends. Moreover, she emphasises that you need to accept that you simply cannot keep up with people who work 60 to 80 hours every week and can attend three to four conferences a year. Some types of research do not work with a family, unless you have a partner who can significantly help out for a while. Bringing up the fact that a job in academia often implies a lot of moving (research positions in different countries, etc.), Saskia mentions that until now, she only moved once with her kids. The main challenge was then the lack of support (for instance from friends and family) when you move to the new place.

Finally, when I asked her for tips on how to manage all of this, she suggested to make lists to keep track of what needs to be done when, and to then divide and plan the tasks day by day, week per week so that they look manageable. The main challenge lies in trying to balance the amount of things you take on with the time you have!

According to these different insights on the work-life balance, a universal definition seems impossible. Instead, the precious balance appears to be quite personal. It depends on your situation in life, on how much your time you can actually dedicate to your project, and your ability to manage the tasks you need to do (or refuse to do). Hence, the work-life balance is a very personal concept everyone has to figure out for him/herself. Ultimately, it is just a matter of being happy with what you do.

Rheological Laws: Atoms on the Move

Rheological Laws: Atoms on the Move

The Geodynamics 101 series serves to showcase the diversity of research topics and methods in the geodynamics community in an understandable manner. We welcome all researchers – PhD students to Professors – to introduce their area of expertise in a lighthearted, entertaining manner and touch upon some of the outstanding questions and problems related to their fields. For our first ‘101’ for 2018, we have an entry by postdoctoral researcher Elvira Mulyukova from Yale University about rheology and deformation occurring on atomic scales … it’s a fun and informative read indeed! Do you want to talk about your research? Contact us!

Elvira Mulyukova, Yale University

Most of us have an intuitive understanding that different materials resist being moved, or deformed, to different degrees. Splashing around in the mud is more energy-consuming (and fun, but never mind that) than in the water, and splashing around in the block of concrete is energy-intensive bordering on deadly. What are the physical reasons for these differences?

For Earth materials (rocks), the answer lies in the restless nature of their atoms: the little buggers constantly try to sneak out of their crystal lattice sites and relocate. Some are more successful at it than others, making those materials more easily deformable. A lattice site is really just other atoms surrounding the one that is trying to escape. You see, atoms are like a bunch of introverts: each is trying to escape from its neighbours, but doesn’t want to get near them. The ones that do escape have to overcome a temporary discomfort (or an increase in their potential energy, for those physically inclined) of getting close to their neighbours. This requires energy. The closer the neighbours – the more energy it takes to get past them. When you exert force on a material, you force some of the neighbours to be further away from our potential atomic fugitive, making it more likely for the atom to sneak in the direction of those neighbours. The fun part (well, fun for nerds like me) is that it doesn’t happen to just one atom, but to a whole bunch of them, wherever the stress field induced by the applied force is felt. A bunch of atoms escaping in some preferential direction is what we observe as material deformation. The more energy you need to supply to induce the mass migration of atoms – the stronger the material. But it’s really a question of how much energy the atom has to begin with, and how much energy is overall needed to barge through its detested neighbuors. For example, when you crank up the temperature, atoms wiggle more energetically and don’t need as much energy supplied from external forcing in order to escape – thus the material gets weaker.

“A lattice site is really just other atoms surrounding the one that is trying to escape. You see, atoms are like a bunch of introverts: each is trying to escape from its neighbours, but doesn’t want to get near them.” cartoon by Elvira Mulyukova

One more thing. Where are the atoms escaping to? Well, there happen to be sanctuaries within the crystal lattice – namely, crystalline defects such as vacancies (aka point defects, where an atom is missing from the otherwise ordered lattice), dislocations (where a whole row of atoms is missing), grain boundaries (where one crystal lattice borders another, which is tilted relative to it) and other crystalline imperfections. These regions are sanctuaries because the lattice is more disordered there, which allows for larger distances in-between the neighbors. When occupying a regular lattice site – the atom is sort of trapped by the crystalline order. Think of the lattice as an oppressive regime, and the crystalline defects as liberal countries that are welcoming refugees. I don’t know, is this not the place for political metaphors? *…whistling and looking away…*

Ok, enough anthropomorphisms, let’s get to the physics. If this is the last sentence you’ll read in this blog entry, let it be this: rocks are made up of atoms that are arranged into crystal lattices (i.e., ordered rows and columns of atoms), which are further organized into crystal grains (adjacent crystals tilted relative to each other); applying force to a material encourages atoms to move in a preferential direction of the largest atomic spacing, as determined by the direction of the applied force; the ability of the lattice sites to keep their atoms in place (call it a potential energy barrier) determines how easily a material deforms. Ok, so it was more like three sentences, but now you know why we need to get to the atomic intricacies of the matter to understand materials macroscopic behaviour.

Alright, so we’re applying a force (or stress, which is simply force per area) to a material and watch it deform (a zen-inducing activity in its own right). We say that a material behaves like a fluid when its response to the applied stress (and not just any stress, but differential stress) is to acquire a strain rate (i.e., to progressively shorten or elongate in one direction or the other at some rate). On geological time scales, rocks behave like fluids, and their continuous deformation (mass migration of atoms within a crystal lattice) under stress is called creep.

The resistance to deformation is termed viscosity (let’s denote it µ), which basically tells you how much strain rate (˙e) you get for a given applied differential stress (τ). Buckle up, here comes the math. For a given dimension (say x, and for the record – I’ll only be dealing with one dimension here to keep the math symbols simple, but bear in mind that µ, ˙e and τ are all tensors, so you’d normally either have a separate set of equations for each dimension, or some cleverly indexed symbols in a single set of equations), you have:

So if I’m holding a chunk of peridotite with a viscosity of 1020 Pa s (that’s units of Pascal-seconds, and that’s a typical upper mantle viscosity) and squeezing it in horizontal direction with a stress of 108 Pa (typical tectonic stress), it’ll shorten at a rate of 5 · 10-13 s-1 (typical tectonic rates). A lower viscosity would give me a higher strain rate, or, equivalently, with a lower viscosity I could obtain the same strain rate by applying a smaller stress. If at this point you’re not thinking “oh, cool, so what determines the viscosity then?,” I failed massively at motivating the subject of this blog entry. So I’m just gonna go ahead and assume that you are thinking that. Right, so what controls the viscosity? We already mentioned temperature (let’s call it T), and this one is a beast of an effect. Viscosity depends on temperature exponentially, which is another way of saying that viscosity depends on temperature hellavulot. To throw more math at you, here is what this dependence looks like:

where R = 8.3144598 J/K/mol (that’s Joule per Kelvin per mol) is the gas constant and E is the activation energy. Activation energy is the amount of energy that an atom needs to have in order to even start thinking about escaping from its lattice site, which of course depends on the potential energy barrier set up by its neighbours. Let’s say your activation energy is E = 530·103 J mol-1 . If I raised your temperature from 900 to 1000 K (that’s Kelvin, and those are typical mid-lithospheric temperatures), your viscosity would drop by a factor of ∼ 1000. That’s a three orders of magnitude drop.

Like I said, helluvalot. If instead you had a lower activation energy, say E = 300 · 103 J mol-1 , the same temperature experiment would bring your viscosity down by a factor of ∼ 50, which is less dramatic, but still significant. It’s like running through peanut butter versus running through chocolate syrup (running through peanut butter is a little harder… I clearly need to work on my intuition-enhancing examples). Notice, however, that while the temperature dependence is stronger for materials with higher activation energies, it is more energy-consuming to get the creep going in those materials in the first place, since atoms have to overcome higher energy barriers. There’s more to the story. Viscosity also depends on pressure (call it P), which has a say in both the energy barrier the atoms have to overcome in order to escape their neighbours, as well as how many lattice defects (called sanctuaries earlier) the atoms have available to escape to. The higher the pressure, the higher the energy barrier and the fewer lattice sanctuaries to resort to, thus the higher the viscosity. Throwing in the pressure effect, viscosity goes as:

The exact dependence of viscosity on pressure is determined by V – the activation volume.

Alright, we’re finally getting to my favourite part – the atoms’ choice of sanctuary sites. If the atomic mass migration happens mainly via point defects, i.e., by atoms hopping from one single lattice vacancy to another, the deformation regime is called diffusion creep. As atoms hop away, vacancies accumulate in regions of compressive stress, and fewer vacancies remain in regions of tensional stress. Such redistribution of vacancies can come about by atoms migrating through the bulk of a crystal (i.e., the interior of a grain, which is really just a crystal that is tilted relative to its surrounding crystals), or atoms migrating along the boundary of a crystal (i.e., a grain boundary). In both cases, the rate at which atoms and vacancies get redistributed depends on grain size (let’s denote it r). The larger the grains – the more distance an atom has to cover to get from the part of the grain that is being compressed to the part that is under tension. More math is due. Here is what the viscosity of a material deforming by diffusion creep looks like:

Exponent m depends on whether the atoms are barging through the bulk of a grain (m = 2), or along the grain boundaries (m = 3). What’s that new symbol B in the denominator, you ask? That’s creep compliance (in this case – diffusion creep compliance), and you two have already met, sort of. Creep compliance specifies how a given creep mechanism depends on pressure and temperature:

For diffusion creep of upper mantle rocks, I typically use m = 3, B0 ∼ 13 µmm MPa-1 s-1 (which is just a material-specific prefactor), Ediff = 300 · 103 J mol-1 and Vdiff = 5 cm3 mol-1 from Karato and Wu (1993). Sometimes I go bananas and set Vdiff = 0 cm3 mol-1 , blatantly ignoring pressure dependence of viscosity, which is ok as long as I’m looking at relatively modest depth-ranges, like a few tens of kilometers.

At sufficiently high stresses, a whole row of atoms can become mobilized and move through the crystal, instead of the meagre one-by-one atomic hopping between the vacancies. This mode of deformation is called dislocation creep. Dislocations are really just a larger scale glitch in the structure of atoms (compared to vacancies). They are linear lattice defects, where a whole row of atoms can be out of order, displaced or missing. It requires more energy to displace a dislocation, because you are displacing more than one atom, but once it’s on the move, it accommodates strain much more efficiently than in the each-atom-for-itself diffusion creep regime. As the material creeps, dislocations get born (nucleated), get displaced and get dead (annihilated). Dislocations don’t care about grain size. What they do care about is stress. Stress determines the rate at which dislocations appear, move and disappear. I know you saw it coming, more math, here is the dislocation creep viscosity:

Exponent n dictates the stress dependence of viscosity. Stress dependence of dislocation creep viscosity is a real pain, making the whole thing nonlinear and difficult to use in a geodynamical model. Not impossible, but rage-inducingly difficult. Say you’re trying to increase the strain rate by some amount, so you increase the stress, but then the viscosity drops, and suddenly you have a monster of a strain rate you never asked for. Ok, maybe it’s not quite this bad, but it’s not as good as if the viscosity just stayed constant. You wouldn’t be able to have strain localization, form tectonic plate boundaries and develop life on Earth then, but maaaan would you be cracking geodynamic problems like they were peanuts! I’m derailing. Just like all the other creeps, dislocation creep has its own compliance, A, that governs its dependence on pressure and temperature:

For dislocation creep of upper mantle rocks, I typically use n = 3, A0 = 1.1 · 105 MPa-n s -1 (which is just a material-specific prefactor), Edisl = 530 · 103 J mol -1 and Vdiff = 20 cm3 mol-1 , similar to Karato and Wu (1993). Just like for diffusion creep, I sometimes just set Vdisl = 0 cm3 mol-1 to keep things simple.

We’re almost done. Allow me one last remark. A rock has an insane amount of atoms, crystal grains and defects, all subject to local and far-field conditions (stress, temperature, pressure, deformation history, etc). A typical rock is therefore heterogeneous on the atomic (nano), granular (micro) and outcrop (meter) scales. Thus, within one and the same rock, deformation will likely be accommodated by more than just one mechanism. With that in mind, and sticking to just two deformation mechanisms described above, we can mix it all together to get:

This is known as composite rheology, where we assumed that the strain rates accommodated by diffusion and dislocation creep can be simply summed up, like so:

Alright. If you got down to here, I salute you! Next time you’re squeezing a peridotite, or splashing in the mud, or running through peanut butter – give a shout out to those little atoms that enable you to do such madness. And if you want to get to the physics of it all, you can find some good introductory texts in Karato (2008); Turcotte and Schubert (2002).

References 

S. Karato. Deformation of earth materials: an introduction to the rheology of solid earth. Cambridge Univ Pr, 2008. 

S. Karato and P. Wu. Rheology of the upper mantle: a synthesis. Science, 260(5109):771–778, 1993. 

D.L. Turcotte and G. Schubert. Geodynamics. Cambridge Univ Pr, 2002.

From hot to cold – 7 peculiar planets around the star TRAPPIST-1

From hot to cold – 7 peculiar planets around the star TRAPPIST-1

Apart from Earth, there are a lot of Peculiar Planets out there! Every 8 weeks, give or take, we look at a planetary body or system worthy of our geodynamic attention. When the discovery of additional Earth-sized planets within the TRAPPIST-1 system was revealed last year, bringing the total to 7 planets, it captured the minds of audiences far and wide. This week, two of the authors from a 2017 Nature Astronomy study on the TRAPPIST-1 planets, Lena Noack from the Department of Earth Sciences at the Free University of Berlin and Kristina Kislyakova from the Department of Astrophysics at the University of Vienna, explain more about this fascinating system. 

Blog authors Lena Noack and Kristina Kislyakova

For Earth scientists, it often seems like a huge endeavour to talk about the geodynamics and other interior processes of the other planets in our Solar System like Mars or Venus. But what about exoplanets? It’s very daring! We have almost no information about the thousands of planets that have been discovered so far in other places of our galaxy. These planets orbit other stars, of which some are quite similar to our Sun whereas other stars behave very differently. But how much do we actually know about planets around these stars?

Exoplanet hunting missions like Kepler have shown that the majority of exoplanets are actually small-mass planets – not huge gas giants like Jupiter – and are often smaller than Neptune, with some being even smaller than Earth. We have a pretty good idea of what some of these planets could look like, for example we know their mass, their radius, we might even have some spectral information on their atmospheres, we know how much energy they get from their star, and we might even know something about the star’s composition. This information hints at the composition of the planetary disk from which planets are made, and how much radioactive heating they may experience during their later evolution. Putting all these pieces together gives us several clues on how the planets may have evolved over time, and is comparable to the wealth of information we had of our neighbouring planets before the age of space exploration.

However, in contrast to our Solar System, we cannot (at least not with our technological standard of today) travel to these planets. The only way we can learn more about exoplanets is if we combine geophysical, thermodynamical and astrophysical models – derived and tested for Earth and the Solar System – and apply them to exoplanet systems.

 

Artist’s impression of TRAPPIST-1e, ©NASA

One exoplanet system that is quite intriguing is the TRAPPIST-1 system, which has been observed by several different space and ground-based telescopes including TRAPPIST (short for TRAnsiting Planets and PlanetesImals Small Telescope, or otherwise known as a European monastery-brewed beer) and the Spitzer Space Telescope.

The system contains at least 7 small, densely-packed planets around an 8 Gyr old M dwarf. All planets have masses and radii close to Earth – from TRAPPIST-1c and -1h, which are both ¾ the radius of Earth, to TRAPPIST-1g, which is 13% larger than Earth. For comparison, Venus, our sister planet, has a radius 5% smaller than Earth, and Mars, our small brother planet, is only half the size of Earth. And the greatest news: TRAPPIST-1 is actually in our direct neighbourhood, only 39 light years away. This is literally around the corner! For comparison, our closest neighbour planet outside the Solar System is Proxima Centauri b with a distance of 4.2 light years. Its star belongs to a system of three stars, the most well-known of which is Alpha Centauri, the closest star outside the Solar System. Some day, it may actually be in our reach to travel to both the Centauri system as well as TRAPPIST-1. So we should learn now as much about these planets as possible.

What makes the TRAPPIST-1 planets truly peculiar are their tight orbits around the star – the closest planet orbits at a distance of 0.0011 AU – so only 0.1 percent of Earth’s orbit. Even the furthest planet in the system discovered so far– TRAPPIST-1h – has an orbit of only 0.0063 AU. In our Solar System, Mercury, the closest-in planet, orbits at a distance of 0.39 AU. Does this mean that the planets are boiling up due to their close orbit? Not necessarily, since TRAPPIST-1 is a very dim red M dwarf, which emits much less light than the Sun in our system. If we would place Earth around this red M dwarf star, it would actually need to orbit at a distance of about 0.0022 AU to receive the same incident flux from the star. Actually, if we look at the possible distances from the star, where (depending on the atmosphere greenhouse effect) liquid water at the surface could theoretically exist for a somewhat Earth-like atmosphere (that is, composed of gases such as CO2 and N2), TRAPPIST-1d, -1e, -1f and -1g could potentially contain liquid water at the surface and would thus be habitable places where Earth-like life could, in principle, form. Of course, for that to occur several other factors have to be just right, as well. This zone, where liquid water at the surface could exist, is called the Habitable Zone or Temperate Zone, and is indicated in green in the illustration of the TRAPPIST-1 system compared to the inner Solar System below.

TRAPPIST-1 system compared to the inner Solar System below showing the green region of a Habitable Zone. © Caltech/NASA

So, should we already book our trip to TRAPPIST-1? Well, there are other factors that may endanger the possible habitability of these otherwise fascinating planets. First of all, TRAPPIST-1 is really different from the Sun. Although it is much dimmer and redder, it still emits almost the same amount of harsh X-ray and extreme ultra violet radiation as our Sun, and in addition, produces powerful flares. For the TRAPPIST-1 planets, which are so close to their star, it means that their atmospheres are exposed to much higher levels of short wavelength radiation, which is known to lead to very strong atmospheric escape. A nitrogen-dominated atmosphere, like the one Earth has, would likely not be stable on the TRAPPIST-1 planets in the habitable zone due to exposure to this short wavelength radiation for gigayears, so carbon dioxide Venus-like atmospheres are more probable. Besides that, stellar wind of TRAPPIST-1 may be very dense at planetary orbits, powering strong non-thermal escape from planetary atmospheres and leading to further erosion of the atmosphere.

Another interesting feature of the M dwarfs, especially such low-mass ones as TRAPPIST-1, is their extremely slow evolution. On the one hand, this means very long main-sequence life times of such stars, with stable radiation levels for many gigayears. Could this maybe allow very sophisticated life forms to evolve? On the other hand, when these stars are young, they go through a contraction phase before entering main sequence, which is much longer than the contraction phase of G-dwarfs such as the Sun. During this phase, the stars are much brighter and hotter than later in their history. For TRAPPIST-1 planets this would mean they have been grilled by hot temperatures for about a billion years! Can they still retain some water after such a violent past? Can life form under such conditions? We don’t really know. In any event, it seems that water retaining and delivery might be a critical factor for TRAPPIST-1 planets’ habitability.

Since the planets are so densely-packed in the system, the masses of neighbouring planets as well as the mass of the star have a gravitational effect on each other – just as the Moon leads to high and low tides of Earth’s oceans. Only, the tidal forces acting on the TRAPPIST-1 planets would be much stronger, and could lead to immense energy being released in the interior of the planets due to tidal dissipation. Furthermore, the star itself appears to have a strong magnetic field. An electrical current is produced if a conductive material is embedded in a changing magnetic field, which is used, for example, to melt iron in induction furnaces. Similarly, the mantle of rocky planets are conductive and can experience enhanced energy release deep in the upper mantle due to induction heating.

Both induction heating and tidal heating can have a negative effect on the potential habitability of a rocky planet, since strong heating in the interior can be reflected by equally strong volcanic activity at the surface. This would lead to a hellish surface to live on! The interior may even be partly molten, leading to subsurface oceans of magma, which actually may be the case for TRAPPIST-1b and -1c. Even TRAPPIST-1d may be affected by strong volcanic events due to both induction and tidal heating of the interior. TRAPPIST-1f, -1g and -1h might be too cold at the surface to have liquid water, and might rather resemble our water-rich icy moons orbiting around Saturn and Jupiter. Hence TRAPPIST-1e, which receives only a little less stellar flux compared to Earth, may be the most interesting planet to visit in the system.

But what would life look like on such a planet?

The tidal forces described above also lead to a different effect: the planets would always face the star with only one side (this is called a tidal lock). Therefore, planets would have a day side that was always facing the star, and a night side immersed into eternal darkness and where no light ray is ever received from the star. Such a tidally-locked orbit is similar to the Moon-Earth system, as the Moon shows us always the same face – the “near-side” of the Moon. The other side, the “far-side” of the Moon, is only known to us due to lunar space missions. Can you imagine living at a place where it never gets dark? On the other hand, the luminosity from the star is very weak. Life on the TRAPPIST-1 planets might therefore actually look different than on Earth. To obtain the needed photons used in photosynthsesis (if this process would also evolve on these planets), life might evolve to favour a large variety of pigments that would enable it to make use of the full range of visible and infrared light – in other words, plants on these planets would appear black to us.

TRAPPIST-1 planets certainly still harbour many mysteries. They are a very good example how diverse the planets in the universe can be. If we set our imagination free… Black trees under the red star in the sky, which never sees a sunrise or sunset, powerful volcanoes filling the air with the ash and shaking the ground.
Very different from our Earth, isn’t it?

Further reading:

Kislyakova, K., Noack, L. et al. Magma oceans and enhanced volcanism on TRAPPIST-1 planets due to induction heating. Nature Astronomy 1, 878-885 (2017).

Gillon, M. et al. Seven temperate terrestrial planets around the nearby ultracool dwarf star TRAPPIST-1. Nature 542, 456-460 (2017).

Kiang, N. et al. The Color of Plants on Other Worlds. Scientific American, April 2008, 48-55 (2008).

Barr, A.C. et al. Interior Structures and Tidal Heating in the TRAPPIST-1 Planets. Astronomy and Astrophysics, in press.

Luger 2017 A resonant chain (about the tidal heating – we should mention it here)

Luger, R. et al. A seven-planet resonant chain in TRAPPIST-1. Nature Astronomy 1, 0129 (2017).

Scalo, J. et al. M stars as targets for terrestrial exoplanet searches and biosignature detection. Astrobiology 7(1), 85-166 (2007).

Ramirez, R.M. and Kaltenegger, L. The habitable zones of pre-main-sequence stars. The Astrophysical Journal Letters 797(2), L25 (2014).

Happy new year!

Happy new year!

It’s 2018! Another year to finally publish that paper, finish your PhD, find a new job, finish that project, and be happy! The EGU Geodynamics Blog Team is looking forward to keep brightening your Wednesday mornings with the most interesting and funny blog posts. In this first post, we wish you all, of course, a happy new year!

Iris van Zelst

 

 

I wish everyone a very happy, productive, writing-guilt-free 2018 with lots of publications, funding, success, and happiness!

 

 

 

Anne Glerum

 

 

Wishing everybody a happy, inspiring and fruitful 2018! Time to start with a clean slate and write another adventurous chapter of life!

 

 

 

Luca Dal Zilio

 

Run, run, run
It’s time to have
fun, fun, fun 🙂
Sprint to the tree,
it’s the season to be jolly!
Happy Holidays is what you’ve won!

 

Grace Shephard

Greetings from EGU’s Geodynamics Blog team
We’ve enjoyed our first year and look forward to twenty eighteen
We’ll report on Earth’s secrets from the frontlines
And wish you fruitful collaborations, realistic expectations, and manageable deadlines.