GeoLog

Guest

This guest post was contributed by a scientist, student or a professional in the Earth, planetary or space sciences. The EGU blogs welcome guest contributions, so if you've got a great idea for a post or fancy trying your hand at science communication, please contact the blog editor or the EGU Communications Officer Laura Roberts Artal to pitch your idea.

Malawi High School Teacher’s Workshop on Natural Hazards

Malawi High School Teacher’s Workshop on Natural Hazards

In July 2017, Professor Bruce Malamud and Dr Faith Taylor from King’s College London travelled to Mzuzu, Malawi to work in collaboration with Mr James Kushe from Mzuzu University, Malawi. They delivered an EGU funded workshop at Mzuzu University to high school teachers on natural hazards, with major funding provided by EGU, and also supported by Urban ARK and Mzuzu University. Faith and Bruce explain more about the trip…

Malawi is a small (118,000 km2) landlocked country in south eastern Africa, often referred to as the ‘warm heart of Africa’ due to its stability, safety, beauty and warm welcome to visitors. Yet behind this warm welcome, life for many in Malawi is hard; with an average GDP per capita of US$0.82 per day, high (although improving) prevalence of HIV-AIDs, tuberculosis and malaria and a range of natural disasters including earthquakes, floods, lightening, hail, strong winds and drought.

Although we are biased, we think life is particularly hard for Malawian Geography teachers who have a great responsibility to shape the next generation of big-thinkers and problem-solvers against the challenges such as (i) large class sizes, (ii) limited opportunity for teachers’ continued professional development and (iii) under-resourcing of schools.

With this in mind, Bruce, James and I applied for EGU funding to run a workshop for teachers on natural hazards, focusing particularly on: (i) collating and developing low-cost teaching demonstrations, (ii) equipping teachers with further information about natural hazards and (iii) learning more about their home city of Mzuzu as a resource for field trips.

Professor Bruce Malamud demonstrating seismic waves using a giant slinky.

In the months running up to the workshop, we prepared 16 Gb USB sticks for each teacher which included >35 teaching demos that we had created and/or reviewed, and then trialled, 77 videos that we selected from the many out there, 11 digital posters and 16 factsheets and 14 Powerpoint lectures from our own teaching. We also started to order a few resources that would be hard to come by in Malawi, such as slinkies for teaching about earthquake waves and mento tubes for demonstrating volcanic eruptions (try explaining a suitcase of slinkies to a customs official!).

In Mzuzu, James visited each highschool to explain the purposes of our workshop and get local interest, planned a fieldtrip to the Massassa region and started to purchase locally available resources for teaching demonstrations, such as jars and sand for teaching about the angle of repose with regard to landslides.

Upon arrival at the Mzuzu University library, where we held the workshop, we were greeted by 27 high school teachers who had travelled from up to a couple of hours away to spend three days with us. The schools they came from varied in terms of resourcing, teachers’ background and experience, but all teachers were enthusiastic about the opportunity to learn more (note to others, teachers were particularly keen on further EGU funded workshops on other topics!).

Over the three days, we delivered interactive undergraduate level lectures on a range of natural hazards, so that teachers would better understand the process behind many of the hazards, interspersed with over two dozen activities and teaching demonstrations that they could bring back to the classroom. We also had a half day microadventure facilitated by one of the teachers to a local area that had been affected by flooding and landslides. This was a good reminder that geography starts on the doorstep, and does not require expensive fieldtrips to exotic destinations to help students experience environmental phenomena and solidify their classroom based learning. There were also opportunities for the teachers to share some of their best practice – and from this, we hope the seed has been sown for teachers to establish their own professional network for sharing ideas and resources.

We have travelled to Malawi multiple times over the past few years as part of our work on the Urban ARK project where we look at multi-hazard risk to infrastructure. From this work, we know how challenging it can be for information and ideas to flow to those experiencing and managing risks. We left Malawi feeling hopeful that through those 27 bright and enthusiastic teachers, we might reach >2000 students, and through those students we might also reach their friends and family to help reduce disaster risk across the Mzuzu region.

In the coming months we will share some of the resources we generated and collated online. There is a clear need for further workshops like this across Malawi, and an appetite for building a network of teachers. It took a lot of planning and partnerships with local academics but we would strongly encourage others to consider running similar workshops for teachers in the warm heart of Africa.

By Faith Taylor and Bruce Malamud, King’s College London

Record-setting forest fires in 2017 – what is to blame?

Record-setting forest fires in 2017 – what is to blame?

Forest fires have once again seized the public consciousness in both Europe and North America. Extreme drought and temperatures contributed to a tinderbox in many forests, and have led to deadly fires across Europe and record-breaking, highly disruptive fires in the USA and Canada, from where I’m currently writing.

A simple way to understand fire is by thinking about the fire triangle – the three pieces that need to combine to produce fire: heat, oxygen, and fuel. While the concept of the fire triangle offers simple insights into the cause of a fire, each of the three factors are subject to a number of controls.

The availability of fuel in particular is affected by a whole range of influences, from tree species variations, to the impact of pests, to the prior history of wildfires. In many cases there will be a multitude of reasons for both the rate that a given fire spreads, and the spatial extent to which it burns. The extensive media coverage of this year’s record-breaking fire season provides a useful opportunity to explore some of these factors, and how they might have contributed to the fires that we’re seeing this year.

Multi-tiered fires

Not all forest fires are equal. The forestry service in British Columbia defines 6 different levels of fire, ranging from slow burning of peat below the surface of a forest (a ground fire), through the burning of scrub and ground level debris (surface fire), to far harder to control conflagrations that consume the canopy of the trees (a crown fire).

Each type of fire consumes fuel from different levels of the forest, and a transition from a lower to higher level requires at a minimum that there is enough flammable material in the upper layers of the forest. As such, the history of fire in a given stand of woodland will have a significant effect on the potential for future fires; a prior surface fire could remove the under-brush and limit the future fire risk.

Many older trees in a forest can have fire scars from previous, smaller fires that have not burnt them entirely, demonstrating that not all fires make the transition to crown fires. In the US, the largest 2-3% of wildfires contribute 95% of the total area burned annually, and these are generally the largest crown fires. For these large fires, the conditions must be optimal to reach the canopy, but once they make this transition they can be very difficult to stop.

Although the largest fires require a specific set of conditions, the transition to large fire can lag for a long time behind the initial trigger of a fire:

“Once ignited, decaying logs are capable of smouldering for weeks, or even months, waiting the time when prevailing conditions (hot, windy, and dry) are conducive for expansion into a full-blown forest fire,” write Logan & Powell in 2001.

The initial trigger in natural settings for the fire is generally lightning, or in rare cases the intense heat from lava flows. However, a recent study has shown that in the continental US, 84% of fires are started by humans. This can range from discarded cigarettes to prescribed fires that have raged out of control.

So, we now have a simplified understanding of the requirements for a fire, but perhaps the more important question remains: why do they spread?

What fans the flames?

So what are the processes that control the strength and spread of a fire? Certain aspects, like climatic conditions, tend to set a long-term propensity for wildfire, while other short term effects define the local, immediate cause for a fire.

The long-term role of climate is quite diverse. Flannigan and co-authors, in their 2008 paper, summarise the importance of temperature in setting the conditions for fire:

“First, warmer temperatures will increase evapotranspiration, as the ability for the atmosphere to hold moisture increases rapidly with higher temperatures… (Roulet et al. 1992) and decreasing fuel moisture unless there are significant increases in precipitation. Second, warmer temperatures translate into more lightning activity that generally leads to increased ignitions (Price and Rind 1994). Third, warmer temperatures maylead to a lengthening of the fire season (Westerling et al. 2006).”

The importance of temperature at a global scale is demonstrated by studies suggesting that fire outbreaks have systematically increased since the last glacial maximum 21,000 years ago, when average temperatures were several degrees colder.

So once the climatic conditions set a long term likelihood for a fire to break out, as well as the distribution of tree species (and therefore the amount of fuel), short term disturbances can act to further increase susceptibility. We can divide these more immediate factors into weather, ecological, and human influences.

Unsurprisingly, weather conditions that bring hot, dry weather with the chance of thunderstorms will be highly likely to drive fires. A number of studies have shown that this kind of weather is often linked to persistent high pressure systems in the atmosphere, which push away rainfall, and can last for weeks at a time.

This kind of atmospheric conditions can often be linked to longer term weather patterns, in particular El Niño, and the effects can be long-lived. For example, El Niño brings warm, dry weather to the North-western part of North America, driving increased fire. In the South-west, El Niño brings wetter weather, but the increased vegetation growth provides a larger amount of fuel that can then burn in subsequent drier conditions.

Caption: Healthy pines mix with red, decaying trees afflicted with pine beetle infestation in Jasper National Park, Canada (Aug 2017). Credit: Robert Emberson

The growth of plants is just one way that the forest ecosystem can affect the availability of fuel for fire. Many trees are affected by pests. The classic example is the pine beetle, which can kill so many trees that whole swathes of pine forest are left characteristically red as they decay in the aftermath of an infestation. While some studies have shown that this can reduce the chance of a crown fire (since less fuel is available in the canopy), the dead and decaying trees provide a source of drier fuel at ground level that is a concern in many regions.

Elsewhere, we can see cases where trees encourage fire. Eucalyptus trees contain oil that burns strongly; the fire that this produces is suggested to remove the other tree species competing with the Eucalyptus. A strange method, but no doubt effective! The Eucalyptus, introduced to Portugal by humans in the 18th century, has been linked to many of the deadly fires that occurred there this summer. The lodgepole pine, too, has pinecones that require the heat of fire to open and release their seeds.
As such, fire is a natural part of the ecosystem, and in most areas of the boreal forest fire management is limited, with attempts at suppression only made where human settlement is at risk.

Where humans do step in, their actions can have an important role in setting the overall susceptibility to fire. Creating ‘fire breaks’ by felling or controlled burning of woodland in the path of a fire removes fuel and limits the growth of a fire, and a program of controlled burns is an important part of forest management to limit the potential for future fire by clearing scrub vegetation at the surface.

On the other hand, continual suppression of fires can lead to build up of fuel – which can then form a significantly more dangerous fire on a long term basis.

Ken Lertzman, Professor of Forest Ecology and Management at Simon Fraser University, told me that in general, control mechanisms are only useful for smaller surface fires; once the fire reaches the canopy, fire suppression is extremely difficult.

“It’s a combination of a statistical and philosophical problem”, he says. Fire control is expensive, so unless it’s financed by profits from felled lumber, cost benefit analysis is necessary, based in part on the statistical probability of a huge fire breaking out in a given location. Philosophically, though, “management forest stand structure at the boundary conditions [between surface and canopy fires] may be able to keep small fires becoming extreme.” and thus perhaps it’s worth trying to use control mechanisms even if giant canopy fires would ignore them, just to avoid that transition to the canopy.

As with most natural systems, the factors discussed above don’t necessarily act independently. Many can amplify other effects. For example, the geographical range of pine beetle outbreaks could increase under a warming climate. At a smaller scale, giant fires can create their own weather patterns, acting to dry out the surrounding forest even before it ignites. So, is the greater incidence of fire this year down to this kind of combination of factors?

What’s happening in 2017?

The sun at 10.15 in the morning in Chilliwack, BC, 3rd August 2017 – more than 50km from the nearest fire. Credit: Megan Reich.

The summer of 2017 has been brutal for wildfire in many locations. Europe has been hit hard with more fires in one day in Portugal than any previous on record. In British Columbia (BC), the largest single fire on record is currently burning, contributing to the largest total area burned in historical record. The fires released so much particulate matter and haze that the sun was obscured in large parts of BC.

According to Professor Lertzman, in British Columbia at least, this is “a fire year different in degree, not in kind”. The same processes are at play as always, but in overdrive. An early spring thaw combined with a long, hot and dry summer created the ideal antecedent conditions for fire.

Linking individual intense ‘fire years’ to long term climate change in challenging, but it is likely that these kind of conditions would be more likely to occur in a world influenced by anthropogenic climate change. Years like this one, which are clearly exceptional compared to the long term trend for fire, might begin to occur more often; every 5-10 years rather than every 30-50, for example.

In the short term, fire damage and suppression is expensive, both financially and in terms of lives affected. Smoke is a health risk to wide areas in the face of such intense fire, but ecological damage can be more difficult for humans to see.

Mature forest is a different niche for organisms to fill in comparison to fresh surfaces stripped bare by fire. While each ecosystem has its place in the natural forest, an increased prevalence of fire reduces the mature forest available for the species that prefer that ecological zone. These can range from recognisable mammal species such as bears, deer and caribou, to less well-known but still important canopy lichen species, says Professor Lertzman.

While this year’s fire season is beginning to ease off, it is clear that the range of factors, both natural and those driven by humans, will continue to play a role in years to come. More build-up of infrastructure in developed countries puts more human settlement at risk, so a clear understanding of how fire interacts with the climate, weather, and forest management strategies will be vital to allow us to live alongside fire in the future with fewer problems.

By Robert Emberson, freelance science writer

 

Further reading and resources

Mapping Ancient Oceans

Mapping Ancient Oceans

This guest post is by Dr Grace Shephard, a postdoctoral researcher in tectonics and geodynamics at the Centre of Earth Evolution and Dynamics (CEED) at the University of Oslo, Norway. This blog entry describes the latest findings of a study that maps deep remnants of past oceans. Her open access study, in collaboration with colleagues at CEED and the University of Oxford, was published this week in the Nature Journal: Scientific Reports. This post is modified from a version that first appeared on the CEED Blog.

Quick summary:

There are several ways of imaging the insides of the Earth by using information from earthquake data. When these different images are viewed at the same time, a new type of map allows geoscientists to identify the most robust features. These deep structures are likely the remains of extinct oceans, known as slabs, that were destroyed hundreds of millions of years ago. The maps are computed at different depths inside the Earth and the resulting slabs can be resurrected back to the surface. Along with a freely available paper and website, the analysis yields new insights into the structure and evolution of our planet in deep time and space.

Earth in constant motion

The surface of the Earth is in constant motion and this is particularly true of the rocks found under the oceans. The crust – the outermost layer of the planet – is continually being formed in the middle of oceans, such as the Mid-Atlantic Ridge. In other places, older crust is being destroyed, such as where the Pacific Ocean is moving under Japan. A third type of locality sees the crust shifted along laterally, such as the San Andreas Fault in San Francisco. These three types of locations are often referred to as plate boundaries, and they connect up to divide the Earth’s surface into tectonic plates of different sizes and motions.

Where plates plunge into the mantle are termed subduction zones (red lines Figure 1, below). The configuration of these subduction zones has changed throughout geological time. Indeed, much of the ocean seafloor (blue area in Figure 1) that existed when the dinosaurs roamed the Earth has long since been lost into the Earth’s mantle and are now known as slabs. The mantle is the domain beneath the outer shell of our planet and extends to around 2800 km depth, to the boundary with the core.

The age and fabric of the seafloor contains some of the most important constraints in understanding the past configuration of Earth. However, the constant recycling of oceans means that the Earth’s surface as it is today can only tell us so much about the deep geological past – the innards of our planet hold much of this information, and we need to access, visualize, and disseminate it.

Figure 1. A reconstruction of the Earth’s surface from 200 Million years ago to present day in jumps of 10 Million years. Red lines show the location of subduction zones, other plate boundaries in black, plate velocities are also shown. Continents are reconstructed with the present-day topography for reference. Based on the model of Matthews et al. (2016; Global and Planetary Change). Credit: G Shephard (CEED/UiO) using GPlates and GMT software.

Imaging the insides

Using information from earthquake data, seismologists can produce images of the Earth’s interior via computer models – this technique is called seismic tomography. Similar to a medical X-ray scan that looks for features within the human body, these models image the internal structure of the Earth. Thus, a given seismic tomography model is a snapshot into the present-day structure, which has been shaped by hundreds of millions to billions of years of Earth’s history.

However, there are different types of data that can be used to generate these models and different ways they can be created, each with varying degrees of resolution and sensitivity to the real Earth structure. This variability has led to dozens of tomographic models available in the scientific arena, which all have slightly different snapshots of the Earth. For example, deep under Canada and the USA is a well-known chunk of subducted ocean seafloor (see ‘slab’ label in Figure 2). A vertical slice through the mantle for three different tomography models shows that while overall the models are similar, there are some slight shifts in its location and shape.

Importantly, seismic waves pass through subducted, old, cold oceanic plates more quickly than they do through the surrounding mantle (in the same way that sound travels faster through solids than air). It follows that these subducted slabs can be ‘imaged’ seismically (usually these slab regions show up as blue in tomography models such as in Figure 2 and as shown in this video by co-author Kasra Hosseini. The red regions might represent thermally hot features like mantle plumes).

Figure 2. Vertical slices through three different seismic tomography models under North America and the Atlantic Ocean (profile running from A to B). The blue region outlined by black dashed line is related to the so-called Farallon slab. While it is imaged in all three models the finer details of the slab geometry and depth are different. Model 1 is S40RTS (Ritsema et al., 2011), 2 is UU-P07 (Amaru, 2007) and 3 is GyPSum-S (Simmons et al., 2010).

For other geoscientists to utilize this critical information, for example to work out how continents and oceans moved through time, requires a spectrum of seismic tomography models to be considered. But several limiting questions arise:

Which tomography model(s) should be used?

Are models based certain data types more likely to pick up a feature?

How many models are sufficient to say that a deep slab can be imaged robustly?

Voting maps of the deep

To facilitate solutions to these questions, a novel yet simple approach was undertaken in the study. Different tomography models were combined to generate counts, or votes, of the agreement between models – a sort of navigational guidebook to the Earth’s interior (Figure 3).

Figure 3. An interactive 360° style image for the vote map at 1000 km depth. The black and red regions highlight the most robust features (high vote count = likely to be a subducted slab of ocean) and the blue regions are the least robust areas (low vote count). Coastlines in black for reference. Image: G Shephard (CEED/UiO) using 360player (https://360player.io/) and GMT software. More depth slices and options can be also imaged at our website.

A high vote count (black-red features in Figure 3) means that an increased number of tomography models agree that there could be a slab at that location. For the study in Scientific Reports the focus was on the oldest and deepest slabs, but the process can be undertaken for shallower and younger slabs, and for other features such as mantle plumes. The maps show the distribution of the most robust slabs at different depths – the challenge is to now try and verify the features and potentially link them to subduction zones at the surface back in time.

One way to achieve this is to assume that a subducted portion of ocean will sink vertically in the mantle, and then to apply a sinking rate to connect depth and time. This enables pictures that link the surface and deep Earth, like the cover image, to be made. A sinking rate of say, 1.2 centimeters per year, means that a feature that existed at the surface around 100 Million years ago might be found at 1200 km depth.

Many studies have started to undertake a similar exercise on both regional and global scales. However, because these vote maps are free to access, showcase a lot of different models and can be remade with a sub-selection of them, they serve as an easy resource for the community to continue this task.

Secrets in depth

A bit like dessert-time discussions about the best way to cut a cake, so too are the ways of imaging and analyzing the Earth (Figure 4). Do you slice it horizontally and see things that might correspond to the same age all over the globe? Or slice vertically from the surface to see a spectrum of ages (depths) at a given location? Or perhaps a 3-D imaging would be most insightful? Whichever choice is made for the vote maps, many interesting features are displayed.

Figure 4. Vote maps visualized using alternative imaging options on a sphere. Credit: G Shephard (CEED/UiO) using GPlates software

By comparing the changes in vote counts with depth, some intriguing results were found. An apparent increase in the amount of the slabs was found around 1000-1400 km depth. This could mean that about 130 Million years ago more oceanic basins were lost into the mantle. Or perhaps there is a specific region in the mantle that has “blocked” the slabs from sinking deeper for some period of time (for example, an increase in viscosity).

The vote maps and their associated depth-dependent changes hold implications on an interdisciplinary stage including through linking plate tectonics, mantle dynamics, and mineral physics.

Of course, the vote maps are only as good as the tomography models that they are comprised of – and by very definition, a model is just one way of representing the true Earth.

A resource for the community

Having accessed a variety of tomography models provided by different research groups or data repositories, this study was facilitated using open-source software (Generic Mapping Tools and GPlates).

An important component of reproducible science and advancing our understanding of Earth is to make datasets and workflows publicly available for further investigations.

An online toolkit to visualize seismic tomography data is being developed by the co-authors and a preliminary vote maps page is already online. Here, vote maps for a sub-selection of tomography models can be generated, including with a choice in colour scales and with overlays of plate reconstruction models. More functionality will soon be available – so watch this space!

By Grace Shephard, a postdoctoral researcher in tectonics and geodynamics at the Centre of Earth Evolution and Dynamics (CEED)

Contact information for more details: Grace Shephard – g.e.shephard@geo.uio.no

Full reference to the article, freely available to the public:

Amaru, M. L. Global travel time tomography with 3-D reference models,. Geol. Ultraiectina 274, 174 (2007).

Matthews, K.J. K.T. Maloney, S. Zahirovic, S.E. Williams, M. Seton, R.D. Müller. 2016. Global plate boundary evolution and kinematics since the late Paleozoic. Global and Planetary Change. v146. doi: 10.1016/j.gloplacha.2016.10.002

Ritsema, J., Deuss, A., van Heijst, H. J. & Woodhouse, J. H. S40RTS: a degree-40 shear-velocity model for the mantle from new Rayleigh wave dispersion, teleseismic traveltime and normal-mode splitting function measurements. Geophysical Journal International 184, 1223-1236, doi:10.1111/j.1365-246X.2010.04884.x (2011).

Simmons, N. A., Forte, A. M., Boschi, L. & Grand, S. P. GyPSuM: A joint tomographic model of mantle density and seismic wave speeds. Journal of Geophysical Research: Solid Earth 115, doi:10.1029/2010JB007631 (2010).

 

 

 

Is it an earthquake, a nuclear test or a hurricane? How seismometers help us understand the world we live in

Is it an earthquake, a nuclear test or a hurricane? How seismometers help us understand the world we live in

Although traditionally used to study earthquakes, like today’s M 8.1 in Mexico,  seismometers have now become so sophisticated they are able to detect the slightest ground movements; whether they come from deep within the bowels of the planet or are triggered by events at the surface. But how, exactly, do earthquake scientists decipher the signals picked up by seismometers across the world? And more importantly, how do they know whether they are caused by an earthquake, nuclear test or a hurricane?  

To find out we asked Neil Wilkins (a PhD student at the University of Bristol) and Stephen Hicks (a seismologist at the University of Southampton) to share some insights with our readers.


Seismometers are highly sensitive and they are able to detect a magnitude 5 earthquake occurring on the other side of the planet. Also, most seismic monitoring stations have sensors located within a couple of meters of the ground surface, so they can be fairly susceptible to vibrations at the surface. Seismologists can “spy” on any noise source, from cows moving in a nearby field to passing trucks and trains.

A nuclear test

On Sunday the 3rd of September, North Korea issued a statement announcing it had successfully tested an underground hydrogen bomb. The blast was confirmed by seismometers across the globe. The U.S.  Geological Survey registered a 6.3 magnitude tremor, located at the Punggye-ri underground test site, in the northwest of the country. South Korea’s Meteorological Administration’s earthquake and volcano center also detected what is thought to be North Korea’s strongest test to date.

However they occur, explosions produce ground vibrations capable of being detected by seismic sensors. Mining and quarry blasts appear frequently at nearby seismic monitoring stations. In the case of nuclear explosions, the vibrations can be so large that the seismic waves they produce can be picked up all over the world, as in the case of this latest test.

It was realised quite early in the development of nuclear weapons that seismology could be used to detect such tests. In fact, the need to have reliable seismic data for monitoring underground nuclear explosions led in part to the development of the Worldwide Standardized Seismograph Network in the 1960s, the first of its kind.

Today, more than 150 seismic stations are operating as part of the International Monitoring System (IMS) to detect nuclear tests in breach of the Comprehensive Test-Ban Treaty (CTBT), which opened for signatures in 1996. The IMS also incorporates other technologies, including infrasound, hydroacoustics and radionuclide monitoring.

The key to determining whether a seismic signal is from an explosion or an earthquake lies in the nature of the waves that are present. There are three kinds of seismic wave seismologists can detect. The fastest, called Primary (P) waves, cause ground vibrations in the same direction that they travel, similar to sound waves in the air. Secondary (S) waves cause shaking in a perpendicular direction. Both P and S waves travel deep through the Earth and are known collectively as body waves. In contrast, the third type of seismic waves are known as surface waves, because they are trapped close to the surface of the Earth. In an earthquake, it is normally surface waves that cause the most ground shaking.

In an explosion, most of the seismic energy is released outwards as the explosive material rapidly expands. This means that the largest signal in the seismogram comes as P waves. Explosions therefore have a distinctive shape in the seismic data when compared with an earthquake, where we expect S and surface waves to have higher amplitude.

Forensic seismologists can therefore make measurements of the seismic data to determine whether there was an explosion. An extra indication that a nuclear test occurred can also be revealed by measuring the depth of the source of the waves, as it would not be possible to place a nuclear device deeper than around 10 km below the surface.

Yet while seismic data can tell us that there has been an explosion, there is nothing that can directly identify that explosion as being nuclear. Instead, the IMS relies on the detection of radioactive gases that can leak from the test site for final confirmation of what kind of bomb was used.

The figure shows (at the bottom) the seismic recording of the latest test in North Korea made at NORSAR’s station in Hedmark, Norway. The five upper traces show recordings at the same station for the five preceding tests, conducted by North Korea in 2006, 2009, 2013 and 2016 (two explosions in 2016). The 2017 test, is as can be seen from this figure, clearly the strongest so far. Credit: NORSAR.

When North Korea conducted a nuclear test in 2013, radioactive xenon was detected 55 days later, but this is not always possible. Any detection of such gases depends on whether or not a leak occurs in the first place, and how the gases are transported in the atmosphere.

Additionally, the seismic data cannot indicate the size of the nuclear device or whether it could be attached to a ballistic missile, as the North Korean government claims.

What seismology can give us is an idea of the size of the explosion by measuring the seismic magnitude. This is not straightforward, and depends on knowledge of exactly how deep the bomb was buried and the nature of the rock lying over the test site. However, by comparing the magnitude of this latest test with those from the previous five tests conducted in North Korea, we can see that this is a much larger explosion.

The Norwegian seismic observatory NORSAR has estimated a blast equivalent to 120 kilotons of TNT, six times larger than the atomic bomb dropped on Nagasaki in 1945, and consistent with the expected yield range of a hydrogen bomb.

Hurriquakes?

Nuclear tests are not the only hazard keeping our minds busy in the past few weeks. In the Atlantic, Hurricanes Harvey, Irma and Katia have wreaked havoc in the southern U.S.A, Mexico and the Caribbean.

Hurricanes in the Atlantic can occur at any time between June and November. According to hurricane experts, we are at the peak of the season. It is not uncommon for storms to form in rapid succession between August, September and October.

The National Hurricane Centre (NHC) is the de facto regional authority for producing hurricane forecasts and issuing alerts in the Atlantic and eastern Pacific. For their forecasts, meteorologists use a combination of on the ground weather sensors (e.g. wind, pressure, Doppler radar) and satellite data.

As hurricane Irma tore its way across the Atlantic, gaining strength and approaching the Caribbean island of Guadeloupe, local seismometers detected its signature, sending the global press into a frenzy. It may come as a slight surprise to some people that storms and hurricanes also show on seismometers.

However, a seismometer detecting an approaching hurricane is not actually that astonishing. There is no evidence to suggest that hurricanes directly cause earthquakes, so what signals can we detect from a hurricane? Rather than “signals”, seismologists tend to refer to this kind of seismic energy as “noise” as it thwarts our ability to see what we’re normally looking out for – earthquakes.

The seismic noise from a storm doesn’t look like distinct “pings” that we would see with an earthquake. What we see are fairly low-pitched “hums” that gradually get louder in the days and hours preceding the arrival of a storm. As the storm gets closer to the sensor, these hums turn into slightly higher-pitched “rustling”. This seismic energy then wanes as the hurricane drifts away. We saw this effect clearly for Hurricane Irma with recordings from a seismometer on the island of Guadeloupe.

What causes these hums and rustles? If you look at the frequency content of seismic data from any monitoring station around the globe, noise levels light up at frequencies of ~0.2 Hz (5 s period). We call these hums “microseism”. Microseism is caused by persistent seismic waves unrelated to earthquakes, and it occurs over huge areas of the planet.  One of the strongest sources of microseism is caused by ocean waves and swell. During a hurricane, swell increases and ocean waves become more energetic, eventually crashing into coastlines, transferring seismic energy into the ground. This effect is more obvious on islands as they are surrounded by water.

As the hurricane gets closer to the island, wind speeds dramatically increase and may dwarf the noise level of the longer period microseism. Wind rattles trees, telegraph poles, and the surface itself, transferring seismic energy into the ground and moving the sensitive mass inside the seismometer. This effect causes higher-pitched “rustles” as the centre of the storm approaches. Gusts of wind can also generate pressure changes inside the seismometer installation and within the seismometer itself, generating longer period fluctuations.

During Hurricane Irma, a seismic monitoring station located in the Dutch territory of St. Maarten clearly recorded the approach of the storm, leading to an intense crescendo as the eyewall crossed the area. As the centre of the eye passed over, the seismometer seems to have recorded a slightly lower noise level. This observation could be due to the calmer conditions and lower pressure within the eye. The station went down shortly after, probably from a power outage or loss in telemetry which provides the data in real-time.

Seismometers measuring storms is not a new observation. Recently, Hurricane Harvey shook up seismometers located in southern Texas. Even in the UK, the approach of winter storms across the Atlantic causes much higher levels of microseism.

It would be difficult to use seismometer recordings to help forecast a hurricane – the recordings really depend on how close the sensor is to the coast and how exposed the site is to wind. In the event of outside surface wind and pressure sensors being damaged by the storm, protected seismometers below the ground could possibly prove useful in delineating the rough location of the hurricane eye, assuming they maintain power and keep sending real-time data.

At least several seismic monitoring stations in the northern Antilles region were put out of action by the effects of the Hurricane. Given the total devastation on some islands, it is likely that it will take at least several months to bring these stations back online. The Lesser Antilles are a very tectonically active and complex part of Earth; bringing these sensors back into operation will be crucial to earthquake and volcano hazard monitoring in the region.

By Neil Wilkins (PhD student at the University of Bristol) and Steven Hicks (a seismologist at the University of Southampton)

References and further reading

GeoSciences Column: Can seismic signals help understand landslides and rockfalls?

NORSAR Press Release: Large nuclear test in North Korea on 3 September 2017

The Comprehensive Nuclear-Test-Ban Organization Press Release: CTBTO Executive Secretary Lassina Zerbo on the unusual seismic event detected in the Democratic People’s Republic of Korea

First Harvey, Then Irma and Jose. Why? It’s the Season (The New York Times)

NOAA  National Hurricane Center

IRIS education and outreach series: How does a seismometer work?