GeoLog

Publications

Geosciences Column: Extreme snowfall potentially worsened Nepal’s 2015 earthquake-triggered avalanche

Geosciences Column: Extreme snowfall potentially worsened Nepal’s 2015 earthquake-triggered avalanche

Three years ago, an earthquake-induced avalanche and rockfalls buried an entire Nepalese village in ice, stone, and snow. Researchers now think the region’s heavy snowfall from the preceding winter may have intensified the avalanche’s disastrous effect.

The Langtang village, just 70 kilometres from Nepal’s capital Kathmandu, is nestled within a valley under the shadow of the Himalayas. The town was popular amongst trekking tourists, as the surrounding mountains offer breathtaking hiking opportunities.

But in April 2015, a 7.8-magnitude earthquake, also known as the Gorkha earthquake, triggered a massive avalanche and landslides, engulfing the village in debris.

Scientists estimate that the force of the avalanche was half as powerful the Hiroshima atomic bomb. The blast of air generated from the avalanche rushed through the site at more than 300 kilometres per hour, blowing down buildings and uprooting forests.

By the time the debris and wind had settled, only one village structure was left standing. The disaster claimed the lives of 350 people, with more than 100 bodies never located.

Before-and-after photographs of Nepal’s Langtang Valley showing the near-complete destruction of Langtang village. Photos from 2012 (pre-quake) and 2015 (post-quake) by David Breashears/GlacierWorks. Distributed via NASA Goddard on Flickr.

Since then, scientists have been trying to reconstruct the disaster’s timeline and determine what factors contributed to the village’s tragic demise.

Recently, researchers discovered that the region’s unusually heavy winter snowfall could have amplified the avalanche’s devastation. The research team, made up of scientists from Japan, Nepal, the Netherlands, Canada and the US, published their findings last year in the EGU’s open access journal Natural Hazards and Earth System Sciences.

To reach their conclusions, the team drew from various observational sources. For example, the researchers created three-dimensional models and orthomosaic maps, showing the region both before it was hit by the coseismic events and afterwards. The models and maps were pieced together using data collected before the earthquake and aerial images of the affected area taken by helicopter and drones in the months following the avalanche.

They also interviewed 20 villagers local to the Langtang valley, questioning each person on where he or she was during the earthquake and how much time had passed between the earthquake and the first avalanche event. In addition, the researchers asked the village residents to describe the ice, snow and rock that blanketed Langtang, including details on the colour, wetness, and surface condition of the debris.  

Based on their own visual ice cliff observations by the Langtang river and the villager interviews, the scientists believe that the earthquake-triggered avalanche hit Langtang first, followed then by multiple rockfalls, which were possibly triggered by the earthquake’s aftershocks.

A three-dimensional view of the Langtang mountain and village surveyed in this study. Image: K. Fujita et al.

According to the researchers’ models, the primary avalanche event unleashed 6,810,000 cubic metres of ice and snow onto the village and the surrounding area, a frozen flood about two and a half times greater in volume than the Egyptian Great Pyramid of Giza. The following rockfalls then contributed 840,000 cubic metres of debris.  

The researchers discovered that the avalanche was made up mostly of snow, and furthermore realized that there was an unusually large amount of snow. They estimated that the average snow depth of the avalanche’s mountainous source was about 1.82 metres, which was similar to snow depth found on a neighboring glacier (1.28-1.52 metres).

A deeper analysis of the area’s long-term meteorological data revealed that the winter snowfall preceding the avalanche was an extreme event, likely only to occur once every 100 to 500 years. This uncommonly massive amount of snow accumulated from four major snowfall events in mid-October, mid-December, early January and early March.

From these lines of evidence, the team concluded that the region’s anomalous snowfall may have worsened the earthquake’s destructive impact on the village.

The researchers believe their results could help improve future avalanche dynamics models. According to the study, they also plan to provide the Langtang community with a avalanche hazard map based on their research findings.  

Further reading

Qiu, J. When mountains collapse… Geolog (2016).

Roberts Artal, L. Geosciences Column: An international effort to understand the hazard risk posed by Nepal’s 2015 Gorkha earthquake. Geolog (2016).

Geosciences Column: The science behind snow farming

Geosciences Column: The science behind snow farming

For roughly the last decade, some ski resorts and other winter sport facilities have been using a pretty unusual method to ensure white slopes in winter. It’s called snow farming. The practice involves collecting natural or artificially made snow towards the end of winter, then storing the frozen mass in bulk over the summer under a thick layer of sawdust, woodchips, mulch, or other insulating material.

Many winter sport destinations have adopted the practice. In preparation for the 2014 Winter Olympics, Sochi, Russia stockpiled about 800,000 cubic metres of human-made snow during the warmer season, enough snow to fill 320 Olympic-size swimming pools.

Despite the growing trend, there still is little research on snow farming techniques. Recently, a team of scientists from the Institute for Snow and Avalanche Research (SLF) and the CRYOS Laboratory at the École Polytechnique Fédérale in Switzerland examined the success of snow conservation practices and used models to estimate what factors influence covered snow. Their findings were published in the EGU’s open access journal The Cryosphere.

Why store snow for the winter?

The ski industry has been storing snow for many reasons. The practice is a way for winter sports facilities to accommodate training athletes, start ski seasons earlier, and guarantee snow for major sports events. Snow farming can also be seen as a way to adapt to Earth’s changing climate, according to the authors of the study. Indeed, research published last year in The Cryosphere, found that the Alps may lose as much as 70 percent of snow cover by the end of the century if global warming continues unchecked. Snow loss to this degree could severely threaten the $70 billion dollar (57 billion EUR) industry and the alpine communities that depend on ski tourism.

For some ski resorts, the effects of climate change are already visible. For example, in Davos, Switzerland, a popular venue of the International Ski Federation Cross-Country World Cup, winter temperatures have risen over the last century while snow depth in turn has steadily declined.

Snow heap study

The research team studied two snow heaps: one near Davos, Switzerland (pictured here) and another in South Tyrol. Credit: Grünewald et al.

To better understand snow conservation techniques, the research team studied two artificially made snow heaps: one sitting near Davos and another located in South Tyrol. Each pile contained approximately 7,000 cubic metres of snow, about enough ice and powder to build 13,000 1.8-metre tall snowmen. The piles were also each covered with a 40 cm thick layer of sawdust and chipped wood.

Throughout the 2015 spring and summer season, the researchers measured changes in snow volume and density, as well as recorded the two sites’ meteorological data, including air temperature, humidity, wind speed and wind direction. The research team also fed this data to SNOWPACK, a model that simulates snow pile evolution and helps determine what environmental processes likely impacted the snow.

Cool under heat

From their observations, the researchers found that the sawdust and chipped wood layering conserved more than 75 percent of the Davos snow volume and about two thirds of the snow in South Tyrol. Given the high proportion of remaining snow, the researchers conclude that snow farming appears to be an effective tool for preparing for winter.

According to the SNOWPACK model, while sunlight was the biggest source of snow melt, most of this solar radiation was absorbed by the layer of sawdust and wood chips. The simulations suggest that the snow’s covering layer took in the sun’s heat during the day, then released this energy at night, creating a cooling effect on the snow underneath. Even more, the model found that, when the thick layer was moist, the evaporating water cooled the snow as well. The researchers estimate that only nine percent of the sun’s energy melted the snow heaps. Without the insulating layer, the snow would have melted far more rapidly, receiving 12 times as much energy from the sun if uncovered, according to the study.

Images of the South Tyrol snow heap from (a) 19 May and (b) 28 October. The snow depth (HS) is featured in c & d and snow height change (dHS) is shown in e. Credit: Grünewald et al.

The researchers found that the thickness of the covering layer was an important factor for snow conservation. When the team modelled potential snow melt under a 20 cm thick cover, the insulating and cooling effects from the layer had greatly diminished.

The simulations also revealed that, while higher air temperatures and wind speed increased snow melt, this effect was not very significant, suggesting that subalpine areas could also benefit from snow farming practices.

In the face of changing climates and disappearing snow, snow farming may be one solution for keeping winters white and skiers happy.

References

Grünewald, T., Wolfsperger, F., and Lehning, M.: Snow farming: conserving snow over the summer season, The Cryosphere, 12, 385-400, https://doi.org/10.5194/tc-12-385-2018, 2018.

Marty, C., Schlögl, S., Bavay, M., and Lehning, M.: How much can we save? Impact of different emission scenarios on future snow cover in the Alps, The Cryosphere, 11, 517-529, https://doi.org/10.5194/tc-11-517-2017, 2017.

 

 

GeoSciences Column: When could humans last walk, on land, between Asia & America?

GeoSciences Column: When could humans last walk, on land, between Asia & America?

Though now submerged under 53 m of ocean waters, there once was a land bridge which connected North America with Asia, allowing the passage of species, including early humans, between the two continents. A new study, published in the EGU’s open access journal Climate of the Past, explores when the land bridge was last inundated, cutting off the link between the two landmasses.

The Bering Strait, a narrow passage of water, connects the Arctic Ocean with the Pacific Ocean. Located slightly south of the Arctic Circle, the shallow, navigable, 85 km wide waterway is all that separates the U.S.A and Russia. There is strong evidence to suggest that, not so long ago, it was possible to walk between the two*.

The Paleolithic people of the Americas. Evidence suggests big-animal hunters crossed the Bering Strait from Eurasia into North America over a land and ice bridge (Beringia). Image: The American Indian by Clark Wissler (1917). Distributed via Wikipedia.

In fact, though the subject of a heated, ongoing debate, this route is thought to be one of the ones taken by some of the very first human colonisers of the Americas, some 16, 500 years ago.

Finding out exactly when the Bering Strait last flooded is important, not only because it ends the last period when animals and humans could cross between North America and northeast Asia, but because an open strait affects the two oceans it connects. It plays a role in how waters move around in the Arctic Ocean, as well as how masses of water with different properties (oxygen and/or salt concentrations and temperatures, for example) arrange themselves. The implications are significant: currently, the heat transported to Arctic waters (from the Pacific) via the Bering Strait determines the extend of Arctic sea ice.

As a result, a closed strait has global climatic implications, which adds to the importance of knowing when the strait last flooded.

The new study uses geophysical data which allowed the team of authors to create a 3D image of the Herald Canyon (within the Bering Strait). They combined this map with data acquired from cylindrical sections of sediment drilled from the ocean floor to build a picture of how the environments in the region of the Bering Strait changed towards the end of the last glaciation (at the start of a time known as the Holocene, approximately 11,700 years ago, when the last ‘ice age’ ended).

At depths between 412 and 400 cm in the cores, the sediment experiences changes in physical and chemical properties which, the researchers argue, represent the time when Pacific water began to enter the Arctic Ocean via the Bearing strait. Radiocarbon dating puts the age of this transition at approximately 11, 000 years ago.

Above this transition in the core, the scientist identified high concentrations of biogenic silica (which comes from the skeletons of marine creatures such as diatoms – a type of algae – and sponges); a characteristic signature of Pacific waters. Elevated concentrations of a carbon isotope called delta carbon thirteen (δ 13Corg), are further evidence that marine waters were present at that time, as they indicate larger contributions from phytoplankton.

The sediments below the transition consist of sandy clayey silts, which the team interpret as deposited near to the shore with the input of terrestrial materials. Above the transition, the sediments become olive-grey in colour and are exclusively made up of silt. Combined with the evidence from the chemical data, the team argue, these sediments were deposited in an exclusively marine environment, likely influenced by Pacific waters.

Combining geophysical data with information gathered from sediment cores allowed the researchers to establish when the Bering Strait closed. This image is a 3-D view of the bathymetry of Herald Canyon and the chirp sonar profiles acquired along crossing transects. Locations of the coring sites are shown by black bars. Figure taken from M. Jakobsson et al. 2017.

The timing of the sudden flooding of the Bering Strait and the submergence of the land bridge which connected North America with northeast Asia, coincides with a period of time characterised by Meltwater pulse 1B, when sea levels were rising rapidly as a result of meltwater input to the oceans from the collapse of continental ice sheets at the end of the last glaciation.

The reestablishment of the Pacific-Arctic water connection, say the researchers, would have had a big impact on the circulation of water in the Arctic Ocean, sea ice, ecology and potentially the Earth’s climate during the early Holocene. Know that we are more certain about when the Bering Strait reflooded, scientist can work towards quantifying these impacts in more detail.

By Laura Roberts Artal, EGU Communications Officer

 

*Authors’s note: In fact, during the winter months, when sea ice covers the strait, it is still possible to cross from Russia to the U.S.A (and vice versa) on foot. Eight people have accomplished the feat throughout the 20th Century. Links to some recent attempts can be found at the end of this post.

References and resources:

Jakobsson, M., Pearce, C., Cronin, T. M., Backman, J., Anderson, L. G., Barrientos, N., Björk, G., Coxall, H., de Boer, A., Mayer, L. A., Mörth, C.-M., Nilsson, J., Rattray, J. E., Stranne, C., Semiletov, I., and O’Regan, M.: Post-glacial flooding of the Bering Land Bridge dated to 11 cal ka BP based on new geophysical and sediment records, Clim. Past, 13, 991-1005, https://doi.org/10.5194/cp-13-991-2017, 2017.

Barton, C. M., Clark, G. A., Yesner, D. R., and Pearson, G. A.: The Settlement of the American Continents: A Multidisciplinay Approach to Human Biogeography, The University of Arizona Press, Tuscon, 2004.

Goebel, T., Waters, M. R., and Rourke, D. H.: The Late Pleistocene Dispersal of Modern Humans in the Americas, Science, 319,1497–1502, https://doi.org/10.1126/science.1153569, 2008

Epic explorer crossed frozen sea (BBC): http://news.bbc.co.uk/2/hi/uk_news/england/humber/4872348.stm

Korean team crossed Bering Strait (The Korean Herald): http://www.koreaherald.com/view.php?ud=20120301000341

Enmeshed in the gears of publishing – lessons from working as a young editor

Enmeshed in the gears of publishing – lessons from working as a young editor

Editors of scientific journals play an important role in the process research publication. They act as the midpoint between authors and reviewers, and set the direction of a given journal. However, for an early career scientist like me (I only defended my PhD in early December 2016) the intricacies of editorial work remained somewhat mysterious. Many academic journals tend to appoint established, more senior scientists to these roles, and while most scientists interact with editors regularly their role is not commonly taught to more junior researchers. I was fortunate to get the chance to work, short term, as an associate editor at Nature Geoscience in the first 4 months of this year (2017). During that time, I learned a number of lessons about scientific publishing that I felt could be valuable to the community at large.

What does an editor actually do?

The role of the editor is often hidden to readers; in both paywalled and open-access journals the notes and thoughts editors make on submitted manuscripts are generally kept private. One of the first things to appreciate is that editors judge whether a manuscript meets a set of editorial thresholds that would make it appropriate for the journal in question, rather than whether the study is correctly designed or the results are robust. I’d argue most editors are looking for a balance of an advance beyond existing literature and the level of interest a manuscript offers for their audience.

At each step of the publication process, from initial submission, through judging referee comments, to making a final decision, the editor is making a judgement whether the manuscript still meets those editorial thresholds.

The vast majority of the papers I got the chance to read were pretty fascinating, but since the journal I was working for is targeted at the whole Earth science community some of these were a bit too esoteric, and as such didn’t fit the thresholds we set to appeal to the journal audience.

I actually found judging papers on the basis of editorial thresholds refreshing – in our capacity as peer reviewers, most scientists are naturally sceptical of methodology and conclusions in other studies, but as an editor in most cases I was able to take the authors conclusions at face-value, and leave the critical assessment to referees.

That’s where the important difference lies; even though editors are generally scientists by training, since they are naturally not experts in every field that they receive papers from, it’s paramount to find reviewers who have the appropriate expertise and to ask them the right set of questions. In journals with academic editors, the editors may have more leeway to make critical comments, but impartiality is key.

Much of this may be already clear to many readers, but perhaps less so to more junior scientists. Many of the editorial decisions are somewhat subjective, like gauging the level of interest to a journal audience.

In the context of open access research journals, I think it’s worth asking whether the editorial decisions should also be made openly readable by authors and referees – this might aid potential authors in deciding how to pitch their articles to a given journal. This feeds into my next point – what are journals looking for?

By which metrics do journals judge studies?
The second big thing I picked up is that the amount of work does not always equate to a paper being appropriate for a given journal. Invariably, authors have clearly worked hard, and it’s often really tricky to explain to authors that their study is not a good fit for the journal you’re working for.

Speaking somewhat cynically, journals run for profit are interested in articles that can sell more copies or subscriptions. Since the audiences are primarily scientists, “scientific significance” will be a dominant consideration, but Nature and subsidiary journals also directly compare the mainstream media coverage of some of their articles with that of Science – that competition is important to their business.

Many other authors have discussed the relative merits of “prestige” journals (including Nobel prize winners – https://www.theguardian.com/science/2013/dec/09/nobel-winner-boycott-science-journals), and all I’ll add here is what strikes me most is that ‘number of grad student hours worked’ is often not related to those articles that would be of a broader interest to the more mainstream media. The majority of articles don’t attract media attention of course, but I’d also argue that “scientific significance” is not strongly linked to the amount of time that goes into each study.

In the long run, high quality science tends to ensure a strong readership of any journal, but in my experience as an editor the quality of science in submitted manuscripts tends to be universally strong – the scientific method is followed, conclusions are robust, but in some cases they’re just pitched at the wrong audience. I’d argue this is why some studies have found in meta-analysis that in the majority of cases, articles that are initially rejected are later accepted in journals of similar ‘prestige’ (Weller et al. 2001, Moore et al. 2017).

As such, it’s imperative that authors tailor their manuscripts to the appropriate audience. Editors from every journal are picking from the same pool of peer reviewers, and so the quality of reviews should also be consistent, which ultimately determines the robustness of a study; so to meet editorial thresholds, prospective authors should think about who is reading the journal.
It’s certainly a fine line to walk – studies that are confirmatory of prior work tend to attract fewer readers, and as such editors may be less inclined to take an interest, but these are nonetheless important for the scientific canon.

In my short time as an editor I certainly didn’t see a way around these problems, but it was eye-opening to see the gears of the publication system – the machine from within, as it were.

Who gets to review?
One of the most time-consuming jobs of an editor is finding referees for manuscripts. It generally takes as long, if not far longer, than reading the manuscript in detail!

The ideal set of referees should first have the required set of expertise to properly assess the paper in question, and then beyond that be representative of the field at large. Moreover, they need to have no conflict of interest with the authors of the paper. There are an awful lot of scientists working in the world at the moment, but in some sub-fields it can be pretty hard to find individuals who fit all these categories.

For example, some studies in smaller research fields with a large number of senior co-authors often unintentionally rule out vast swathes of their colleagues as referees, simply because they have collaborated extensively.

Ironically, working with everyone in your field leaves no-one left to review your work! I have no doubt that the vast majority of scientists would be able to referee a colleagues work impartially, but striving for truly impartial review should be an aim of an editor.

As mentioned above, finding referees who represent the field is also important. More senior scientists have a greater range of experience, but tend to have less time available to review, while junior researchers can often provide more in-depth reviews of specific aspects. Referees from a range of geographic locations help provide diversity of opinion, as well as a fair balance in terms of gender.

It was certainly informative to compare the diversity of authors with the diversity of the referees they recommended, who in general tend to be more male dominated and more US-centric than the authors themselves.

A positive way of looking at this might be that this represents a diversifying Earth science community; recommended referees tend to be more established scientists, so greater author diversity might represent a changing demographic. On the other hand, it’s certainly worth bearing in mind that since reviewing is increasingly becoming a metric by which scientists themselves are judged, recommending referees who are more diverse is a way of encouraging a more varied and open community.

What’s the job like?
Editorial work is definitely rewarding – I certainly felt part of the scientific process, and providing a service to authors and the readership community is the main remit of the job.

I got to read a lot of interesting science from a range of different places, and worked with some highly motivated people. It’s a steep learning curve, and tends to be consistently busy; papers are always coming in, so there’s always a need to keep working.

Perhaps I’m biased, but I’d also suggest that scientists could work as editors at almost any stage in their careers, and it offers a neat place between the world of academia and science communication, which I found fascinating.

By Robert Emberson, freelance science writer

References

Moore, S., Neylon, C., Eve, M. P., O’Donnell, D. P., and Pattinson, D. 2017. “Excellence R Us”: university research and the fetishisation of excellence. Palgrave Communications, 3, 16105

Weller A.C. 2001 Editorial Peer Review: Its Strengths and Weaknesses. Information Today: Medford NJ