GeoLog

Publications

Geosciences Column: Flooded by jargon

Geosciences Column: Flooded by jargon

When hydrologists and people of the general public use simple water-related words, are they actually saying the same thing? While many don’t consider words like flood, river and groundwater to be very technical terms, also known as jargon, water scientists and the general public can actually have pretty different definitions. This is what a team of researchers have discovered in recent study, and their results were published in EGU’s open access journal Hydrology and Earth System Sciences. In this post, Rolf Hut, an assistant professor at Delft University of Technology in the Netherlands and co-author of the study, blogs about his team’s findings.

On the television a scientist is interviewed, in a room with a massive collection of books:

“Due to climate change, the once in two years flood now reaches up to…”

“Flood?” interrupts my dad “We haven’t had a flood in fifteen years; how can they talk about a once in two years flood?”

The return period of floods is an often used example to illustrate how statistically illiterate ‘the general public’ is supposed to be. But maybe we shouldn’t focus on the phrase ‘once in two years’, but rather on the term ‘flood’. Because: does my dad know what that scientist, a colleague of mine, means when she says “flood”?

In water-science the words that experts use are the same words that people use in daily life. Words like ‘flood’, ‘dam’ or ‘river’. Because we have been using these words for our entire lives, we may not stop and think that, because of our training as water scientists, we may have a different definition than what people outside our field may have. When together with experts on science communication, I was writing a review paper about geoscience on television[1] when we got into the discussion “what is jargon?”. We quickly found out that within geoscience this is an open question.

Together with a team of Netherlands-based scientists, including part-time journalist and scientist Gemma Venhuizen and professor of science communication Ionica Smeets and assistant professor on soils Cathelijne Stoof and professor of statistics Casper Albers we decided to look for an answer to this question. We conducted a survey where we asked people what they thought words like ‘flood’ meant. People could pick from different definitions. Those definitions were not wrong per se, just different. One might be from Wikipedia and another from a policy document from EU officials. We did not want to test if people were correct, but rather if there was a difference in meaning attached to words between water scientists and lay people. For completeness, we also added picture questions where people had to pick the picture that best matched a certain word.

The results are in. We recently published our findings in the EGU journal Hydrology and Earth System Sciences[2] and will present them at the EGU General Assembly in April 2019 in Vienna. As it turns out: words like ‘groundwater’, ‘discharge’ and even ‘river’ have a large difference between the meaning lay-people have compared to water scientists. For the pictures however, people tend to agree more. The figure below shows the misfit distribution between lay people and water scientists: the bigger the misfit, the more people have different definitions. The numbers on the right are the Bayes factor: bigger than 10 indicates strong evidence that differences between lay people and water scientists are more likely than similarities. The words with an asterisk are the picture questions, showing that when communicating using pictures people are more likely to share the same definition.

Graph showing the posterior distribution of the misfit between laypeople and experts by using a Bayes factor (BF) for every term used in the survey. Pictorial questions are marked with an asterisk. A value of the BF <1∕10 is strong evidence towards H0: it is more likely that laypeople answer questions the same as experts than differently. A value of the BF >10 is strong evidence towards H1: differences are more likely than similarities. In addition to a Bayes factor for the significance of the difference, we also calculated the misfit: the strength of the difference. The misfit was calculated by a DIF score (differential item functioning), in which DIF =0 means perfect match, and DIF =1 means maximum difference. (Figure from https://doi.org/10.5194/hess-23-393-2019)

Maybe that scientist talking about floods on the television should have been filmed at a flood site, not in front of a pile of books.

Finally, the term ‘flood’ proved to be one of the words that we do tend to agree on, so maybe dad should take that class in basic statistics afterall…

By dr. ir. Rolf Hut, researcher at Delft University of Technology, the Netherlands

[This article is cross-posted on Rolf Hut’s personal site]

References

[1] Hut, R., Land-Zandstra, A. M., Smeets, I., and Stoof, C. R.: Geoscience on television: a review of science communication literature in the context of geosciences, Hydrol. Earth Syst. Sci., 20, 2507-2518, https://doi.org/10.5194/hess-20-2507-2016, 2016.

[2] Venhuizen, G. J., Hut, R., Albers, C., Stoof, C. R., and Smeets, I.: Flooded by jargon: how the interpretation of water-related terms differs between hydrology experts and the general audience, Hydrol. Earth Syst. Sci., 23, 393-403, https://doi.org/10.5194/hess-23-393-2019, 2019.

Geosciences Column: Scientists pinpoint where seawater could be leaking into Antarctic ice shelves

Geosciences Column: Scientists pinpoint where seawater could be leaking into Antarctic ice shelves

Over the last few decades, Antarctic ice shelves have been disintegrating at a rapid rate, likely due to warming atmospheric and ocean temperatures, according to scientists. New research reveals that one type of threat to ice shelf stability might be more widespread that previously thought.

A study recently published in EGU’s open access journal The Cryosphere identified several regions in Antarctica were liquid seawater could be leaking into vulnerable layers of an ice shelf.

Scientists have known for some time now that seawater can seep into an ice shelf’s firn layer, the region of compacted snow that is on its way to becoming ice. This seawater infiltration presents an issue to the ice shelf’s stability, since as the seawater spreads throughout the firn layer, the water can create fractures and help expand crevasses already present in the frozen material. Past research has shown that the presence of liquid brine from seawater within an ice shelf is correlated to increased fracturing and calving.

While ice shelf collapse doesn’t directly contribute to sea level rise, since the ice is already floating, stable ice shelves often push back on land-based ice sheets and glaciers, slowing down ice flow into the ocean. Past research has suggested that once an ice shelf collapses, the rate of ice flow from unsupported glaciers can greatly accelerate.

To better understand Antarctic ice shelves’ risk of collapse, the researchers involved with this new study sought to identify where ice shelf firn layers are vulnerable to seawater infiltration. Using Antarctic geometry data, they mapped out the potential ‘brine zones’ within the continent’s ice shelves. These are regions of the ice shelf where the firn layer is both below the sea level and permeable enough to let seawater percolate through.

The results of their analysis revealed that almost all ice shelves in Antarctica have spots where seawater can potentially leak through their layers, with about 10-40 percent of the continent’s total ice shelf area possibly at risk of infiltration.

Map of potential brine zones areas around Antarctica. Map shows areas where permeable firn lies below sea level (the brine zone), with the threshold for firn permeability defined as 750 kg m−3 (red), 800 kg m−3 (yellow) and 830 kg m−3 (blue) calculated using Bedmap2 surface elevation. Bar charts show the mean percentage area of selected ice shelves covered by the brine zone. (Credit: S. Cook et al. 2018)

The researchers compared their estimated points to a map of previously confirmed brine zones, observed through ice cores or radar surveys. After reviewing these records, they identified only one record of brine presence that hadn’t been highlighted by their developed model.

The study found many areas in Antarctica where seawater infiltration could be possible, but has not been previously observed. The findings suggest that this firn layer vulnerability to seawater might be more widespread than previously believed.

The researchers suggest that the most likely new regions where brine from seawater may be present includes the Abbot Ice Shelf, Nickerson Ice Shelf, Sulzberger Ice Shelf, Rennick Ice Shelf, and slower-moving areas of Shackleton Ice Shelf. The regions all contain large swathes of permeable firn below sea level while also subject to relatively warm air temperatures and low flow speeds, the ideal conditions for maintaining liquid brine.

The study points out that there are still many uncertainties in this research, considering the unknowns still present in the data used for mapping and the factors that may influence seawater infiltration. For example, some areas that have large predicted brine zones have an unusually think layer of firn from heavy snowfall. This is the case for the Edward VIII Bay in eastern Antarctica. “Our results indicate the total ice shelf area where permeable firn lies below sea level, but this does not necessarily imply that the firn contains brine,” the authors of the study noted in their article.

Given their findings, the researchers involved recommend that this potentially widespread influence on ice shelves should be further examined and assessed by future studies.

By Olivia Trani, EGU Communications Officer

References

Cook, S., Galton-Fenzi, B. K., Ligtenberg, S. R. M. and Coleman, R.: Brief communication: widespread potential for seawater infiltration on Antarctic ice shelves, The Cryosphere, 12(12), 3853–3859, doi:10.5194/tc-12-3853-2018, 2018.

Hoegh-Guldberg, O., D. Jacob, M. Taylor, M. Bindi, S. Brown, I. Camilloni, A. Diedhiou, R. Djalante, K.L. Ebi, F. Engelbrecht, J.Guiot, Y. Hijioka, S. Mehrotra, A. Payne, S.I. Seneviratne, A. Thomas, R. Warren, and G. Zhou, 2018: Impacts of 1.5ºC Global Warming on Natural and Human Systems. In: Global Warming of 1.5°C. An IPCC Special Report on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty [Masson-Delmotte, V., P. Zhai, H.-O. Pörtner, D. Roberts, J. Skea, P.R. Shukla, A. Pirani, W. Moufouma-Okia, C. Péan, R. Pidcock, S. Connors, J.B.R. Matthews, Y. Chen, X. Zhou, M.I.Gomis, E. Lonnoy, T.Maycock, M.Tignor, and T. Waterfield (eds.)]. In Press

Scambos, T. A.: Glacier acceleration and thinning after ice shelf collapse in the Larsen B embayment, Antarctica, Geophysical Research Letters, 31(18), doi:10.1029/2004gl020670, 2004.

Scambos, T., Fricker, H. A., Liu, C.-C., Bohlander, J., Fastook, J., Sargent, A., Massom, R. and Wu, A.-M.: Ice shelf disintegration by plate bending and hydro-fracture: Satellite observations and model results of the 2008 Wilkins ice shelf break-ups, Earth and Planetary Science Letters, 280(1–4), 51–60, doi:10.1016/j.epsl.2008.12.027, 2009.

State of the Cryosphere: Ice Shelves. National Snow & Ice Data Center

How to increase reproducibility and transparency in your research

How to increase reproducibility and transparency in your research

Contemporary science faces many challenges in publishing results that are reproducible. This is due to increased usage of data and digital technologies as well as heightened demands for scholarly communication. These challenges have led to widespread calls for more research transparency, accessibility, and reproducibility from the science community. This article presents current findings and solutions to these problems, including recent new software that makes writing submission-ready manuscripts for journals of Copernicus Publications a lot easier.

While it can be debated if science really faces a reproducibility crisis, the challenges of computer-based research have sparked numerous articles on new good research practices and their evaluation. The challenges have also driven researchers to develop infrastructure and tools to help scientists effectively write articles, publish data, share code for computations, and communicate their findings in a reproducible way, for example Jupyter, ReproZip and research compendia.

Recent studies showed that the geosciences and geographic information science are not beyond issues with reproducibility, just like other domains. Therefore, more and more journals have adopted policies on sharing data and code. However, it is equally important that scientists foster an open research culture and teach researchers how they adopt more transparent and reproducible workflows, for example at skill-building workshops at conferences offered by fellow researchers, such as the EGU short courses, community-led non-profit organisations such as the Carpentries, open courses for students, small discussion groups at research labs, or individual efforts of self-learning. In the light of prevailing issues of a common definition of reproducibility, Philip Stark, a statistics professor and associate dean of mathematical and physical sciences at the University of California, Berkeley, recently coined the term preproducibility: “An experiment or analysis is preproducible if it has been described in adequate detail for others to undertake it.” The neologism intends to reduce confusion and also to embrace a positive attitude for more openness, honesty, and helpfulness in scholarly communication processes.

In the spirit of these activities, this article describes a modern workflow made possible by recent software releases. The new features allow the EGU community to write preproducible manuscripts for submission to the large variety of academic journals published by Copernicus Publications. The new workflow might require hard-earned adjustments for some researchers, but it pays off because of an increase in transparency and effectivity. This is especially the case for early career scientists. An open and reproducible workflow enables researchers to build on others’ and own previous work and better collaborate on solving the societal challenges of today.

Reproducible research manuscripts

Open digital notebooks, which interweave data and code and can be exported to different output formats such as PDF, are powerful means to improve transparency and preproducibility of research. Jupyter Notebook, Stencila and R Markdown let researchers combine long-form text of a publication and source code for analysis and visualisation in a single document. Having text and code side-by-side makes them easier to grasp and ensures consistency, because each rendering of the document executes the whole workflow using the original data. Caching for long-lasting computations is possible, and researchers working with supercomputing infrastructures or huge datasets may limit the executed code to purposes of visualisation using processed data as input. Authors can transparently expose specific code snippets to readers but also publish the complete source code of the document openly for collaboration and review.

The popular notebook formats are plain text-based, like Markdown in case of R Markdown. Therefore an R Markdown document can be managed with version control software, which are programs for managing multiple versions and contributions, even by different people, to the same documents. Version control provides traceability of authorship, a time machine for going back to any previous “working” version, and online collaboration such as on GitLab. This kind of workflow also stops the madness of using file names for versions yet still lets authors use awesome file names and apply domain-specific guidelines for packaging research.

R Markdown supports different programming languages besides the popular namesake R and is a sensible solution even if you do not analyse data with scripts nor have any code in your scholarly manuscript. It is easy to write, allows you to manage your bibliography effectively, can be used for websites, books or blogs, but most importantly it does not fall short when it is time to submit a manuscript article to a journal.

The rticles extension package for R provides a number of templates for popular journals and publishers. Since version 0.6 (published Oct 9 2018) these templates include the Copernicus Publications Manuscript preparations guidelines for authors. The Copernicus Publications staff was kind enough to give a test document a quick review and all seems in order, though of course any problems and questions shall be directed to the software’s vibrant community and not the publishers.

The following code snippet and screen shot demonstrate the workflow. Lines starting with # are code comments and explain the steps. Code examples provided here are ready to use and only lack the installation commands for required packages.

# load required R extension packages:
library("rticles")
library("rmarkdown")

# create a new document using a template:
rmarkdown::draft(file = "MyArticle.Rmd",
                 template = "copernicus_article",
                 package = "rticles", edit = FALSE)

# render the source of the document to the default output format:
rmarkdown::render(input = "MyArticle/MyArticle.Rmd")

{: .language-r}

The commands created a directory with the Copernicus Publications template’s files, including an R Markdown (.Rmd) file ready to be edited by you (left-hand side of the screenshot), a LaTeX (.tex) file for submission to the publisher, and a .pdf file for inspecting the final results and sharing with your colleagues (right-hand side of the screenshot). You can see how simple it is to format text, insert citations, chemical formulas or equations, and add figures, and how they are rendered into a high-quality output file.

All of these steps may also be completed with user-friendly forms when using RStudio, a popular development and authoring environment available for all operating systems. The left-hand side of the following screenshot shows the form for creating a new document based on a template, and the right-hand shows side the menu for rendering, called “knitting” with R Markdown because code and text are combined into one document like threads in a garment.

And in case you decide last minute to submit to a different journal, rticles supports many publishers so you only have to adjust the template while the whole content stays the same.

Sustainable access to supplemental data

Data published today should be published and properly cited using appropriate research data repositories following the FAIR data principles. Journals require authors to follow these principles, see for example the Copernicus Publications data policy or a recent announcement by Nature. Other publishers required, or still do today, to store supplemental information (SI), such as dataset files, extra figures, or extensive descriptions of experimental procedures, as part of the article. Usually only the article itself receives a digital object identifier (DOI) for long-term identification and availability. The DOI minted by the publisher is not suitable for direct access to supplemental files, because it points to a landing page about the identified object. This landing page is designed to be read by humans but not by computers.

The R package suppdata closes this gap. It supports downloading supplemental information using the article’s DOI. This way suppdata enables long-term reproducible data access when data was published as SI in the past or in exceptional cases today, for example if you write about a reproduction of a published article. In the latest version available from GitHub (suppdata is on its way to CRAN) the supported publishers include Copernicus Publications. The following example code downloads a data file for the article “Divergence of seafloor elevation and sea level rise in coral reef ecosystems” by Yates et al. published in Biogeosciences in 2017. The code then creates a mostly meaningless plot shown below.

# load required R extension package:
library("suppdata")

# download a specific supplemental information (SI) file
# for an article using the article's DOI:
csv_file <- suppdata::suppdata(
  x = "10.5194/bg-14-1739-2017",
  si = "Table S1 v2 UFK FOR_PUBLICATION.csv")
supplemental

# read the data and plot it (toy example!):
my_data <- read.csv(file = csv_file, skip = 3)
plot(x = my_data$NAVD88_G03, y = my_data$RASTERVALU,
     xlab = "Historical elevation (NAVD88 GEOID03))",
     ylab = "LiDAR elevation (NAVD88 GEOID03)",
     main = "A data plot for article 10.5194/bg-14-1739-2017",
     pch = 20, cex = 0.5)

{: .language-r}

Main takeaways

Authoring submission-ready manuscripts for journals of Copernicus Publications just got a lot easier. Everybody who can write manuscripts with a word processor can learn quickly R Markdown and benefit from a preproducible data science workflow. Digital notebooks not only improve day-to-day research habits, but the same workflow is suitable for authoring high-quality scholarly manuscripts and graphics. The interaction with the publisher is smooth thanks to the LaTeX submission format, but you never have to write any LaTeX. The workflow is based on an established Free and Open Source software stack and embraces the idea of preproducibility and the principles of Open Science. The software is maintained by an active, growing, and welcoming community of researchers and developers with a strong connection to the geospatial sciences. Because of the complete and consistent notebook, you, a colleague, or a student can easily pick up the work at a later time. The road to effective and transparent research begins with a first step – take it!

Acknowledgements

The software updates were contributed by Daniel Nüst from the project Opening Reproducible Research (o2r) at the Institute for Geoinformatics, University of Münster, Germany, but would not be able without the support of Copernicus Publications, the software maintainers most notably Yihui Xie and Will Pearse, and the general awesomeness of the R, R-spatial, Open Science, and Reproducible Research communities. The blog text was greatly improved with feedback by EGU’s Olivia Trani and Copernicus Publications’ Xenia van Edig. Thank you!

By Daniel Nüst, researcher at the Institute for Geoinformatics, University of Münster, Germany

[This article is cross posted-on the Opening Reproducible Research project blog]

References

Geosciences Column: Using volcanoes to study carbon emissions’ long-term environmental effect

Geosciences Column: Using volcanoes to study carbon emissions’ long-term environmental effect

In a world where carbon dioxide levels are rapidly rising, how do you study the long-term effect of carbon emissions?

To answer this question, some scientists have turned to Mammoth Mountain, a volcano in California that’s been releasing carbon dioxide for years. Recently, a team of researchers found that this volcanic ecosystem could give clues to how plants respond to elevated levels of carbon dioxide over long periods of time. The scientists suggest that studying carbon-emitting volcanoes could give us a deeper understanding on how climate change will influence terrestrial ecosystems through the decades. The results of their study were published last month in EGU’s open access journal Biogeosciences.

Carbon emissions reached a record high in 2018, as fossil-fuel use contributed roughly 37.1 billion tonnes of carbon dioxide to the atmosphere. Emissions are expected to increase globally if left unabated, and ecologists have been trying to better understand how this trend will impact plant ecology. One popular technique, which involves exposing environments to increased levels of carbon dioxide, has been used since the 1990s to study climate change’s impact.

The method, also known as the Free-Air Carbon dioxide Enrichment (FACE) experiment, has offered valuable insight into this matter, but can only give a short-term perspective. As a result, it’s been more challenging for scientists to study the long-term impact that emissions have on plant communities and ecosystems, according to the new study.

FACE facilities, such as the Nevada Desert FACE Facility, creates 21st century atmospheric conditions in an otherwise natural environment. Credit: National Nuclear Security Administration / Nevada Site Office via Wikimedia Commons

Carbon-emitting volcanoes, on the other hand, are often well-studied systems and have been known to emit carbon dioxide for decades to even centuries. For example, experts have been collecting data on gas emissions from Mammoth Mountain, a lava dome complex in eastern California, for almost twenty years. The volcano releases carbon dioxide at high concentrations through faults and fissures on the mountainside, subsequently leaving its forest environment exposed to the emissions. In short, the volcanic ecosystem essentially acts like a natural FACE experiment site.

“This is where long-term localized emissions from volcanic [carbon dioxide] can play a game-changing role in how to assess the long-term [carbon dioxide] effect on ecosystems,” wrote the authors in their published study. Research with longer study periods would also allow scientists to assess climate change’s effect on long-term ecosystem dynamics, including plant acclimation and species dominance shifts.

Through this exploratory study, the researchers involved sought to better understand whether the long-term ecological response to carbon-emitting volcanoes is actually representative to the ecological impact of increased atmospheric carbon dioxide.

Remotely sensed imagery acquired over Mammoth Mountain, showing (a) maps of soil CO2 flux simulated based on accumulation chamber measurements, shown overlaid on aerial RGB image, (b) above-ground biomass (c) evapotranspiration, and (d) normalized difference vegetation index (NDVI). Credit: K. Cawse-Nicholson et al.

To do so, the scientists analysed characteristics of the forest ecosystem situated on the Mammoth Mountain volcano. With the help of airborne remote-sensing tools, the team measured several ecological variables, including the forest’s canopy greenness, height and nitrogen concentrations, evapotranspiration, and biomass. Additionally they examined the carbon dioxide fluxes within actively degassing areas on Mammoth Mountain.

They used all this data to model the structure, composition, and function of the volcano’s forest, as well as model how the ecosystem changes when exposed to increased carbon emissions. Their results revealed that the carbon dioxide fluxes from Mammoth Mountain’s soil were correlated to many of the ecological variables analysed. Additionally, the researchers discovered that parts of the observed environmental impact of the volcano’s emissions were consistent with outcomes from past FACE experiments.  

Given the results, the study suggests that these kind of volcanic systems could work as natural test environments for long-term climate research. “This methodology can be applied to any site that is exposed to elevated [carbon dioxide],” the researchers wrote. Given that some plant communities have been exposed to volcanic emissions for hundreds of years, this method could help paint a more comprehensive picture of our future environment as Earth’s climate changes.

By Olivia Trani, EGU Communications Officer

References

Cawse-Nicholson, K., Fisher, J. B., Famiglietti, C. A., Braverman, A., Schwandner, F. M., Lewicki, J. L., Townsend, P. A., Schimel, D. S., Pavlick, R., Bormann, K. J., Ferraz, A., Kang, E. L., Ma, P., Bogue, R. R., Youmans, T., and Pieri, D. C.: Ecosystem responses to elevated CO2 using airborne remote sensing at Mammoth Mountain, California, Biogeosciences, 15, 7403-7418, https://doi.org/10.5194/bg-15-7403-2018, 2018.