GeoLog

Geoscientific Methods

Geosciences Column: climate modelling the world of Game of Thrones

Geosciences Column: climate modelling the world of Game of Thrones

Disclaimer: This article contains minor spoilers for Season 8 of “Game of Thrones.” A basic understanding of the world of Game of Thrones is assumed in this post.

The Game of Thrones world of ice and fire is an unpredictable place both politically and environmentally. While the fate of the Iron Throne is yet to be confirmed, a humble steward has been working diligently to make some sense of the planet’s peculiar climate. The results could help scholars assess when future winters will be coming or how wind patterns may influence where eastern attacks on Westeros from invading dragons and ships would occur.  

It is known that the realms of Westeros and Essos are subject to long-living seasons, with many extending over several years, but Samwell Tarly, the former heir of House Tarly and current steward of the Night’s Watch, has developed a new theory to explain this long seasonal cycle.

His research suggests that the seasons’ extended lifespans could be due to periodic changes in the planet’s tilt as it orbits around the Sun. The results were published in the Philosophical Transactions of the Royal Society of King’s Landing in the Common Tongue, with translations available in Dothraki and High Valyrian.

Tarly carried out his analysis while on sabbatical at the Citadel in Oldtown, Westeros. In the published article he notes that his study was “inspired by the terrible weather on the way here to Oldtown”.

Uncovering climate observations and models

Tarly’s first developed his theory after studying observational climate records stored in the Citadel library’s collections. Many of these manuscripts contain useful information on a number of climate conditions present within the Game of Thrones world, including the multiyear length of seasons.

Seasons occur when regions of a planet receive different levels of sunlight exposure throughout a year. The southern and northern hemispheres experience opposite degrees of sunlight exposure due to the natural tilt of the planet’s axis as it orbits around the Sun. For example, when the southern hemisphere is tilted closer to the Sun it experiences a warmer season; at the same time the northern hemisphere is tilted away from the Sun, so it experiences a colder season.

When a planet is consistently tilted on one side as it orbits around the Sun, the world experiences four seasons during one year. Tarly proposed that seasons could last over several years if the tilt of a planet changes during its orbit: “so that the Earth ‘tumbles’ on its spin axis, a bit like a spinning top”, he explains. If a planet were to only change the side of its tilt once a year, it would experience permanent seasons.

Caption: an example of Earth’s orbit in which (a) the angle of tilt of the spinning axis of the Earth stays constant through the year (Credit: Dan Lunt, University of Bristol)

Caption: an example of Earth’s orbit in which (b) the tilt “tumbles” as the planet rotates round the Sun, such that the angle of tilt changes, so that the same Hemisphere always faces the Sun, giving a permanent season (Credit: Dan Lunt, University of Bristol)

Tarly put this theory to the test with the help of a climate model that he discovered on a computing machine stored in the Citadel cellars. “Luckily I learned how to code when I was back in Horn Hill avoiding sword practice,” Tarly explains in the text.

By running climate simulations with the proposed parameters of his theory, Tarly found that his model was consistent with much of the observational data present within the Citadel library. The models also estimated many climatic features of the world of Game of Thrones, including the seasonal change in temperature, precipitation and wind direction across Westeros.

In the published article, Tarly notes that his theory doesn’t explain how the planet transitions between summer and winter. He guesses that the tumbling pattern of the planet’s tilt persists for a few years, but then flips at one point so that the hemispheres experience new seasons. “The reasons for this flip are unclear, but may be a passing comet, or just the magic of the Seven (or magic of the red Lord of Light if your name is Melisandre),” Tarly writes.

Caption: The Northern Hemisphere winter (top row (a,b,c)) and summer (bottom row (d,e,f)) modelled climate, in terms of surface temperature (◦C; left column (a,d)) precipitation (mm/day; middle column; (b,e)) and surface pressure and winds (mbar; right column (c,f)). (Credit: Dan Lunt, University of Bristol)

The world of Game of Thrones compared to ‘real’ Earth

Tarly then compared the climate of the world of the Game of Thrones to that of a fictional planet called the ‘real’ Earth; Gilly, his partner and research associate, had found records of this planet’s climate in the Citadel library. The analysis revealed that in winter, The Wall, the northern border of the Seven Kingdoms, was similar in climate to many areas of the ‘real’ Earth, including parts of Alaska in the US, Canada, western Greenland, Russia, and the Lapland region in Sweden and Finland. “I always suspected that Maester St. Nicholas was a member of the Night’s Watch,” Tarly noted.

Caption: High-resolution (0.5◦ longitude ×0.5◦ latitude) mountain height for the whole planet. (b) Model-resolution (3.75◦ longitude ×2.5◦ latitude) mountain height for the region of Westeros and western Essos. (Credit: Dan Lunt, University of Bristol)

On the other hand, the models showed that the climate of Casterly Rock, the southern home of House Lannister, was similar to that of the Sahel region in Africa, eastern China, and a small region nearby Houston, Texas in the US.

Climate sensitivity in a world of ice and fire

Finally, Tarly used the climate models to investigate how climate change might impact the world of Game of Thrones. The simulations were done in response to some “worrying reports from monitoring stations on the island of Lys”; the stations have recently observed increasing concentrations of methane and carbon dioxide in the world’s atmosphere. It is suggested that this spike in greenhouse gas emissions could be due to the rising dragon population in Essos, deforestation from global shipbuilding, and excessive wildfire.

Tarly found that, by doubling the level of atmospheric carbon dioxide in his models, the world would warm on average by 2.1°C over 100 years. The results showed that the greatest warming would occur in the polar regions, since warming-induced sea ice and snow melt can trigger additional warming as a positive feedback.

By comparing this level of warming to the Pliocene period of the ‘real’ Earth 3 million years ago, Tarly predicted that the sea level of the world of Game of Thrones could rise by 10 metres in the long term. This degree of sea level rise is sufficient to flood several coastal cities, including King’s Landing.

In the paper, Tarly stresses that climate action from all the Kingdoms is needed to prevent even more social instability and unrest from climate change. He suggests that all governing bodies should work on reducing their greenhouse gas emissions and invest in renewable energy, such as windmills.

If he survives the war for Westeros, Tarly expects that improving his climate analysis will keep him busy for years to come.

By Olivia Trani, EGU Communications Officer

This unfunded work was carried out by Dan Lunt, from the University of Bristol School of Geographical Sciences and Cabot Institute, Carrie Lear from Cardiff University and Gavin Foster from the University of Southampton during their spare time, using supercomputers from the Advanced Centre for Research Computing at the University of Bristol. You can learn more about the climate models online here.

Back for the first time: measuring change at Narrabeen–Collaroy Beach

Back for the first time: measuring change at Narrabeen–Collaroy Beach

Narrabeen–Collaroy Beach in New South Wales, Australia, just north of Sydney, is home to one of the longest-running shoreline-measurement programmes in the world. With colleagues at the University of New South Wales (UNSW) Sydney, Eli Lazarus, an associate professor in geomorphology at the University of Southampton, UK, has been analysing over 40 years of data from Narrabeen–Collaroy to better understand how shorelines recover from major storm events.

In this blog post, Lazarus shares a glimpse of the programme’s history and describes his experience of visiting a field site that for him is both familiar and brand new.

“Want to see what an old GPS unit looks like after it’s been up and down the beach a thousand times?”

Mitchell Harley, a Scientia Fellow and coastal researcher at the UNSW Sydney Water Research Laboratory (WRL), in Manly Vale, Australia, handed me a battered, corroded, steel-cased receiver the size of a grapefruit. “It’s also seen a lot of Duct Tape.”

He loaded a carbon-fibre survey staff and a yellow Pelican case containing a new, a top-of-the-line Trimble GPS handset into the back of a WRL vehicle. With two visiting masters students – Tim van Dam from TU Delft, and Yann Larré from École Polytechnique – we set off on our afternoon excursion, to Narrabeen.

View of Narrabeen Beach, looking south from Narrabeen Headland. Credit: Eli Lazarus

Facing the open South Pacific, Narrabeen and Collaroy are the northern and southern halves of an embayed beach, a reach of sand framed at either end by rocky promontories, that extends approximately three-and-a-half kilometres between Narrabeen Headland and Long Reef Point. Narrabeen is the keystone of the Northern Beaches, a chain of sandy pockets defining the coastal peninsula north of Sydney. The beaches darken in colour with each embayment, from dun in the south to a reddish ochre in the north, representative of the ancient sandstone bedrock units in which they sit.

Narrabeen is a legendary surf break and home turf to a roll of world champions, where, to date, the locals have successfully prevented the installation of anything that resembles a surf cam. But the beach is also home to one of the longest-running and most complete beach-survey programmes in the world (Turner et al., Sci Data 2016).

In 1976, the renowned coastal scientist Andy Short, who used to live in Narrabeen, began the programme from the beach across the street from his house. He and family members, colleagues, friends, and volunteers diligently measured a set of cross-shore profiles along the full Narrabeen–Collaroy embayment every month for 30 years.

All long-term monitoring endeavours are labours of love. But frequent, detailed measurements of beach morphology, maintained consistently over long time scales, are exceptionally rare, and they offer essential quantitative insight into coastal events, changes, and cycles that occur more rapidly than most records tend to capture.

Harley took over the measurement programme in 2006, along with Ian Turner, who now directs the Water Research Lab, and recorded more than 120 monthly surveys of the full beach with a quad-bike Harley would trailer back and forth from Manly Vale.

Harley’s quad-bike – and shoreline-survey workhorse – at the UNSW Sydney Water Research Lab. Credit: Eli Lazarus

The Water Research Laboratory team has continued to experiment with different measurement methods for the Narrabeen–Collaroy system. Mounted on the top floor of the Flight Deck, a beachfront hotel where Narrabeen blends into Collaroy, is an array of five cameras, known as an Argus station, that takes time-averaged photos of the shoreline and surf zone. Tucked in among the cameras is a smoked-glass dome that looks like a space helmet: a lidar unit that uses a laser to measure wave swash and a cross-shore profile of beach elevation five times every second.

On our outing, Harley first drove us up Narrabeen Headland, to get an unobstructed southerly view of the bay. At the overlook was a stainless-steel post with a frame to hold a smartphone. This was the Narrabeen CoastSnap station.

In 2017, Harley, along with collaborators from the New South Wales State Government, launched the CoastSnap programme to collect crowd-sourced observations of beach dynamics (Harley et al., 2019). The process is simple: take a photo, post the image on social media with the station hashtag (#CoastSnapNarra, for example), and if you don’t post it right away, then write in the date and time of the image. With some clever analytical tricks, an algorithm finds the shoreline in your photo. Harley installed the first CoastSnap station at Manly Beach, above the Manly Surf Life Saving Club. There are now more than 35 CoastSnap stations in nine countries around the world.

Harley pointed out the various permanent features the algorithm uses to identify the shoreline position in every #CoastSnapNarra photo: an inlet hazard sign, the corners of prominent buildings in the foreground and distance. “We get about an image a day from people up here,” he said. Watching a sparse line-up of surfers work a peeling break at Narrabeen Inlet, we stood eating steak pies from The Upper Crust – like the surfers, another local institution.

Pies finished, we looped back down to the north end of the beach and assembled the GPS. The four of us would take turns walking the GPS receiver down the five main cross-shore transects still sampled at Narrabeen and Collaroy every month, and the three visitors would get our names added to the dataset’s long list of contributors.

Harley, Larré (holding GPS) and van Dam working through a beach profile. Credit: Eli Lazarus

In a reversal of cart and horse, I had written a scientific article about Narrabeen but never seen it. In fact, I was there in Sydney to visit people I had co-authored with but never met in person.

Earlier this year, Harley, Chris Blenkinsopp (of Bath University in the UK, and a former postdoc at WRL), Turner, and I published a paper in the EGU journal Earth Surface Dynamics about the information that shoreline records retain or destroy regarding the environmental conditions that shape them (Lazarus et al., 2019).

Extreme storm events, for example, can inscribe dramatic changes in the shape of a coastline. A detailed, high-frequency record of shoreline position presumably should reflect something about the magnitude of those events. But sedimentary systems can be very effective at obscuring or erasing their own histories, and not all evidence of conditions that impact a shoreline gets preserved. This phenomenon is known as ‘signal shredding’. The exceptional data catalogue for Narrabeen–Collaroy enabled us to pursue the first empirical test of signal shredding at a sandy beach, an idea I’d puzzled over since geomorphic signal-shredding was first described for other sediment-transport systems almost ten years ago (Jerolmack & Paola, 2010).

Among our survey crew, I asked to take Profile 4, near the middle of the embayment, because that was the record I had used the most when working through the signal-shredding analysis. To me, Profile 4 seemed to best capture, in a single line, the spatially variable character of the beach overall.

As we leapfrogged our way south, the beach profile became steeper and narrower. Harley mentioned an article that he had published with Turner and Short (Harley et al., 2015) that described, among other patterns at Narrabeen, a spatial pattern in the beach slope. If one end of the beach was steeply sloping toward the water, then the other end would be flat. The steep stretches of the beach tended to be narrow, and the flat stretches tended to be wide. Under certain wave conditions, the narrow, steep end of the would switch to being wide and flat, and vice versa – a pattern typical of embayed beaches called ‘rotation’.

As Harley described the slope pattern, the observation struck me as the kind that comes from investing time at a field site: the intuition internalised by surveying the beach over and over again in the seat of a quad-bike, from tipping sideways in the steeps and tracing the long meanders of the shoreline across the flats.

Standing astride the sharp break in beach slope at Collaroy, looking south toward Long Reef. Credit: Eli Lazarus

We finished the day with a walk around Long Reef, at Collaroy, looking back into the embayment we’d spent the afternoon traversing. Hang-gliders drifted in slow figure-eights above us. I was headed back to the UK the next day. There is more work to be done at Narrabeen, for sure, and we talked about what’s coming next: algorithms for predicting shoreline position (Davidson et al., 2017), fresh insights into beach recovery after major storms (Phillips et al., 2019), identifying shorelines from catalogues of satellite imagery (Vos et al., 2019). We talked about possible funding avenues to keep fuelling our collaboration.

The wind picked up, and the waves set to work rearranging the shoreline we had just measured.

Day’s end and hang-gliders at Long Reef, looking northwest toward Collaroy and Narrabeen. Credit: Eli Lazarus

By Eli Lazarus, University of Southampton, UK

Dr Eli Lazarus (@envidynxlab) is an Associate Professor in Geomorphology in the School of Geography & Environmental Science at the University of Southampton, UK.

 

Imaggeo on Mondays: Science above the Amazon rainforest

Imaggeo on Mondays: Science above the Amazon rainforest

The color and symmetry of the Amazon Tall Tower Observatory (ATTO) sticks out against the endless green of the rainforest. Built in a remote and pristine location, the ATTO tower is the tallest construction in South America. In a joint Brazilian-German project, atmospheric scientists aim to unravel the interaction of pristine rainforest with the atmosphere. With its height of 325 meters, the ATTO tower allows for studying atmospheric processes at different spatial scales.

Description by Achim Edtbauer, as it first appeared on imaggeo.egu.eu.

Imaggeo is the EGU’s online open access geosciences image repository. All geoscientists (and others) can submit their photographs and videos to this repository and, since it is open access, these images can be used for free by scientists for their presentations or publications, by educators and the general public, and some images can even be used freely for commercial purposes. Photographers also retain full rights of use, as Imaggeo images are licensed and distributed by the EGU under a Creative Commons licence. Submit your photos at http://imaggeo.egu.eu/upload/.

How to increase reproducibility and transparency in your research

How to increase reproducibility and transparency in your research

Contemporary science faces many challenges in publishing results that are reproducible. This is due to increased usage of data and digital technologies as well as heightened demands for scholarly communication. These challenges have led to widespread calls for more research transparency, accessibility, and reproducibility from the science community. This article presents current findings and solutions to these problems, including recent new software that makes writing submission-ready manuscripts for journals of Copernicus Publications a lot easier.

While it can be debated if science really faces a reproducibility crisis, the challenges of computer-based research have sparked numerous articles on new good research practices and their evaluation. The challenges have also driven researchers to develop infrastructure and tools to help scientists effectively write articles, publish data, share code for computations, and communicate their findings in a reproducible way, for example Jupyter, ReproZip and research compendia.

Recent studies showed that the geosciences and geographic information science are not beyond issues with reproducibility, just like other domains. Therefore, more and more journals have adopted policies on sharing data and code. However, it is equally important that scientists foster an open research culture and teach researchers how they adopt more transparent and reproducible workflows, for example at skill-building workshops at conferences offered by fellow researchers, such as the EGU short courses, community-led non-profit organisations such as the Carpentries, open courses for students, small discussion groups at research labs, or individual efforts of self-learning. In the light of prevailing issues of a common definition of reproducibility, Philip Stark, a statistics professor and associate dean of mathematical and physical sciences at the University of California, Berkeley, recently coined the term preproducibility: “An experiment or analysis is preproducible if it has been described in adequate detail for others to undertake it.” The neologism intends to reduce confusion and also to embrace a positive attitude for more openness, honesty, and helpfulness in scholarly communication processes.

In the spirit of these activities, this article describes a modern workflow made possible by recent software releases. The new features allow the EGU community to write preproducible manuscripts for submission to the large variety of academic journals published by Copernicus Publications. The new workflow might require hard-earned adjustments for some researchers, but it pays off because of an increase in transparency and effectivity. This is especially the case for early career scientists. An open and reproducible workflow enables researchers to build on others’ and own previous work and better collaborate on solving the societal challenges of today.

Reproducible research manuscripts

Open digital notebooks, which interweave data and code and can be exported to different output formats such as PDF, are powerful means to improve transparency and preproducibility of research. Jupyter Notebook, Stencila and R Markdown let researchers combine long-form text of a publication and source code for analysis and visualisation in a single document. Having text and code side-by-side makes them easier to grasp and ensures consistency, because each rendering of the document executes the whole workflow using the original data. Caching for long-lasting computations is possible, and researchers working with supercomputing infrastructures or huge datasets may limit the executed code to purposes of visualisation using processed data as input. Authors can transparently expose specific code snippets to readers but also publish the complete source code of the document openly for collaboration and review.

The popular notebook formats are plain text-based, like Markdown in case of R Markdown. Therefore an R Markdown document can be managed with version control software, which are programs for managing multiple versions and contributions, even by different people, to the same documents. Version control provides traceability of authorship, a time machine for going back to any previous “working” version, and online collaboration such as on GitLab. This kind of workflow also stops the madness of using file names for versions yet still lets authors use awesome file names and apply domain-specific guidelines for packaging research.

R Markdown supports different programming languages besides the popular namesake R and is a sensible solution even if you do not analyse data with scripts nor have any code in your scholarly manuscript. It is easy to write, allows you to manage your bibliography effectively, can be used for websites, books or blogs, but most importantly it does not fall short when it is time to submit a manuscript article to a journal.

The rticles extension package for R provides a number of templates for popular journals and publishers. Since version 0.6 (published Oct 9 2018) these templates include the Copernicus Publications Manuscript preparations guidelines for authors. The Copernicus Publications staff was kind enough to give a test document a quick review and all seems in order, though of course any problems and questions shall be directed to the software’s vibrant community and not the publishers.

The following code snippet and screen shot demonstrate the workflow. Lines starting with # are code comments and explain the steps. Code examples provided here are ready to use and only lack the installation commands for required packages.

# load required R extension packages:
library("rticles")
library("rmarkdown")

# create a new document using a template:
rmarkdown::draft(file = "MyArticle.Rmd",
                 template = "copernicus_article",
                 package = "rticles", edit = FALSE)

# render the source of the document to the default output format:
rmarkdown::render(input = "MyArticle/MyArticle.Rmd")

{: .language-r}

The commands created a directory with the Copernicus Publications template’s files, including an R Markdown (.Rmd) file ready to be edited by you (left-hand side of the screenshot), a LaTeX (.tex) file for submission to the publisher, and a .pdf file for inspecting the final results and sharing with your colleagues (right-hand side of the screenshot). You can see how simple it is to format text, insert citations, chemical formulas or equations, and add figures, and how they are rendered into a high-quality output file.

All of these steps may also be completed with user-friendly forms when using RStudio, a popular development and authoring environment available for all operating systems. The left-hand side of the following screenshot shows the form for creating a new document based on a template, and the right-hand shows side the menu for rendering, called “knitting” with R Markdown because code and text are combined into one document like threads in a garment.

And in case you decide last minute to submit to a different journal, rticles supports many publishers so you only have to adjust the template while the whole content stays the same.

Sustainable access to supplemental data

Data published today should be published and properly cited using appropriate research data repositories following the FAIR data principles. Journals require authors to follow these principles, see for example the Copernicus Publications data policy or a recent announcement by Nature. Other publishers required, or still do today, to store supplemental information (SI), such as dataset files, extra figures, or extensive descriptions of experimental procedures, as part of the article. Usually only the article itself receives a digital object identifier (DOI) for long-term identification and availability. The DOI minted by the publisher is not suitable for direct access to supplemental files, because it points to a landing page about the identified object. This landing page is designed to be read by humans but not by computers.

The R package suppdata closes this gap. It supports downloading supplemental information using the article’s DOI. This way suppdata enables long-term reproducible data access when data was published as SI in the past or in exceptional cases today, for example if you write about a reproduction of a published article. In the latest version available from GitHub (suppdata is on its way to CRAN) the supported publishers include Copernicus Publications. The following example code downloads a data file for the article “Divergence of seafloor elevation and sea level rise in coral reef ecosystems” by Yates et al. published in Biogeosciences in 2017. The code then creates a mostly meaningless plot shown below.

# load required R extension package:
library("suppdata")

# download a specific supplemental information (SI) file
# for an article using the article's DOI:
csv_file <- suppdata::suppdata(
  x = "10.5194/bg-14-1739-2017",
  si = "Table S1 v2 UFK FOR_PUBLICATION.csv")
supplemental

# read the data and plot it (toy example!):
my_data <- read.csv(file = csv_file, skip = 3)
plot(x = my_data$NAVD88_G03, y = my_data$RASTERVALU,
     xlab = "Historical elevation (NAVD88 GEOID03))",
     ylab = "LiDAR elevation (NAVD88 GEOID03)",
     main = "A data plot for article 10.5194/bg-14-1739-2017",
     pch = 20, cex = 0.5)

{: .language-r}

Main takeaways

Authoring submission-ready manuscripts for journals of Copernicus Publications just got a lot easier. Everybody who can write manuscripts with a word processor can learn quickly R Markdown and benefit from a preproducible data science workflow. Digital notebooks not only improve day-to-day research habits, but the same workflow is suitable for authoring high-quality scholarly manuscripts and graphics. The interaction with the publisher is smooth thanks to the LaTeX submission format, but you never have to write any LaTeX. The workflow is based on an established Free and Open Source software stack and embraces the idea of preproducibility and the principles of Open Science. The software is maintained by an active, growing, and welcoming community of researchers and developers with a strong connection to the geospatial sciences. Because of the complete and consistent notebook, you, a colleague, or a student can easily pick up the work at a later time. The road to effective and transparent research begins with a first step – take it!

Acknowledgements

The software updates were contributed by Daniel Nüst from the project Opening Reproducible Research (o2r) at the Institute for Geoinformatics, University of Münster, Germany, but would not be able without the support of Copernicus Publications, the software maintainers most notably Yihui Xie and Will Pearse, and the general awesomeness of the R, R-spatial, Open Science, and Reproducible Research communities. The blog text was greatly improved with feedback by EGU’s Olivia Trani and Copernicus Publications’ Xenia van Edig. Thank you!

By Daniel Nüst, researcher at the Institute for Geoinformatics, University of Münster, Germany

[This article is cross posted-on the Opening Reproducible Research project blog]

References