Geology for Global Development

Geoethics

New Paper: Interconnected Geoscience for International Development

A new paper published in Episodes: The Journal of International Geoscience highlights the importance of geoscience in tackling complex development challenges, and the need for new approaches to overcome barriers preventing greater application of geoscience within development. ‘Interconnected geoscience for international development‘, written by Professor Michael Petterson of Auckland University of Technology (New Zealand) sets out a conceptual model that combines geoscience expertise with an understanding of developmental situations, conditions, and context. 

The Sustainable Development Goals (SDGs) and Sendai Framework for Disaster Risk Reduction require geoscientists from across all sectors and sub-disciplines to get involved, improve access to their science, and participate in effective and respectful capacity building and knowledge exchange (read more here). In this new article, Petterson (2019) reflects on his experiences as a geoscientist working in two sharply contrasting development contexts (the Solomon Islands and Afghanistan) to synthesise key learning. With one funding cycle starting as another comes to an end, taking time to reflect on and share lessons learned is sadly not always typical. As the SDGs and a renewed focus on science-for-development provide geoscientists with greater opportunities to engage in international development, this reflection is of great importance.

 

One factor discussed by Petterson (2019) is the importance of both understanding and valuing situational context (including local world views), and using this to enrich the design and implementation of projects. Another is the importance of inclusivity, building strong networks and actively including local wisdom. While good technical geoscience knowledge and skills are greatly needed in development programmes, these must be complement by a suite of other skills (often missing from the traditional education of geoscientists). Recognising this, and helping geoscientists to build these skills, is central to the work of Geology for Global Development.

Petterson (2019) notes: “Developmental setting/conditions are the foundation: these will guide how the geoscience is to be optimally applied. Projects are devised with development goals in mind and outputs/services tailored to meet the needs of policy makers and practitioners. Local affected communities must be at the heart of project outcome design. An interconnected approach places importance on issues such as inclusivity, environment and local focus, indigenous and non-conformist world-views, valuing and incorporating traditional knowledge, the possibilities of citizen-science and geoscientist-community connections/relations. The interconnected approach adopts the equal and respectful inclusive approach from the earliest stages of programme conception and development. Interconnected geoscience approaches, provide a conceptual model for the possibilities of science + social science + community + local world views, to feed into policy and communal acceptance of policy. An interconnected geoscience approach stands a better chance of addressing complex, regional and global development issues, including planetary health and global climate change. The approach improves the probability of practitioners using research results, and researchers undertaking research that addresses the highest level needs of development.”

Read the whole article (open access) here.

The link between development and resource use

The link between development and resource use

This month the GfGD blog revolved around the theme of Resources. Blog author Heather Britton explores the link between the use of natural resources and development. How feasible are the various options available to us, to reach a use of resources aligned with sustainable development? From the ideology of a circular economy, a switch to renewable resources and increasing efficiency, what might help us get out of an unsustainable pattern? [Editor’s note: This post reflects Heather’s personal opinions. These opinions may not reflect official policy positions of Geology for Global Development.]

Resources play a huge part in determining the character, history and trading power of a country. Many of these resources – such as metal ores, precious stones and fossil fuels – link directly to the geology of a region, which has inspired the theme of ‘resources’ for this month’s selection of blog posts.

This week, I want to look at how in the past, and indeed to this day, the quantity and quality of resources available to a country has acted as a predictor of how developed that country is, and how this will need to change in the future if we are to succeed in meeting the UN sustainability goals.

The most striking example of development spurred on by the availability of resources is the industrial revolution. The UK is thought to have led the way in becoming an industrialised nation due to a combination of the amount of underlying carboniferous coal, and a strong agricultural economy.

Although Britain is thought to have experienced an industrial revolution of its own between the mid-18th century and 1830, the more widely recognized industrial revolution occurred between the mid-19th to the 20th century and was experienced by other countries, including France, Germany and North America to name a few.

Without the use of coal as a resource, development might have come to the UK much later.

It is predicted that by 2050, 140 billion tons of minerals, ores, fossil fuels and biomass will be used per year – three times the current average.

The environmental effects of burning coal and other fossil fuels were not fully appreciated at this time.

In the UK, as light has been shone on the negative impact of fossil fuel use, carbon emissions have been cut to a fraction of what they were during the industrial revolution. That being said, the UK is in the privileged position of having gone through industrial development prior to the threat of global warming being appreciated.

Many countries, particularly in parts of the world with low GDP, are only now beginning to use the natural resources available to them to undergo similar development to that which the UK experienced a century ago (this website gives an indication of world income by region over time).

This poses a problem for the climate, however, and brings us to the cusp of the problem – development needs to be decoupled from resource use, so that countries are able to reap the rewards of development in a sustainable way which does not exacerbate the negative impact that people have had on our planet up until now.

But how can this be achieved?

going from our entrenched linear method of dealing with waste to a circular economy would require huge changes to the way in which property, possessions and businesses f­unction

It is predicted that by 2050, 140 billion tons of minerals, ores, fossil fuels and biomass will be used per year – three times the current average.

Citizens of developed countries consume an average of 16 tons of these same materials per capita (ranging up to 40 or more tons per person in some developed countries). By comparison, the average person in India consumes only 4 tons per year. This stark contrast demonstrates how much resources are taken for granted in the economically developed world, and how this needs to change.

One method of severing the link between development and resource availability is to shift towards a circular economy. This is an ideology whereby there is little to no waste, and instead of items being thrown away once used, the worn-out components are continually replaced.

This idea is similar to how natural ecosystems function (there is no waste in nature). Adopting this kind of lifestyle would separate our reliance on resources from the ability of a nation to develop, but going from our entrenched linear method of dealing with waste to a circular economy would require huge changes to the way in which property, possessions and businesses f­unction.

Although it may be the ideal solution, transitioning to a circular economy would require a huge change in global attitude which will take a great deal of time to develop.

A far more feasible way of working to separate unsustainable resource use from development is … to minimise the use of non-renewable resources

A far more feasible way of working to separate unsustainable resource use from development is simply to minimise the use of non-renewable resources so that it is no longer essential to use them to reach a developed state.

Methods of doing so include adopting new, greener technologies to replace the heavy industries that have been large-scale users of fossil fuels in the past (for example adopting electric arc furnace improvements in the iron and steel industry) and ensuring that fewer high carbon fuels need to be burned to heat homes by improving home insulation, particularly in cooler parts of the world.

By improving the materials, insulation and orientation of buildings (orientations which make the most use of solar gains) energy use in buildings can be cut by 80%.

On top of these examples, using more renewable energy in agriculture and continuing to innovate to create alternatives to unrenewable resources use are further options.

Picture by Joyce Schmatz, distributed via imaggeo (CC BY 3.0). By making agriculture more renewable we can take a step towards decoupling development from resource-use.

It is doubtless that as a country develops, its resource use will increase. However, with awareness of the environmental challenges facing the planet as it is growing, developing countries will be able to tap into the growing renewables industry rather than turning to substantially increased fossil fuel use.

At the end of the day, countries will develop however they are able and it is not up to anyone to dictate how they do this. However, in the interests of meeting UN sustainable development goal 13 – climate action – encouraging sustainable development may be the best way to ensure that as development spreads to more countries, our planet is not significantly affected as a result.

**This article expresses the personal opinions of the author (Heather Britton). These opinions may not reflect an official policy position of Geology for Global Development. **

Anthropocene: Are we in the recent age of man?

International Chronostratigraphic Chart

Regular GfGD Blog contributor Heather Britton pen’s this weeks post, where she discusses the heated topic of whether we are, or not, living in the Anthropocene. [Editor’s note: This post reflects Heather’s personal opinion. This opinions may not reflect official policy positions of Geology for Global Development.]

Naming a geological epoch the Anthropocene, literally meaning ‘the recent age of man’, is an idea that has been seriously discussed in many scientific circles and has become a scientific buzzword in recent years. Environmentalists, generally, are great proponents for the idea, stating that it summarises the huge changes that human presence has had on the planet and draws attention to the need for us to change our ways and prevent the damage from extending into the future. Geologists are typically less enthused by the idea. Naming an interval of geological time involves formally recognising that the Earth has been permanently changed at the onset of this era, and although in many ways humans have permanently changed the planet, making this a formal geological epoch requires the identification of a single point in the rock record when this took place. I wish to explain why I believe the Anthropocene, although suggested for admirable reasons, should not become formally recognised.

The term was first popularised almost 20 years ago in the year 2000 (by environmental scientist Paul Crutzen) and in 2016 the Working Group on the Anthropocene (WGA) voted to formally designate the epoch Anthropocene and present the recommendation to the international geological congress. The International Commission on Stratigraphy and the International Union of Geological Sciences have not approved this subdivision of geological time, but it may be that a decision is on the horizon [Ed: In July 2018 the International Union of Geological Sciences ratified a decision by the International Commission on Stratigraphy which announced Earth was living in the Meghalayan Age].

Finding the signal that marks this period exactly is difficult, but not for a lack of options. The prime candidate is the appearance of radioactive nuclides from nuclear bomb tests, which have registered a signal worldwide. Plastic pollution, high level of nitrogen and phosphate in soils from fertilisers and a massive increase in the number of fossilised chicken bones are other strong contenders which appear to define the rise of the human population and civilisation. There is certainly strong evidence to suggest humanity’s effect on the planet is permanent, but are we really in a position to state that the planet has undergone a permanent change when humanity itself is still a blip in geological time? To put it another way, if something were to wipe out the human race tomorrow, there certainly would be a distinctive signal of our presence in the rock record, but due to the tiny fraction of Earth history that we occupy, how can we guarantee that it will endure for long enough to be significant in geological terms?

Plastic in the rock record could used as a marker for the base of the Anthropocene. Credit: Guilhem Amin Douillet (distributed via imaggeo.egu.eu)

There are many geologists who would claim that the creation of the International Chronostratigraphic Chart is one of the greatest achievements of mankind. Each Eon, Era and Epoch has been painstakingly identified using signals within the strata that must conform to a set of very strict rules. This ensures that rocks all over the world can be correlated to the same record of geological time, allowing communication and understanding between scientists from different countries where otherwise the use of local nomenclature would cause endless mistakes and confusion. The most common way of marking the base of a stratigraphic unit is the appearance or disappearance of a particular fossil.

This method clearly has its limitations – fossil organisms will have only lived in certain habitats, and it is assumed that the time taken for a new fossil organism to spread from where it evolved to locations across the globe is negligible in comparison to geological time, something we can’t be certain is true for all species. Dating is simpler when volcanic rocks are present, as radioactive dating is able to step in and provide, for the most part, accurate rock ages, but such methods cannot be used with any great certainty in the sedimentary world. As discussed above, fossil evidence for the beginning of the Anthropocene is present, but it seems more likely that a different kind of signal is used to mark this new epoch. This would not be the first time, as was demonstrated when the Holocene was formally designated in 2008.

In conclusion, making the Anthropocene a formal geological epoch would send out a message which may fast track the public and global governments to take notice of the impact we are having on the planet and, as a result, take action. I question, however, whether this is a sound enough reason to add to the international stratigraphic column. The Holocene, the time period we are currently considered to be occupying, began approximately 12,000 years ago as Earth slipped out of ice ages into what is currently an extended interglacial period showing no sign of slipping back into its glacial state. The time since the start of the Holocene is already only a geological blink of an eye and cutting it short now to make way for the Anthropocene seems both unnecessary and indicative of a lack of appreciation of the enormity of geological time. The now is not always an appropriate time to mark a significant event, as it is only afterward that its significance can really be properly understood. Regardless of this, it does not excuse how over the miniscule time period that we have spent inhabiting this planet we have had such a detrimental effect on what is a shared home and not ours to ruin. This certainly needs to be put to rights, but I am not certain that announcing the Anthropocene is the best way of doing so.

**This article expresses the personal opinions of the author (Heather Britton). These opinions may not reflect an official policy position of Geology for Global Development. **

 

Private solutions, public science: how to bridge the gap?

Private solutions, public science: how to bridge the gap?

The urgency around many sustainability issues leads some billionaire investors to throw caution in the wind, frustrated with the pace of academic research. Robert Emberson sympathises with private projects like the Ocean Cleanup, even when things go wrong. ‘How’, he asks, ‘might we build a constructive bridge between ambitious entrepreneurs and scientific sceptics? ‘

Reading and writing about sustainable development in 2019 can be tough going, with a seemingly unending series of headlines suggesting that we as a society are lagging behind in the race to achieve our goals and that the deleterious effects of climate change are looming closer and closer, if not already upon us.

So when good news of any kind comes along, it can often be something to cling to – and perhaps even more devastating if that news is not what it seems. This up and down emotional trajectory describes my response to the clean-up operation launched last year to remove the plastic waste from the ‘Great Pacific Garbage Patch’, which ran into difficulties early this year.

The story is not yet over, though, and there are lessons to be learned for scientists working on issues related to sustainability more generally – so perhaps a positive outcome is still to come.

For those unaware, plastic pollution, both small and large, often ends up in the ocean, where gyres – or ocean currents – preferentially carry the waste products to certain areas, where it accumulates. These patches are hard to delineate, since unlike the images of islands of plastic bottles and grocery bags sometimes portrayed in the media, the plastic concentration is relatively low (4 particles per cubic metre), but the patch – which may be as large as 15,000,000 square kilometres – likely represents the largest waste accumulation in the ocean.

The open ocean, while home to diverse ecosystems and vitally important to many food networks, is a challenging thing to govern. Since it is not owned by any given country, the responsibility to clean up waste accumulating within the seas is nigh on impossible to assign. It’s a classic problem of ‘the commons’ – shared resources, like the ocean or the atmosphere, that many users need but none own, can be overexploited and depleted. Resolving those issues can be challenging at best.

For some scientists, problems with the system had been evident from the start

So, in 2012, enter the Ocean Cleanup Project. At a TED talk, the 18-year-old inventor Boyan Slat laid out a plan to use floating booms to gradually gather up the waste in an efficient manner. Investors were intrigued, and the project took off quickly; billionaires funding it allowed for it to be deployed in mid-2018, rapid progress by any standard. The clean-up attempt had begun in earnest.

Quickly, though, problems arose; the system of floating booms couldn’t withstand the storms in the open ocean, and by January 2019 the first clean-up system had been towed to Hawaii for repairs after teething problems.

For some scientists, problems with the system had been evident from the start. Kim Martini and Miriam Goldstein, research oceanographers unaffiliated with the project, analysed the project and found major issues. While there was communication between the scientists and the engineers involved with the project, and some of the issues raised were addressed, the two oceanographers still maintained that while the aim was laudable, the design was not as accomplished. Despite this, the project went ahead, and the concerns of the scientists proved to be well founded.

Clearly, this is a well-intentioned project. But perhaps just as clear is that a communications gulf existed between the scientists and the project developers. And therein lies the key question: how can scientists involved in sustainability issues best communicate their thoughts to private sector projects aiming to solve those issues? It certainly seems unlikely that the Ocean Cleanup will be the last case where such communication matters.

Indeed, it’s not surprising that in some cases private investors and entrepreneurs have stepped in with big ideas to solve problems of the commons. It’s clear that in many cases billionaires have lofty ambitions beyond the business that made them rich – both Jeff Bezos at Amazon at Tesla’s Elon Musk have moved into space exploration, and for individuals with such a mindset the idea of ‘saving the world’ might well appeal. They may also consider themselves less limited by regulation and national borders than scientists and government.

In fact, there’s more than just regulation and borders that hold back some ideas. The precautionary principle, both in unwritten and legal contexts, prevents some action where it is unclear if that action could result in harm to the public. This is often applied to geoengineering ideas, since the long-term implications may not be well known. A private project to dump iron sulphate into the ocean to encourage plankton growth and thus a draw-down of Carbon Dioxide in 2012 was cited as falling foul of these principles, having not established the long-term risk of seeding the ocean in this way.

The slower pace of academic research, …, makes it ever more appealing for private individuals to skip those steps and spend a fortune to fix something now, rather than wait until it’s too late

At the same time, however, there is an increasing sense of urgency around many sustainability questions. The slower pace of academic research, the painstaking process of ensuring reproducibility in findings, and the need to establish long term effects of potential solutions to climate or sustainability issues makes it ever more appealing for private individuals to skip those steps and spend a fortune to fix something now, rather than wait until it’s too late.

I can sympathise with that view. It’s well-meaning, and solving a problem is better than sitting on the sidelines, or worse profiting from it. Moreover, hindsight is 20:20, so if a solution only becomes problematic after it is deployed, then those behind it can always argue that they did what they could in advance. That must be balanced though with an abundance of caution, and perhaps this is where scientists can help.

I would argue that we should be realistic – solutions will come from all sectors of society, and private individuals and entrepreneurs may well be the ones leading the charge. While it shouldn’t be incumbent upon research scientists alone to ensure their voices are heard by private projects, we shouldn’t shy away; building bridges, especially in the form of communication channels, would be of great benefit. Goldstein and Martini did a great service to science by reaching out and making their voices heard, even if they might have been perceived as naysayers.

We might not be able to change the minds of those leading private initiatives, but we can at least provide them with the most information possible to make their decisions.

Robert Emberson is a Postdoctoral Fellow at NASA Goddard Space Flight Center, and a science writer when possible. He can be contacted either on Twitter (@RobertEmberson) or via his website robertemberson.com