In this month’s GeoEd post, Sam Illingworth explores the pitfalls of being a scientist in the public eye. Following the recent acquittal of 6 geoscientists on manslaughter charges after ‘failing’ to predict the 2009 L’Aquila earthquake, is it time we thought about improving how risk is communicated to the wider public?
At the beginning of November of this year, six Italian scientists were acquitted of manslaughter; an appeals court in L’Aquila (a medieval Italian city on the edge of the Aterno river) overturning the 2012 guilty verdicts that were originally cast against the researchers.
In their initial trial, the scientists were convicted of multiple manslaughter charges, by failing to predict the devastating earthquake, which struck at 03:32 CEST on 6 April 2009, and which was responsible for the deaths of 309 people. It has taken the past two years to acquit these six scientists, and the initial ramifications of the convictions were far reaching, with other researchers from across the globe wondering if a precedent had now been set, regarding liability for the conveyance of information.
Sadly, scientists are far from unaccustomed with judicial proceedings, from Galileo vs. the Catholic Church, to more recent examples of scientists being sued by a gym regarding injury rate statistics, or NASA being sued for trespassing on MARS. However, the recent allegations against the L’Aquila six (actually there were seven experts in total; more on this later), calls into question the fundamental belief system of accountability. If a building surveyor were to tell you that the foundations of your house were sound, yet you were later to find evidence of subsidence you would expect compensation from the surveyor. So why not also from the scientists, after all are they not too experts in their own field?
Well, for one thing, finding evidence for subsidence is far more of a precise art than trying to predict earthquakes. On the one hand you are looking for something that already exists, and on the other you are searching for something that may or may not be. In addition to this, surveyors are usually protected by professional indemnity insurance.
However, in the case of scientists communicating risk, is not being able to accurately predict an earthquake or a volcanic eruption really professional negligence, or is it simply to be expected given the impossibility of fully accurate predictions?
What is potentially worrying to scientists is that the line between professional negligence and unforeseen circumstance would appear to be very blurred indeed. Although, in some instances the distinction is far more clear-cut, for example the behaviour of the seventh member of the panel of experts in the L’Aquila case, Bernardo De Bernardinis. The then deputy director of the Civil Protection agency had, prior to the earthquake, advised locals to “sit back and enjoy a nice glass of Montepulciano” wine. Bernandinis was not acquitted, although his prison sentence was cut, from six to two years.
Although many might view Bernandinis as being guilty of nothing more than pompous over confidence, it is important to remember that as scientists we still have a role to inform the public as to the seriousness of any potential dangers, even if we are not ultimately to be held accountable for our inability to predict them. In other words, failing to predict a natural hazard (or other such incident) should not be seen as professional negligence, but failing to adequately inform the general public of the consequences of any potential threats, probably should be.
Of course, communicating risk goes well beyond natural disasters, and is something that many of us do when we talk about the effects of both current and predicted climate change. In these situations, scientists also regularly put themselves in the firing line, although this time often with regards to the media and pressure groups with an anti-climate change agenda.
One of the most well known examples of this was when a Competitive Enterprise Institute (CEI) analyst made the following, frankly horrific statement, about Penn State University climate researcher Michael Mann:
“Mann could be said to be the Jerry Sandusky of climate science, except that instead of molesting children, he has molested and tortured data in the service of politicized science.”
Dr Mann has subsequently sued the CEI, but such legal proceedings are both incredibly expensive and time consuming, and often represent a completely alien world to many scientists who are simply just doing their job.
In the US, scientists working for government or federal labs are now offered free legal counsel and support by the organization Protecting Our Employees Who Protect Our Environment (PEER). In addition to this, some scientific professions are now requiring their researchers to have professional indemnity insurance, for example in the UK, legislation was recently introduced that requires all health care scientists to have a professional indemnity arrangement in place, as a condition of their registration with the health & care professions council.
According to Jeff Ruch, the director of PEER, threatening scientists for their science “is a bully strategy,” and “bullies don’t like to be pushed back at.” Whilst the work of PEER and their contemporaries is admirable, is this a position that scientists should ever be finding themselves in? And is there anything that they could be doing to avoid such potential pitfalls?
In some cases, these pitfalls could be avoided by a more careful consideration of how to communicate risk, by explaining to the general public that there are many uncertainties associated with the calculations and predictions that are being made. However, I think that this is something that many scientists are already reasonably adept at, and if scientists are guilty of anything it is sometimes of being overcautious with their predictions, or of waiting to comment until they are absolutely 99.9% sure (with the obligatory 0.1% margin of error).
Media and science communication training can help scientists prepare for how to deliver their research and advice in potentially alien and hostile arenas, but there will always be instances where people have a set agenda to follow at any cost.
There may well be a public perception that scientists failing to predict natural disasters, or underdetermining a certain problem, are like the proverbial bad workmen who blame their tools. However, in trying to communicate risk I think that it might well be a case of “don’t shoot the messenger,” even if it turns out that they have no message to convey.
By Sam Illingworth, Lecturer, Manchester Metropolitan University