Say hello to Sam Illingworth, Young Scientist Representative, Science Communication Lecturer and education enthusiast! Sam will be making regular contributions to the GeoEd series, sharing his experience of science outreach with geoscientists, educators and the public at large. In his GeoEd debut Sam reports on the importance of evaluating outreach activities, one of the key areas covered in EGU 2014’s short course on school outreach…
Hello, and welcome to the latest blog post from GeoEd, the EGU’s series about education and outreach in the geosciences. My name is Sam Illingworth, and I will be making a regular appearance on this GeoLog feature.
I aim to help develop this series as a useful tool for educators, practitioners, and scientists working in geoscience. As a lecture in science communication, I intend to draw on my experiences of developing and delivering educational activities to produce some useful and informative advice to old hands and newcomers alike, and sincerely hope that this series can become a focus point for stimulating debate regarding geosciences in the education.
Forgive me if I appear reckless in my first post on this site listener, but we really do need to talk about evaluation.
How many times have you delivered a successful outreach activity and then packed up your stuff and returned to the lab/office/tea room with your ears still ringing from the delicate sound of children’s laughter and your heart still racing from the close escape with the school’s fire alarm system, only to swiftly move on to the next experiment/computer model/chocolate digestive that you had on your agenda?
We are at times all serially guilty of failing to properly evaluate our school outreach activities, but in doing so we are wasting a great opportunity to not only further develop and strengthen our own events, but also to assist in the overall understanding and development of educational engagement.
As an absolute minimum we should be recording the metrics (number of students, age range, name of school, etc.) for the outreach activities in which we are involved, both for our own records and also for those of our universities and any external funding bodies. A short personal summary of the activity is also good practice, as by recording your own thoughts on what did and did not work you are able to ensure that the next execution of the activity is an even greater success. Even if it was a one-off event, such summaries can still help you in developing and delivering future activities.
In order to really assess the relative successes of the outreach activity, though, it is necessary to get feedback from the students, educators and demonstrators. Obtaining this feedback needn’t be overly complicated, and I would recommend using the excellent SurveyMonkey to construct straightforward questionnaires (What did you like? What would you do differently? What surprised you?) that can be filled out by all three parties immediately following the activity. By analysing the results from these surveys (again SurveyMonkey can be used to do this) you can really start to get a better picture of what does and does not work in your outreach activity, and how it can be improved for future events.
However, is my opinion that we need to go one stage further than this, and that in order to truly evaluate our school outreach activities, we need to start applying the ‘scientific process’. The scientific process can be represented pictorially by Ouroboros (the snake that eats itself): you start with a hypothesis, you then test that hypothesis, and based on the outcomes of the test you either accept the original hypothesis or adjust it and continue once more with the cycle.
For school outreach activities the hypothesis would be that “this activity raises the awareness of the student’s knowledge in subject X.” However, it is impossible to test if this hypothesis can be accepted or not without first assessing the base level of knowledge that the students have about X. Therefore, the evaluation process really needs to take place before you even set foot in the classroom.
Assessing base knowledge needn’t be overcomplicated; if for example the outreach activity aimed to improve the students’ knowledge of global warming, then their initial familiarity with the subject could be assessed by asking them: 1) What is global warming? 2) What causes global warming? 3) What can be done to reduce global warming? These same questions can then be asked after the outreach activity, and the hypothesis can either be accepted or rejected based on the comparison of the students’ pre- and post-understanding of the subject.
This particular approach to assessing the prior and posterior level of understanding can, for some students, be overly reminiscent of ‘assessment’, resulting in negative implications for the outreach activity. In such cases it might be better to adopt a more informal ‘focus group’ approach, where the students are encouraged to chat about subject X both before and after the activity, with their comments and remarks recorded and later analysed by the facilitators.
School outreach activities are a wondrous thing; they help in the communication of science to society, and without wishing to sound too much like a politician they can ultimately help to inspire a future generation of scientists. However, as research scientists we live and work in an industry in which we are ultimately judged by our publication record. It can therefore help to justify the legitimacy of any school outreach activity to the powers that be (your line manager, head of school, or external funding body) if you are able to point them in the direction of peer-reviewed publications that have been produced as a result of your outreach activities. However, in order to publish in pedagogical journals such as Physics Education, it is essential that the any analysis and conclusions are defendable, and in order to do that it is necessary to approach school outreach activities using the scientific process outlined above.
By constructing a solid evaluation plan for any educational outreach activities during the planning process, we can ensure that we actually learn from our relative successes and failures. In order to do this though, we must first accept that these activities do not end the second we leave the classroom and that in many ways that is when they truly begin.
Further evaluation resources can be found here at this excellent webpage, created by the UK’s National Co-ordinating Centre for Public Engagement (NCCPE).
By Sam Illingworth, Lecturer, Manchester Metropolitan University
Need a helping hand planning your outreach activity? Take a look at this handy checklist, developed in preparation for the EGU 2014 short course on school outreach. You can also find the full presentation from the workshop here.