Evaluating public engagement events

Evaluating public engagement events

Evaluating public engagement events

By Liz Jeavans, Being Human external evaluators Jenesys Associates

In this post Liz gives us the low-down on important things to consider when evaluating public engagement events and measuring impact. Following on from her great workshop at the Being Human masterclass, she lets organisers know how to get the most out of their festival event evaluation this November.

I work with Sarah Jenkins from Jenesys Associates and we are the external evaluators for Being Human Festival this year. As external evaluators we gather feedback across the whole festival – from attendees, contributors and event organisers – with the aim to:

  • Assess Being Human’s performance against its strategic objectives
  • Understand who attends and takes part in delivering Being Human events and activities
  • Demonstrate the value and impact of Being Human for attendees, participants, partners, event organisers/hub coordinators and funders.
  • Identify learning and potential improvements for future Being Human festivals or other humanities public engagement activities

Through conducting the evaluation we can reveal impacts and outcomes of the Being Human festival, and capture learning for the Being Human team and wider public engagement community. However, with hundreds of activities and events going on, gathering consistent data across the whole festival can be quite a challenge, so we do ask for your help in gaining feedback from your own activities and events to inform this useful piece of work.

We ask you, as event organisers, to tell us about your own experiences of Being Human and we are planning to let you know by September the best way to contribute to this year’s evaluation. For audiences attending your activities or events we are interested in feedback in three broad areas: Who attends your activity or event? What was their experience of your activity or event? What did they gain from your activity or event (what were the outcomes)?

We will be putting together a series of survey questions that you can use with your audiences to gain this feedback. However, you may find that that a survey is not appropriate to use with some activities, so we will also include some alternative methodology. We don’t want the evaluation to be an onerous task for anyone involved and we want to offer a flexible approach, but one which also delivers the consistent data we need to paint a ‘whole-festival’ picture.

Previous organisers have used responses to the Being Human evaluation to inform their own development and practice and we will be putting together some guidance and resources for you to use. In the meantime, here are some top tips to help your planning:

  • Think about what it is you want your activity or event to achieve – what are the aims and outcomes? Ensure that your evaluation can give you information on if and how your activity or event met those aims and outcomes.
  • Capture a mix of quantitative (numbers) and qualitative (words etc.) data. Quantitative will give you an overview and qualitative can add detail.
  • Ensure you provide an opportunity for objective responses – don’t use leading questions.
  • You may find referring to the Generic Learning Outcomes (GLOs) useful in both planning evaluation (thinking about what you want your activity or event to achieve) and analysing qualitative responses. GLOs are a good way to think beyond the outcome of ‘knowledge and understanding’ to consider things like ‘enjoyment’ or ‘attitudes’. They are also useful in categorising responses to open questions.
  • Finally, use evaluation not only as a tool to show impacts and outcomes of your activities and events, but also to provide you with valuable learning. From gaining feedback from your audiences you can gain an insight into what works well and what needs improvement, which can be used to refine your own public engagement practice in the future.