Which Evidence


What do we mean by 'evidence' in the context of our programme-level evaluation.

 

What we mean by 'evidence' is anything that helps to confirm or illustrate or exemplify a message. A message may be arrived at through formal evaluation processes – when it is likely to be supported by data collected and analysed for that purpose – or through personal reflection, or as part of the emerging collective 'story' of the project.

 

During the pilot phase we used the generic framework to define the problem space in which we expected messages to emerge. We know that unanticipated outcomes, messages and stories can be equally valuable to those we set out to collect – but these too require evidence to support and illustrate them. This framework has been developed throughout the programme and adapted to reflect phase 2 - Phase2 synthesis framework and phase 3 -  phase 3 evaluation framework Both of these versions of the framework also linked to evidence gathered during those phases.

 

We don't want to tell you what evidence you 'ought' to bring forward to support your key messages, only that the better evidence we have as a programme, the more powerfully we can communicate those messages. Creative forms of evidence such as multimedia, talking heads, screenshots and images have an important role to play in communicating project outcomes. However, the type of evidence needs to match the type of message or the claim being made. For example, if we are claiming that OERs can save on overall development costs, we need fairly robust, quantitative data to back up that message. If we are saying that a number of issues seem to motivate staff to release open resources, we need a different kind and quality of evidence: quotes from staff – written or spoken – are appropriate to illustrate the issues involved.

 

Evidence generated by complex and innovative processes such as OER release is often itself complex, context-specific and difficult to generalise. This is fine, providing the complexity and the limits of generalisability are acknowledged. Remember that what might be 'just anecdotal' evidence in the context of one project, if it is repeated across 20 or 30 other projects, becomes clearer, stronger and more credible evidence at programme level.

 

Examples of 'messages' and 'evidence' are summarised in the table below. There are many more, and we will be adding to this list of suggestions.

Notice that any of these project level messages, if repeated across enough projects, lend themselves to being expressed in a more general way, e.g. 'There is evidence that involvement in OER projects builds staff skills' 'There is evidence that senior management involvement can overcome barriers to OER release' etc.

 

Example evidence for OER projects

Message

Example evidence

Being involved in this project has helped staff to develop their skills

  • OERs released by staff (number, type, links, or examples e.g. screenshots, images, video) who were not previously involved in OER (statement to this effect, ideally quotes from staff members themselves: 'I never thought I could do this...')
  • Feedback from workshops ('It all seems much simpler now and I'm going to go and put my entire back catalogue of animations into the institutional repository...')
  • Baseline and post-project survey or interview results showing greater confidence, higher levels of self-reported skills etc

Senior management support has helped to overcome barriers to open release

  • Senior management support has helped to overcome barriers to open release
  • Senior managers have been supportive (podcast, internal email or memo, quotable quote, public statement, meeting minutes)
  • There are real instances where this support has helped overcome an institutional obstacle (policy or strategic document change, relevant quotes, emails, minutes etc)

This project has saved on overall development costs of learning resources

  • Statements from staff about the time it has taken them to repurpose resources as compared with developing them from scratch
  • Stats showing re-use in a large number of new contexts - this can be argued to show saved development time overall even if the savings are not experienced at the originating institution.

Releasing OERs is enhancing the reputation of the course/dept/institution

  • Students arriving at the course/dept/inst say that they have seen or downloaded OERs prior to applying (survey, interview, quote)
  • Quote from staff or stats from programme office showing that there has been enhanced interest in a course coincident with release of OERs.
  • Quote from other stakeholder e.g. head of dept, professional body, partner institution ('we see these resources as essential to our strategy for attracting students from outside of the UK...')

Personal recognition has been a strong motivator for individual staff

  • Quotations from staff involved in open release (talking heads, written quotes etc)
  • Survey of attitudes among staff
  • Staff early and late in their careers have been more willing to release or share open content
  • Quotations from staff involved in open release talking about their motivation (informal comparison)
  • Survey of attitudes among staff (statistical comparison)
  • Stats on career stage of staff uploading OERs (how collected...?)

The OERs released are of at least as good quality as materials produced for 'closed' teaching contexts

  • Resources evaluated against key criteria
  • Peer review outcomes and comments/reviews
  • Feedback from users (surveys, focus groups, interviews, clips etc)
  • Comparative data using comparable criteria or tool (e.g. LOAM tool)