OER Synthesis and Evaluation / HEFCE Review Report Appendix 3
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

HEFCE Review Report Appendix 3

Page history last edited by Lou McGill 10 years, 12 months ago

Go to main report page: HEFCE-OER-Review-Final-Report

Back to: Impact of OER initiatives


Examples of the kinds of indicators that comprised evidence for the UKOER Programme

 

Evidence generated by complex and innovative processes such as OER release and Open Educational Practice (OEP) is often itself complex, context-specific and difficult to generalise. The complexity and the limits of generalisability need to be acknowledged. However what might be 'just anecdotal' evidence in the context of one project, if it is repeated across 20 or 30 other projects, becomes clearer, stronger and more credible evidence at a programme level.

 

Examples of 'messages' and 'evidence' are summarised in the table below.

Notice that any of these project level messages, if repeated across enough projects, lend themselves to being expressed in a more general way, e.g. 'There is evidence that involvement in OER projects builds staff skills' 'There is evidence that senior management involvement can overcome barriers to OER release' etc.

 

Example evidence for OER projects

Message

Example evidence

Being involved in this project has helped staff to develop their skills

  • OER released by staff (number, type, links, or examples e.g. screenshots, images, video) who were not previously involved in OER (statement to this effect, ideally quotes from staff members themselves: 'I never thought I could do this...')
  • Feedback from workshops ('It all seems much simpler now and I'm going to go and put my entire back catalogue of animations into the institutional repository...')
  • Baseline and post-project survey or interview results showing greater confidence, higher levels of self-reported skills etc

Senior management support has helped to overcome barriers to open release

  • Senior managers have been supportive (podcast, internal email or memo, quotable quote, public statement, meeting minutes)
  • There are real instances where this support has helped overcome an institutional obstacle (policy or strategic document change, relevant quotes, emails, minutes etc)

This project has saved on overall development costs of learning resources

  • Statements from staff about the time it has taken them to re-purpose resources as compared with developing them from scratch
  • Stats showing re-use in a large number of new contexts - this can be argued to show saved development time overall even if the savings are not experienced at the originating institution
  • Evidence of sharing generic resources across institutions or with other institutions

Releasing OER is enhancing the reputation of the course/dept/institution

  • Students arriving at the course/dept/inst say that they have seen or downloaded OER prior to applying (survey, interview, quote)
  • Quote from staff or stats from programme office showing that there has been enhanced interest in a course coincident with release of OER
  • Quote from other stakeholder e.g. head of dept, professional body, partner institution ('we see these resources as essential to our strategy for attracting students from outside of the UK...')
  • Increase in traffic/hits to course website via OER links and OER dissemination activities

Personal recognition has been a strong motivator for individual staff

  • Quotations from staff involved in open release (talking heads, written quotes etc)
  • Survey of attitudes among staff
  • Staff early and late in their careers have been more willing to release or share open content
  • Quotations from staff involved in open release talking about their motivation (informal comparison)
  • Survey of attitudes among staff (statistical comparison)
  • Stats on career stage of staff uploading OER (how collected...?)

The OER released are of at least as good quality as materials produced for 'closed' teaching contexts

  • Resources evaluated against key criteria
  • Peer review outcomes and comments/reviews
  • Feedback from users (surveys, focus groups, interviews, clips etc)
  • Comparative data using comparable criteria or tool (e.g. LOAM tool)

 

 

 

 

 

 

 

 

 

Comments (0)

You don't have permission to comment on this page.