OER Synthesis and Evaluation / phase3approach
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

phase3approach

This version was saved 11 years, 1 month ago View current version     Page history
Saved by Lou McGill
on January 31, 2013 at 5:30:01 pm
 

Back to  ukoer3 Final Synthesis Report contents page

Back to previous section Phase 3 Introduction and summary

Forward to  Lessons Learned: phase3CultureAndPractice

 

Synthesis and Evaluation approach to phase three

 

In summary our approach to evaluation and synthesis included:

  • Revision of the Phase2 synthesis framework which provided a strong foundation and common language for collating evidence (see phase 3 framework) 
  • Working closely with the programme management team, project teams and their evaluators to ensure all projects arrived at a coherent, feasible evaluation plan that met the needs of the programme
  • Development of an Evaluation Toolkit which offered:
    • information around  evaluation methods and resources
    • various pathways into the evaluation framework
    • a mechanism for collating and organising key messages as they emerged that could feed back to interim and final reports
    • a mechanism which fed into final synthesis 
  • Contributing to the design of programme events and documentation to ensure evaluation and synthesis issues remain a priority
  • Responding to support requirements expressed by projects  (as appropriate, and in collaboration with other programme management and support teams)
  • Iteratively mapping project outputs and lessons learned to the synthesis framework: offering feedback expectations to projects in a timely way
  • Further developing measures of openness and encourage projects to apply them to their outcomes
  • Setting up and supporting thematic peer review clusters among the projects to make our interventions with projects more efficient, and ensure conversations about evaluation were ongoing

 

Evaluation and synthesis has continued to be an iterative, two-way process. Collation of findings and evidence builds on the outcomes of Evaluation and Synthesis in both of the earlier phases, but takes into account emerging lessons from the final phase of the programme. It will also feed into the HEFCE OER Review which will provide a cumulative evaluation and synthesis of the entire HEFCE funded intervention in OER - including all phases of UKOER and the Open University SCORE activities.

 

We have worked with all participants in the programme to identify emerging approaches to OER release which are valuable and relevant to the UK HE and FE context, and to consider wider issues around open educational practice.


Developing an Evaluation Toolkit

In previous phases there were challenges in getting projects to engage with the evaluation and synthesis framework. There are several reasons for this:

  • short project timescales
  • the framework was often introduced to projects after project plans had been written and agreed (this is a very significant factor which also happened in phase 3). Engaging with the evaluation framework then appeared to be something extra for projects to do.
  • Final report templates used by the programme do not reflect the framework which reduces incentive to engage. 
  • The framework can appear to be quite complex and takes time to engage with. The framework has several levels going from 4 broad focus areas to more focused sections and finally down to very specific evaluation questions. The following table illustrates this for one of the 4 areas:

 

Broad framework heading 

Mid level focus area

Detailed aspects

 

OER Release & Use Collaboration
  • Effect of collaborations
OER release/publishing models
  • Making OER available
  • Organising OER
  • Ensuring pedagogic accessibility
  • Ensuring adaptability

  • Making OER discoverable 
  • Addressing gaps
  • Drawing on existing OER collections
  • Influence of discipline or sector
  • Kinds of OERs being used
Technical/hosting issues
  • Technical issues in collecting OER
  • Ensuring technical accessibility

 

For phase 3 we developed a new Evaluation Toolkit which attempted to overcome some of these issues. The toolkit is made up of three elements

  • information and resources to support evaluation activities
  • an interactive tool to guide projects through our Evaluation and Synthesis framework, providing an opportunity to submit findings, observations and links to evidence AND which feeds this back to projects for inclusion in their project reporting mechanisms
  • examples of evaluation materials, instruments and reports from other UKOER projects

 

We considered the different themes of this phase an provided an additional thematic route into our broad framework by highlighting which evaluation questions were most pertinent to each theme. Behind these visual pathways were a series of Googleforms (see example form on Engaging stakeholders) that provided a mechanism for projects to respond to key evaluation questions, offered a way of collating lessons learned and fed into final synthesis. Written and video guides were produced to help projects use the toolkit.

 

We had some very positive feedback from projects about this approach as it appealed to people who preferred visual ways into the framework and also found it helpful linking it to themes that they had written to in their project planning phases. We also provided an alternative to the flash-based spicy nodes in case projects could not use or did not like this software (see Themes Evaluation pages)

 

 

Interactive spicynodes map

 

We discuss the effectiveness of the Toolkit in the later section on Programme Issues

 

Thematic Peer Review Clusters

We decided to group the projects into congruent pairs/groups, relating to their themes. This is illustrated in a projects summary page which highlights pairings/groupings, project evaluation questions, themes and focus areas and significant stakeholders. Themes and focus areas were also captured in a wordle.

 

 

We used the term Evaluation buddies to signal to projects that the relationship should be based on sharing and support. We established an Evaluation Buddies page within the toolkit to highlight the potential of these relationships, using some good practice exemplars from the previous phase.

 

Each project was encouraged to give feedback on their partner's evaluation plan and to provide external peer review of a selection of outputs at the end of the process. The evaluation buddy mechanism in phase two encountered similar challenges to the framework (mentioned above) so we adopted a different approach for phase three. We established and supported two online meetings for each grouping before both interim and final reporting stages. The meetings proved valuable for both projects and the Synthesis & Evaluation Team and were also attended by Programme Officers when possible. They allowed projects to begin to articulate lessons learned and to spot synergies with partner projects. It enabled us all to note where outcomes were similar or different. One project wrote an interesting blog post following their first meeting http://deftoer3.wordpress.com/2012/05/18/buddying-up-with-orbit/ where they write:

 

On Wednesday, we had a very productive meeting with Teresa Connelly and Bjoern Hassler from ORBIT (“Open Resource Bank for Interactive Teaching”), our “evaluation buddy” – the idea emerged from a joint phone call with Lou McGill where we started talking about synergies between the projects. Once again, the meeting demonstrated the value of a rather revolutionary concept known as actually talking to people face to face and exchanging ideas. The meeting also helped us to see that despite the fact that we cover different discipline areas (the majority of DeFT teachers are in English or media while ORBIT focuses on science subjects) we have much more in common than we initially thought.

 

They go on to talk about developing a strategy for sharing evaluation outputs. This highlights the real value that this approach can bring even when projects are working to very tightly focused timescales.

 


 

Back to  ukoer3 Final Synthesis Report contents page

Back to previous section Phase 3 Introduction and summary

Forward to  Lessons Learned: phase3CultureAndPractice

 

 

 

 

 

 

 

 

 

 

 

 

Comments (0)

You don't have permission to comment on this page.