phase3approach


Back to  ukoer3 Final Synthesis Report contents page

Back to previous section Phase 3 Introduction and summary

Forward to  Lessons Learned: phase3CultureAndPractice

 

2. Synthesis and Evaluation approach to phase three

In summary our approach to evaluation and synthesis included:

 

Evaluation and synthesis has continued to be an iterative, two-way process. Collation of findings and evidence builds on the outcomes of Evaluation and Synthesis in both of the earlier phases, but takes into account emerging lessons from the final phase of the programme. It will also feed into the HEFCE OER Review which will provide a cumulative evaluation and synthesis of the entire HEFCE funded intervention in OER - including all phases of UKOER and the Open University SCORE activities.

 

We have worked with all participants in the programme to identify emerging approaches to OER release which are valuable and relevant to the UK HE and FE context, and to consider wider issues around open educational practice.


2.1 Developing an Evaluation Toolkit

In previous phases there were challenges in getting projects to engage with the evaluation and synthesis framework. There are several reasons for this:

 

Broad framework heading 

Mid level focus area

Detailed aspects

 

OER Release & Use Collaboration
  • Effect of collaborations
OER release/publishing models
  • Making OER available

  • Organising OER

  • Ensuring pedagogic accessibility

  • Ensuring adaptability

  • Making OER discoverable 

  • Addressing gaps

  • Drawing on existing OER collections

  • Influence of discipline or sector

  • Kinds of OER being used

Technical/hosting issues
  • Technical issues in collecting OER
  • Ensuring technical accessibility

 

For phase 3 we developed a new Evaluation Toolkit which attempted to overcome some of these issues. The toolkit is made up of three elements

 

We considered the different themes of this phase and provided an additional thematic route into our broad framework by highlighting which evaluation questions were most pertinent to each theme. Behind these visual pathways were a series of Googleforms (see example form on Engaging stakeholders) that provided a mechanism for projects to respond to key evaluation questions, offered a way of collating lessons learned and fed into final synthesis. Written and video guides were produced to help projects use the toolkit.

 

We had some very positive feedback from projects about this approach as it appealed to people who preferred visual ways into the framework and also found it helpful linking it to themes that they had written to in their project planning phases. We also provided an alternative to the flash-based spicy nodes in case projects could not use or did not like this software (see Themes Evaluation pages)

 

 

Interactive spicynodes map

 

We discuss the effectiveness of the Toolkit in the later section on Programme Issues

 

2.2 Thematic Peer Review Clusters

We decided to group the projects into congruent pairs/groups, relating to their themes. This is illustrated in a projects summary page which highlights pairings/groupings, project evaluation questions, themes and focus areas and significant stakeholders. Themes and focus areas were also captured in a wordle.

 

 

We used the term Evaluation buddies to signal to projects that the relationship should be based on sharing and support. We established an Evaluation Buddies page within the toolkit to highlight the potential of these relationships, using some good practice exemplars from the previous phase.

 

Each project was encouraged to give feedback on their partner's evaluation plan and to provide external peer review of a selection of outputs at the end of the process. The evaluation buddy mechanism in phase two encountered similar challenges to the framework (mentioned above) so we adopted a different approach for phase three. We established and supported two online meetings for each grouping before both interim and final reporting stages. The meetings proved valuable for both projects and the Synthesis & Evaluation Team and were also attended by Programme Officers when possible. They allowed projects to begin to articulate lessons learned and to spot synergies with partner projects. It enabled us all to note where outcomes were similar or different. One project wrote an interesting blog post following their first meeting http://deftoer3.wordpress.com/2012/05/18/buddying-up-with-orbit/ where they write:

 

On Wednesday, we had a very productive meeting with Teresa Connelly and Bjoern Hassler from ORBIT (“Open Resource Bank for Interactive Teaching”), our “evaluation buddy” – the idea emerged from a joint phone call with Lou McGill where we started talking about synergies between the projects. Once again, the meeting demonstrated the value of a rather revolutionary concept known as actually talking to people face to face and exchanging ideas. The meeting also helped us to see that despite the fact that we cover different discipline areas (the majority of DeFT teachers are in English or media while ORBIT focuses on science subjects) we have much more in common than we initially thought.

 

They go on to talk about developing a strategy for sharing evaluation outputs. This highlights the real value that this approach can bring even when projects are working to very tightly focused timescales.

 


 

Back to  ukoer3 Final Synthesis Report contents page

Back to previous section Phase 3 Introduction and summary

Forward to  Lessons Learned: phase3CultureAndPractice