OER Synthesis and Evaluation / Subject Strand Quality Issues
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Subject Strand Quality Issues

Page history last edited by Lou McGill 13 years, 7 months ago

Summary

Academic quality checks were largely provided by existing institutional processes – typically 'all materials had passed through institutional and professional quality measures' (OERP) before being offered for release. However, these QA measures relate to whole modules, of which the released resources were only one aspect, and within which the resources would have been used in pedagogically specific ways. So it could not be assumed that the materials themselves would 'carry' the quality assurance into a more open context, in which they might be accessed by a wide range of different users with different requirements. A second line of quality assurance was therefore often provided in the form of peer review within the consortium. This review process took account of the fact that the materials were now intended for open release.

 

Projects were provided with resources to help them decide appropriate quality criteria, but all in practice used their own criteria and/or allowed individual reviewers to use their professional judgement. Peer review has a number of advantages, as recorded by the projects, but it is time consuming and not all reviewers are equally experienced in designing for re-use. It is therefore considered advisable to develop a general quality checklist for open educational resources, drawing on the work which some projects have done already, and focusing on the 'add-on' issues of legality, accessibility, technical interoperability, repurposability, metadata/discoverabillity, and accompanying information (incl pedagogical).

 

Question

What quality processes are appropriate for different communities?

Findings from projects

C-Change:

  • It is advisable to set up a peer review process for OERs which takes both a content and pedagogical perspective on quality
  • Cross-departmental peer review can have added advantage of raising awareness of OER agenda.

 

Humbox:

  • all resources have been used successfully with HE students; emphasis in peer review is not only on successful 'closed' use but on the potential for repurposing and re-use in other contexts
  • peer review has led to improvements in the resources and opportunities for academics to learn from each other's pedagogic and technical practice.

 

ADM-OER – institutional repository appointing a 'gatekeeper' to verify academic quality and rigour and check copyright violations.

 

TRUE: the specialist topics approach meant that subject leads acted as quality gateways to materials.

 

OERP: " 'it is not for the project to dictate value other than it is a requirement that the materials are actually used in teaching. There are many teaching styles' "

 

OER-CSAP: Peer review by those with expert understanding of how knowledge is produced in the relevant curriculum

 

Question

How do quality processes for OER relate to other existing quality processes? Are there tensions or barriers?

Findings from projects

'e' quality issues may have to be separately verified as materials often derive from face-to-face delivery contexts. There are additional considerations when delivering online or in a blended environment. There are also different accessibility issues with OERs than with closed teaching resources: OERP, for example, argues that compliance with DDA should be a central element of QA for OERs.

 

A generic problem with preparing materials for open release is that end-users needs cannot be fully known, nor their context of study supported. Therefore criteria for 'quality' resources are likely to be different, possibly broader, and more difficult to define.

 

A second issue is that users need some indicator of 'quality' that is simple and reliable enough to support rapid decision making while searching/browsing resources. The bioscience project survey found that trust is critical at this point. Academics may trust the quality of a resource because they recognise the individual, organisation or community that has provided a resource, or because they trust the quality processes of the repository or host site from which it is downloaded. They are unlikely to engage with complex quality indicators.

 

Most projects saw value in a comments and reviewing facility to provide ongoing checks on quality and opportunities for enhancement by users. However, TRUE removed this facility after evaluation found it was perceived negatively. In tension with the need to provide simple indicators of quality, many projects emphasised that quality in learning resources depends on relevance to the intended context of use, and can only really be decided by the end-user. It may be that both quick and rich quality indicators are needed to meet the needs of different users.

 

Biosciences finding/reflection. " 'The quality of the resource - as an OER – should be higher, in that it has been built to a more robust and transferrable specification. This effect has been coined the “Sunlight Effect” by Cecilia d'Oliveira, executive director of MIT Open CourseWare. We see evidence of it in our collection.' "

 

Humbox: " 'we have come to recognise that work in progress materials can also have value for repurposing' ": in a community repository the quality emphasis is less on polished outcomes and more on how resources support sharing and reuse.

 

ADM-OER finding: research repositories tend to have specific and rigorous quality processes in place; VLEs (where teaching materials have been located) do not. Open repositories for learning/teaching materials require a different approach to either of these.

 

OER-CSAP:

  • Releasing materials relating to a single module ensures pedagogic integrity and also carries the quality assurance of academic validation. However, much about the context is taken for granted in existing materials (and may need to be made explicit for 'integrity' in re-use).
  • Resources meet key criteria from the QAA Subject Benchmark Statements for Sociology, Anthropology, Politics and Criminology, approved by the subject professional associations.
  • Institutions have monitoring processes in place wrt their repositories which relate to institutional reputation. They may therefore take a different view on quality and value to share-and-share-alike users (who, for example, may be very happy with unpolished resources).

 

Questions

Do open resources retain quality? Are they perceived to be of high quality? How can quality be monitored over time? Are there any sustainable quality strategies?

Findings from projects

Releasing simulations as OERs can potentially enhance quality by:

  • giving visibility to the materials (the 'sunlight effect')
  • broadening availability and usability
  • making individual assets available for re-use as well as full simulations
  • making available full details of the use of the simulations
  • raising awareness of different types and uses of simulations
  • sharing practice around embedding and use

 

However, many academics are deeply concerned about poor quality repurposed copies of their resources appearing in the public domain: e.g. FETLAR found that academics were often more comfortable with the permissions granted by the by-nc-nd CC license: " 'having spent hundreds of hours on [quality assurance] I am concerned that when we open this up the potential for more problems will grow' ". There are also concerns among some in the academic population that reliance on external resources "per se" is an indicator of poor quality teaching.

 

MEDEV fiindings:

  • survey results clearly indicated that perceived ‘quality’ was an important consideration when choosing resource
  • students particularly valued identifiable/verifiable quality (materials from reputable sources), and were very strategic when searching
  • staff were more likely to seek materials from a wider range of sources, probably because they had greater ability to distinguish good from poor quality without having to cross check them.

 

Humbox interim finding: quality of resources was a key issue for participants " 'This is being managed through the peer review process which is approaching quality not as a value judgement but as an evaluation of how and in what contexts a resource may be useful. This is distinctly different from peer review in other areas of academia but is appropriate in this case as it was felt that being judged on content and pedagogy might discourage uploading in an environment in which many colleagues consider that their materials are not quite polished enough for sharing (i.e. don’t have the look of commercially published material).' "

 

Bioscience findings:

  • Reliance on too many external resources for learning and teaching material is not currently regarded as good academic practice: perceptions of quality may suffer
  • One originator was prepared to enable adaptations of the resource for local use but was unwilling to accept these back into the resource pool without equivalent quality control being applied (Solution: this resource carries a Creative Commons by-nc-sa licence with limited reservations)
  • Both these issues are concerning for future OER development.

 

SimShare Legal: quote from a legal professional body: " ‘the openness of OER could in itself be a potential issue if it is used or accessed by educators who are not themselves capable of evaluating the resources - but this has always been the case with more traditional resources.’ "

 

C-Change: Partners were concerned that if repurposing required significant removal of third party material (e.g. graphs using multi-author data) there might be a concomitant reduction in the pedagogic quality of the resource.

 

OERP: The quality of OERs is enhanced by releasing pedagogic and assessment guidance where possible.

 

Project outputs and evidence

Humbox:

 

PHORUS:

  • peer review thru HSaP system and using criteria of Appropriateness, Usability, Clarity, Link with PH framework
  • feedback and user reviews

 

ICS: Quality Guidelines, Repurposing and Validation Guidelines, design/presentation principles.

 

OERP:

  • feedback on usability of OERP review criteria: Discoverable, Editable, Repurposable and Portable
  • use of 'light touch' protocols for materials that have already been thru academic quality review.

 

CORE-Materials: review of released materials against criteria of open, accessible, and comprehensive coverage of u/G curriculum (also sustainability issue wrt quality)

 

TRUE: evaluation or reflection on process in the specialist communities: specialist oversight, with peer review and feedback.

 

C-Change: evidence from user feedback, analysis of user experience, peer review outcomes using criteria relating to pedagogy and usability/accessibility

 

ADM-OER: outcome of small group trials to ascertain quality of resources according to criteria of accessibility, reusability, and appropriateness/pedagogic effectiveness

 

Bioscience OER:

  • resources checked for software version, content migration, accessibility, usability, branding and metadata tagging
  • pedagogic value checked by critical friends.
  • short guide on Evaluation and Quality Assurance

 

SimShare Legal: focus on usability and fitness for purpose, wrt simulations and associated resources, case studies, findings, guidelines.

 

OER CSAP: collective review process, with materials evaluated against criteria: ease of sharing; ease of re-use; conformance to technical standards; potential scalability

 

OER MEDEV:


 

 

Comments (0)

You don't have permission to comment on this page.