This is an archive of the original site and you may encounter broken links and/or functionality

Assessment through computers: Combining Computer Assisted Assessment with Paper-based Delivery for Improved Authoring, Marking, and Reporting Efficiency -- 55 -- Short (oral) Paper

13:40 - 14:40 on Tuesday, 11 September 2012 in 4.206

This paper reports on a case study exploring how computer aided assessment (CAA) technology and OMR scanning software can be combined to create, mark, and report on paper-based exams during an eight month pilot project at Harper Adams University College conducted during 2010-2011. Funded by the institution's University Modernisation Fund (UMF), the pilot aimed to establish whether paper-based exams created using software editing tools could increase productivity and reduce the time spent creating and marking summative assessments. This required the development of new working procedures and a brand new installation of Questionmark Perception and the accompanying infrastructure.

An example of how multiple authors from varying disciplines can collaborate during assessment construction is provided and the workflow management of a phased approach to development is discussed. It provides an insight into both the successes and failures that can occur, reveals problems, for example with scanner hardware and discusses the question of whether this style of exam creation is more time-efficient than traditional methods exploring the risk that workloads will be shifted within an organisation rather than eliminated.

The project used optical mark recognition (OMR) scanning of paper answer sheets to implement automatic marking.

This format has greater limitations than online assessment but still allows institutions to employ CAA's advantages, including question banks consolidating assessment resources for use by multiple authors and assessments thus increasing economies of scale (Sclater & MacDonald, 2004, p.205-206).

Using OMR scanning is vulnerable to the same problems as online assessments and simplistic objective questions must be avoided (Sim, Holifield, & Brown, 2004, p.216).

Students submitted by filling in circles on bubble sheets for scanning into a format Questionmark could use to automatically mark and calculate students’ results.

The creation of paper-based exams with multiple authors was challenging to coordinate. The pilot successfully demonstrated that procedural structures are essential. Establishing these from the beginning made it easier to track assessments’ development achieving minimal deadline slippage.

The project successfully combined the efforts of five lecturers and five support staff members during phased development and delivery of summative assessments.