Association for Learning Technology Online Newsletter
Issue 9 July 2007   Friday, July 20, 2007

ISSN 1748-3603

Cover Page »
Feature article
E-Learning support in Japan
Virtually there?
Case studies
Chartered Institute of Personnel and Development (CIPD)
Conference reviews
JISC Digital Repositories Conference
IMS Global Learning Impact 2007
Innovating e-Learning 2007
Software reviews
Selecting an Electronic Voting System (EVS)
ALT news
Director's report
Chief Executive's report
News from the Netherlands
Publications from the SURF e-learning website
Subscribe / Remove
Privacy policy

ALT Website

ALT-N online
Past Issues
Issue 8 April 2007
April 20, 2007
Issue 7 January 2007
January 27, 2007
Issue 6 October 2006
October 30, 2006
Issue 5, July 2006
July 11, 2006
Issue 4 April 2006
April 27, 2006

More Issues »
Selecting an Electronic Voting System (EVS)
by Paul Burt

Many different aspects need to be considered when selecting an Electronic Voting System (EVS), also commonly referred to as Personal Response Systems (PRS) and Audience Response Systems (ARS). By sharing the process we used at the University of Surrey we aim to help other institutions that may be considering an EVS purchase.

About EVS
Students using EVS can respond to questions the lecturer asks by pressing buttons on their individual voting handsets which transmit the response back to the lecturer’s computer. Posing questions that the students can respond to individually and anonymously encourages engagement and interaction within the lecture, and when used well EVS questions can form the basis of active peer discussions and help challenge common misunderstandings. Within an EVS enhanced session a graph of the distribution of responses is able to be projected instantaneously.

The E-Learning Unit of the University of Surrey had gained practical experience of supporting, both pedagogically and technologically, academic staff using EVS during a pilot that ran during 2004/05. The equipment used in this pilot had a number of shortcomings, knowledge of which helped inform the outline parameters for the procurement of a new, much larger-scale, voting system for use across the University.

One of the lessons learnt from the early pilot concerned the limitations of infrared (IR) based systems. IR is a near-perfect technology for a home television remote control but suffers from significant drawbacks when used in large scale voting systems. Specifically the three main drawbacks of IR systems are:

  • Limited range – IR systems struggle in larger lecture theatres unless multiple IR receivers are installed.
  • Requirement for line-of-sight and aim accuracy – users need to accurately aim their handset at the receiver.
  • Serial reception of votes – despite happening quickly, the receiver can only receive one vote at a time which sometimes requires the student to point their handset at a receiver for quite some time.

For these reasons IR-based EVS were ruled out for the University of Surrey and attention was focussed on radio frequency (RF) based systems.

The first stage of the selection process involved compiling a list of the RF systems available in the UK. This task was not as simple as it sounds because at the time of listing (start of 2006) many of the manufacturers of IR systems were only just beginning to introduce RF systems; in addition many systems available in the USA were not certified for use in the UK (e.g. The list of RF EVS systems evaluated by the University of Surrey in early 2006 included:

For completeness the following EVS systems are listed although Surrey was not aware of their existence in early 2006:

The manufacturer/importer/vendor of each of the systems identified by Surrey was contacted and the following requested of them:

  • An evaluation copy of their voting system software
  • To fill in and return a questionnaire about their hardware
Hardware Evaluation
The questionnaire sent to all suppliers included the questions in Table 1:



RF transmission related:


How far away from a receiver is a handset guaranteed to work (line-of-sight)?

Interference robustness

Is this a feature of the hardware?

For handsets with rechargeable batteries:

Battery life

How long will the unit last in typical usage?

Battery charge time

How long does it take to fully charge a depleted unit?

Battery max cycles

How many times can the battery be re-charged before it will need replacement?

Battery technology

What type of battery is being used?

Battery replacement cost

What is the cost of a replacement battery? Can it be replaced by University technical support?

Ease of charging

Is charging achieved via a docking system/contact terminals or via cable & plug?

Battery status

Is there an indicator of remaining battery charge? If so what type (i.e. LED or LCD)

For handsets with changeable batteries:

Battery type

Please specify size and voltage

Battery life

Number of hours in typical use

Ease of battery replacement

Is tool required? If so please specify what type of tool. How easy to access?

Battery theft prevention

Has the handset been designed to prevent students 'swapping' batteries?

Handset design:


What would happen if you dropped the handset onto a hard floor?

Accessibility of button design/layout

Tactile clues (e.g. two dots on the 5 button as per phones)


Are the keys/display backlit?

Clarity of legends durability of legends

Is there sufficient contrast between button legends and backgrounds? Are the legends guaranteed to not wear off?

Confirmation of vote status

Is it obvious to the user that their vote has been accepted


Does the handset have a display (for example to associate the buttons to the options)?

Comfort in hand

Is it ergonomically designed? Is it primarily designed for right-handed operation?

Receiver units:

Ratio of receivers to handsets

What is the maximum number of handsets that can be used with one receiver?

External DC power

Does the receiver require an external power supply?

Cabling type

What connection type is used to connect the receiver to the lecturer's computer?

System Functionality:

Number of voting options

Can you ask a question with 10 or more options? i.e. how many buttons are there, can the user enter 10+ 4 for 14?


Can users cancel and re-vote?


Is text entry and short answer voting possible?

Pre-emptive text

If text entry is possible, are there pre-emptive options (e.g. T9)?


How portable is the whole system? Are custom fit cases available?

Integration with existing devices

Can existing devices (e.g. PDAs) be used with the system?

Table 1. Hardware evaluation questionnaire

The questionnaire was designed to make comparison of the hardware easier. Most manufacturers only normally publish the specifications of the systems that are notable and often omit specifications that could be interpreted as negative. Once the hardware evaluation questionnaires were received back from the (majority of) suppliers, a matrix was prepared to cross compare the responses and a suitability rating attached to each system. This rating was only a rough guide to the suitability of each system to meet the suspected needs of the University.

Software evaluation
Of the list of eleven systems initially looked at, one was excluded from any further evaluation because of a software limitation which prevented the use of more than 64 handsets. Of the remaining ten companies, eight provided evaluation copies of their software, which was installed and evaluated by a member of the E-Learning Unit against the criteria shown in table 2.



General software requirements:

PowerPoint plug-in

Can a voting session be run from within Microsoft PowerPoint?

Standalone application

Can a voting session be run without Microsoft PowerPoint?

Retro compatibility of client system requirements

Can it be run on earlier version of operating system or with earlier version of PowerPoint (e.g. Office 2000)?

Non-Windows support

Is there a version for Apple Mac users?

Admin only run

Can the software be happily run under a restricted-permissions user account (e.g. it does not need to write files to C:\)?

Product activation

Can the software be installed without activation or does each installation of the software require individual activation (either by web or phone)?

Licensing restrictions

Can the software be installed on an unrestricted number of machines? i.e. is the licence to use the hardware rather than the software?

Training and support:


Has the manufacturer produced training animations and how clear are these?


Is the manual available as PDF and if so does it have structured bookmarks?

Telephone support

Is there telephone support available and is it free?

Online support

Is there live online support or email support?

Questions and results:

Customisable results interface

Can the design of the results screen be customised?


Can images be inserted into questions?

Rich media

Can rich media such as movies or animation be inserted into questions

Export and import questions

Are users able to share their question sets?

Running the presentation:

Question timing

Can timings be applied to question polling, can this be over-ridden?

On-the-fly' question entry

Can last minute questions be added in?

Show countdown timer

Can the remaining polling time be displayed?

Respondent progress

Can it display a progress indicator of the proportion of votes received?

Dual monitor

Can the questions and a poling grid be displayed on different screens or different areas of the same screen.

Non-linear progression

Can conditional branching be specified based upon the responses?

Demonstration Mode

Can a lecturer run the presentation and see how results data would be presented even though they are not connected to the hardware?

Data analysis:


Can reports be run on the results?

Export to Microsoft Excel

Are reports exported to Microsoft Excel?



Any software mis-function noticed?


How elegant/professional is the interface?


How intuitive is the system for a tutor to use?


Any unique selling points of the software?


What are the good points of the system?


What are the bad aspects of the system?

Table 2. Software evaluation criteria

By actually installing and using the software we could test first-hand how easy it was to get the software up and running. The criteria were designed through experience gained using the earlier IR system; general software/support aspects were also considered important. As the final stage of this part of the evaluation each piece of software was allocated a rating score.

On-Site Demonstrations
The suppliers of the four systems that scored highest in terms of both hardware and software were then invited to demonstrate their system to a mixed group of University staff.

Each presenter was given the same brief and parameters:

  • show how a lecturer would prepare a session
  • show how to setup the system
  • run a live demonstration for us to vote in
  • answer any questions from the audience
  • do not discuss the price of the system

The need to avoid discussion about the relative costs of the systems arose because the University’s intended EVS purchase was significant enough to require a formal tendering process. The aim of the selection process was to decide the specifications and attributes of the system we wanted, how it was finally acquired was left to the University’s procurement specialists.

Twenty-one academic, academic-related and technical staff attended the demonstrations and everyone was asked to answer specific questions about each system presented.

The questions asked included:

  • “Please identify some of the features of this system that you like (hardware and software)”
  • “Please comment on the design of the handset”
Other considerations
Experience gained during the earlier pilot with an IR system raised our awareness of potential battery life issues with EVS. A desire to find out real-life experiences of battery life and to speak with existing users of different EVS systems led to email contact with staff in many other institutions.

At around this time we established the JISCmail list to enable further discussion amongst UK EVS users.

Final Decisions
Results of all stages of the selection process were compiled and presented to a Project Executive Group established to oversee the selection, acquisition and establishment of the EVS service at the University.

A number of outstanding decisions were taken by this group:

  • Functionality – is text entry or other sophisticated functionality desirable? The decision was made that there is little evidence to support the valid use of this type of sophisticated functionality and that undesired functionality would possibly confuse staff and form a barrier to use.
  • Quantity versus functionality – typically, sophisticated functionality adds cost and therefore we could obtain less equipment for any given budget. What is the priority? The decision made was to prioritise quantity over non-essential functionality.
  • Portability – how important is it to have easily portable equipment? The decision was made that lack of portability could become a significant barrier to use and thus easy portability would be a significant criteria.
The University of Surrey opted to purchase the TurningPoint RF system and now hold 2000 handsets. The equipment has been central to a year-long, cross-University, implementation project to encourage the pedagogically-sound usage of the EVS. In alignment with current educational research, we are advocating using EVS to facilitate a shift to student-centred instruction. EVS can provide an opportunity to take a fresh look at the function of the traditional lecture format and to steer away from ‘coverage’ of material towards a space for students to think, engage and learn.

Paul Burt

Printer Friendly Version »

There are no letters available.

Created with Newsweaver