You are in:

Research on the Impact of Using Computer Readers in Examinations

by Paul Nisbet

on Wed Mar 07, 2018

Share this blog

Share on:

  • Twitter share
  • Facebook share
  • linkedin share
  • Google+ share

At present there are:

Since you're here...

training course

CALL Scotland course,
University of Edinburgh
10th May, 2018

Personal Communication Passports

training courseNewsletter

Get news, articles, advice and tips.

At a Crick 'Ask the Expert' webinar on 6/3/18, Dr Abi James drew attention to research undertaken at Runshaw College, a sixth form and FE College in England, on the use of technology in examinations. The research has found that the majority of students preferred to use a computer reader compared to a human reader and that the use of the software had a positive impact upon attainment. The report contains many useful insights on the use of technology to support candidates with additional support needs, and is well worth reading.

Evidence based practice?

When we first investigated Digital Question Papers in 2004/5, our logic went something like this:

  1. The most common method of support used in examinations (apart from extra time and separate accommodation) was a human reader.
  2. If digital question papers were available, that could be read out by a computer reader, then candidates might be able to use technology in exams and work independently, rather than rely on human readers.
  3. This would be a GOOD THING because:
    • Candidates would learn a useful life skill that would stand them in good stead when they left school.
    • Candidates would learn a skill that would enable them to access curriculum resources and textbooks when a human reader wasn't available, which should raise attainment in the exam.
    • It might be more comfortable and less embarrassing for the candidate and the human reader - if the candidate doesn't know the answer, only the computer hears about it (during the exam).
    • It would be cheaper in terms of staff and require fewer rooms, because several candidates could sit the exam in one room, rather than requiring separate rooms with readers and invigilators.

We conducted trials and piloted the digital question papers (you can read the reports on the CALL Digital Assessments website), and the results seemed to support the hypothesis, and so SQA started offering Digital Question Papers in 2008.

However, just because something sounds sensible doesn't make it true, so this research report from Runshaw College is really helpful as it adds to the relatively sparse evidence base on the use of computer readers in examinations.


Following a successful pilot study of 17 students using reading software in exams, the College wanted to extend the scope to more students, and since the arrangements used in examinations should 'be the normal way of working', the project began by providing universal access to computer reading software (Orato, a free text reader); in conjunction with a programme of staff development and teaching for students; and support for students to access curriculum materials in accessible digital formats. Students were then offered the use of Orato as one of the options for access arrangements in the 2015 examinations. The team's Talking Technology Computer Reader Road Map provides a very helpful overview of this process:


The students who took part in the project had been assessed as eligible for the use of our human reader in examinations, i.e. the candidate has persistent and significant difficulties in accessing written text, is disabled under the terms of the Equality Act 2010, and achieves a below-average standardised score of 84 or less in relation to reading accuracy, comprehension or speed (Access Arrangements and Reasonable Adjustments, JCQ p.35).


  • Out of 478 students who sat the GCSE English Reading paper, 44 (9.2%) were assessed as having difficulties with reading.
  • Of these 44 students, 29 used a computer reader and 15 did not have reading help.
  • The authors report that the majority of the 29 students (85%) preferred to use a computer reader in the exam, rather than using a human reader.
  • The students liked the Orato text reader because it was very simple to use and therefore required little training.
  • At the end of the project, 100% of the 17 staff involved reported that they would feel confident incorporating Orato into the practice in the class.
  • Students who use a computer text reader in an exam should also be using it to access curriculum resources. Students therefore need textbooks and other learning materials in digital format. During the project, the number of registered users of the RNIB Bookshare database of accessible textbooks increased by 414%, and the number of downloaded textbooks by 390%.
  • The number of lessons where the computer reader was used increased from 8 to 97, and students reported that this improved their comprehension and proofreading.
  • The authors report that many students felt the software helped improve their attainment in the GCSE English Reading paper, and this appeared to be supported by the marks achieved in the 2015 examinations, although the small number of students involved means that these results should be viewed with some caution. In addition, it is not clear from the report whether any of the students used a human reader, and if they did, what attainment was achieved by the students in the Reading paper.


The project team produced a really useful set of resources which  would be very helpful for anyone thinking of introducing computer readers into schools, and particularly if you're thinking about examinations.

Tags: computer reader, digital exams, sqa

Share this blog

Share on:

  • Twitter share
  • Facebook share
  • linkedin share
  • Google+ share

At present there are: