Tag Archives: online exams

BTL Surpass for online assessment in Computer Science

Over the last couple of years I have been leading the introduction of BTL’s Surpass online assessment platform for  exams in Computer Science. I posted the requirements for an online exam system we agreed on a few months ago. I have now written up an evaluation case study: Use of BTL Surpass for online exams in Computer Science, an LTDI report (local copy). TL;DR: nothing is perfect, but Surpass did what we hoped, and it is planned to continue & expand its use.

My colleagues Hans-Wofgang has also presented on our experiences of “Enhancing the Learning Experience on Programming-focused Courses via Electronic Assessment Tools” at the Trends in Functional Programming in Education Conference, Canterbury, 19-21. This paper includes work by Sanusi Usman on using Surpass for formative assessment.

A question for online exams in computer science showing few lines of JAVA code with gaps for the student to complete.
A fill the blanks style question for online exams in computer coding. (Not from a real exam!)

Requirements for online exam system

Some time back we started looking for an online exam system for some of our computer science exams. Part of the process was to list a set of “acceptance criteria,” i.e. conditions that any system we looked at had to meet. One of my aims in writing these was to  avoid chasing after some mythical ‘perfect’ system, and focus on finding one that would meet our needs. Although the headings below differ, as a system for high stakes assessment the overarching requirements were security, reliability, scalability, which are reflected below.

Having these criteria were useful in reaching a consensus decision when there was no ‘perfect’ system.


  • Only authorised staff (+ external examiners) to have access before exam time.
  • Only authorised staff and students to have access during exams.
  • Only authorised staff (+ external examiners) to have access to results.
  • Authorised staff and external examiners  to have only the level of access they need, no more.
  • Software must be kept up-to-date and patched in a timely fashion
  • Must track and report all access attempts
  • Must not rely on security by obscurity.
  • Secure access must not depend on location.


  • Provide suitable access to internal checkers and external examiners.
  • Logging of changes to questions and exams would  be desirable.
  • It must be possible to set a point after which exams cannot be changed (e.g. once they are passed by checkers)
  • Must be able to check marking (either exam setter or other individual), i.e. provide clear reports on how each question was answered by each candidate.
  • Must be possible to adjust marking/remark if an error is found after the exam (e.g. if a mistake was made in setting the correct option for mcq, or if question was found to be ambiguous or too hard)


  • Must should be possible to reproduce content of previous CS electronic exams in similar or better format [this one turned out not to be  important]
  • Must be able to decide how many points to assign to each question
  • Desirable to have provision for alternate answers or insignificant difference in answers (e.g.  y=a*b, y=b*a)
  • Desirable to reproduce style of standard HW CS exam papers, i.e. four potentially multipart questions, with student able to choose which 3 to answer
  • Desirable to be possible to provide access to past papers on formative basis
  • Desirable to support formative assessment with feedback to students
  • Must be able to remove access to past papers if necessary.
  • Students should be able to practice with same (or very similar) system prior to exam
  • Desirable to be able to open up access to a controlled list of websites and tools (c.f. open book exams)
  • Should be able to use mathematical symbols in questions and answers, including student entered text answers.


  • Desirable to have programmatic transfer of staff information to assessment system (i.e. to know who has what role for each exam)
  • Must be able to transfer student information from student information system to assessment system (who sits which exam and at which campus).
  • Desirable to be able to transfer study requirements from student information system to assessment system (e.g. who gets extra time in exams)
  • Programmatic transfer student results from assessment system to student record systems or VLE (one is required)
  • Desirable to support import/export of tests via QTI.
  • Integration with VLE for access to past papers, mock exams, formative assessment in general (e.g. IMS LTI)
  • Hardware & software requirements for test taking must be compatible with PCs we have (at all campuses and distance learning partners).
  • Set up requirements for labs in which assessments are taken must be within capabilities of available technical staff at relevant centre (at all campuses and distance learning partners).
  • Lab infrastructure* and servers must be able to operate under load of full class logging in simultaneously (* at all campuses and distance learning partners)
  • Must have adequate paper back up at all stages, at all locations
  • Must be provision for study support exam provision (e.g. extra time for some students)
  • Need to know whether there is secure API access to responses.
  • API documentation must be open and response formats open and flexible.
  • Require support helpline / forum / community.
  • Timing of release of encryption key


  • Costs. Clarify how many students would be involved, what this would cost.