Digital Assessment Research: Mode Comparability

View all tags
Student performance across onscreen and paper-based exams

We believe that exams should be a fair and accurate reflection of students’ performance – regardless of whether an exam is taken onscreen or on paper.

That’s why our research focuses on many aspects of comparability – including student performance across paper and digital exams.

At a glance

  • We analysed the relationship between exam format (paper or onscreen) and student outcomes for our 2022, 2023 and 2024 International GCSE exams in subjects where both exam modes are available.
  • Our analysis found no evidence that students’ outcomes are affected by taking exams in different ways – supporting fairness and reliability across paper-based and onscreen exam modes.
  • We’ll continue our research to validate findings, broaden scope to include new areas for investigations as they emerge, and share our findings to support the evolution of digital assessment at Pearson and across the wider sector.

What was the research about?

This study builds on years of research and feedback from schools and colleges across the globe – including surveys and statistical analysis since we started offering onscreen International GCSE exams alongside paper-based formats in summer 2022.

Even though onscreen and paper assessments have the same content, the exam setup is slightly different – as students taking onscreen exams can use in-built planning and accessibility tools. Many schools and colleges choose onscreen exam options because students’ usual way of working includes devices and typing and/or the accessibility features suit students’ needs.

It’s crucial that exams are fair and consistent across all formats. Ensuring this not only means that we meet regulatory requirements, but also means we deliver on our commitment to students and can confidently offer a choice of exam formats that schools, colleges and the education community can trust.

That’s why our research study focused on:

  • analysing the relationship between the exam format (paper or onscreen) and learner outcomes – ‘mode effect’
  • determining if there was any significant mode effect that could impact student grades
  • contributing to sector-wide research on comparability and identifying opportunities to further support fairness across exam modes.

By doing so, we’re better able to understand and ensure comparability between assessment formats and gain practical insights that can continue to support dual-mode exam delivery.

How was the research conducted?

A crucial part of the research has been comparing mean scores for students who have taken onscreen or paper-based International GCSEs in:

  • English Language A (since summer 2022)
  • English Literature (since summer 2023)
  • Business (since summer 2024)
  • Economics (since summer 2024)
  • History (since summer 2024).

As we’ve added more subjects each year, this now means our research includes the analysing student performance in paper-based exams as well as over 5,500 onscreen exams taken in these five subjects in the UK or across the globe.

We used a linear regression (OLS) analytical technique to compare outcomes between students and modes of assessment, as it’s best suited for the number of centres in the study.

What were the key findings?

Our analysis to date has found no evidence that students’ outcomes are affected by taking exams in different ways, with:

  • no statistically significant mode effect between students taking the paper-based or onscreen exams
  • no significant interaction between ability and mode – suggesting that students with differing ability profiles were not affected by the assessment format.

What do these findings mean for Pearson and the wider sector?

The findings suggest that the mode of assessment (onscreen or paper) does not impact student performance. Rather, that students’ grades are influenced most by their knowledge and skills in a subject. As such, the focus for schools and colleges can be on selecting the exam format that will most suit the student and enable them to best show what they know and can do.

Such findings reinforce the validity of onscreen assessments across exam formats – emphasising that learners are receiving fair results and are not advantaged or disadvantaged by taking exams on paper or onscreen.

This not only can build confidence in onscreen exam options from Pearson and the education community as a whole, as we shape the assessments of the future in the UK and across the globe.

What further research is needed?

To date, we’ve been adding to the sample size, scope and findings with each exam series where both onscreen and paper-based formats are available. To bolster the robustness of the findings around mode effect, we need to continue such analysis in future.

Across the sector, it’s important we keep exploring the impact of digital assessments on a larger and more diverse cohorts (see 'What are Pearson's nextt steps on this?'). This includes analysing different subjects and varying levels of student ability and skills to ensure the findings are generalisable. Additional factors such as access to technology and familiarity with devices could also be explored further.

As the number of candidates grows and shapes an adequate sample size, we will be able to gather evidence about learners who take onscreen exams for more than one subject and analyse any potential impacts as they become more familiar with the format.

What are Pearson’s next steps on this?

This research is part of a comprehensive series of studies considering various aspects of onscreen exams – from accessibility and inclusion through to marking consistency and feedback from teachers and students.

As such, we’ll not only be looking at this study in isolation but in relation to our wider research and evidence base as we create a full and informed picture of assessment opportunities.

Our next steps include (but are not limited to):

  • continuing to build and validate our findings by repeating the current study across future exam series and doing so across other subjects across the International GCSE and International A level suites as they expand to include onscreen assessments alongside paper-based exams.
  • exploring additional research areas such as any impact of taking more than one qualification onscreen
  • informing sector-wide conversations and recommendations – we will continue to share our research and use the insights gained to inform recommendations that can improve assessment practices and ensure fairness and comparability across all exam modes.

While the study can give the education community confidence in digital and paper exams, we’ll continue to research and refine assessment practices so we can help uphold rigorous standards and ensure that students can best show what they know can do in exams and be recognised accordingly.

About the research

Date: November 2024

Authors: Kevin Mason (Senior Assessment Researcher, Pearson), Sebastian Nastuta (Data Scientist, Pearson)

Citation: Mason, K. and Nastuta, S. (2024) Digital Assessment Research: Mode Comparability – Student performance across onscreen and paper-based International GCSE exams Available at: URL (Accessed: 22 Nov 2024)

You may also be interested in...

Onscreen exam updates

Would you like to be the first to know about our latest digital assessment news and ways you can get involved in research?

Sign up for updates

Share your thoughts

Get in touch with us through our social media channels to have your say:

X (Twitter) LinkedIn Facebook