Evidence highlights
Explore the Assessment Efficacy Report to understand more
![](/content/dam/global/shared/brand/pictograms/1600x800-pictogram-bright-orange-writing-implements2.png)
50,000+
written responses were scored by humans and automatically by PTE Academic
![](/content/dam/global/shared/brand/pictograms/1600x800-pictogram-bright-orange-scales-measure.png)
0.88
Correlation between human score and machine-generated score for writing
![](/content/dam/global/shared/brand/pictograms/1600x800-pictogram-bright-orange-speech-bubble1.png)
400,000+
spoken responses were scored by humans and automatically by PTE Academic
![](/content/dam/global/shared/brand/pictograms/1600x800-pictogram-bright-orange-scales-measure.png)
0.96
Correlation between human score and machine-generated score for speaking
Researching PTE Academic
We assembled evidence from a range of sources to support the validity, reliability, and fairness of PTE Academic in assessing the English communication skills of international students in an academic environment.
The above findings about the close correlations between human raters and PTE Academic’s automated scores are based on a 2019 study involved 10,000+ test-takers and 200 human raters, who rated responses during field testing of the product.
Assessment Efficacy Report
A summary of relevant research
PTE Academic home
Get to know the test
Professional development
Train to teach PTE Academic