Top Nav Breadcrumb

Lost in translation? Simple steps to follow when planning formative assessments

This blog highlights how the International Baccalaureate (IB) ensures students get the same exam experience across languages. It also looks at how teachers can apply simple writing rules when posing questions in the classroom and in formative assessments to help your students achieve their best results 

The IB noticed that students sometimes give a slightly different answer to the same exam question when it is asked in a different language. An expert Oxford University team was commissioned to investigate several IB Diploma Programme (DP) assessments and identify specific exam questions where answers differed across English, French, and Spanish. An executive summary of the Oxford study is available on the IB research webpage. 

What did the Oxford study find?  

The IB offers many exams in at least three languages, English, Spanish and French and the different language versions are very close translations of each other. 

The Oxford study looked at the DP exams of Biology, Chemistry and Physics because these exams are sat by large number of students every session and are designed to assess exactly the same science content across languages. The Oxford research team identified specific exam questions where answers differed across English, French and Spanish and found that:  

  • across all three subjects, in all the exams the majority (80% or more) of exam questions showed no significant difference between the languages 
  • to some extent the differences are a little more common for fairly difficult closed response (or multiple choice) questions which could mean that students responding in one language may be guessing more often 
  • no language was systematically advantaged or disadvantaged. 

How does the IB ensure translation quality and that no student is disadvantaged? 

The IB’s exam production team have extensive processes and checks in place to ensure exams are of comparable difficulty across language versions within one exam session.  However, during marking sometimes examiners notice that students write slightly different answers to questions when answering in one language compared to the answers in another.   

While occasionally translation issues in exams can occur, the differences in responses aren’t always caused by translation. In some cases, a word or term – while correctly translated – can mean something slightly different across languages, leading to students interpreting the question slightly differently. Usually, these response differences are taken into account by the examiners when marking, so that students are not disadvantaged in the grades they get.  

Based on the outcomes of the Oxford study, an expert review involving a group of bilingual subject experts reviewed a sample of 225 questions and their translations. The questions were rated on features that are known to have an impact on the perceived difficulty, including how translation may have affected the length and clarity or layout of the questions or if translation had resulted in any errors within the question. 

The expert review identified that: 

  • translations at times led to questions that looked slightly different across different language versions but overall it showed that often the quality of the translation was not to blame and that many of observed differences were unavoidable. 
  • some translation and linguistic differences were only found at the very end of the IB exam creation process and could not be addressed in the English language source exam questions. 
Exam question example
The image above shows how the English multiple-choice correct answer includes a callback to a word in the question (the word atria links back to the question) while the translated correct answer does not use the same callback.

These findings led the research team to recommend changing the assessment development process to include a more multilingual approach to test question development and exam design.  

How can this translate into practice – three simple rules for your classroom 

For IB educators developing formative assessments and classroom tests for their students, we know you want your students’ answers to demonstrate what they know and understand. You will also want your feedback to be as effective as possible to help them achieve their best results.  

If questions are too long, or include more formal language than you would naturally use, meaning can be lost. This can make differences in student answers more likely. Try applying the following rules to questions and feedback: 

  • use familiar words, correct sentence structures, spelling and grammar, while making sure students are familiar with the subject specific terms and terminology 
  • use shorter sentences and avoid complex or unfamiliar grammatical structures 
  • use only essential adjectives, adverbs and avoid complex punctuation and multiple clauses.  


20200627_Rebecca4AEA2 (2)

Rebecca Hamer is the research lead of the IB Assessment Research and Design team (ARD). This team supports IB curriculum development and the design of IB summative assessments in DP, CP and MYP. To make sure IB’s assessment is state of the art and evidence informed, the team carries out and commissions a range of studies on advances in assessment methods and practices, sharing outcomes through academic presentations, papers, reports, blogs and their webpages.