× Updates from English Language Programs about COVID-19. Read more ›
30th Anniversary

30@30 Specialists Recount Adventures in International English Language Assessment

Specialists Nathan Carr and Christine Coombe

November 18, 2021

After a record 2020 year – that saw a 3000% increase in our virtual programming – the Specialist Program is celebrating our 30th anniversary throughout 2021. Since 1991, over 800 English Language Specialists – representing the best of America’s educators from all 50 States – have encouraged critical thought and erudition, celebrated their cultural diversity, and showcased their American values and civic engagement strategies to millions of educators and students in 130 countries.

In January we introduced our 30@30 – a group of 30 alumni who have had a profound impact on the Specialist Program as well as the field of English language education. In addition, upon returning to their home states, these leaders have added immense benefit to their local economies, communities, and institutions.

Our November 30@30 story features two Specialists – Nathan Carr and Christine Coombe – who share their experiences as groundbreaking assessment developers on assignment.

Nathan Carr – From the Bunker to Vietnam

     Dr. Nathan Carr distinctly remembers the secure test development environment known as the Bunker in Baku, Azerbaijan, where he was sequestered for five days with a small group of English language test writers from that country’s State Examination Center (SEC). His 2019
English Language Specialist assignment: Assist in revising the English language portion of that country’s national 9th and 11th grade school-leaving examinations. Treated with all the stealth of a top-secret mission, life in the Bunker – ironically located on the top floor of an office building
– included thorough bag searches to ensure no electronics or outside teaching materials were on the premises, sealed light switches and outlets to prevent possible installation of cameras or microphones, and cell phone jamming equipment to disrupt potential outside communication.Nathan Carr in a teacher training session
Once “sealed in,” as Carr puts it, he and the other test writers were there for the duration – working, eating, and sleeping – with doors to the outside opening only after tests had been securely removed from the building and distributed nationwide, and students had taken the exam that Sunday. “Except for the TV in the dining area that showed Azerbaijani shows, we were completely cut off from the outside world,” says Carr, the first foreign educator allowed in the Bunker.

     Carr did not enter the Bunker immediately upon arrival in Azerbaijan. First, he met with SEC staff members for an overview of what they did and required, the initial step in what would be a three-phase project to revamp and develop a complete set of national test specifications. While impressed by the technical quality of their statistical analyses of SEC test data and rigorous scoring procedures for test items, he also recognized a lack of consistent and coherent test parameters. “There were some guidelines, but nothing else. It was all very random,” he says. Indeed, uniformity was needed – for example, how many main idea, detail, and inferences items should be included with each reading passage? – but that would be the focus of later project phases. Next on the schedule was the Bunker.

Life in the Bunker

        Test writers for all major subjects, not just English language, were housed in the Bunker, but Carr worked exclusively with two English language item writers, tweaking or eliminating old questions and adding new ones. While that would have been challenge enough in such a short
period of time, everything had to be accomplished without the use of computers or reference books. “It was an interesting exercise to see what I remembered without the ability to look anything up,” he recalls. “The challenge actually made the experience more fun.” In addition, Carr’s personal notes for future reference and all test changes had to be handwritten – typed changes were done by staff in an off-limits room in the building. Despite the absence of technology and contact with the outside world, as well as 12-hour workdays, Carr found his time in the Bunker “illuminating,” helping him clarify ideas for how to better train item writers in the future. “I realized that what I take for granted as common sense is not always obvious to those writing or revising items,” he says. “Our work in the Bunker helped me anticipate situations that could come up during training sessions.” And while he admits that living under such tight
security was intense, he appreciates the SEC’s focus on maintaining the integrity of the test and testing process. “The objective was to make sure everything was very straightforward and on the up-and-up, with no hint of impropriety.”

     Over the next 10 months, Carr returned to Azerbaijan to complete the second and third phases of the project. It was then that he was able to put what he had learned in the Bunker into action, working with item writers on skill-specific questions and with item reviewers on best practices for critiquing items. By the end of the project, Carr had also helped them develop a coherent, unified test specifications document for the school leaving exams. “My overall goal was to train them to improve the usefulness of these tests, so they were more effective measures of their students’ ability to communicate in English,” says Carr.

Nathan Carr at American Spaces

        Carr undertook a similar three-phase Specialist assignment in Vietnam in 2016 and 2017 with the Center for Language Testing and Assessment at the University of Languages and International Studies, part of Vietnam National University. Here, objectives were highly specific. The country already had an established national assessment for English language proficiency – the Vietnam Standardized Test of English Proficiency (VSTEP) – but test developers wanted particular help with such assessment components as calculating reliability for speaking and listening, analyzing test data, estimating task difficulty, interpreting results, and setting cut
scores. According to Carr, “Ultimately their goal was to improve the test, so there would be positive washback on the curriculum. They were well aware of the impact an assessment has on teaching and learning — if it’s not on the test, it won’t become part of the curriculum. They had a lot of foresight: improving the English language test leads to an improved curriculum, and that leads to improved English language proficiency, which gives the country a competitive advantage.”

Nathan Carr with teachers in Vietnam

     As often happens with Specialists during repeat assignments in a country, a common theme beyond the task at hand emerged for Carr, namely, a growing admiration for Vietnamese food and coffee. “I like spicy dishes and strong coffee, and Vietnam excels in both,” says Carr. His Vietnamese colleagues regularly took him out to restaurants, where he sampled Vietnamese wraps, DIY barbeque, egg coffee, and more. Even during the workday, lunch in the cafeteria often took center stage. If Carr attempted to skip the midday meal, his Vietnamese coworkers would not allow it. “The young lecturers insisted on mothering me, and I would gladly eat whatever ‘light meal’ they placed before me – typically a lunch tray laden with a wide variety of Vietnamese dishes,” he recalls. During those hour-plus lunches, Carr and his coworkers would bond over life’s ordinary moments. “‘How bad is the traffic?’ ‘Do your children do that?’ ‘Why is it so hard to keep electronics charged?’ All those common international threads of conversation.” Carr recollects. “That’s one of the reasons I feel so fortunate to be part of this program – it’s both professionally and life enriching.”

Christine Coombe – Assessment Across Time Zones

     When Dr. Christine Coombe starts talking about language assessment, her commitment to the field is palpable. No surprise, given that testing has been the focus of her career since the first semester of her doctoral program in Foreign and Second Language Education. Receiving a perfect score on the midterm of her first-ever assessment course, Coombe was encouraged by the professor to write her dissertation on the subject. “He told me I had a talent for refining and honing knowledge about assessment,” recalls Coombe. “In short, I grasped it immediately.” Since then, she has amassed an impressive roster of credentials in the field, including extensive awards in academic excellence, leadership roles in professional associations, editorial positions on a vast array of educational journals, and authorship of numerous assessment literacy textbooks and papers. But among this impressive list of accomplishments, her assessment assignments as an English Language Specialist – in Africa, Asia, Europe, and South America – hold a unique place, led by one tour-de-force assessment project, the likes of which Coombe had never before experienced.  

Christine Coombe in Russia

      From 2003 to 2017, Coombe took part in a massive undertaking in Russia, working with educators across that country to conceptualize and design the English language portion of that country’s first Unified State Exam (known in Russia as the EGE). “I thought they were crazy to attempt this,” says Coombe. “How can a country with a 10-hour time difference between east and west, with all those vastly different regions in between, develop a nationalized test? The United States doesn’t even have one!” But that was the task at hand, and, working in conjunction with the chairperson of Russia’s Federal Commission on the EGE, Coombe tackled the assignment with her usual zeal for assessment, exhilarated by the prospect of working on a project of such scope. “They had this idea, and I said, ‘Let’s get started.’” While the EGE was officially launched in Russia in 2009 – and continues as a work in progress – Coombe returned regularly over the years, seven visits total, working with educators there to fine-tune the exam, revising, adding, and eliminating items when necessary.  “It’s rare in the field of testing to have the opportunity to be involved with such a huge, impactful initiative for that long,” notes Coombe.      

Training in Russia

     The long journey to the EGE rollout started with her initial assignments in Russia, during which Coombe not only helped participants conceptualize the English portion of the assessment but also addressed the fundamentals, reviewing the cornerstones of good testing and guidelines for writing test items. “Testing is one of those fields where every educator can use a review,” says Coombe. “If assessment developers don’t apply those basic guiding principles, nothing produced will be worth much.” Participants consisted of a core trio of educators from universities and the Ministry of Education, and a rotating group of about 20 test item writers – all teachers trained in assessment – from regions across Russia. “I insisted that those involved be teachers, from experienced to novice,” says Coombe. “After all, they’re the ones in the trenches who know what students – the most important stakeholders in this process – need.”

Teachers in Christine Coombe's assessment workshop

     During subsequent visits, it was time to undertake the business of writing the assessment.  In its first incarnations, the EGE was to focus on reading, writing, vocabulary, and grammar, a tall order even without the listening and speaking components that were to be added in future years. To accomplish this monumental task, Coombe and her assessment team participated over the next decade in intensive week-long summer institutes held in locations throughout Russia – a mountain retreat in Chelyabinsk, a sanatorium in Kazan, an inn in the country town of Suzdal, a vocational college in Vladivostok – where everyone involved developed a deep sense of camaraderie. “Participants were so excited to be part of creating their country’s first national exam,” says Coombe. “They brought a unique energy and focus to those workshops – I’ve never worked with such motivated individuals.” During those programs, Coombe typically spent the first few hours of each day exploring various aspects of assessment literacy with participants. For the remainder of the day, they worked in groups to peer review EGE test items, focusing on potential questions to add to the EGE question pool. Coombe recalls that participants were so intent on getting those items right, they’d often reconvene after dinner to work. “But even when they’d take a spa break or a walk in the countryside, they’d still talk about multiple choice questions,” she says. The final session of each institute typically consisted of one-on-one consultations with Coombe, who provided feedback on test items. “My message to them was always, test what you teach and how you teach it! That’s the most important aspect of assessment development. The content of a test should not be a surprise to anyone involved – the focus has to be on transparency.”    

Christine Coombe with teachers

     Development of the English language assessment for the EGE was not without its frustrations for Coombe. Typically, assessment reform follows curricular changes, but in this situation, it happened the other way around in what Coombe saw as the classic situation of the tail wagging the dog. But she is hopeful that this massive change to the assessment process will galvanize Russia to update its English language curriculum and, given the scope of the project, she considers it a minor issue. “The EGE stands as one of the most successful, home-grown national tests out there,” says Coombe. “I’m so proud that I was there to witness such an impressive achievement.” 

Follow Us on Twitter


Are you ready for your next adventure and cultural exchange? The application for the 2022 - 2023 English Language… https://t.co/1Hyvm7TeIS

1 day ago