Acing the Test
Psychometrician Francis O’Donnell '19 PhD demystifies educational assessments.
Tests are ingrained in every aspect of modern life, whether you are taking a personality quiz on Buzzfeed or answering questions in a job interview.
"I personally feel much safer on the road knowing that everyone has passed their driver’s test," says Francis O'Donnell '19 PhD, a graduate of the Research, Educational Measurement, and Psychometrics Program in the College of Education.
Standardized tests in K–12 and postsecondary education, on the other hand, have a contentious reputation in the public eye. Cases of biased content reveal other systemic educational inequities. "Teaching to the test" has negatively influenced curriculum design. In the past five years alone, the SAT and ACT have been called into question, with colleges and universities adopting "test optional" admissions policies.
The field of psychometrics aims to right all of the wrongs associated with tests.
"Tests become bad when they are being misused, when their policies aren't culturally responsive, and when they are developed without stakeholder input," says O’Donnell. "But, strictly speaking, tests are just information. We want to know what a student is learning."
O'Donnell is a psychometrician for the National Board of Medical Examiners (NBME) in Philadelphia. She focuses on medical education assessments—in other words, the tests that aspiring doctors take at the end of their courses and clinical rotations. NBME tests are quite popular in programs across the United States. One of the main reasons, O'Donnell says, is that it's really, really difficult to write a good test question.
"Medical students also want to know how they are doing in relation to a national cohort,” she explained. "We do a lot of analysis to provide that information."
"It’s important to help students see a way forward. Education benefits from good data."
The guiding principle of O'Donnell’s work is an attempt to make test results more meaningful. Students want to know the broader context of their performance, as in the case of national comparisons. Score reports should also provide other key details, O'Donnell said, including an overview of weaknesses and steps to take for improvement.
O'Donnell devotes her time to figuring out how to convey all of that information clearly. Periodically, she conducts focus groups and interviews students and instructors about their perceptions of NBME assessments. Pairing those findings with rigorous data analysis, she discovers opportunities to cultivate equity. She is also a co-chair of the National Council on Measurement in Education Mission Fund Committee, where she awards grants to researchers working on improving the fairness of assessments.
At UMass Amherst, O'Donnell developed her passion for research. Being accepted to the psychometrics program was a "defining moment" in her life, she said. Working with the likes of Stephen Sireci, Craig Wells, Lisa Keller, and Jennifer Randall was a major motivator.
Her dissertation examined strategies for communicating test results to K–12 students in ways that were both clear and encouraging.
"It was a dream to work on a dissertation that combined psychometrics and linguistics," she said. Growing up in Novo Hamburgo, Brazil, O'Donnell spoke Portuguese as her native language, and began learning English as a child. Linguistics remain important to her to this day, as language is one aspect to consider when making test results meaningful to students.
O'Donnell credits the extensive experience she gained at UMass with preparing her for her role at NBME.
"Providing data that helps students thrive and educators do their job, those things are very meaningful to me," she said.