To the Test
The UMass School of Education Center for Educational Assessment conducts research critical to ensuring that the Massachusetts Comprehensive Assessment System (MCAS) and testing statistics are accurate.
However contentious the national conversation surrounding education reform has become since the legislation’s passing, most agree that assessment is a necessary part of charting student progress and identifying problem areas within the educational system. While the debate over how to use the results from these assessments continues, the Center for Educational Assessment remains focused on ensuring tests are fair, equal and relevant. To accomplish this, the Center specializes in developing efficient ways to quantify unseen personal attributes—a cross between psychology and measurement referred to as psychometrics. The Center has garnished an international reputation for paving the psychometric path, and UMass Amherst is home to one of the largest psychometrics programs in the country.
“Because we’re seen as people who have some talent in doing research in this area, but are neutral and stand for the promotion of fair testing practices, we get a lot of work,” says Center Co-Director and Professor Stephen Sireci (above: front row, second from left).
The Center’s founding Co-Director and Distinguished University Professor Ronald Hambleton ( above: front row, third from left) played a critical role in making the reputable program what it is today—Sireci says that it was Hambleton’s work that drew him to the university in the early 1990s. When Hambleton began his career more than 40 years ago, the “mathematical modeling of psychological phenomena” was just beginning to receive recognition. Since then, Hambleton and Sireci, two leading psychometricians in the field, have provided consultation to countless test-makers and organizations around the globe.
Impacts of the research are as far-reaching as Indonesia and Malawi, yet the work has also made a huge splash at home. Since 2004, Sireci and colleagues have worked with the Massachusetts Department of Education’s Adult Basic Education office to implement computerized adaptive testing for adult learners. The testing software, which the team developed specifically for in-state adult education programs in collaboration with software specialist David Hart and the UMass Amherst Center for Educational Software Development, enables testing across a broad range of students. As adult programs provide instruction to struggling students and English language learners in the same setting, the research team faced the challenge of designing tests that can assess a broad continuum of knowledge and skills. The Massachusetts Adult Proficiency Tests are adaptive tests delivered over the Internet that “adapt” the difficulty of the exam to each student. Sireci explains that flexibility is necessary to gain a true sense of a student’s abilities—language barriers and learning disabilities can be huge variables.
“There’s always this question of trying to provide access without providing an unfair advantage,” says Sireci.
And with ongoing funding from Measured Progress, the state’s testing contractor, Sireci and the team are conducting research critical to the Massachusetts Comprehensive Assessment System’s (MCAS) technical analysis and reporting. Each year, the Center replicates the process of ensuring new tests are on the same scale as the prior year’s tests, a complex statistical process called “equating.” If the results do not match those of Measured Progress, the state will not release the figures until any discrepancies are resolved. In this system, the Center serves as an objective third party to ensure the state’s testing statistics are accurate.
The MCAS recently began to incorporate a “student growth percentile,” into their reports. The new inclusion sparked Sireci, a long-time MCAS supporter, to voice public apprehension. He says that there have not been any validity studies on these statistics and that he has already found enough inconsistency to warrant removing the data from the report. The data have emerged amidst collective bargaining controversy, and Sireci has concerns that the measurement of student growth may be both unfounded and unfairly attributed to teachers.
“There’s some great opportunity for better work in this area and we’d like to be part of that,” says Sireci.
Amanda Drane '12