Karen Addesso received her B.S. in Mathematics at Southern Connecticut State University and her M.Ed in Educational Policy Studies at UMass - Amherst. She became a full-time REMP student in the fall of 2004. Prior to enrolling at UMass - Amherst, Karen worked as a Research Associate for the Perinatal Epidemiology Unit at the Yale School of Medicine for ten years, managing several research assistants on different projects related to mother and infant health. Before working at Yale, she was employed as a high school mathematics teacher for four years. Karen's current professional interests include measuring academic growth, measurement issues when implementing the No Child Left Behind Act, and equity issues in K-12 large-scale assessments. Her career goal is to work at a state or federal department of education.
Nina Deng received her B.A in English from Shanghai International Studies University (SISU), China in 2003, and her M.A. with a focus on English Teaching and Testing from SISU in 2006. Prior to joining REMP, she did part-time teaching, organized and rated communicative language tests in schools of Shanghai. She studied the effects of oral task types on the validity of oral tests in her master's thesis. During 2006 in REMP, she was involved in the IRT analysis of the Massachusetts Comprehensive Assessment System (MCAS) science tests, and an equating project. She is now conducting a study of the dimensionality analysis under the multi-stage testing. She is interested in the Item Response Theory and its various applications in the large-scale tests.
Dean Goodman received a M.A. in Educational Psychology from the University of Victoria and a B.A. in Psychology and Education from Simon Fraser University. Prior to enrolling in the program in 2001, Dean worked for the British Columbia Ministry of Education, where he was responsible for provincial, national, and international assessments and developing assessment-related resources. He has worked on projects related to test equating, differential item functioning, and score reporting, and recently drafted a plan of action for a national assessment program. His interests include applications of item response theory, methods for communicating assessment results, and issues related to the implementation of large-scale assessment programs.
Stephen Jirka received a B.S. in Psychology from Texas A&M University and a M.A. in Psychology from St. Mary's University. From there, he was employed by Harcourt Educational Measurement in San Antonio, TX for over six years in the psychometrics department, leaving in 2003 to begin his doctoral studies at the University of Massachusetts at Amherst. While at HEM, he participated in a variety of projects dealing with large-scale assessment and data analysis. His current research interests include external validation of test scores and standard setting methods.
Leah Kaira received a Bachelor's degree in Education from the University of Malawi in 1995. Leah then taught Biology and Physical science in secondary school for six years before returning to the University of Malawi for an honors degree in education with a focus on media and technology. In 2002, she obtained her M.Ed. (testing and measurement) from the University of Massachusetts Amherst, and in 2003 joined the Malawi National Examinations Board as a Research and Test Development Officer. In this position, Leah's duties included training item writers for all national examinations, moderation of tests, selecting multiple-choice items for possible inclusion in national examinations and conducting aptitude tests for clients, among others. Leah is interested in large-scale assessment issues as well as teacher made tests, validity and equating.
Ana Karantonis received her B.A. in African American Studies at Yale University and her M.Ed. in Educational Research, Measurement and Evaluation at Boston College. Prior to joining REMP, she worked as a Project Manager and Research Analyst at Systemic Research, Inc., where she designed and administered achievement tests and attitude surveys, coordinated data collection from schools and universities, conducted workshops for data managers and project directors, and wrote technical and educational reports. Her interests include policy implications of large-scale assessments and equity issues in educational testing.
Heather Klesch received her B.A. in English from Florida State University in 1997. In 1998 she was employed by National Evaluation Systems (now the Evaluation Systems group of Pearson) in Amherst, MA, where she is presently a Senior Area Director. In 2004 she enrolled as a doctoral student with REMP at the University of Massachusetts at Amherst. She is expected to complete her doctorate in 2009. Her current interests include computer adaptive testing, standard setting methods (especially the bookmark method), trends and issues in score reporting, and the effects of the No Child Left Behind Act on state testing programs and policies.
Wendy Lam received a B.Sc. in Statistics from the University of Calgary in 2002. Prior to joining REMP, she worked as a Statistical Analyst in the psychometrics department at Harcourt Educational Measurement for almost 4 years. She has experience on various types of data analyses and had participated in different technical meetings for large-scale assessment programs. Her current interests include applications of item response theory both within and outside of large-scale educational testing, standard setting methods, diagnostic score reporting, vertical scaling, and response time models.
Tie Liang began her coursework in the Research and Evaluation Methods Program at the University of Massachusetts at Amherst in the fall of 2005. She did an internship at Harcourt Assessment in 2006, at Center for Educational Assessment in 2007, and at ACT in 2008. After three years study, she has established her research interest in IRT model fit. She also conducted some projects on computerized adaptive testing, differential item functioning and test equating.
Yu Meng received a B.A. in English from North China Institute of Technology (NCIT) and a M.A. in Educational Research Methods from the University of York (U.K.). During the first year in REMP, he was involved in the projects of College Board CAT Algorithm Development and Massachusetts Adult Proficiency Tests (MAPT). He conducts quantitative data analysis for the Adult Transitions Longitudinal Study (ATLAS) project. His current interests include computer-based testing, large-scale assessments and equity issues in educational testing.
Tim O'Neil earned a B.A. in psychology (1995) from the University of Connecticut and a M.Ed. in psychometric methods (2000) from the University of Massachusetts, Amherst. His background is in behavioral research on stress and coping behavior pertaining to alcohol use and abuse. As a REMP student, his interests have come to include test score equating, diagnostic score reporting, applied validity theory, and the application of item response theory both within and outside of large-scale educational testing. Currently he is working at Pearson Educational Measurement. Having finished his coursework for his doctorate, he plans on completing his dissertation over the next year.
Polly Parker entered the University of Massachusetts Research and Evaluative Methods Program in the fall of 2005 while working on her M.Ed in Educational Policy. Before entering the program, Polly worked as an educational consultant for several local agencies and schools. She also consulted for the Commonwealth Corporation's Diploma Plus program as an MCAS teacher trainer and Western Massachusetts DYS program as a focus group facilitator. Polly was trained to function as an organizational change agent by CS2 (Communities and Schools for Success), and was placed at a vocational high school to serve as a Curriculum Director and as a CS2 Entrepreneur for the past four years. Prior to this, Polly worked for the Corporation for Public Management (CPM) an affiliate of Partners For Community (PFC) as a program director assisting socio-economically disadvantaged people. She graduated cum laude from the University of Massachusetts in 1995 with a B.F.A in Painting.
Jeffrey Patton earned a B.S. in Psychology (2006) from the University of Illinois at Urbana-Champaign, where he became interested in educational and psychological measurement and psychometrics. He joined REMP in the fall of 2007, and is also working toward an M.S. degree in statistics. His current interests include diagnostic testing, performance assessment, differential item functioning, and applications of item response theory.
Christine Shea received a B.A. in Psychology and Education from Mount Holyoke College and a M.Ed. in Administration, Planning and Social Policy at the Harvard Graduate School of Education. Prior to joining REMP, she worked as a Research and Evaluation Manager at the University of Massachusetts Donahue Institute where she conducted applied research and the evaluation of educational and human service programs for 5 years. Before that, she was a public school principal and teacher for 10 years. Currently, her interests include equity issues that arise when measuring student achievement and school success, and validity theory and its application to accountability policy at the local, state and federal levels.
Zach Smith received a B.S. from the University of Massachusetts Amherst in 2005 for Psychology with a minor in Mathematics. In fall of 2006, he continued his studies at UMass by enrolling in the Research and Evaluation Methods Program. He plans to further his knowledge in the field of education by working towards an M.Ed. in Policy Studies while on the track to receiving an Ed.D. in 2009. His current interests include applications of item response theory, standard setting methods, and the effects of assessment on policy.
Tia Sukin received her M.A. in Measurement, Statistics, & Evaluation from the University of Maryland (UMD) - College Park in 2004. While at UMD, she taught a course in Classroom Assessment and worked for the Center for the Study of Assessment Validity and Evaluation where she worked on several projects related to the assessment of English Language Learners. Prior to beginning her doctoral studies with REMP, she worked for the Charleston County School District in South Carolina as a Program Evaluator for Special Programs where she coordinated the administration of testing programs, trained school test coordinators on standardized testing practices and security, and conducted program evaluations for the Early Childhood Education, English for Speakers of Other Languages, and Gifted and Talented departments. Her current interests include equating methods, applications of IRT, and accessibility of score reports.
Hanwook (Henry) Yoo earned a B.A. in English from Korea University and has studied Industrial and Organizational Psychology at Sungkyunkwan University in Korea. Hanwook already has some background in statistical methods and psychometric methods. Prior to joining REMP, he worked for KICE (Korea Institute of Curriculum and Evaluation) as a research assistant. He is especially eager now to focus on the study of item response theory and applications into many topics including computer based testing, differential item functioning and large-scale assessments in educational field.