James Madison University

Assessment and Measurement Students Complete Summer Internships with Pearson Education, Inc.

PHOTO: JMU students

James Koepfler


The Pearson summer internship has been enriching and deeply rewarding.  I was privileged to work with a fantastic group of research scientists and research associates in the PRS Tulsa, Oklahoma office. I must especially thank Dr. Stephen Murphy, Dr. Josh Goodman, and Dr. Shelley Ragland, who provided me with an invaluable experience and treated me as one of their own colleagues.

I had a unique opportunity of spending the first two weeks of internship in Oklahoma City, supporting standards setting for the OK Core Curriculum Tests, OK Modified Alternative Assessment Program and the OK Alternate Assessment Program (OAAP). I shadowed PRS facilitators and worked as a data analyst for the OAAP standard setting. Additionally, I had the opportunity to meet and work closely with PRS employees from each of the PRS sites, as well as the OK program team and content team, and several employees from the OK SDE.  This was a great experience as I got to see the larger process involved in conducting standard settings at the state level and expanded my knowledge of the bookmark and body of work standard setting methods.

The remaining six weeks were spent in the PRS Tulsa office where my primary project was to learn Oklahoma’s test construction process and ultimately transition test construction from TC tool to Pearson’s Item Tracker/Test Builder (ITTB) system. For those not familiar with ITTB, it is a configurable, web and local client based service that integrates item bank management, test construction, and test map services. This project provided me with a deeper knowledge of test construction and all of the many factors that must be considered when building a test. It also allowed me to become proficient with the ins-and-outs of ITTB. The final product for this project was a step-by-step guide to configuring and prepping ITTB for test construction. I believe this guide can be beneficial to other programs that may be considering moving test construction into the ITTB environment (I highly recommend it).

Overall through this internship I’ve expanded my practical and technical knowledge in the areas of equating, SAS programming, test construction, standard setting, and the logistics of large scale testing. My favorite aspect of the internship was the day-to-day problem solving that comes with working in a large scale testing environment and the opportunity to work with and learn from a variety of professionals that have different areas of expertise. I’ve thoroughly enjoyed the past two-months working at Pearson and look forward to continued collaboration.

Anna Zilberberg


As I look back at the past eight weeks spent at the Pearson’s Austin office, a single word describing my internship keeps surfacing in my mind – diversity. Through Pearson’s Psychometric Fellowship, I’ve had the privilege to experience a full spectrum of professional life in an applied testing company. My activities ranged from working alongside others in day-to-day operations to attending staff meetings to conducting independent research, and were always varied, challenging, and rewarding.

Over the last two months I have been involved in several projects related to assessments designed for students in special education. Through these projects, I learned about a new form of the longitudinal growth analysis (transition table approach) and practiced large scale data management and programming in SAS. In addition to being an educational experience, working on these projects gave me a sense of having contributed to the work underway at Pearson. Involvement in other projects, such as verifying item-level statistics and conducting a participation analysis, allowed me to explore the item bank database and follow the steps outlined in technical specifications for test assessments.

Not only did I get to dabble in the operational work at Pearson, I was also fortunate enough to participate in a research project driven by the practical testing concerns. Under the guidance of my mentor, Ha Phan, and with the support of Leslie Keng and Jadie Kong, I explored an area of research unfamiliar to me prior to this summer: Mantel-Haenszel technique for identifying differentially functioning items (DIF). Together, we came up with the design for a simulation study investigating the effects of matching type and sample size on the DIF detection. Reviewing prior research on the effects of small sample size was especially interesting given the recent increase in racial diversity in Texas and the need to include groups of mixed ethnicities in the analysis. I hope to continue to collaborate on this project with members of the research team formed this summer.

Acquiring hands-on experience missing from my graduate studies never meant sacrificing academic rigor. On the contrary, weekly webinars presented by Pearson’s psychometricians introduced interns to an array of topics integral to test development. These topics included an overview of growth analyses, comparability studies, equating, and building SAS macros. These are valuable new resources for my toolbox that I will surely utilize in the future.

Through more informal interactions with my colleagues, I learned about the hot-button issues that are sure to occupy many academics and applied testing practitioners in the next few years, such as policy issues surrounding common core assessments, value-added models, teacher effectiveness, and growth analyses. I’m especially grateful to my mentor, Ha Phan, who always made time for me despite her busy schedule and looming deadlines. Her work ethic, dedication, and patience are unsurpassed. I hope to have made lasting friendships and professional connections with many at Pearson; the internship would never be as rich or as edifying if it was not for everyone’s support. The summer at Pearson left me striving to be brave, imaginative, decent, and continue to be always learning.