skip to main content skip to footer

 

ETS Internship, Fellowship and Visiting Scholar Programs in Research

Collaborate with ETS researchers to carry out innovative and impactful R&D projects.

Learn more about available internships and how to apply.
 

2022 Summer Research and Measurement Sciences (RMS) Internship Program for Graduate Students

Application has closed for the Summer 2022 internship.

If you are a creative and innovative individual who wants to help shape the future of learning and assessment, we encourage you to apply for the 2022 Summer Research and Measurement Sciences (RMS) Internship program.

RMS conducts rigorous foundational and applied research on the most critical issues facing education and the workforce. Central to ETS’s legacy of global leadership in learning and assessment, RMS is dedicated to advancing the science and practice of measurement, driving innovation in digital assessment, learning and teaching, and advancing equity of opportunity for all learners.

As an intern in RMS, you’ll work with experts who are nationally and internationally known as thought leaders, trusted advisors and go-to collaborators for their high-impact work addressing significant educational and societal goals.

If you’re accepted into the program, you’ll collaborate with scientists on projects related to these topics and will participate in data analysis, writing and other research tasks. Upon the completion of the program, you’ll have the opportunity to present your findings to teams across R&D.

Tasks

You’ll perform any number of tasks associated with a research project, such as:

  • performing a literature review
  • working with data collection
  • conducting various analyses
  • preparing a proposal for a conference
  • drafting a research report and documenting the study

If you’re matched to projects involving specialized procedures or software, you may also be learning the technology.

Doctoral students who have completed at least 2 years in one of these areas or a related field are encouraged to apply.

  • psychology
  • education
  •  psychometrics
  • measurement
  • statistics
  • cognitive or learning sciences
  • data science

Eligibility requirements

  • Current full-time enrollment in a relevant doctoral program
  • Completion of at least 2 years of coursework toward the doctorate prior to the program start date.

Selection

The main criteria for selection will be scholarship and the match of applicant interests and experience with the research projects.

We value team members who bring a diversity of interests and lived experiences to RMS. We strongly encourage students from underrepresented groups and backgrounds to apply. Late or incomplete applications will not be considered.

Complete the electronic application form. On the application form:

  1. Choose up to two research areas in which you are interested and provide written statements about your interest in the area(s) of research and how your experience aligns with the project.
  2. Attach a copy of your curriculum vitae (preferably as a PDF).
  3. Attach a copy of your graduate transcripts (unofficial copies are acceptable).
  4. Download the recommendation form and share it with your recommenders. The link to the recommendation form is on the application. 
    • Recommendations should come from an academic advisor, a professor who is familiar with your work as it relates to the area of interest or an individual with whom you have worked on a closely aligned project. 
    • ETS will only accept two recommendation forms. Recommendations should be sent electronically to internfellowships@ets.org and must be received by February 1, 2022.

Dates and location

  • Deadline: The application period is currently closed.
  • Decisions: You’ll be notified of selection decisions by March 31, 2022.
  • Duration: 8 weeks: June 6, 2022–July 29, 2022
  • Location: On campus or remote

Compensation

  • $8,000
  • For interns participating on-campus:
    • Transportation allowance for relocating to and from the Princeton area
    • Housing will be provided for interns commuting more than 50 miles

Some examples of the kinds of projects interns have worked on in recent years include:

  • Joint Modeling of NAEP Process Data and Response Data. This project involved developing methods to study the relationships between a test taker's sequences of actions on interactive NAEP Science items and their performance on the assessment. The study investigated the use of different modeling strategies including Markov chain models and clustering methods to label test takers as "efficient" or "inefficient" in terms of their sequence of actions and examined how this efficiency was related to performance.
  • Methods for Evaluating Fairness of Machine Scores. Artificial intelligence (AI) and machine learning algorithms have been found to produce biased predictions in some settings. It is important to make sure scoring procedures, such as automated or machine scoring algorithms, are fair and do not produce such biased predictions. This project examined methods based on confirmatory factor analysis and structural equation modeling to investigate whether the relationships among predictors and item scores were invariant across demographic subgroups in order to ensure the scores we produce are fair.
  • Holistic Admissions in Graduate School Applications and Impact on Admissions Outcomes. In this project, we examined how graduate programs approach holistic admissions, considering a range of evaluation criteria when determining applicants' fit for their programs. We examined different approaches to holistic admissions and documented the training that staff received to apply holistic admissions. We also researched outcomes of holistic admissions in terms of diversity and academic rigor of the admitted class. This is an ongoing project.
  • Evaluating Measurement Invariance of Translated Tests Across Language Groups. The intern project investigated whether the two (or multiple) language versions of the same test affect item level performance and test level performance, whether score equating is necessary, and how equating should be conducted. We conducted multiple group item response theory (MIRT) concurrent calibrations to select common items that are statistically invariant to serve as an internal anchor between the different language forms (e.g., an English form and Spanish form). We found that text-heavy subjects (e.g., reading) is more affected by translation in terms of form-equivalency. For math, the two language forms were very similar.

Contact