Keep up with what's happening at PARCC.
PARCC states are working closely with two sets of contractors to develop the items and tasks for the PARCC mid-year, performance-based and end-of-year assessments.
Test development has two phases:
Item and Text Reviews
State experts, local educators, postsecondary faculty and community members from across the PARCC states are conducting rigorous reviews of every item and passage being developed for the PARCC assessment system to ensure all test items are of the highest quality, aligned to the Common Core State Standards, and fair for all student populations. PARCC’s process allows for an unprecedented collaboration among states and their K-12 and higher education communities to provide state-led quality assurance and oversight of the tests’ development. All PARCC item reviewers are nominated by their state education agency.
The purpose of the PARCC educator committee meetings is to receive feedback on the quality, accuracy, alignment, and appropriateness of test items that are developed annually for PARCC ELA/L and mathematics assessments. These committees are composed of educators, university professors, and other PARCC citizens selected by PARCC states. Below is a list of the various PARCC educator committees that review test items for appropriateness to be included on the PARCC assessments.
State Text Review Committee
|Participants will be able to review and edit the passages independently, through electronic display of passages, particularly multi-media passages, and then the grade level group will discuss content and bias concerns.|
State Content Item Review Committee
|During the State Content reviews, the committees will review and edit test items for adherence to the PARCC foundational documents, basic Universal Design principles, PARCC Accessibility Guidelines, selected metadata fields, and the PARCC Style Guide.|
State Bias and Sensitivity Item Committee
|Educators and community members will be asked to review items and tasks to confirm the absence of bias or sensitivity issues that would interfere with a student’s ability to accomplish his or her best performance. The objective is to provide items and tasks that do not unfairly advantage or disadvantage one student or group over another. Once items are approved by the State Content Item Review Team, they will be prepared for external bias and sensitivity review.|
Editorial Review Committee
Prior to each editorial review meeting Pearson will work with the Partnership Manager to select up to 10 percent of the items and tasks for this review. The PARCC editorial review committee participants will do their review in Pearson’s item bank system. As with the other reviews, the committee members will view the items as the student would, and be able to vote and record their comments in the system. However, this is not a content but a copy edit review.
|Data Review Committee|
|Educators will be asked to participate in the Data Review Meeting to evaluate item-level statistics from field-tested items on the operational assessment. Participants make decisions about whether items should move forward to the operational assessments, or be revised and field tested.|
Test Construction Committee
Educators and bias members will be asked to participate in the Test Construction Meeting to build operational core forms to meet PARCC assessment blueprints for the PBA and EOY components of the summative assessment, scheduled to be administered during the school year.