ERIC Identifier: ED328603 
Publication Date: 1990-12-00 
Author: Mullis, Ina V. S. 
Source: ERIC Clearinghouse on Tests Measurement and Evaluation Washington DC., American Institutes for Research Washington DC. 

The National Assessment of Educational Progress (NAEP). ERIC Digest. 

Since 1969, the National Assessment of Educational Progress (NAEP) has been assessing what public and private school students know and can do in a variety of curriculum areas, including mathematics, reading, science, writing, U.S. history, and geography. In 1988, Congress added a new dimension to NAEP by authorizing, on a trial basis, voluntary participation in state-level assessments in 1990 and 1992. With the President's Summit on Education, the resultant education goals, and the addition of the state assessment program, NAEP is playing an increasingly visible role in measuring student achievement. 

This digest describes how NAEP is organized and what is included in a typical NAEP assessment. It also looks at how NAEP selects students for its assessments and how the results from an assessment are used. 

HOW IS NAEP ORGANIZED AND MANAGED?

NAEP is a congressionally-mandated project of the National Center for Education Statistics (NCES), U.S. Department of Education. NCES carries out the NAEP project through competitive awards to qualified organizations. NCES awarded the operational contract for conducting the 1990 and 1992 assessments to Educational Testing Service (ETS)--who is responsible for printing, open-ended scoring, and scanning--and its subcontractor, Westat, Inc.--who is responsible for data collection. In addition to coordinating operational activities, ETS develops the assessment instruments, analyzes the assessment results, and works with NCES staff to prepare the reports on student achievement. 

The National Assessment Governing Board (NAGB) formulates policy guidelines for NAEP. NAGB's composition is specified by law, and its 24 members include teachers, curriculum specialists, state legislators, governors, measurement experts, chief state school officers, state and local school board members, school superintendents, principals, and representatives from business and the general public. 

NAGB selects the subject areas to be assessed, in addition to those specified by Congress; develops assessment objectives and specifications; ensures that all NAEP items are free from racial, gender, or cultural bias; and identifies appropriate achievement goals for students. NAGB also awards contracts for technical advice. For example, NAGB awarded a contract to the Council of Chief State School Officers (CCSSO) to manage the consensus process to develop NAEP's 1992 reading objectives and item specifications. 

WHAT IS COVERED IN A TYPICAL NAEP ASSESSMENT?

The NAEP objectives underlying each assessment typically take the form of frameworks or matrices delineating the important content and process areas to be assessed. For example: 

o The mathematics framework is a five by three matrix specifying five content areas--Numbers and Operations; Measurement; Geometry; Data Analysis, Statistics, and Probability; and Algebra and Functions--and three process or ability areas--conceptual  understanding, procedural knowledge, and problem solving. 

o The reading framework includes reading for three primary purposes--for literary experience, for information, and for performing a task. The process dimension includes fluency, constructing meaning (forming an initial understanding of the text and developing an interpretation of it), and elaborating and responding critically (reflecting on and responding to the text as well as demonstrating a critical stance). 

As part of the legislatively-mandated consensus process used to regularly update NAEP objectives and item specifications, the contractors hired by NAEP indicate the percentages of assessment items that should be devoted to measuring various aspects of the frameworks. For more recent assessments, recommendations have emphasized measuring higher order skills and understandings. 

The assessment instruments, which are generally administered in group settings, include a variety of multiple-choice and open-ended items. Here is a sample of the tasks the students perform: 

o During the mathematics assessment, students are allowed to use calculators and protractors or rulers, and the instrument asks for open-ended responses to complex problems. 

o The reading assessment will break new ground by presenting longer, naturally occurring passages and increasing the proportion of open-ended questions to fill nearly half of the assessment. Using the Integrated Reading Performance Record, NAEP will also assess fluency in oral reading and will conduct a portfolio study based on interviews of individual fourth-grade students. 

o Based entirely on student writing samples, the writing assessment includes a variety of prompts addressing different purposes for writing. Students' responses are evaluated for task accomplishment, overall fluency, and mechanical correctness. 

o The science assessment includes a variety of open-ended questions, some of which ask students to describe their conceptions of scientific inquiry and to draw conclusions about scientific phenomena and events. 

HOW ARE STUDENTS SELECTED FOR PARTICIPATION IN NAEP?

In 1990, approximately 87,000 students participated in the national assessment and another 100,000 participated in the state assessments of eighth grade mathematics. Considering the planned expansion of the state assessment program, NAEP anticipates that in 1992 the assessments will involve approximately 419,000 students in 12,000 schools. 

The details of the sampling procedures for the national and state assessments differ in the following respects: 

o For the national assessment, NAEP uses a four-stage sampling design: (1) primary sampling units are identified; (2) schools are enumerated within the primary sampling units and randomly selected; (3) students are randomly selected from those schools; (4) those students are assigned to assessments in different subject areas. 

o For the state assessments, the schools in each state are enumerated, stratified, and randomly selected; then, students are listed and randomly selected, and assigned to assessment sessions. 

HOW DOES NAEP REDUCE THE BURDEN ON PARTICIPATING SCHOOLS AND STUDENTS?

All NAEP data are collected by trained administrators. For the national assessments, Westat, Inc. trains its own field staff to collect the data, thus reducing the burden on participating schools. However, according to the NAEP legislation, each participating state must collect data for the trial state assessments. Westat achieves uniformity of procedures across states through training and quality-control monitoring. 

NAEP uses matrix sampling to reduce the burden for participating students. In matrix sampling, the total pool of assessment questions is divided, and portions are given to different but equivalent samples of students. Thus, not all students are asked to answer all questions. This system provides broad coverage of each curriculum area being assessed, while each student invests only about an hour in the assessment. 

HOW ARE NAEP RESULTS ANALYZED AND REPORTED?

NAEP results are known as The Nation's Report Card. NAEP also publishes results in a series of widely-disseminated reports that summarize achievement across items and describe relationships between achievement and a variety of background characteristics. In addition, NAEP provides information about the percentage of students who give acceptable responses to each item. 

NAEP objectives, reports, technical documentation, and complete publications lists are available from Educational Testing Service, P.O. Box 6710, Princeton, NJ 08541. 

ADDITIONAL READING

Johnson, Eugene G. and Rebecca Zwick, The NAEP 1988 Technical Report. Princeton, NJ: Educational Testing Service, National Assessment of Educational Progress, 1990. 

Mullis, Ina V.S., The NAEP Guide: A Description of the Content and Methods of the 1990 and 1992 Assessments. Princeton, NJ: Educational Testing Service, National Assessment of Educational Progress, 1990. 


Library Reference Search Web Directory
This site is (c) 2003-2005.  All rights reserved.

Please note that this site is privately owned and is in no way related to any Federal agency or ERIC unit.  Further, this site is using a privately owned and located server. This is NOT a government sponsored or government sanctioned site. ERIC is a Service Mark of the U.S. Government. This site exists to provide the text of the public domain ERIC Documents previously produced by ERIC.  No new content will ever appear here that would in any way challenge the ERIC Service Mark of the U.S. Government.