ERIC Identifier: ED458289
Publication Date: 2001-11-00
Author: Stansfield, Charles - Rivera, Charlene
ERIC Clearinghouse on Assessment and Evaluation College Park MD.
Test Accommodations for LEP Students. ERIC Digest.
It is widely believed that school achievement will improve if education
systems identify what is to be learned, then assess student mastery of that
material to determine the effectiveness of instruction). In recent years, there
has been much discussion about how best to assess the school achievement of
students with limited English proficiency (LEP), also known as English language
learners (ELLs). Those charged with setting inclusion and accommodation policies
for state assessment programs face two problems: the lack of research on the
effects of accommodations generally, and the lack of research on how specific
accommodations address the linguistic needs of ELLs.
This Digest presents an overview of LEP student accommodation and inclusion
practices on statewide assessments, with special emphasis on the accommodation
known as linguistic simplification.
LEP PARTICIPATION RATES IN STATE ASSESSMENTS
of LEP students in statewide testing programs over the last decade has been
uneven. In the mid-90s, 44 of the 48 states with state assessment programs in
place permitted ELLs to be excused from one or more state assessments. In 27 of
the 44 states, ELLs as a group were routinely exempted from participation in the
state assessment program (Rivera and Vincent, 1997).
When the federal Elementary Secondary Education Act was re-authorized in 1994
as the Improving America's School Act (IASA), it mandated the annual testing of
LEP students in Title I programs and required that states create final
assessment systems that are inclusive of all students by the 2001-02 school
year. A study of state inclusion and accommodation policies for ELLs in the
1998-99 school year showed states were generally trying out various
accommodations for ELLs. However, most states appeared to be using
accommodations designed for students with disabilities rather than
accommodations designed with the linguistic needs of ELLs in mind (Rivera,
Stansfield, Scialdone, and Sharkey, 2000).
APPROPRIATE ACCOMMODATIONS FOR LEP STUDENTS ON STATE
Appropriate test accommodations level the playing field and help
ensure the validity of the test for all students by eliminating irrelevant
obstacles that affect test performance and test scores. Yet accommodations
should not give a demonstrable advantage to students who receive them over
students who do not. Some accommodations for LEP students are:
1. Offering extra time,
2. Providing bilingual dictionaries and glossaries, and
3. Allowing the teacher to clarify the meaning of words on the test (when
they do not relate to the content being tested).
Some accommodations can be problematic. A glossary plus extra time was found
to raise performance for both LEP and non-LEP students, which raises concerns
about validity (Abedi, Lord, Hofstetter, & Baker, 2000). Access to English
dictionaries or native language dictionaries can unfairly advantage LEP students
by giving them access to content-related terms. A customized dictionary that
does not contain words that assist students with test content appears to be a
promising accommodation (Abedi, 2001).
Some accommodations address environmental conditions that help students feel
more comfortable, such as allowing the student to take the test in a familiar
setting with a familiar teacher with other students receiving similar
accommodations, or permitting a flexible schedule that includes shorter test
sessions or more breaks. Administrative accommodations can include allowing the
teacher to read directions aloud, repeating directions, and simplifying or
clarifying directions (Rivera & Stansfield, 1998).
Several professional groups within the education and measurement communities
have issued recent calls for research to identify appropriate, valid, and
reliable accommodations for ELLs, including the American Educational Research
Association, American Psychological Association, and National Council on
Measurement in Education (1999), the American Educational Research Association
(2000), Teachers of English to Speakers of Other Languages (2000), and the U.S.
Department of Education's Office for Civil Rights (2000). Although research on
accommodations for ELLs has begun to be reported on at conferences and to appear
in the literature, studies involving accommodations seldom involve an
experimental research design, making it difficult to determine the effects of
accommodations on reliability, validity, and score comparability.
MEASURING THE EFFECTS OF LINGUISTIC SIMPLIFICATION
evaluating the efficiency of an accommodation, there are two issues to be
determined. First, among those for whom it is not considered necessary, there is
a need to understand whether the accommodation provides an unfair advantage to
an examinee who receives it over one who does not. For example, would an
English-speaking student who took a test in which the language had been
simplified to aid comprehension get an improved score? Second, if among the
first group, there is no advantage for those who receive it, then there is a
need to uderstand whether the accommodation actually improves the performance of
those who have special needs, for example, the English language learner. Would
an LEP student demonstrate improved performance on a test that had been
linguistically simplified over a test with standard wording?
One way to determine if an accommodation offers an unfair advantage, or
whether it meaningfully assists students with special needs, is through an
experimental design whereby students with and without the necessary condition
are randomly assigned to treatments, with some students receiving the treatment
and others not getting it. Two recent experimental studies that have explored
the effects of linguistic simplification as an accommodation illustrate the
complexity of the issue. One study (Abedi, Lord, & Hofstetter, 1998)
involved mathematics items used in the National Assessment of Educational
Progress (NAEP). In the study, test booklets containing either a Spanish
version, a simplified English version, or original NAEP math items (in
un-simplified English) were randomly administered to 1,400 LEP and non-LEP
eighth-graders in southern California middle schools. Only Hispanic students
received the Spanish version. The simplified items were rewritten by content
experts in linguistics and mathematics at the National Center for Research on
Evaluation, Standards, and Student Testing. The analyses indicated that both LEP
and non-LEP students (that is, fully English proficient students) performed best
on the simplified version, and worst on the Spanish version. While LEP and
non-LEP students performed significantly better on the simplified items,
significant differences in item difficulty were obtained on only 34 percent of
the simplified items, leading the researcher to suggest that linguistic
clarification of math items might be beneficial to all students. He also noted
that other factors, such as length of time in the United States, English
proficiency, reading competency, and prior math instruction had significant
effects on scores.
Rivera and Stansfield (2001) examined the effects of linguistic
simplification on fourth- and sixth-grade science test items used in the
Delaware Student Testing Program. At each grade level, four parallel 10-item
testlets were included on an operational statewide assessment. Items differed
only in that on one testlet, they were linguistically simplified by experts at
The George Washington University Center for Equity and Excellence in Education,
while on the other, the standard wording was used. A total of 11,306 non-LEP
students and 109 LEP students took one of the eight forms of the test. Because
the number of LEP students was split among the eight forms, the number of LEP
students taking each test form was small, ranging from 6 to 23 students. While
the researchers caution that due to the limited sample size, nothing can be
generalized about linguistic simplification as an aid to LEP students, the
findings for the large non-LEP sample are quite clear. The linguistic
simplification was not helpful to non-LEP students who received it. This
provides evidence that linguistic simplification is not a threat to score
The result of the process of linguistic simplification must be to make the
item accessible to ELLs without altering the difficulty of the content. However,
at times, language and content interact, and in these cases, it is not possible
to linguistically simplify items without simplifying the content. Further
studies are necessary to address the usefulness of linguistic simplification for
LEP students taking formal and high-stakes assessments. If experimental studies
involving large samples of LEP students who are randomly assigned to treatments
show that those LEP students who receive simplified items perform statistically
and meaningfully better than those who receive the regular, un-simplified
version of such items, then the utility of linguistic simplification in meeting
the needs of LEP test-takers will be established. At the moment, the research
shows that when properly carried out, linguistic simplification need not be
considered a threat to score comparability.
Abedi, J. (2001). Validity of Accommodations for
English Language Learners. Paper presented at the annual meeting of the American
Educational Research Association, Seattle, WA.
Abedi, J., Lord, C., & Hofstetter, C. (1998). Impact of Selected
Background Variables on Students' NAEP Math Performance. Los Angeles: UCLA
Center for the Study of Evaluation/National Center for Research on Evaluation,
Standards and Student Testing.
Abedi, J., Lord, C., Hofstetter, C., & Baker, E. (2000). Impact of
accommodation strategies on English language learners' test performance.
Educational Measurement: Issues and Practice, 19 (3): 16-26.
American Educational Research Association (2000). Position statement of the
American Educational Research Association concerning high-stakes testing in
pre-K-12 education. Educational Researcher, 29 (8), 24-25.
American Educational Research Association, American Psychological
Association, and National Council on Measurement in Education (1999). Standards
for Educational and Psychological Testing. Washington, D.C.: American
Educational Research Association.
Rivera, C., & Stansfield, C.W. (2001). The Effects of Linguistic
Simplification of Science Test Items on Performance of Limited English
Proficient and Monolingual English-Speaking Students. Paper presented at the
annual meeting of the American Educational Research Association, Seattle, WA.
Rivera, C., & Stansfield, C.W. (1998). Leveling the playing field for
English language learners: Increasing participation in state and local
assessments through accommodations. In R. Brandt, ed., Assessing Student
Learning: New Rules, New Realities (pp. 65-92). Arlington, VA: Educational
Research Service. [Available online at
Rivera, C., Stansfield, C.W., Scialdone, L., & Sharkey, M. (2000). An
Analysis of State Policies for the Inclusion and Accommodation of English
Language Learners in State Assessment Programs During 1998-99. Arlington, VA:
George Washington University, Center for Equity and Excellence in Education.
Rivera, C., & Vincent, C. (1997). High school graduation testing:
Policies and practices in the assessment of English language learners.
Educational Assessment, 4 (4): 335-55.
Teachers of English to Speakers of Other Languages, Elementary and Secondary
Education Act Reauthorization Task Force (2000). Board endorses position papers
for ESEA re-authorization effort. TESOL Matters, 11 (1): 1, 4.
U.S. Department of Education, Office for Civil Rights (2000). The Use of
Tests As Part of High-Stakes Decision-Making for Students: A Resource Guide for
Educators and Policy-Makers. Washington, D.C.: U.S. Department of Education.
[Available online at http://www.ed.gov/offices/OCR/testing/index.html].