ERIC Identifier: ED291441
Publication Date: 1988-01-00
Author: Mabry, Theo N.
Source: ERIC Clearinghouse for
Junior Colleges Los Angeles CA.
Program Review. ERIC Digest.
"Declining funds, declining test scores, declining standards, declining
quality of products, declining resolve of legislators, state and local boards to
fully fund programs without clear-cut objectives and documented outcomes"
(Smith, 1984) have created a need for institutional assessment that will
continue unabated into the 1990's. Thoughtful, well-planned and systematic
reviews of both instructional and non-instructional programs provide community
colleges with a way of determining whether programs are meeting stated
objectives and what standards of performance should be maintained. Programs that
need improvement or should be eliminated can be identified, and fiscal
accountability can be achieved.
APPROACHES TO PROGRAM REVIEW
Program reviews generally have
qualitative and quantitative components. For the most part, the quantitative
component utilizes the types of numerical data that are collected and reported
to state agencies (e.g., student enrollment, weekly student contact hours,
percentage of students completing the program, numbers of degrees granted, and
numbers of students transferring to four-year institutions). Other categories
for which data may be gathered include weekly student contact hours per
full-time faculty equivalent, percentage of students obtaining jobs in their
field of study, number of job openings in the service region, and full-time to
part-time faculty ratio.
In gathering qualitative data, students, faculty, advisory committee members,
and other members of the college community who have knowledge of or experience
with a program are asked to share their perceptions and judgments. Question may
be open-ended, requiring the respondent to assess aspects of the program in his
or her own words. However, most surveys depend heavily on ordinal or rank-order
measures. Respondents are asked to rate aspects of a program's effectiveness as
poor, below expectations, acceptable, good, or excellent--or some variation
thereof. The evaluations typically focus on goals and objectives of the program,
processes used in program implementation, and resources available for the
Qualitative and quantitative components are integrated into an institution's
overall plan for program review in various ways, ranging from heavily
qualitative to heavily quantitative.
THE QUALITATIVE APPROACH
Voluntary self-studies conducted
by internal review committees, augmented through validation studies by external
review teams, lie at the qualitative end of the continuum. One example,
Michigan's Program Review in Occupational Education (PROE), which was developed
by a steering committee of local and state community college professionals and
sponsored by the Michigan State Department of Education, includes:
o selection of a self-study coordinator and committee, and
orientation meetings and announcements.
o completion of self-study instruments to determine faculty
members' perceptions of program standards, strengths,
and weaknesses; students' assessments of how well the
program meets their needs; and advisory committee members'
opinions of the occupational preparation of program
o organization and presentation of tabulated and correlated
responses into an understandable format.
o preparation of a written report summarizing the program's
strengths and weaknesses to assist decision making
about the need to modify or redirect a program.
o preparation of an action plan for occupational program
improvement. (Michigan Community Colleges, 1980).
The colleges arrange for validation of the self-study by an outside team from
other two-year colleges, industry, and the community. The team members, selected
on the basis of their expertise and trained in applying the evaluation system,
develop a consensus profile of the program, identifying where and why they agree
or disagree with the self-study profile.
These validation studies play an important role in lending credibility to
voluntary, qualitative self-assessments, which are sometimes criticized as
overly intuitive and self-serving (Rasor, 1983). In Michigan, concern for
objectivity and credibility is addressed by the other components of the state's
comprehensive evaluation system. These components--the Michigan Student
Information System, Financial Information System, and Management Plan--provide
quantitative data to supplement the qualitative findings.
THE QUANTITATIVE APPROACH
Maryland's program review process
stands on the quantitative side of the qualitative/quantitative continuum. Under
the direction of the Maryland State Board for Community Colleges, the colleges
conduct annual reviews of both transfer and occupational programs. The Maryland
program evaluation process begins with a quantitative review by the State Board
of program data reported by the community colleges in the following categories:
o Name, status and code numbers of the program
o Enrollment and awards data for the last six years
o Enrollment and awards in similar programs at other colleges
o Graduate follow-up data
o Annual job openings in the Baltimore area and statewide
o Discipline cost analysis data
Each May, student, cost, and manpower information from the Program Data
Monitoring (PDM) system is distributed to academic and occupational deans, and
institutional research directors for verification. The State Board then
identifies programs at each college that appear to be in difficulty and should
be evaluated qualitatively. The identification process is assisted by "flags" in
the PDM, indicating that a program has experienced declining enrollment or
awards, low job placement or transfer rates, higher than average costs, or low
student satisfaction levels. On the basis of these quantitative criteria checks,
the Board develops question about the identified problems, which are forwarded
to the college president for response. The questions are designed to determine
why a program is in trouble and ways in which problems can be identified and
addressed. The resulting program validation process may involve mail or
telephone surveys of students, analyses of student transcripts, and assistance
by the program advisory committee. The college must submit a written response to
the State Board. On the basis of the program review, the Board may suggest the
discontinuation or inactivation of a program.
A study conducted at Pasadena City College (Carvell Education Management
Planning, 1982) represents a further step on the continuum toward quantification
of the program review data. This program review model is based on six criteria:
o trends in weekly student contact hours (WSCH)
o faculty loads (WSCH/full-time equivalent faculty)
o class size
o student grade petitions
o student retention
o assigned cost of instruction per annual contact hour
Programs are assessed in terms of how well they are performing currently in
comparison with how well they performed in the past. Data for individual program
performance are also compared with college-wide averages. The final aspect of
program review at PCC is a qualitative assessment conducted by outside
consultants based on personal interviews, a tour of the facilities, a review of
records, and a written survey of faculty and administrators.
Another model of quantitative program evaluation is provided by Foothill
College in Los Altos, California. The two-part model was designed to examine
instructional programs in relation to program effectiveness, cost effectiveness
and relation to the college mission (Lowe, 1983). Program and cost effectiveness
are measured in terms of quantifiable, weighted criteria. For example, in
assessments of program effectiveness, the number of students enrolled in
relation to enrollment capacity is assigned a weight of 17 (the highest), while
future employment outlook is given a weight of 6 (the lowest). In assessing cost
effectiveness, average daily attendance has a weight of 25.5 (the highest),
while the ratio of part-time to full-time instructors has a weight of 8 (the
Through a formula based on percentile scores and the weights assigned each
criteria for both program and cost effectiveness, a single score is obtained for
each program evaluated. Programs are then ranked according to their overall
scores. Finally, the programs are reviewed by the college president and the
president's cabinet to judge their relation to the overall mission of the
college. From this evaluation, decisions are made regarding the status of each
program. During 1981-82, six of the ten lowest ranked programs were eliminated
from the curriculum.
Program reviews that rely heavily on quantitative analysis can be extremely
threatening to staff members, particularly if declining enrollments or severe
fiscal constraints mandate the elimination of programs and therefore reductions
in teaching and support staff (Rasor, 1983). Both the Maryland State program
review process and the Pasadena City College model provide recourse for those
involved in programs that do not rank well on quantitative measures. These two
systems address such issues as the need to provide general education courses for
transfer students, the importance of responding to specific, but not necessarily
quantifiable local community needs, and the need to provide a comprehensive
curriculum through the qualitative component.
The Foothill College model, which is heavily dependent upon quantitative
data, provides almost no channel of response for the faculty and support staff
in poorly rated programs, since the assessment of the programs' intangible and
unquantifiable value rests solely with the president and members of the cabinet.
Reviews of instructional programs require a balance between qualitative and
quantitative components to ensure both accountability and fairness.
FOR FURTHER INFORMATION
Carvell Education Management Planning, Inc. "A Comprehensive Review of Credit Instructional Programs Offered by Pasadena City College." Los Angeles,
Carvell Education Management Planning, 1982. 116 pp. (ED 237 126) Lowe, Irel
D. "Program Evaluation at Foothill College." Los Altos: Foothill College, 1983. 30 pp. (ED 231 406)
Maryland State Board for Community
Colleges. "Maryland Community Colleges Instructional Program Manual." Annapolis: Maryland State Board for Community Colleges, 1983. 52 pp. (ED 238 475)
Michigan Community Colleges. "PROE: Program Review in Occupational Education," 1980. 72 pp. (ED 234 828)
Rasor, Richard A., and others. "A
Community College Instructional Program Evaluation Model Using a Mini-Accreditation Approach." Sacramento: American River College, 1983. 30 pp. (ED 227 916)
Smith, Ronald C. "Why
Program Review?" Anchorage, Alaska: Anchorage Community College, 1984. 28 pp. (ED 246 937)