1. OECD – IMHE General Conference
Paris – September, 17-19, 2012
Assessing learning outcomes in Higher Education:
the Brazilian experience
Renato H. L. Pedrosa
Dept. of Scientific and Technology Policy /Geosciences Instititute
Group of Studies in Higher Education/Center for Advanced Studies
University of Campinas - Unicamp
CEAv
renato.pedrosa@reitoria.unicamp.br
2. Assessing Learning Outcomes in HE
CEAv
1. Scope: academic
a. General education (?)
b. Subject area (general subject area/specific subject area)
2. Scope: institutional, national, multinational (LA, EU, Asia), global
3. Purposes?
a. Accreditation: regulation
b. Evaluation: feedback to policy makers, stakeholders, students
c. Comparative studies
4. Validity: more in a moment
5. Feasibility: practical issues
IMHE 2012 RHLPedrosa
4. Assessing Learning Outcomes in HE
CEAv
John Sexton this morning:
“Are we just measuring the measurable?"
IMHE 2012 RHLPedrosa
5. The Brazilian Asessment System
CEAv
SINAES/ENADE
• The Brazilian System of Higher Education Evaluation (SINAES) was established by
national law in 2004.
• 1996: there was already a national exam for most undergraduate programs.
• SINAES includes as one of its components the National Student Performance Test
(ENADE), which is mandatory for all students in their last year of studies.
• SINAES also evaluates various institutional aspects, like faculty, infrastructure and
others and each program is assigned a score, the Preliminary Program Score
(CPC), on a 1-5 scale.
• Program score norm-referenced, meaning that scores are attributed by splitting the
programs in level of scores without reference to proficiency levels. Thus one
actually cannot say at what level if students graduating from a program that scores
4 (next to highest)
• But: most publicized and debated aspect of system is the Student Performance
National Exam (ENADE), since the results for each program are widely publicized
and used by media.
IMHE 2012 RHLPedrosa
6. The test - ENADE
CEAv
• It is composed by two components
• A “general education” (GE) component: 10 items, 8 multiple choice, 2 open
• A subject area (SA) component: 30 item, 27 multiple choice, 3 open
• The final score is weighted 25% for the GE and 75% for the SA component
• The score is used in two ways
• As the score for the graduating class
• To compute a “value added” score (with the High School Exit Test score)
• Those two scores plus the institutional and program evaluation for the
Preliminary Program Score (CPC)
• 40% graduating ENADE Score, 30% “value added” score, 30% institutional
score
IMHE 2012 RHLPedrosa
7. Validity of educational assessment
CEAv
systems
Evidences presented to justify the intended
interpretations and uses of results
of exams/tests employed by an assessment system
Kane, M. T. (2006). Validation. In R. L. Brennan (Ed.), Educational measurement (4th ed., pp. 17–
64). Westport, CT: ACE/Praeger Publishers.
IMHE 2012 RHLPedrosa
8. Validity of educational assessment
CEAv
systems
Evidences presented to justify the intended
interpretations and uses of results
of exams/tests employed by an assessment system
Kane, M. T. (2006). Validation. In R. L. Brennan (Ed.), Educational measurement (4th ed., pp. 17–
64). Westport, CT: ACE/Praeger Publishers.
IMHE 2012 RHLPedrosa
9. Validity aspects of ENADE and SINAES
CEAv
1. “General education” component
a. Contemporary social, cultural information asseessment
b. Some analytical, writing, reasoning skills assessed (not much)
c. How is that related to curriculum or any specific university/college activities?
2. Subject area component
a. Coverage of all subareas?
b. Number of items?
3. “Value added” score
a. Some evidence it does not actually measure that, but is highly correlated to the
graduating class score
4. Preliminary Program Score (CPC)
a. Norm-referenced
b. No indication regarding what the scores really mean
5. Big issue: student involvement
IMHE 2012 RHLPedrosa
10. Comments
CEAv
• ENADE/CPC are not valid, in their present form, as
assessments of individual programs or intitutions regarding
undergraduate education. Evidence is clear, well regarded
programs by specialists and market get very low scores
(student boycott being the main reason for that).
• Certainly the “general education” component is very fragile
as indicator of what a college education brings to the student
in terms of general skills. That is evident from just analysing
content of items and from performance of first and last year
students.
IMHE 2012 RHLPedrosa
11. Comments
CEAv
• In spite of that, it has some validity for assessment of
subgroups of institutions. For example, the public or the
private systems. Or the university system, or the small
colleges system.
• But it depends heavily on the subject area. Some analysis
done at Unicamp for the recent exams indicate that in
engineering and basic sciences the exams seem appropriate
as assessment tools (but much further study is needed). On
the other hand, there is much concern about human and
social sciences areas.
IMHE 2012 RHLPedrosa
13. OECD – IMHE General Conference
Paris – September, 17-19, 2012
Assessing learning outcomes in Higher Education:
the Brazilian experience
Thank you
Renato H. L. Pedrosa
Dept. of Scientific and Technology Policy /Geosciences Instititute
Group of Studies in Higher Education/Center for Advanced Studies
University of Campinas - Unicamp
CEAv
renato.pedrosa@reitoria.unicamp.br