More Related Content Similar to Trudy Banta AAGLO Forum Melbourne May 2012 Similar to Trudy Banta AAGLO Forum Melbourne May 2012 (20) Trudy Banta AAGLO Forum Melbourne May 20121. Trudy W. Banta
Professor of Higher Education
and
Senior Advisor to the Chancellor for
Academic Planning and Evaluation
Indiana University-Purdue University Indianapolis
click for short click for audio
video of Trudy of this
© TWBANTA-IUPUI presentation
2. Discipline-Based Assessment
to Provide Convincing Evidence of
Graduate Learning Outcomes
Presented in
Australia
May 2012
by
Trudy W. BantaProfessor of Higher Education
and
Senior Advisor to the Chancellor for
Academic Planning and Evaluation
Indiana University-Purdue University Indianapolis
355 N. Lansing St., AO 140
Indianapolis, Indiana 46202-2896
tbanta@ iupui.edu
http://www.planning.iupui.edu
© TWBANTA-IUPUI
3. My History
• Educational psychology
• Program evaluation & measurement
• Performance funding in Tennessee
• 1990 USDOE effort to build a national test
• 1992 Initiated evidence-based culture at
IUPUI
© TWBANTA-IUPUI
4. ASSESSMENT
Is like a dancer’s mirror.
It improves one’s ability to see and
improve one’s performance.
Alexander Astin
1993
© TWBANTA-IUPUI
5. ASSESSMENT OF INDIVIDUAL
STUDENT DEVELOPMENT
•Assessment of basic skills for use in advising
•Placement
•Counseling
•Periodic review of performance with detailed
feedback
•End-of-program certification of competence
•Licensing exams
•External examiners
© TWBANTA-IUPUI
6. KEY RESULTS OF INDIVIDUAL
ASSESSMENT
•Faculty can assign grades
•Students learn their own
strengths and weaknesses
•Studentsbecome self-
assessors
© TWBANTA-IUPUI
7. A SECOND LOOK
• Across students
•Across sections
•Across courses
© TWBANTA-IUPUI
8. •Where is learning satisfactory?
•What needs to be retaught?
•Which approaches produce the most
learning for which students?
© TWBANTA-IUPUI
9. GROUP ASSESSMENT ACTIVITIES
•Classroom assignments, tests, projects
•Questionnaires for students,
graduates, employers
•Interviews, focus groups
•Program completion and placement
•Awards/recognition for graduates
•Monitoring of success in graduate
school
•Monitoring of success on the job
© TWBANTA-IUPUI
10. ASSESSMENT . . .
“a rich conversation
about student learning
informed by data.”
-- Ted Marchese --
AAHE
© TWBANTA-IUPUI
11. USE OF RESULTS OF GROUP
ASSESSMENT
•Program improvement
•Institutional and / or state peer
review
•Regional and / or national
accreditation
© TWBANTA-IUPUI
13. GROUP ASSESSMENT REQUIRES
COLLABORATION
In setting expected program outcomes
In developing sequence of learning
experiences (curriculum)
In choosing measures
In interpreting assessment findings
In making responsive improvements
© TWBANTA-IUPUI
14. BARRIERS TO COLLABORATION
IN THE ACADEMY
1. Graduate schools prepare specialists
2. Departments hire specialists
3. Much of our scholarship is
conducted alone
4. Promotion and tenure favor
individual achievements --
interdisciplinary work is harder to
evaluate
© TWBANTA-IUPUI
15. TO FOSTER COLLABORATION
•Name interdisciplinary committees
•Read and discuss current literature on
learning/assessment
•Attend conferences together
•Bring experts to campus
•Share good practices
•Work together on learning communities
© TWBANTA-IUPUI
16. MOST FACULTY ARE NOT TRAINED AS
TEACHERS
Faculty Development
Can Help Instructors:
•Write clear objectives (outcomes) for student
learning in courses and curricula
•Connect learning outcomes to assignments in
courses.
•Develop assessment tools that test higher order
intellectual skills
© TWBANTA-IUPUI
17. Taxonomy of Educational Objectives
(Bloom and Others, 1956)
Cognitive domain Sample verbs for outcomes
categories
Identifies, defines, describes
Knowledge
Explains, summarizes, classifies
Comprehension
Demonstrates, computes, solves
Application
Differentiates, diagrams, estimates
Analysis
Creates, formulates, revises
Synthesis
Criticizes, compares, concludes
Evaluation
© TWBANTA-IUPUI
18. SOME GENERIC LEARNING OBJECTIVES
•Differentiatebetween fact and opinion
•Gather, analyze, and interpret data
•Apply ethical principles to local,
national, global issues
•Communicate ideas in writing effectively
© TWBANTA-IUPUI
19. PROFESSIONAL PROGRAM
OBJECTIVES
Program Graduates will Demonstrate
1. Professional commitment
2. Communication skills
3. Administrative and managerial skills
4. Information technology competence
5. Research and analytic competence
20. To Ensure That Concepts Are
Taught
Time management
© TWBANTA-IUPUI
21. ALVERNO COLLEGE 8 ABILITIES
Communication
Analysis
Problem Solving
Valuing in Decision-Making
Interacting
Global Perspectives
Effective Citizenship
Aesthetic Responsiveness
22. PRINCIPLES OF UNDERGRADUATE
LEARNING (PULS)
1. Core communication and quantitative
skills
2. Critical thinking
3. Integration and application of knowledge
4. Intellectual depth, breadth, and
adaptiveness
5. Understanding society and culture
6. Values and ethics
Approved by IUPUI Faculty Council
May 1998
23. PUL #1
CORE COMMUNICATION & QUANTITATIVE
SKILLS
Demonstrated by student’s ability to:
•Express ideas and facts to others effectively in a variety
of formats, particularly written, oral, and visual formats
•Communicate effectively in a range of settings
•Identify and propose solutions for problems using
quantitative tools and reasoning
•Make effective use of information resources and
technology
24. PRINCIPLES OF UNDERGRADUATE
LEARNING
•A distinctive feature of education at IUPUI
•Permeate the entire undergraduate
curriculum
•Are enacted differently in each discipline
25. PUL HISTORY AT IUPUI
1990 – Study group of faculty and staff
1992-98 – Series of task forces
1998 – Adoption by Faculty Council
2007 – Adoption of revised version
27. IN USING STANDARDIZED TESTS
• Match test with curriculum
•Set expected scores on subscales
•Discuss results
•Determine what is missing
© TWBANTA-IUPUI
28. Limitations
of standardized tests of generic skills
cannot cover all a student knows
narrow coverage, need to supplement
difficult to motivate students to take
them!
What are they actually measuring?
© TWBANTA-IUPUI
29. VOLUNTARY SYSTEM OF ACCOUNTABILITY
Report Scores in
critical thinking, written communication,
analytic reasoning
using
•Collegiate Assessment of Academic
Proficiency (CAAP)
•Measuring Academic Proficiency and
Progress (MAPP)
•Collegiate Learning Assessment (CLA)
© TWBANTA-IUPUI
30. TN = MOST PRESCRIPTIVE(5.45% OF
BUDGET FOR INSTRUCTION)
1. Accredit all accreditable programs (25)
2. Test all seniors in general education (25)
3. Test seniors in 20% of majors (20)
4. Give an alumni survey (15)
5. Demonstrate use of data to improve (15)
___
100 © TWBANTA-IUPUI
31. AT THE UNIVERSITY OF TENNESSEE
CAAP
Academic Profile (now MAPP)
COMP (like CLA and withdrawn
by 1990)
College BASE
© TWBANTA-IUPUI
32. IN TN WE LEARNED
1. No test measured 30% of gen ed skills
2. Tests of generic skills measure primarily
prior learning
3. Reliability of value added = .1
4. Test scores give few clues to guide
improvement actions
© TWBANTA-IUPUI
33. AN INCONVENIENT TRUTH
.9 = the correlation between SAT
and CLA scores of institutions
thus
81% of the variance in institutions’
scores is due to prior learning
© TWBANTA-IUPUI
34. HOW MUCH OF THE VARIANCE IN SENIOR
SCORES IS DUE TO COLLEGE IMPACT?
• Student motivation to attend that institution
(mission differences)
• Student mix based on
• age, gender
• socioeconomic status
• race/ethnicity
• transfer status
• college major
© TWBANTA-IUPUI
35. HOW MUCH OF THE VARIANCE IN SENIOR
SCORES IS DUE TO COLLEGE IMPACT?
(CONTINUED)
•Student motivation to do well
•Sampling error
•Measurement error
•Test anxiety
•College effects
______
19 %
© TWBANTA-IUPUI
36. STUDENT MOTIVATION
• Samples of students are being tested
• Extrinsic motivators (cash, prizes) are used
We have learned:
• Only a requirement and intrinsic motivation
will bring seniors in to do their best
© TWBANTA-IUPUI
37. CONCERNS ABOUT VALUE ADDED
•Student attrition
•Proportion of transfer students
•Different methods of calculating
•Unreliability
•Confounding effects of maturation
© TWBANTA-IUPUI
38. Recent University of Texas Experience
30 – 40% of seniors at flagships earn
highest CLA score (ceiling effect)
flagship campuses have lowest value
added scores
© TWBANTA-IUPUI
39. WORD FROM MEASUREMENT EXPERTS
Given the complexity of
educational settings, we may never be
satisfied that value added models can be
used to appropriately partition the causal
effects of teacher, school, and student on
measured changes in standardized test
scores.
- Henry Braun & Howard Wainer
Handbook of Statistics, Vol. 26: Psychometrics
Elsevier 2007
© TWBANTA-IUPUI
40. Employing currently available
standardized tests of generic
skills to compare the quality
of institutions is not a valid use of
those tests.
© TWBANTA-IUPUI
41. OECD’S AHELO
COMPARING HEIS X NATIONS
1. Generic skills (CLA)
2. Disciplines (Engineering and Economics)
3. Value added
4. Contextual information indicators
© TWBANTA-IUPUI
42. 2012
K-12 standardized test scores are used to
evaluate and compare schools
assign grades to schools
take over failing schools
evaluate, compare, and fail teachers
Yet NAEP scores have stagnated
© TWBANTA-IUPUI
43. IN FINLAND AND SINGAPORE
•No annual testing of students
•No high-stakes accountability measures for
teachers/schools
•Scholarships for best and brightest
•Starting pay like a doctor
•Must complete master’s degree
•Teachers are respected professionals
© TWBANTA-IUPUI
44. SHORT-TERM PERSPECTIVE
•Limit degrees to 120 SCH
•Penalize students who go beyond a SCH
cap
•Reward graduation in 4 years
•Consider earning potential in setting tuition
© TWBANTA-IUPUI
45. DE-PROFESSIONALIZATION –
IMMEDIATE PAYOFF
•Teacher education is first
•Industry certifications
•Partnerships to fill employers’ needs
Does apprenticeship model prepare us for
global leadership in the future?
© TWBANTA-IUPUI
46. BETTER WAYS TO DEMONSTRATE
ACCOUNTABILITY
Performance Indicators
1.Access (to promote social mobility)
2.Engaging student experience
3.Workforce development
4.Economic development
5.Civic contribution of students, faculty,
staff, graduates
© TWBANTA-IUPUI
47. IF WE MUST MEASURE LEARNING
LET’S USE:
1. Standardized tests in major fields
licensure and certification tests
ETS Major Field Tests
2. Internship performance
3. Senior projects
4. Study abroad performance
5. Electronic portfolios
6. External examiners
© TWBANTA-IUPUI
48. START WITH MEASURES YOU
HAVE
•Assignments in courses
•Course exams
•Work performance
•Records of progress through the
curriculum
© TWBANTA-IUPUI
49. METHODS OF ASSESSMENT
Paper and pencil tests
Individual or group projects
Portfolios
Observation of practice
Observation of simulated practice
Analysis of case studies
Attitude or belief inventories
Interviews and focus groups
Surveys
© TWBANTA-IUPUI
50. Direct Measures of Learning
Assignments, exams, projects, papers
Indirect Measures
Questionnaires, inventories, interviews
- Did the course cover these objectives?
- How much did your knowledge increase?
- Did the teaching method(s) help you
learn?
- Did the assignments help you learn?
GOOD ASSESSMENT INCLUDES BOTH
© TWBANTA-IUPUI
51. NILOA SURVEYPROGRAM LEVEL
APPROACHES
1. Portfolios (80% in at least 1 area)
2. Performance assessments
3. Rubrics
4. External judges
5. Student interviews
6. Employer surveys
© TBANTA-IUPUI
52. STUDENT ELECTRONIC PORTFOLIO
•Students take responsibility for
demonstrating core skills
•Unique individual skills and achievements
can be emphasized
•Multi-media opportunities extend
possibilities
•Metacognitive thinking is enhanced
through reflection on contents
- Sharon J. Hamilton
IUPUI
© TWBANTA-IUPUI
53. More use of RUBRICS
locally developed
VALUE from AAC&U
© TWBANTA-IUPUI
55. ACCOUNTABILITY REPORT
•85% achieve Outstanding ratings in writing
as defined . . .
•78% are Outstanding in applying knowledge
and skills in internships
•75% are Outstanding in delivering an oral
presentation
© TWBANTA-IUPUI
57. E-PORT CHALLENGES
•Reliability of rubrics
•Student motivation if used for assessment
(Barrett, 2009)
•Differences in topics for products to be
evaluated
(Sekolsky & Wentland, 2010)
© TWBANTA-IUPUI
58. OBSTACLES TO USING
PERFORMANCE-BASED MEASURES
•Defining domains and constructs
•Obtaining agreement on what to measure
and definitions
•Defining reliability and validity
•Creating good measures
- Tom Zane
WGU
© TWBANTA-IUPUI
59. WILL IT TAKE 80 YEARS . . . ?
3 Promising Alternatives
E portfolios
Rubrics
Assessment
communities
- Banta, Griffin, Flateby,
Kahn
NILOA Paper #2 (2009)
© TWBANTA-IUPUI
60. TEAGLE ASSESSMENT SCHOLARS
•study assessment data
•visit campuses
•talk with 3-4 groups of students
•talk with faculty about their campus
assessment data
- Charles Blaich
Wabash College
© TWBANTA-IUPUI
61. NATIONAL SURVEY OF STUDENT ENGAGEMENT
AT
~ HOPE COLLEGE ~ Seniors
Freshmen
2003 STUDENTS STUDYING LESS THAN
% 38% 39%
2010 1021% /WEEK
HOURS
28%
© TWBANTA-IUPUI
62. HOPE COLLEGE
•Considered data over supper
•Proposed solutions
•Conducted student focus groups
•Shared all data with all faculty
•Departments dedicated a meeting to prepare
strategies to increase rigor
© TWBANTA-IUPUI
63. NATIONAL INSTITUTE FOR
LEARNING OUTCOMES ASSESSMENT
•Surveys
2009 CAOs
2011 Departments
•Occasional Papers
•Website review, standards
•Quick comments (monthly)
•Calendar of events
© TWBANTA-IUPUI
65. NEW LEADERSHIP ALLIANCE
FOR STUDENT LEARNING AND ACCOUNTABILITY
- Presidents’ Alliance
- Certification Process
Set ambitious goals for learning
Gather evidence of learning
Use evidence to improve learning
Report evidence and results
© TWBANTA-IUPUI
66. BUILD ASSESSMENT INTO VALUED
PROCESSES
1. Assessment of learning
2. Curriculum review and revision
3. Survey research
4. Program review
5. Scholarship of Teaching & Learning
6. Evaluation of initiatives
7. Faculty development
8. Promotion & tenure
9. Rewards and recognition
© TWBANTA-IUPUI