Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Excellence in teaching
1. Excellence in Teaching – the road towards AHELO Prof. Dr. Dirk Van Damme Head of the Centre for Educational Research and Innovation – OECD/EDU
2. Outline How to assess the teaching function: approaches and proxies Analysis of how teaching is assessed and how it impacts on ranking in the THEWUR AHELO: an attempt to measure learning outcomes Some ‘political’ conclusions 2
4. How to assess the teaching function of universities? Students and graduates Participation rates in age cohort Participation of specific groups Success and failure rates within programmes Graduation rates In age cohort Within certain time perspective 4
5. Growth in university-level qualificationsApproximated by the percentage of the population that has attained tertiary-type A education in the age groups 25-34 years, 35-44 years, 45-54 years and 55-64 years (2007)
6. How to assess the teaching function of universities? Labour market success of graduates Effective transition to the labour market Unemployment after certain time Employment and career prospects after 5y Economic return on investment Higher return = higher reward of degrees on the labour market Private and public net value of degree 6
7. How successful are students in moving from education to work?Proportion of 25-29 year-olds with tertiary degree working low skills occupations Less than ¼ of tertiary graduates student do not find a job that matches their educational level EAG 2010 C3.7
8. How to assess the teaching function of universities? Academic quality, staff/student ratio Despite some attempts, reports of quality assurance agencies provide little basis for measurement The Times Higher Education WUR uses staff/student ratio as a proxy for quality the higher the ratio, the better the teaching/learning environment 8
9. How to assess the teaching function of universities? Academic quality, staff/student ratio Other proxies used in THEWUR2010 Ratio PhD students versus undergraduate students More PhD = more research intensive teaching Ratio PhD students or PhDs versus academic staff Quality of educational infrastructure, measured by ratio of income to staff 9
10. How to assess the teaching function of universities? Academic reputation Reputation is a very important driver in the dynamics of higher education systems: the ‘reputation race’ (Frans van Vught) Measures as a component of rankings E.g.: Times Higher Education WUR: Academic Reputation Survey, a poll of some 14000 scholars for the 2010 THEWUR ranking, responsible for 50% of the Teaching score 10
11. How to assess the teaching function of universities? Academic reputation Reputation is a very questionable but extremely influential and powerful indicator Outcomes are very skewed: ‘the winner takes all’, like in sports or pop music Time-lag Popularity perception is different from quality Distorted by reputation in research, and often limited to certain disciplines 11
12. How to assess the teaching function of universities? Consumer satisfaction approaches CHE approach for example relies heavily on student questionnaires covering various aspects of the educational, but also other parts of their experience 12
14. Analysis of the THEWUR2010 The outcomes of the Times Higher Education World University Rankings provide one of the best available sources for analysing the teaching function via the proxies used 50% reputation survey 15% undergraduate students to staff ratio 7.5% PhD degrees to undergraduate degrees ratio 20% PhD degrees to staff ratio 7.5% income versus staff ratio The 100% teaching counts for 30% in the ranking 14
15. The structure of the top 200according to the THEWUR 2010 Overall Score Peaks and Plateaux
17. The structure of the top 200according to the THEWUR 2010 Overall Score ?
18. The structure of the top 200Teaching – Research – Citations Std=16.99 Std=16.29 Std=14.63
19. Analysis of the THEWUR2010 The teaching score has the flattest profile and the lowest variation in scores In fact, in teaching (as measured in the THEWUR) most universities are not so different The research function has more variation The citation score has the highest variation Are these artefacts of the measurement methodology or of the reality?
21. Analysis of the THEWUR2010 High correlation between research and teaching of .86 as measured in the THEWUR good research universities are in general also good teaching universities, but with important exceptions But low correlation between research and citations and between teaching and citations: both .28
22. Analysis of the THEWUR2010 Function coherence (as measured by absolute difference between teaching and research scores) Is rather high over the whole ranking list Is higher in North America than in Europe or Asia Suggesting that in the upper part of the global HE system excellence in teaching goes hand in hand with excellence in research Binding the two functions still is at the heart of the academic mission and identity But probably that’s also a consequence of the choice of indicators used
24. Analysis of the THEWUR2010 But closer analysis reveals some interesting findings Ranked on the teaching dimension, the capacity to translate research into citations output increases when you move down the rank Meaning that less teaching oriented universities, have a slightly higher efficiency in research But with an enormous variation
26. Preliminary conclusions We definitely need much better indicators to understand and measure the teaching function of universities Resisting the development of sound measurement of teaching implicitly confirms the research dominance in rankings and reputation Indicators need to go into the heart of the teaching-learning interaction and be output, not input nor process, oriented 26
30. Provides faculties, students and government agencies with a more balanced assessment of HE quality – not just research-driven rankings!
31. No sacrifice of HEIs’ missions or autonomy in their subsequent efforts to improve performance28
32. The feasibility study at a glance To evaluate whether reliable cross-national assessments of HE learning outcomes are scientifically possible and whether their implementation is feasible. Goal? Not a pilot, but rather a research approach to provide a proof of concept and proof of practicality. What? The outcomes will be used to assist countries to decide on the next steps. Why? Phase 1 - Development of tools: August 2010 to April 2011 Phase 2 - Implementation: August 2011 to December 2012 When? Data will be collected from a targeted population of students who are near, but before, the end of their first 3-4 year degree. Who? OECD’s role is to establish broad frameworks that guide international expert committees charged with instrument development in the assessment areas. How? 29
37. But reflect cumulative learning outcomes and less relevant to the subject-matter competencies that are familiar to HEIs, departments or faculties30
40. Feedback to HEIs: performance profiles and contextual data, with their own results and those of other HEIs (anonymously)31
41. AHELO: 4 strands of work Generic skills strand Discipline strand in Engineering Initial work on defining expected learning outcomes through ‘Tuning’ approach. + contextual data International pilot test of the US Collegiate Learning Assessment (CLA), to assess the extent to which problem-solving or critical thinking can be validly measured across different cultural, linguistic and institutional contexts. + contextual data Research-based “Value-added” or “Learning gain” measurement strand Discipline strand in Economics Several perspectives to explore the issue of value-added (conceptually, psychometrics), building on recent OECD work at school level. Initial work on defining expected learning outcomes through ‘Tuning’ approach. + contextual data 32
42. AHELO tests of instruments 3 assessment instruments Generic Skills Discipline-specific skills: Engineering Economics 2 contextual surveys Contextual indicators and indirect proxies of quality: Student survey Faculty survey 33
43. Work to be undertaken in 2 phases Generic Skills Framework Economics Framework Engineering Framework Frameworks Phase 1 -Initial proof of concept Instrument development & small-scale validation Generic Skills Instrument Economics Instrument Engineering Instrument Contextual dimension surveys Phase 2 -Scientific feasibility & proof of practicality Project management, survey operations and analyses of results Implementation 34
55. Practices in teaching and learning such as students’ perceptions of academic challenge, clear sense of direction, quality of effort, student-faculty relationship,…
63. Feedback to HEIs: performance profiles and contextual data, with their own results and those of other HEIs (anonymously)43
64. A study with great potential… … Diagnosis is the basis of any improvement Better information on student learning outcomes is the first step to improve teachingand learning for all: Provide evidence for national and institutional policy and practice Equip institutions with the method and tools to improve teaching … Shaping the future of higher education to address key challenges Equity Build fairer higher education systems, promoting success for all Responsiveness Better connect higher education and society Effectiveness Help students make informed choices to ensure success for all Impact Foster international transparency and mobility 44
65. Some ‘political’ conclusions We need to balance measurement of universities’ qualities by acknowledging teaching Indicators and measurements are never neutral, but become benchmarks and policy goals We need to reward institutions for the added value in the essence of the teaching-learning No doubt, the added value in terms of knowledge and skills is the essence The indicators and measurement systems in higher education have become inferior to those for school education, so progress is urgently needed 45
66. Thank you ! dirk.vandamme@oecd.org www.oecd.org/edu/ceri www.oecd.org/edu/ahelo 46