1. Assessing and Assuring
Graduate Learning Outcomes
National Fora
Brisbane Melbourne Sydney Adelaide
Perth
audio link to this presentation
2. AAGLO National Fora
2012
WELCOME
●
Forum program
● 9:15 Registration
● 9:30 Opening and brief overview of AAGLO Project
● 9:45 Keynote address
Professor Trudy W. Banta
● 10:45 Questions
● 11:10 Morning tea
● 11:30 Presentation of AAGLO interview findings
● 12:00 Workshop
Response to issues raised (15 minutes - Trudy Banta)
● 1:30pm - 2:15pm Lunch
3. Forum Objectives
For participants to engage with colleagues
in:
● discussion of practice and issues in the assessment and assurance
of graduate learning outcomes in the Australian higher education
context
● developing informed opinion to contribute to institutional decision-
making at various levels
● forming collaborations for further investigation and innovation in this
area.
●
4. The AAGLO PROJECT
● Funded in 2010 under the ALTC Strategic
Priority Project Scheme to investigate
● The types of assessment tasks most likely to
provide convincing evidence of student
achievement of or progress towards graduate
learning outcomes? and,
● The processes that best assure the quality of
assessment of graduate learning outcomes.
●
5. ● Project team:
● Simon Barrie (The University of Sydney)
● Clair Hughes (The University of Queensland)
● Geoffrey Crisp (RMIT)
● Anne Bennison – Project Manager (The University of Queensland)
● Timeline: Jan 2011 – August 2012
● International reference group
● Broad in scope and range of activities
Project website http://www.itl.usyd.edu.au/projects/aaglo/
6. Project activities and outcomes to
date
Activities Outcomes
Situational analysis “Related projects” identified and documented,
communication with project and institutional leaders
Literature review Summary papers
1: The ALTC AAGLO project and the international standards agenda
Consultation with 2: Assurance of graduate learning outcomes through external review
reference group 3: Challenges of assessing Graduate Learning Outcomes (GLOs) in
work-based contexts
Visits to international 4: Standardised testing of graduate Learning Outcomes in Higher
Education
centres of excellence 5: Approaches to the assurance of assessment quality
Conference roundtables 6. Assessment policy issues in the effective assessment and assurance
of GLOs
Endnote library
Participation in national Response to government discussion paper on the
debates Assessment of Generic Skills
Co-authorship of “Mapping learning and teaching
standards in Australian Higher education: An issues
and options paper”
Interviews Findings
7. Keynote
Trudy Banta - pioneer in outcomes
assessment
● Professor in Higher Education
● Senior Advisor to the Chancellor for
Academic Planning and Evaluation at
Indiana University - Purdue University
(IUPUI)
● founding editor of “Assessment Update”
● numerous publications on outcomes
assessment.
http://www.planning.iupui.edu/103.html
10. AAGLO Interviews
● Ethical approval
● Telephone interviews
● Participants selected through LTAS project and in consultation with LTAS
scholars
● 84 invitations to academics across 7 disciplines (Accounting/Business:
Chemistry: Drama and performance: Engineering: History: Law: Veterinary
Science) representing LTAS demonstration clusters and range of university
types and locations throughout Australia
● 48 interviews conducted of approximately one hour (2 partial)
● broad coverage of assessment and assurance practice and issues
● Nvivo software for analysis and storage of data.
12. We interviewed ......
● 30 male and 18 female academics
● academics from 26 institutions
● 15 Deans /Associate Deans
● 12 with program-level responsibilities
● 36 with single course responsibilities
● 41 who taught in one or more courses
● 17 involved in disciplinary initiatives around assessment and standards such
as LTAS project
● 10 involved in other national projects
● 3 LTAS Discipline Scholars
● 4 Quality Verification System (QVS) and 2 other external reviewers
● 4 past or current members of disciplinary accreditation panels
● several academics who had published in this area
17. Other task features
Task relationship patterns within a course
Cumulative (a series of related tasks combined as a single product) –
9
Linked – 15 (successful completions of a task indicated likelihood of
success in following tasks)
Repetitive -3 (same task repeated several times to develop expertise)
Independent -16 (different tasks assessed different components of a
course)
Active student role
18. Effective task characteristics
● Multiple, related stages
● Aligned with course learning objectives – incorporation of
TLOs such as self-organisation, management, lifelong
learning: reflect on social, cultural and ethical issues: apply
local and international perspectives; plan ongoing personal
and professional development.
● Blurred distinction between learning and assessment
activities
● Activities and text types characteristic of profession
● Authentic contexts, roles and audiences
● 12 real-life
● 25 lifelike (definitional range)
● Careful group task design, management and grading
● Active role that developed student capacity for self-
assessment and self-directed learning
20. Task quality assurance practice
Pre-implementation Post-implementation
● Assessment policy ● Formal evaluation processes (24)
● Other related policy (e.g. Quality incorporating:
Assurance) ● review of student satisfaction surveys
● Mapping of program curriculum inputs ● monitoring by boards of examiners or
(25) or program assessment (5) other committees
● audits and reviews
● Formal approval processes for new ● documentation and reporting of
and revised assessment by variously responsive action by course and
titled committees program coordinators and sometimes
● Course level (3) individual teaching staff.
● Program level (14) ● Student representation on faculty TL
● Faculty or school level (26) Committees (6)
● Institutional level (8) ● Response to student complaints (1)
● Multiple level (15) ● Informal only (6)
● Some approval for examinations only ●
● Informal only (5)
●
21. Assuring task quality
● Approval from a whole-of-program perspective
● Approval for significant change as well as for new
assessment tasks
● Effort spent prior to implementation to save effort after
implementation
● Where multiple approval is required at least one level
provides feedback beyond policy compliance
● Consequential review and evaluation procedures –
action required and reported
● Institutional data collection and reporting support the
evaluation process
● Inclusive – all have some level of responsibility for
assessment quality
●
22. HOW IS THE QUALITY OF TASK
JUDGEMENTS ASSURED?
23. The basis of judgements
● Course LOs based on institutional
graduate attributes (28), personal
experience (17) and accreditation
requirements (12)
● Common practice to provide criteria with
marks, criteria and standards rubrics or
marking guidelines
● Links between wording of course LOs and
assessment criteria often unclear
●
24. Assuring standards
Post-judgement –
Pre-judgement -
(consensus moderation)
(calibration)
Examples ● No moderation rare and usually if
only single marker
● Workshop for staff to induct them into the standard ● Moderation could be informal.
expected for the award of different grades ● The teams marking the
assignment often sit in the same
● Project work is required at each level of the program
room to mark they don’t have to
with about 70 academics involved in the
assessment process. As part of their induction they but normally do so as this is
are provided with a training session during which another opportunity for informal
everyone marks particular group reports from moderation.
previous years and displays their mark on yellow
paper on the reports around the room to enable ● Consensus moderation most
them to compare their standards with those of common approach (85 comments), e.
others
g.
● Preliminary marking of selected papers, discussion ● discussion to reach agreement
of the application of criteria and standards prior to ● double marking
marking of remainder of papers ● random checks by coordinator
● Much marking is undertaken by sessional staff.
● Some instances (5) of normal
distribution requirement with rescaling
They are gathered together andR. (2012). Assuring comparability of achievement standards in higher
Sadler, D. the criteria are
explained. The unit team pick out a small numbermoderation to calibration. Manuscript submitted for
education: From consensus of of ‘outliers’ or justification required
publication.
assignments randomly to mark and discuss. After
25. Assuring judgement quality
● Shared standards at program and course level
● Effort spent to establish standards prior to
judgements to save effort after judgements
have been made
● Criteria and standards basis for both
assessment judgements and moderation
● Inclusive – all have some level of
responsibility for assessment judgements
including casual staff
● Resourcing to support effective calibration and
moderation processes – rescaling cheaper but
less effective as professional development
26. HOW IS STUDENT PROGRESS
REPORTED ACROSS THE YEARS OF A
PROGRAM?
27. Recording student GLO
progress through a
program
● Few examples of progressive recording of student GLO development
● Most common was aggregation of course grades in summary numerical
forms such as those required for progressive GPA calculation
● Some year level (horizontal) approaches
● Mapping of inputs on assumption that coverage of GLOs in combination
with aligned assessment a logical proxy measure of progress. Challenged in
institutions with standardised grade cut-offs such as 50% “Pass” grades.
● 3 reports of informal approaches with small student cohorts (e.g. team
meetings)
● Reservations about ePortfolios effectiveness as practice inconsistent
● Most reported monitoring student progress as a current priority – wait and
see attitude to possible TEQSA requirements
●
28. CAN YOU IDENTIFY EXAMPLES OF
QUALITY IMPROVEMENT RESULTING
FROM QUALITY ASSURANCE
PRACTICES?
29. Quality improvement
Examples
● Nomination of task role and audience for report
after participation in ‘Achievement Matters’ project
● Lecturer feedback more challenging after
discussion and observation of feedback provided
by colleagues
● Tutor provision of annotated samples of work to
students to facilitate understanding of criteria and
standards
All example attributed to quality assurance
processes that encouraged and facilitated dialogue
with colleagues