SlideShare uma empresa Scribd logo
1 de 18
Baixar para ler offline
National Institute for Learning Outcomes Assessment
October 2011

Assessing Learning in Online Education:

The Role of Technology in Improving Student Outcomes
Matthew Prineas and Marie Cini
Foreword by Peter Ewell

nowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innovation success ingenuity
ntellect curiosity challenge create achievement connection self-reflection educate action under stand communicate listen learn access quality innov
uccess ingenuity intellect curiosity challenge knowledge accountability connection under stand communicate listen learn access quality innovation s
ngenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate action under stand commun
uriosity challenge create achievement connection self-reflection curiosity challenge create achievement connection self-reflection knowledge accou
onnection self-reflection educate action under stand communicate listen learn access quality innovation success ingenuity intellect curiosity challen
ducate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountability connection self-reflection educate a
nder stand communicate curiosity challenge create achievement connection self-reflection under stand communicate listen learn access quality acti
ction under stand communicate listen learn action under stand communicate listen learn access quality innovation success ingenuity intellect curios
hallenge knowledge accountability connection access quality self-reflection curiosity challenge create achievement learn access quality innovation
ngenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate action under stand knowled
ccountability connection self-reflection educate action under stand communicate listen learn access quality innovation success ingenuity intellect c
hallenge connection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innova
uccess ingenuity challenge create achievement connection self-reflection educate action under stand connection self-reflection under stand commun
sten learn access quality action create achievement connection self-reflection educate action under stand communicate listen learn access quality
uccess educate action communicate listen learn access quality action educate action under stand communicate educate innovation success self-refl
nowledge accountability communicate listen learn achievement connection self-reflection educate action under stand communicate listen learn acc
nnovation success ingenuity intellect access quality innovation success self-reflection curiosity challenge create achievement connection self-reflec
nder stand educate action under stand communicate listen learn action under stand communicate listen learn access quality innovation success inge
uriosity challenge create achievement connection self-reflection under stand communicate listen learn access quality action create achievement con
elf-reflection educate action under stand communicate listen learn access quality innovation success educate action communicate listen learn acce

Oc c a s i o n a l Pa p e r # 1 2
lear ningoutcomesassessment.org

National Institute for Learning Outcomes Assessment  |  1    
About the Authors
Matthew Prineas
Matthew Prineas is the Academic Director for English, Literature, and
Humanities in the School of Undergraduate Studies at University of Maryland University College. He received his PhD in English language and literature from the University of Rochester and an M.A. in English from the
University of Michigan, Ann Arbor. Previously, he worked as an administrator at the National Endowment for the Humanities, where he coordinated
curriculum development for EDSITEment, an online resource for K-12
teachers across the country. From 1995-2000, he was a faculty member in
the English department at Idaho State University, and has published articles
on Milton, Henry Vaughan, Ben Jonson, and other seventeenth-century
authors.
Marie Cini
Marie Cini is the Vice Provost and Dean of the Undergraduate School at
University of Maryland University College, where she oversees 32 baccalaureate programs serving 67,000 students worldwide. She earned a PhD
and a master’s degree in social psychology from the University of Pittsburgh,
as well as a master’s degree in counselor education from Pennsylvania State
University. Before UMUC, she was an associate vice president for Academic
Affairs and interim dean of the School of Management at City University
of Seattle. During her tenure there, she reorganized and led the school’s
e-campus initiative, while also establishing common curriculum standards,
enhancing the Prior Learning Assessment process and linking academic
affairs staff in the U.S. more closely with those in the international arena.
Dr. Cini has published and presented extensively on the topics of authentic
assessment and academic integrity, leadership development in adult learners,
retention and adult learners, group newcomers and innovation, and program
development across cultures.

Contents
Abstract . . . 3
Foreword . . . 4
Assessing Learning in Online
Education . . . 6
Learning Analytics: Designing DataDriven Interventions to Support
Student Learning . . . 7
Online Mastery Learning: Closing
the Gap Between Assessment and
Learning . . . 8
Courses and Programs that Learn . .
.9
Changing Roles for Faculty . . . 12
Conclusion . . . 12
References . . . 14
NILOA
National Advisory Panel . . . 16
About NILOA . . . 17
NILOA Staff . . . 17
NILOA Sponsors . . . 17

In this compelling new NILOA Occasional Paper, Matthew
Prineas and Marie Cini argue persuasively not only that the
connections between online education and learning outcomes
assessment are deep but also that the mediated settings provided
by online education have the potential to significantly improve
assessment and its capacity to improve teaching and learning.
Peter T. Ewell
Vice President, NCHEMS
Senior Scholar, NILOA

The ideas and information contained in
this publication are those of the authors
and do not necessarily reflect the views of
Carnegie Corporation of New York, Lumina
Foundation for Education, or The Teagle
Foundation.

National Institute for Learning Outcomes Assessment  |  2    
intellect curiosity challenge create achievement connection self-reflection educate action under stand communicate listen learn access qualit
innovation success ingenuity intellect curiosity challenge knowledge accountability connection under stand communicate listen learn access qualit
innovation success ingenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate actio
under stand communicate curiosity challenge create achievement connection self-reflection curiosity challenge create achievement connection sel
reflection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innovation succes
ingenuity intellect curiosity challenge educate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountabilit
connection self-reflection educate action under stand communicate curiosity challenge create achievement connection self-reflection under stan
communicate listen learn access quality action educate action under stand communicate listen learn action under stand communicate listen learn acces

Abstract

Assessing Learning in Online Education
The national learning outcomes assessment (LOA) movement and
online learning in higher education emerged during roughly the same
period. What has not yet developed is a sophisticated understanding
of the power of online learning and its concomitant technologies to
change how we view, design, and administer LOA programs. This
paper considers how emerging techniques, such as data mining and
learning analytics, allow the use of performance and behavioral data
to improve student learning not just for future iterations of a program
but in real time for current students. Also considered are powerful
learning methodologies which predate online learning but have found
renewed utility when coupled with new technologies for assessing and
assisting student learners. In this paper, we postulate that technology
will enable educators to design courses and programs that learn in the
same way that individual students learn, and we offer some conditions
that we believe are important to further this goal. We conclude with a
consideration of how the faculty role will necessarily change as a result
of these advances in our understanding of using technology to improve
learning outcomes.

National Institute for Learning Outcomes Assessment  |  3    
intellect curiosity challenge create achievement connection self-reflection educate action under stand communicate listen learn access qualit
innovation success ingenuity intellect curiosity challenge knowledge accountability connection under stand communicate listen learn access qualit
innovation success ingenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate actio
under stand communicate curiosity challenge create achievement connection self-reflection curiosity challenge create achievement connection sel
reflection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innovation succes
ingenuity intellect curiosity challenge educate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountabilit
connection self-reflection educate action under stand communicate curiosity challenge create achievement connection self-reflection under stan
communicate listen learn access quality action educate action under stand communicate listen learn action under stand communicate listen learn acces

Foreword

Although online education for college and university study has experienced remarkable
growth over the past two decades, the links of online education to the assessment
of student learning outcomes have been largely overlooked. While the ongoing “no
significant difference” debate—which deploys assessment techniques to determine
whether computer-mediated approaches are “as good as” face-to-face classroom
delivery—may seem an exception, in my view, that debate has been fruitless. In this
compelling new NILOA Occasional Paper, Matthew Prineas and Marie Cini argue
persuasively not only that the connections between online education and learning
outcomes assessment are deep but also that the mediated settings provided by online
education have the potential to significantly improve assessment and its capacity to
improve teaching and learning.
Online education and assessment practice in higher education evolved
contemporaneously, as Prineas and Cini observe. Assessment’s beginning is usually
dated to the mid-1980s, when “distance education” consisted largely of paper-andpencil correspondence courses and what now seem relatively primitive one-way video
communications. Now, of course, online education is multifaceted, sophisticated,
and almost ubiquitous. Assessment, meanwhile, has developed from the ad hoc use
of available standardized tests like the GRE and the ACT to the deployment of a
powerful array of instruments, portfolios, and rubrics—many of which are featured on
the NILOA web page. Yet while online education and assessment in higher education
developed in parallel, these two “movements” intersected little during that time. Now
they are poised to do so, Prineas and Cini emphasize, and both will benefit.
The authors offer three broad observations to support this claim. The first centers on
the ability of online learning management systems to harvest data not just on what
students learn but also on students’ every learning activity. Historically, one of the major
impediments in using assessment results to improve instruction has been the lack of data
about student learning behaviors and the inability to connect analytically the behavioral
evidence for further educational development. Instruments like the National Survey of
Student Engagement (NSSE) were designed to gather such data in conventional learning
environments. In online learning environments, however, these data are collected
automatically in learning management systems that obtain an electronic record of
everything the student does online. While making sense of all these data will certainly
be a challenge, data mining techniques and what Prineas and Cini term “data analytics”
are already guiding improvement.
The authors’ second broad observation is about the growing dominance in the design of
online education programs of the asynchronous mastery learning approach. Learning
assessment is integral to the design of these programs because students advance from
module to module only after demonstrating what they know and can do. In contrast
to most assessment programs in traditional academic programs, this means that instead
of assessing just a small sample of students, these programs assess every student against
established learning objectives. What is more, these programs require no additional data
collection, no “add-on” testing, nor the application of rubrics to rescore already graded
student work, as is common in portfolio-based assessment approaches.

National Institute for Learning Outcomes Assessment  |  4    
intellect curiosity challenge create achievement connection self-reflection educate action under stand communicate listen learn access qualit
innovation success ingenuity intellect curiosity challenge knowledge accountability connection under stand communicate listen learn access qualit
innovation success ingenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate actio
under stand communicate curiosity challenge create achievement connection self-reflection curiosity challenge create achievement connection sel
reflection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innovation succes
ingenuity intellect curiosity challenge educate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountabilit
connection self-reflection educate action under stand communicate curiosity challenge create achievement connection self-reflection under stan
communicate listen learn access quality action educate action under stand communicate listen learn action under stand communicate listen learn acces

Foreword continued

The authors’ third broad observation is about the emerging best practices for building
effective learning environments online—practices that are analogous to the famous Seven
Principles of Good Practice for Undergraduate Education, articulated by Arthur Chickering
and Zelda Gamson some 25 years ago, but that are being applied in the new milieu of
virtual learning. Like the original principles, these practices are intended to guide good
pedagogy, but they also require an approach to curriculum design that is thoroughly
intentional and that incorporates learning outcomes from the outset. Consistent with
the mastery learning approach, this is a far different situation from that of conventional
assessment—in which learning outcomes are frequently developed after the fact and
applied to evaluate the effectiveness of courses never explicitly designed to fit together.
Another common feature of online education that is favorable to the development
of effective assessment, the authors note, is disaggregated faculty functions. While
instructional design, content delivery, advising and mentoring, and the assessment of
student work are all performed by the same individual in the conventional classroom,
in online settings these functions are frequently undertaken separately. Additionally
favorable to the development of effective assessment is the fact that, in online education,
assessment is a distinct activity of faculty, providing them opportunities to undertake
assessment activities more deeply, more thoroughly, and professionally than in their
conventional grading activities.
Because I have seen much of what Prineas and Cini describe through the lens of a
particular institution—Western Governors University (WGU), which my organization
helped design and where I continue to serve as a council member—I am delighted to
introduce this paper. WGU is based entirely on the mastery learning approach and
has exactly the kind of “unbundled” faculty functions the authors describe. While
WGU is only just beginning to harness “learning analytics,” its remarkable growth and
development owe much to the fact that its structure and operations are based almost
entirely on the principles these authors articulate here.
Peter T. Ewell
Vice President, National Center for Higher Education Management Systems
(NCHEMS)
Senior Scholar, NILOA

National Institute for Learning Outcomes Assessment  |  5    
communicate listen learn access quality action educate action under stand communicate listen learn action under stand communicate listen learn acces
quality innovation success ingenuity intellect curiosity challenge knowledge accountability connection access quality self-reflection curiosity challeng
create achievement learn access quality innovation success ingenuity self-reflection educate action under stand intellect knowledge accountabili
connection self-reflection educate action under stand knowledge accountability connection self-reflection educate action under stand communicat
listen learn access quality innovation success ingenuity intellect curiosity challenge connection knowledge accountability connection self-reflectio
educate action under stand communicate listen learn access quality innovation success ingenuity challenge create achievement connection sel
reflection educate action under stand communicate listen learn achievement connection self-reflection educate action under stand communicate liste
learn access quality innovation success ingenuity intellect access quality innovation success self-reflection curiosity challenge create achievemen

Assessing Learning in Online Education:
T h e R o l e o f Te c h n o l o g y i n I m p r o v i n g
Student Outcomes
Matthew Prineas and Marie Cini
Online education, defined as a platform for delivering educational content and
facilitating instructor-student interaction over a computer network (Shelton &
Saltsman, 2005, pp. 3–4), came of age in the 1990s and grew rapidly over the
next decade (Allen & Seaman, 2010; U.S. Department of Education, 2003;
U.S. General Accounting Office, 2002). During roughly the same period,
increasing calls for accountability in higher education led to the development
of measures to establish the value of higher education in general, through the
mechanism known as “learning outcomes assessment” (LOA). Besides historical proximity, these movements—online education and LOA—shared important features: both represented the introduction of disruptive concepts into the
traditional face-to-face, faculty-centric classroom, and both raised questions
about the efficacy of traditional models of teaching and learning measurement
that had remained essentially unchanged for centuries. Yet, for the most part,
the two movements progressed independently. Early LOA efforts were focused
on the traditional or face-to-face classroom, with online courses assessed only
to determine whether the learning outcomes matched the face-to-face “standard.” Even today, LOA efforts tend to use the same approach, asking the same
questions about face-to-face classrooms as about online classrooms. What has
not emerged is a more sophisticated understanding of the power of online
learning to change how we view, design, and administer LOA programs.

Technological developments have
the potential to re-energize LOA
efforts for both traditional and
online education.

Online education exists because technology made it possible. Technology is
also making possible an increasing ability to track, assess, and respond to the
behaviors and mastery levels of students in online courses with far greater
depth and rapidity than ever before. Researchers are focusing on the large
amounts of student data that can be gathered and archived in online courses
and programs in order to “mine” the data for patterns that can assist educators to improve learning outcomes. Referred to as “learning analytics” in
the higher education setting, these approaches enable faculty and course
designers to make rapid changes in instructional practices and curriculum,
and they empower students to make informed decisions about their learning
behaviors and course choices. Emerging technologies are also reinvigorating
powerful instructional methodologies such as mastery learning that in some
cases predate the emergence of online education by decades (Scrima, 2009).
These technological developments have the potential to re-energize LOA
efforts for both traditional and online education. Although LOA efforts have
grown substantially over the past decade, most institutions have used assessment results simply to satisfy accreditation pressures, unfortunately, instead of
using them to improve student learning (Kuh & Ikenberry, 2009). Pressures
from the public and from the federal government to be more accountable for
the outcomes of higher education no longer allow us to pay only lip service to
student learning. The promise of learning outcomes assessment is that through
continuous improvement of curriculum and instruction, learning achievement
for all students should increase. Online education and its concomitant technologies promise better ways to help all our students reach their full potential.

National Institute for Learning Outcomes Assessment  |  6    
In this paper we describe the power of learner analytics to improve student
learning not just for future iterations of a program but in real time for current
students. We also explore several powerful instructional methodologies that
have found renewed utility in the online environment. We postulate that technology will enable educators to design courses and programs that learn in the
same way individual students learn, and we offer some conditions that we
believe are important to further this goal. Finally, we present some concluding
thoughts about the ways in which the faculty role will necessarily change as a
result of these advances in our understanding of using technology to improve
learning outcomes.

Learning Analytics: Designing Data-Driven Interventions
to Support Student Learning
Colleges and universities typically offer online classes through course management system (CMS) software, which provides the virtual classroom space for
faculty and students to interact over the course of a semester (Watson, 2007).
These interactions are tracked and stored, making CMS an important potential source of data related to student learning. Using current CMS software—
which delivers curricular content in various formats including text, audio, and
video—students and faculty can conduct synchronous or asynchronous discussions, faculty can administer quizzes and exams, students can submit papers
and assignments, and faculty can provide grades and other forms of feedback.
Within a single online course section, then, an individual faculty member has
access to a wealth of quantitative and qualitative data about student engagement, for example, number of words posted, total number of posts, average
length of posts, and the text of every student comment or question from the
first week’s discussion to the last. The sophistication and reach of CMS storage
and tracking features continue to grow. Additional kinds of data gathered by
CMS software may include how frequently students log in to their courses; the
length of time they spend on particular tasks such as reading a content page,
taking self-directed quizzes, and so on; and customized statistical analyses of
individual or group responses to quizzes. All such transactions are archived,
during and after the semester, for every online section of every course offered
at an institution. Thus, institutions with online enrollments in the thousands
or tens of thousands come to possess a vast repository of data with potential
applications in the assessment student learning.

In this paper we describe the
power of learner analytics to
improve student learning not just
for future iterations of a program
but in real time for current
students.

An understanding is growing in higher education of how to utilize these data
for improved student outcomes. Over the last several years, a movement has
emerged that attempts to uncover patterns in data stored on the CMS and
to design interventions accordingly. Because these applications are so new,
the terminology of the field is still in flux; in recently published papers and
conference presentations, it is common to see the terms “data mining,” “data
analytics,” “academic analytics,” and “learning analytics” used to describe a
family of related techniques and technologies for finding pattern in large data
sets (for example, see the annual conference program of the Western Cooperative for Electronic Telecommunications [WCET] at http://wcetconference.
wiche.edu/). Such techniques have been in existence for years in the business
world, typically under the rubric “data analytics” or “business intelligence,”
and have become more sophisticated than ever in profiling customer behavior
(as anyone who receives recommendations from Google, Amazon, or Netflix
knows). A recent and useful effort to define terms appropriate for the higher
education setting has distinguished between academic analytics, in which
data are examined based on a particular hypothesis or research question, and
data mining, which describes a family of techniques, not necessarily driven
by a hypothesis or question, for detecting patterns in large bodies of data.
Both approaches have applications in higher education (Baepler & Murdoch,
2010). For the purposes of this paper, we refer to all of these techniques as
“learning analytics.”

National Institute for Learning Outcomes Assessment  |  7    
Learning analytics have recently received a great deal of attention in their role
of helping educators focus efforts and resources when designing interventions
for student success. For example, researchers have used data analytics to discern
which types of students are most likely to be at risk of failure at various stages
in their academic journey. Once institutions identify these student groups,
they can design interventions such as supportive services, enhancements to
curricula, or improved instructional practices. A simple form of intervention,
based on patterns identifying at-risk students in the online classroom, is the
use of automatic emails sent to students to flag key problems such as a student’s
a) not posting in a discussion thread by a certain date, b) not submitting an
assignment, and/or c) not engaging frequently enough in online class discussion. This basic form of tracking can be done independently of the instructor;
the message can be automatically sent to a student, noting the missed work
or interaction, along with the reason why it is important for the student to
engage or re-engage with the course.
In sophisticated course management systems these messages can even share
the knowledge that a certain percentage of students who do not engage in
these targeted ways are “likely to fail,” thus motivating students to either
engage or seek help. A pioneering example of such an early system designed
and used by Purdue University—Signals—draws from 20 data points. The
Signals algorithm synthesizes quiz and test data, but it can also incorporate
time spent on task and other behavioral measures. These data are spliced
together and presented back to the student as well as to the faculty member
in the intuitive format of traffic signals: red, yellow, or green lights reflecting
the level of the student’s performance (Arnold, 2010). Similar to the Signals
dashboard, another approach assisting student learning has been developed
by researchers at the University of Maryland Baltimore (UMB), who began
with the observation that students earning a D or F used the university’s CMS
39% less frequently than students earning a grade of C or higher. Reasoning
that these students might make better choices about their own engagement
in online discussions if they had access to real-time data about the effects of
their classroom behaviors, the researchers designed a tool that allows students
to follow in real time their levels of activity in the online classroom as well as
their grades compared to an anonymous summary of their peers’ performance
(Fritz, 2011). All this information is stored in most CMS software already; the
UMB tool simply made it accessible and provided an understanding of the
data for each student.

Researchers have used data
analytics to discern which types
of students are most likely to
be at risk of failure at various
stages in their academic journey.
Once institutions identify
these student groups, they can
design interventions such as
supportive services, enhancements
to curricula, or improved
instructional practices.

Learning analytics are being used not just in the classroom, but in a variety
of other operational functions supporting online programs. For example,
instructional designers can now use CMS to improve courses through “built
in” feedback mechanisms that continually (not just as the end of a course)
gather user feedback on the relative usefulness of course features. In the past,
such improvements required the labor-intensive analysis of individual student
evaluations after the semester. Increasingly, mechanisms for user feedback can
be built into the online course, so that as students in multiple sections of a
course use and respond to various course features, course designers receive
large amounts of data in real time about which features students are using or
not using, which features are problematic, and so on (“Using data,” 2011).

Online Mastery Learning: Closing the Gap Between
Assessment and Learning
Another set of emerging tools for assessing student outcomes may be
grouped under the heading online mastery learning, which combines
the decades-old instructional methodology of mastery learning with
the technology of online education. First developed in the 1960s and
early 1970s, mastery learning represents an instructional approach
that radically closes the gap between assessing student achievement
and intervening to assist and advance student learning (Scrima, 2009).
National Institute for Learning Outcomes Assessment  |  8    
In mastery learning, students must demonstrate proficiency in a particular
learning objective before they are allowed to progress to a subsequent stage.
Assessment and feedback alternate in frequent cycles as the student progresses
through each stage of the curriculum. Time to completion becomes fluid.
Students begin with differing sets of prior knowledge, progress at different
rates, and master the course curriculum within different time frames. Clearly
and precisely defined learning outcomes—absolutely crucial for this instructional methodology—must adequately define the criteria of mastery toward
which students are working. Of equal importance are the precise definition
and accurate alignment of incremental learning objectives that delineate
the intermediate stages in the pathway toward mastery. While research into
mastery learning efforts has supported its efficacy, at the same time it has highlighted the substantial investment of time and faculty involvement required
to implement an effective curriculum using this approach (Kulik, Kulik, &
Bangert-Drowns, 1990).
Recent advances in online learning are breathing new life into this instructional
methodology, however. In a fully online curriculum, a variety of assessment
instruments—and the technical means for instant feedback—can be built into
every instructional activity. Previously static aspects of an online course (for
example, content areas containing background reading) can be designed as
interactive, with embedded assessments testing then assisting students as they
advance through the curriculum, whether on their own, in collaboration with
other students, or with the help of an instructor. Particularly important, the
process of learning for each individual can be tracked, monitored, and assisted.
Underlying all the elements of such a course are data collection “engines”
capable of gathering and quickly processing a large amount of information on
student performance. This marriage of mastery learning and technology allows
instructors to monitor students’ learning and to intervene when students need
assistance.

Computer-mediated approaches to
mastery learning are being developed
by a smattering of researchers and
for-profit companies. Current leaders
in this effort include CarnegieMellon’s Open Learning Initiative
(OLI) and the Khan Academy.

Computer-mediated approaches to mastery learning are being developed by
a smattering of researchers and for-profit companies. Current leaders in this
effort include Carnegie-Mellon’s Open Learning Initiative (OLI) and the
Khan Academy.
With open learning courses in fields ranging from engineering to French, OLI
is working to develop multiple feedback loops for rapidly collecting, disseminating, analyzing, and acting on student learning data:
•	 Feedback to students: Built into the mastery model of OLI courses
are mechanisms that provide students with feedback even during
the process of problem-solving, allowing for self-paced, incremental
progress toward learning objectives.
•	 Feedback to instructors: OLI researchers are developing ways to
harness the large amounts of assessment data generated by the model
and to present these data to instructors in a manner that empowers
them to intervene in timely and effective ways (Bajzek et al., 2008).
•	 Feedback to course designers: The OLI model provides course
designers with rapid and frequent feedback to improve online
courses in real time. In addition, the OLI model looks not only
at patterns of how students use particular course features but also
compares those usage patterns with learning assessment results.
Much in the news of late, the Khan Academy (http://www.khanacademy.
org/) is known for its collection of short instructional videos on topics
ranging from arithmetic to finance, narrated and posted on YouTube by
Salman Khan, a former hedge fund manager who aimed, initially, to tutor
his cousins in math. The Khan Academy approach, now augmenting videos
with technology and mirroring online mastery learning, is being used in K–12
settings to complement traditional classroom instruction (Rasicot, 2011).
National Institute for Learning Outcomes Assessment  |  9    
Like OLI, the Khan Academy approach reverses the usual relationship of
homework and lecture or classroom discussion. At home, students learn
content material at their own pace by working through incremental, interactive lessons and tutorials designed on the mastery model of progression.
Data is collected that instructors can use to help students learn concepts they
may be struggling with in the computer-mediated lessons. The classroom then
becomes a place of active work, with students using class time to complete
projects and assignments, either individually or in groups, with an instructor
present to monitor, coach, and assist their efforts.
Underlying the Carnegie Mellon OLI and Khan Academy models of mastery
learning is an assessment methodology that also predates online learning but
that in recent years has been energized by the growing capacity to track, store,
and analyze student learning data. Adaptive achievement testing, according
to Wainer (1990), bases test questions on students’ level of prior knowledge
as shown in their responses to previous questions. Adaptive testing is more
efficient in that fewer questions can be used to assess a student’s level of knowledge than with traditional linear testing, which requires a student to answer
all low-level questions before moving to more difficult material. With the use
of adaptive testing, instructors can quickly determine students’ skill level and
provide them with just-in-time learning, making optimal use of their prior
knowledge. This assessment method is also made simpler through course
management systems that connect new learning materials to the questions the
students answer incorrectly.

Courses and Programs That Learn

To be truly revolutionary,
student learning data generated
in online technologies must be
applied in a systematic way at the
program level and in real time,
so that students can benefit from
ongoing adjustments—changes
in curricula, course sequences,
academic requirements, and
resource allocation. Online
technologies can provide this
feedback loop in ways that
traditional classrooms simply
cannot.

Just as individual learners need feedback that is timely, targeted, and likely to
be acted upon, courses and programs also need feedback loops that efficiently
and quickly direct the results of assessment to improve student learning. The
emerging tools for online assessment examined previously are primarily being
used to assist individual student learning in vivo, for current students at the
time of greatest need. As a national movement, however, learning outcomes
assessment is concerned mostly with adjustments to curricula and instruction at the program level, generally applying these adjustments to assist future
students to achieve at greater levels. To be truly revolutionary, student learning
data generated in online technologies must be applied in a systematic way at
the program level and in real time, so that students can benefit from ongoing
adjustments at the program level—changes in curricula, course sequences,
academic requirements, resource allocation, and so on. Online technologies
can provide this real-time, program-level feedback loop in ways that traditional classrooms simply cannot.
Two preconditions seem necessary for effectively scaling emerging technologies for assessment in the online classroom to the needs of program-level, realtime LOA efforts. First, as the example of online mastery learning suggests,
educational technology will be most effective when coupled with a conceptual learning model—for example, a set of best practices about the kinds of
interventions that have the greatest impact on student performance in the
classroom. Without a learning model to guide course-level uses of online
assessment technology, ad hoc applications by individual instructors will be
difficult to scale up to instructional changes across multiple sections of the
same course or across a program. Second, to ensure that assessment data are
actionable at the course and program level, there needs to be a tight integration of learning outcomes throughout the curriculum, including assignments, discussion, course material, and instructional practices. Both of these
preconditions—a guiding learning model and tight curricular integration—of
course, are essential for LOA in any program, online or face-to-face. With the
ever-increasing volume of assessment data available from emerging technologies, however, such structural underpinnings are becoming even more important. Without a model to guide this work and to serve as a framework for
collecting and interpreting this great amount of data, we will find ourselves
simply unable to use it effectively.
National Institute for Learning Outcomes Assessment  |  10    
Fortunately, the foundations already exist for a learning model appropriate for the
online environment. The learner-centered tenets, based on a broad review of the
learning literature, articulated in the influential Seven Principles for Best Practice
in Undergraduate Education, by Chickering and Gamson (1987), are particularly
applicable to the task of identifying interventions that impact student performance: faculty engagement, student collaboration, active learning, frequent and
prompt feedback, time on task, high expectations, and respect for diversity (see
also Chickering & Ehrmann, 1996). Although initially developed for the faceto-face environment, the Chickering/Gamson model has been utilized widely in
the online environment. Palloff and Pratt (2009), in Assessing the Online Learner:
Resources and Strategies for Faculty, provide numerous examples and models for
designing effective assessments in the online modality as well as effective interventions based on outcomes data. These educators offer institutions both a theoretical framework based on the learner-centered approach and a practical guide
to online assessment practices leading to improvements in instruction, courses,
and programs. Similar approaches—emphasizing interactivity, instructor feedback, critically engaging assignments, and total time on task—are reflected as
well in a growing number of published best practices for online education.
Among the first, if not the first, of such sets of published guidelines was the
1995 document produced by the Western Cooperative for Electronic Telecommunications (WCET) in cooperation with the eight regional accrediting bodies,
Principles of Good Practice for Electronically Offered Degree and Certificate
Programs. A more recent best-practices document produced by WCET (2009),
Best Practice Strategies to Promote Academic Integrity in Online Education, Version
2.0., emphasizes assessment practices in the online classroom. The practices in
these documents are supported as well by reviews and meta-analyses of comparative studies on online and face-to-face education published over the past several
years (see Bernard et al., 2004; Tallent-Runnels et al., 2006; U.S. Department
of Education, 2010; Zhao, Lei, Yan, Lai, & Tan, 2005).

Another precondition for
effectively scaling up course-level
assessment data to changes at the
program level is the integration
of learning outcomes throughout
all aspects of the curriculum,
including assignments, discussion,
course material, and other
instructional features through
a “backward design” process
beginning with robust program
outcomes.

The second precondition for effectively scaling up course-level assessment data to
changes at the program level is the integration of learning outcomes throughout
all aspects of the curriculum, including assignments, discussion, course material,
and other instructional features through a “backward design” process beginning
with robust program outcomes. Many universities, however, using a more traditional approach to curriculum design, have responded to the growing market for
online education by quickly cobbling together degree programs that combine
existing face-to-face elements with new courses developed in the online format
(Shelton & Saltsman 2005). Program-level LOA in this context becomes a Rube
Goldberg operation of retrofitting curricula with objectives derived after the
fact from existing courses developed by individual faculty—with inevitable gaps
and contradictions. Learning objectives in individual courses may be presented
out of sequence. Some program-level outcomes may not be supported by sufficient learning experiences for students. In addition, course-level assignments
and assessments originally designed for the face-to-face environment may not
be ideal for an online classroom, making it more difficult to identify which
program-level adjustments should be made in response to assessment data.
A more efficient and effective scenario would be a curriculum in which outcomes
and assessments are built in from the beginning. Competencies could then be
sequentially developed and reinforced within courses and across the student
experience, with assessments and feedback deeply embedded and intertwined in
each stage of learning. An example of this approach to curriculum redesign was
recently undertaken by the Undergraduate School at the University of Maryland University College in its Supporting Educational Goals for Undergraduate
Excellence (SEGUE) project. The SEGUE process transformed the undergraduate curriculum, making it more relevant, sequenced, and focused on learning
outcomes (Stiehl & Lewchuk, 2008). Applying this process to connect individual courses to programs of study, the undergraduate faculty created seamless
learning pathways in 33 programs of study by answering this question: What do
we expect students to be able to do “out there” as a result of the learning experiences we offer them “in here” in our curriculum in the Undergraduate School?
National Institute for Learning Outcomes Assessment  |  11    
Program outcomes were focused not only on the workplace but also on the
community, the family, and global society. As a result of SEGUE, faculty have a
better understanding of the program in which they teach, its intended outcomes,
and what students in their particular course must master to reach intended
learning outcomes. Also as a result of SEGUE, students are better able to see the
connections between their courses and the contributions of each course to their
overall education, their career, and their role in society.

Changing Roles for Faculty
If our assertions here are correct, the distinction between online courses and
face-to-face courses will continue to blur. We foresee that day when no course
will be entirely face-to-face without at least some online practice and assessments. At many institutions, face-to-face courses use the same CMS software to
complement traditional class sessions already; at others, such as the University
of Maryland University College, the process has gone further, with all previously
face-to-face courses now being taught only as “hybrid” or “blended” format
courses (a hybrid or blended course is one in which online course activities
replace a portion of the course’s official “seat time”).
As this blurring continues, the key group with a significant impact on institutional LOA efforts will be faculty. Without faculty support and engagement,
these changes simply cannot occur. To be effectively deployed on a large scale,
these promising approaches will require a profound transformation in how
faculty members interact with students as well as how faculty create and improve
curricula and programs. Perhaps the greatest conceptual shift for faculty to
absorb will be the new reality that the effective teaching of each student “takes a
village” and that the individual instructor in isolation can no longer achieve this.
The “classroom walls” of the online course will become increasingly porous, with
library staff, course designers, and various other support staff creating, maintaining, and continually improving the learning experience for students (Neely
& Tucker, 2010). The new technologies and approaches to online assessment
only accelerate a process that has marked online learning programs from the
beginning, particularly those programs with large enrollments and students and
faculty dispersed across the nation. While demands of scale and cost dictate the
sharing of responsibility for course content and instruction for larger online
programs, creating and upgrading the learning environment already rests with
a network of staff and systems. What has changed with the emergence of new
technologies and approaches such as learning analytics is that we are now able
to make informed interventions on students’ behalf during the semester, not just
before the next one.

If our assertions here are correct,
the distinction between online
courses and face-to-face courses
will continue to blur. We foresee
that day when no course will be
entirely face-to-face without at
least some online practice and
assessments. As this blurring
continues, the key group
with a significant impact on
institutional LOA efforts will be
faculty.

Conclusion
Redesign efforts to create “programs that learn” will push us further away still from
the comfortable and thus-far dominant model of the instructor who has mastered
a discipline and who imparts his or her knowledge either through lecture or class
discussion. In this new world of higher education, rather than delivering prepared
lectures to all students regardless of their foundational knowledge, the instructor’s role will be to monitor students’ progress and to intervene appropriately
when students are unable to advance to the next stage in the curriculum. Instead
of the instructor transferring knowledge to students, the instructor (along with
learning experts) will develop learning environments with content designed for
students in ways that align with best practices. As students work through material delivered online, the role of the instructor will not be to teach all topics to
all students but, rather, to monitor which students are having trouble mastering
which concepts, so that specific help can be provided to those students at the right
time. Faculty roles in pre-semester preparation and post-semester adjustments
are changing as well. In outcomes-based program design, course materials and
assignments will be delivered based on learning research, not simply on how an
instructor wishes to deliver the content based on his or her own past experience.
National Institute for Learning Outcomes Assessment  |  12    
Course curricula, assignments, and infrastructure will be designed ahead of
time, with a tight integration of learning outcomes in all aspects of the course.
The difficulty and rigor of assignments will also be set and refined by collective
efforts and a shared model of how students learn, rather than by each individual instructor’s idiosyncratic practice.
Emerging technologies and approaches in online education enable all instructors to practice what the very best teachers have always known: that students
are enabled or limited by their prior knowledge (Ambrose, Bridges, DiPietro,
Lovett, & Norman, 2010) and that the fine art of teaching involves discovering the multiple pathways that will move students from where they are to
the desired learning outcomes. If a student comes to the course knowing a
great deal already, the online instructor will be able to assist that student to
accelerate the time to completion in a course. If another student needs more
help and time to acquire needed prior knowledge and skills, the instructor can
augment the student’s learning to fill in some of the basic knowledge required
before moving forward. While these features could, of course, be integrated in
an online course designed by an individual instructor, the truth is that most
faculty simply do not have the time, resources, or expertise to recreate what
can be integrated far more simply into course management systems for online
education.
There will be some who find the seeming loss of faculty autonomy in this
new world of higher education profoundly disturbing. Such voices will likely
be widespread and, particularly for traditional institutions without extensive experience with online courses and programs, may present a barrier to
adopting the new technologies and approaches examined in this paper. Other
faculty members, however, will welcome a work environment in which their
time is not consumed by preparing learning materials and grading student
work. They will embrace the opportunity to meet students where they are
intellectually and to provide individualized assistance benefitting the at-risk
student as well as the more advanced student held back by a one-size-fits-all
approach.

Emerging technologies and
approaches in online education
enable all instructors to practice
what the very best teachers have
always known: that students are
enabled or limited by their prior
knowledge and that the fine art
of teaching involves discovering
the multiple pathways that
will move students from where
they are to the desired learning
outcomes.

National Institute for Learning Outcomes Assessment  |  13    
References
Allen, I. E., & Seaman, J. (2010, November). Class differences: Online education in the United States, 2010. Wellesley, MA:
Babson Survey Research Group and The Sloan Consortium. Retrieved from http://sloanconsortium.org/publications/
survey/class_differences
Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: Seven
research-based principles for smart teaching. San Francisco, CA: Jossey-Bass.
Arnold, K. E. (2010, November 1). Signals: Applying academic analytics. Educause Quarterly, 33(1). Retrieved
from http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVolum/
SignalsApplyingAcademicAnalyti/199385
Baepler, P., & Murdoch, C. J. (2010, July). Academic analytics and data mining in higher education. International Journal
for the Scholarship of Teaching and Learning. 4(2), 1–9.
Bajzek, D., Brooks, J., Jerome, W., Lovett, M., Rinderle, J., Rule, G., & Thille, C. (2008). Assessment and instruction: Two
sides of the same coin. In G. Richards (Ed.), Proceedings of World Conference on E-Learning in Corporate, Government,
Healthcare, and Higher Education 2008 (pp. 560–565). Chesapeake, VA: AACE. Retrieved from http://oli.web.cmu.edu/
openlearning/files/theinitiative/publications/eLearn08-TwoSides.pdf
Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., . . . & Huang, B. (2004). How does
distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational
Research, 74(3), 379–439. doi: 10.3102/00346543074003379
Chickering, A. W., & Ehrmann, S. C. (1996). Implementing the seven principles: Technology as a lever. American Association of
Higher Education Bulletin, 49(2), 3–6. Retrieved from http://www.aahea.org/bulletins/articles/sevenprinciples.htm
Chickering, A. W., & Gamson, Z. (1987). Seven principles for good practice in undergraduate education. American
Association of Higher Education Bulletin, 39(7) 3–7. Retrieved from http://www.aahea.org/bulletins/articles/
sevenprinciples1987.htm
Fritz, J. (2011, March). Classroom walls that talk: Using online course activity data of successful students to raise selfawareness of underperforming peers. The Internet and Higher Education, 14(2), 89–97.
Kuh, G. D., & Ikenberry, S. O. (2009). More than you think, less than we need: learning outcomes assessment in American
higher education. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes
Assessment. Retrieved from http://www.learningoutcomeassessment.org/documents/fullreportrevised-L.pdf
Kulik, C.-L. C., Kulik, J. A., & Bangert-Drowns, R. L. (1990, summer). Effectiveness of mastery learning programs: A
meta-analysis. Review of Educational Research, 60(2), 265–299. doi: 10.3102/00346543060002265
Neely, P. W., & Tucker, J. P. (2010). Unbundling faculty roles in online distance education programs. Contemporary Issues in
Education Research, 3(6), 17–24. Retrieved from http://journals.cluteonline.com/index.php/CIER/article/viewArticle/211
Palloff, R. M., & Pratt, K. (2009, December). Assessing the online learner: Resources and strategies for faculty. San Francisco,
CA: Jossey-Bass.
Rasicot, J. (2011, August 4). Education review: Web site offering free math lessons catches on ‘like wildfire.’ The Washington
Post. Retrieved from http://www.washingtonpost.com/lifestyle/magazine/web-site-offering-free-online-math-lessonscatches-on-like-wildfire/2011/07/15/gIQAtL5KuI_story.html
Scrima, A. E. (2009). Implementing a mastery model through self quizzing in an online learning environment (Doctoral
dissertation). Retrieved from Proquest Dissertations & Theses, AAT No. 3364685.
Shelton, K., & Saltsman, G. (2005). An administrator’s guide to online education. Charlotte, NC: IAP-Information Age.
Stiehl, R., & Lewchuk, L. (2005). The assessment primer: Creating a flow of learning evidence. Corvallis, OR: The Learning
Organization.
Stiehl, R., & Lewchuk, L. (2008). The outcomes primer: Reconstructing the college curriculum (3rd ed.). Corvallis, OR: The
Learning Organization.
Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S. , Ahern, T. C., Shaw, S. M., & Liu, X. (2006,
spring). Teaching courses online: A review of the research. Review of Educational Research, 76(1), 93–135. doi:
10.3102/00346543076001093
National Institute for Learning Outcomes Assessment  |  14    
U.S. Department of Education, National Center for Education Statistics. (2003). Distance education at degree-granting
postsecondary education institutions, 2000–2001 (NCES 2003-017). Washington, DC: Author. Retrieved from http://nces.
ed.gov/pubs2003/2003017.pdf
U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. (2010, September). Evaluation
of evidence-based practices in online learning: A meta-analysis and review of online learning studies (ID: CSD5650 ).
Washington, DC: Author. Retrieved from http://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.
pdf
U.S. General Accounting Office. (2002). Distance education: Growth in distance education programs and implications
for federal education policy (Statement of Cornelia M. Ashby, Director, Education, Workforce, and Income Security
Issues. Testimony before the Committee on Health, Education, Labor, and Pensions, U.S. Senate [GAO-02-1125]).
Washington, DC: Author. Retrieved from http://www.gao.gov/new.items/d021125t.pdf
Using data to improve online courses. (2011, May 15). Distance Education Report. Retrieved from http://www.magnapubs.
com/newsletter/story/5718/
Wainer, H. (1990). Computerized adaptive learning: A primer. Hillsdale, NJ: Lawrence Erlbaum.
Watson, W. R., & Watson, S. L. (2007). An argument for clarity: What are learning management systems, what are they
not, and what should they become? TechTrends, 51(2), 28–34. Retrieved from http://www.springerlink.com/content/
r1ur45g5t3225553/
Western Cooperative for Electronic Telecommunications (WCET). (1995). Principles of good practice for
electronically offered degree and certificate programs. Retrieved from http://wcet.wiche.edu/wcet/docs/publications/
PrinciplesofGoodPractice1995.pdf
Western Cooperative for Electronic Telecommunications (WCET), UT TeleCampus, & Instructional Technology Council.
(2009). Best practice strategies to promote academic integrity in online education, Version 2.0. Retrieved from http://wcet.
wiche.edu/wcet/docs/cigs/studentauthentication/BestPractices.pdf
Zhao, Y., Lei, J., Yan, B., Lai., C., & Tan, S. (2005). What makes the difference? A practical analysis of research on the
effectiveness of distance education. Teachers College Record, 107(8), 1836–1884. Retrieved from http://www.tcrecord.org/
Content.asp?ContentId=12098

National Institute for Learning Outcomes Assessment  |  15    
NILOA National Advisory Panel
Joe Alutoo
Provost
The Ohio State University
Trudy W. Banta
Professor
Indiana University-Purdue University
Indianapolis
Douglas C. Bennett
President
Earlham College
Wallace Boston
President and CEO
American Public University System

Charlene Nunley
Program Director
Doctoral Program in Community College
Policy and Administration
University of Maryland University College
Randy Swing
Executive Director
Association for Institutional Research
Carol Geary Schneider
President
Association of American Colleges and
Universities

Molly Corbett Broad
President
American Council on Education

Michael Tanner
Chief Academic Officer/Vice President
Association of Public and Land-grant
Universities

Judith Eaton
President
Council for Higher Education Accreditation

Belle Wheelan
President
Southern Association of Colleges and Schools

Richard Ekman
President
Council of Independent Colleges
Mildred Garcia
President
California State University Dominguez Hills

Ralph Wolff
President
Western Association of Schools and Colleges

Ex-Officio Members

Susan Johnston
Executive Vice President
Association of Governing Boards

Peter Ewell
Vice President
National Center for Higher Education
Management Systems

Steve Jordan
President
Metropolitan State University - Denver

Stanley Ikenberry
President Emeritus and Regent Professor
University of Illinois

Paul Lingenfelter
President
State Higher Education Executive Officers

George Kuh
Director, National Institute for Learning
Outcomes Assessment
Adjunct Professor, University of Illinois
Indiana University Chancellor’s Professor
Emeritus

George Mehaffy
Vice President
Academic Leadership and Change
American Association of State Colleges and
Universities

NILOA Mission
NILOA’s primary objective is to
discover and disseminate ways that
academic programs and institutions
can productively use assessment data
internally to inform and strengthen
undergraduate education, and externally to communicate with policy
makers, families and other stakeholders.

NILOA Occasional Paper
Series
NILOA Occasional Papers
are commissioned to examine
contemporary issues that will inform
the academic community of the
current state-of-the art of assessing
learning outcomes in American higher
education. The authors are asked to
write for a general audience in order
to provide comprehensive, accurate
information about how institutions and
other organizations can become more
proficient at assessing and reporting
student learning outcomes for the
purposes of improving student learning
and responsibly fulfilling expectations
for transparency and accountability
to policy makers and other external
audiences.

Comments and questions about this
paper should be sent to
sprovez2@illinois.edu.

National Institute for Learning Outcomes Assessment  |  16    
About NILOA
•	 The National Institute for Learning Outcomes Assessment (NILOA)
was established in December 2008.
•	 NILOA is co-located at the University of Illinois and Indiana
University.
•	 The NILOA web site went live on February 11, 2009.

www.learningoutcomesassessment.org

•	 The NILOA research team has scanned institutional websites,
surveyed chief academic officers, and commissioned a series of occasional papers.
•	 One of the co-principal NILOA investigators, George Kuh, founded
the National Survey for Student Engagement (NSSE).
•	 The other co-principal investigator for NILOA, Stanley Ikenberry,
was president of the University of Illinois from 1979 to 1995 and
of the American Council of Education from 1996 to 2001. He is
currently serving as Interim President of the University of Illinois.
•	 Peter Ewell joined NILOA as a senior scholar in November 2009.

NILOA Staff
National Institute for Learning Outcomes Assessment

Stanley Ikenberry, Co-Principal Investigator
George Kuh, Co-Principal Investigator and Director
Peter Ewell, Senior Scholar
Staci Provezis, Project Manager and Research Analyst
Jillian Kinzie, Associate Research Scientist
Natasha Jankowski, Associate Project Manager and Research Analyst
Gloria Shenoy, Research Analyst
Gianina Baker, Research Analyst

NILOA Sponsors
Carnegie Corporation of New York
Lumina Foundation for Education
The Teagle Foundation

Permission is hereby granted to reproduce and distribute copies of this work for nonprofit educational purposes, provided that copies are
distributed at or below cost, and that the author, source and copyright notice are included on each copy. © 2011 NILOA
Produced by Creative Services | Public Affairs at the University of Illinois for NILOA. 10.032

National Institute for Learning Outcomes Assessment  |  17    
ducate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountability connection self-reflection educate a
nder stand communicate curiosity challenge create achievement connection self-reflection under stand communicate listen learn access quality acti
ction under stand communicate listen learn action under stand communicate listen learn access quality innovation success ingenuity intellect curios
hallenge knowledge accountability connection access quality self-reflection curiosity challenge create achievement learn access quality innovation
ngenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate action under stand knowled
ccountability connection self-reflection educate action under stand communicate listen learn access quality innovation success ingenuity intellect c
hallenge connection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innova
uccess ingenuity challenge create achievement connection self-reflection educate action under stand connection self-reflection under stand commun
sten learn access quality action create achievement connection self-reflection educate action under stand communicate listen learn access quality i
uccess educate action communicate listen learn access quality action educate action under stand communicate educate innovation success self-refl
nowledge accountability communicate listen learn achievement connection self-reflection educate action under stand communicate listen learn acce
nnovation success ingenuity intellect access quality innovation success self-reflection curiosity challenge create achievement connection self-reflec
nder stand educate action under stand communicate listen learn action under stand communicate listen learn access quality innovation success inge
uriosity challenge create achievement connection self-reflection under stand communicate listen learn access quality action create achievement con
elf-reflection educate action under stand communicate listen learn access quality innovation success educate action communicate listen learn acce
uality action educate action under stand create achievement connection self-reflection under stand communicate listen learn access quality action c
chievement connection self-reflection educate action under stand communicate listen communicate educate innovation success self-reflection know
ccountability connection self-reflection educate action under stand communicate listen learn access quality innovation ingenuity intellect connectio
eflection under stand communicate listen learn access quality action create achievement connection self-reflection educate action under stand comm
sten learn access quality innovation success educate action communicate listen learn access quality action educate action under stand communicat

National Institute for Learning Outcomes Assessment
For more information, please contact:
National Institute for Learning Outcomes Assessment (NILOA)
University of Illinois at Urbana-Champaign
340 Education Building
Champaign, IL 61820
learningoutcomesassessment.org
njankow2@illinois.edu
Fax: 217.244.3378
Phone: 217.244.2155

Mais conteúdo relacionado

Mais procurados

Ferreira maclean2017
Ferreira maclean2017Ferreira maclean2017
Ferreira maclean2017Diver Padi
 
Layers loops and processes: learning analytics for complex learning systems
Layers loops and processes: learning analytics for complex learning systemsLayers loops and processes: learning analytics for complex learning systems
Layers loops and processes: learning analytics for complex learning systemsRuth Deakin Crick
 
Digital Tools to Advance Sustainability Literacy
Digital Tools to Advance Sustainability Literacy Digital Tools to Advance Sustainability Literacy
Digital Tools to Advance Sustainability Literacy Julie Evans
 
Lasi 21 century skills panel
Lasi 21 century skills panelLasi 21 century skills panel
Lasi 21 century skills panelRuth Deakin Crick
 
New Pedagogies for Deep Learning. (2016). NPDL Global Report. (1st ed.)
New Pedagogies for Deep Learning. (2016). NPDL Global Report. (1st ed.)New Pedagogies for Deep Learning. (2016). NPDL Global Report. (1st ed.)
New Pedagogies for Deep Learning. (2016). NPDL Global Report. (1st ed.)eraser Juan José Calderón
 
Learning Trajectory-Aligned Diagnostic Assessments for Early Algebra, Grades ...
Learning Trajectory-Aligned Diagnostic Assessments for Early Algebra, Grades ...Learning Trajectory-Aligned Diagnostic Assessments for Early Algebra, Grades ...
Learning Trajectory-Aligned Diagnostic Assessments for Early Algebra, Grades ...Basia Coulter
 
JTC May 2013 - Flexible Pathways Session
JTC May 2013 - Flexible Pathways SessionJTC May 2013 - Flexible Pathways Session
JTC May 2013 - Flexible Pathways SessionAlberta Education
 
Story board for study sync complete presentation
Story board for study sync complete presentationStory board for study sync complete presentation
Story board for study sync complete presentationvbjrvb36
 
Primary inquiry
Primary inquiryPrimary inquiry
Primary inquirysteeners
 
Blending assessment for 21st century learning
Blending assessment for 21st century learningBlending assessment for 21st century learning
Blending assessment for 21st century learningJonathan Martin
 
2015annualdatareport-singlepage
2015annualdatareport-singlepage2015annualdatareport-singlepage
2015annualdatareport-singlepageAmil Gehrke
 
Tony Wagner's Keynote Presentation
Tony Wagner's Keynote PresentationTony Wagner's Keynote Presentation
Tony Wagner's Keynote PresentationNAFCareerAcads
 
Coiro PDI Keynote 2018 keynote
Coiro PDI Keynote 2018 keynote Coiro PDI Keynote 2018 keynote
Coiro PDI Keynote 2018 keynote Julie Coiro
 
The College Classroom Fa15 Meeting 4: Fixed and Growth Mindset, and Assessmen...
The College Classroom Fa15 Meeting 4: Fixed and Growth Mindset, and Assessmen...The College Classroom Fa15 Meeting 4: Fixed and Growth Mindset, and Assessmen...
The College Classroom Fa15 Meeting 4: Fixed and Growth Mindset, and Assessmen...Peter Newbury
 
Qed528 tg12 g5 pbl presentation
Qed528 tg12 g5 pbl presentationQed528 tg12 g5 pbl presentation
Qed528 tg12 g5 pbl presentationAschik Hoffman
 
Personal Digital Inquiry Summer Institute in Digital Literacy 2019
Personal Digital Inquiry Summer Institute in Digital Literacy 2019Personal Digital Inquiry Summer Institute in Digital Literacy 2019
Personal Digital Inquiry Summer Institute in Digital Literacy 2019Julie Coiro
 
The Global Achievment Gap: Schools that Work
The Global Achievment Gap: Schools that WorkThe Global Achievment Gap: Schools that Work
The Global Achievment Gap: Schools that WorkBen Johnson
 
Breakfast future ready workshop
Breakfast future ready workshopBreakfast future ready workshop
Breakfast future ready workshopDerek Wenmoth
 

Mais procurados (20)

Ferreira maclean2017
Ferreira maclean2017Ferreira maclean2017
Ferreira maclean2017
 
Layers loops and processes: learning analytics for complex learning systems
Layers loops and processes: learning analytics for complex learning systemsLayers loops and processes: learning analytics for complex learning systems
Layers loops and processes: learning analytics for complex learning systems
 
Digital Tools to Advance Sustainability Literacy
Digital Tools to Advance Sustainability Literacy Digital Tools to Advance Sustainability Literacy
Digital Tools to Advance Sustainability Literacy
 
Lasi 21 century skills panel
Lasi 21 century skills panelLasi 21 century skills panel
Lasi 21 century skills panel
 
New Pedagogies for Deep Learning. (2016). NPDL Global Report. (1st ed.)
New Pedagogies for Deep Learning. (2016). NPDL Global Report. (1st ed.)New Pedagogies for Deep Learning. (2016). NPDL Global Report. (1st ed.)
New Pedagogies for Deep Learning. (2016). NPDL Global Report. (1st ed.)
 
Learning Trajectory-Aligned Diagnostic Assessments for Early Algebra, Grades ...
Learning Trajectory-Aligned Diagnostic Assessments for Early Algebra, Grades ...Learning Trajectory-Aligned Diagnostic Assessments for Early Algebra, Grades ...
Learning Trajectory-Aligned Diagnostic Assessments for Early Algebra, Grades ...
 
JTC May 2013 - Flexible Pathways Session
JTC May 2013 - Flexible Pathways SessionJTC May 2013 - Flexible Pathways Session
JTC May 2013 - Flexible Pathways Session
 
Story board for study sync complete presentation
Story board for study sync complete presentationStory board for study sync complete presentation
Story board for study sync complete presentation
 
Primary inquiry
Primary inquiryPrimary inquiry
Primary inquiry
 
Blending assessment for 21st century learning
Blending assessment for 21st century learningBlending assessment for 21st century learning
Blending assessment for 21st century learning
 
2015annualdatareport-singlepage
2015annualdatareport-singlepage2015annualdatareport-singlepage
2015annualdatareport-singlepage
 
Tony Wagner's Keynote Presentation
Tony Wagner's Keynote PresentationTony Wagner's Keynote Presentation
Tony Wagner's Keynote Presentation
 
Coiro PDI Keynote 2018 keynote
Coiro PDI Keynote 2018 keynote Coiro PDI Keynote 2018 keynote
Coiro PDI Keynote 2018 keynote
 
The College Classroom Fa15 Meeting 4: Fixed and Growth Mindset, and Assessmen...
The College Classroom Fa15 Meeting 4: Fixed and Growth Mindset, and Assessmen...The College Classroom Fa15 Meeting 4: Fixed and Growth Mindset, and Assessmen...
The College Classroom Fa15 Meeting 4: Fixed and Growth Mindset, and Assessmen...
 
21k12 Model School
21k12 Model School21k12 Model School
21k12 Model School
 
NoelOnline
NoelOnline NoelOnline
NoelOnline
 
Qed528 tg12 g5 pbl presentation
Qed528 tg12 g5 pbl presentationQed528 tg12 g5 pbl presentation
Qed528 tg12 g5 pbl presentation
 
Personal Digital Inquiry Summer Institute in Digital Literacy 2019
Personal Digital Inquiry Summer Institute in Digital Literacy 2019Personal Digital Inquiry Summer Institute in Digital Literacy 2019
Personal Digital Inquiry Summer Institute in Digital Literacy 2019
 
The Global Achievment Gap: Schools that Work
The Global Achievment Gap: Schools that WorkThe Global Achievment Gap: Schools that Work
The Global Achievment Gap: Schools that Work
 
Breakfast future ready workshop
Breakfast future ready workshopBreakfast future ready workshop
Breakfast future ready workshop
 

Destaque (6)

9268 cowie summaryreport
9268 cowie summaryreport9268 cowie summaryreport
9268 cowie summaryreport
 
Innovación pedagógica y Asesoría.
Innovación pedagógica y Asesoría.Innovación pedagógica y Asesoría.
Innovación pedagógica y Asesoría.
 
Скажи палінню НІ
Скажи палінню НІСкажи палінню НІ
Скажи палінню НІ
 
Cairan tubuh
Cairan tubuhCairan tubuh
Cairan tubuh
 
Vision de marque, pas jargon de marque! Article in FRENCH strategies
Vision de marque, pas jargon de marque! Article in FRENCH   strategiesVision de marque, pas jargon de marque! Article in FRENCH   strategies
Vision de marque, pas jargon de marque! Article in FRENCH strategies
 
WINGS NETWORK Italia - "Il Network Marketing Più Pemiante!"
WINGS NETWORK Italia - "Il Network Marketing Più Pemiante!"WINGS NETWORK Italia - "Il Network Marketing Più Pemiante!"
WINGS NETWORK Italia - "Il Network Marketing Più Pemiante!"
 

Semelhante a Asesoría para la Educación Online

Change Leadership In Education
Change Leadership In EducationChange Leadership In Education
Change Leadership In EducationMark Luffman
 
Learning modules marketing messages.docx
Learning modules marketing messages.docxLearning modules marketing messages.docx
Learning modules marketing messages.docxJoseph Serwach
 
Teaching Metacognition
Teaching MetacognitionTeaching Metacognition
Teaching MetacognitionJim Wentworth
 
CopyrightGail Burnaford and Tara BrownTeaching and Learnin
CopyrightGail Burnaford and Tara BrownTeaching and LearninCopyrightGail Burnaford and Tara BrownTeaching and Learnin
CopyrightGail Burnaford and Tara BrownTeaching and LearninAlleneMcclendon878
 
eMOOCs Conference #HumanMOOC Flipped Video Slides
eMOOCs Conference #HumanMOOC Flipped Video Slides eMOOCs Conference #HumanMOOC Flipped Video Slides
eMOOCs Conference #HumanMOOC Flipped Video Slides Whitney Kilgore
 
Introduction to Personal Digital Inquiry in Grades K-8
Introduction to Personal Digital Inquiry in Grades K-8Introduction to Personal Digital Inquiry in Grades K-8
Introduction to Personal Digital Inquiry in Grades K-8Julie Coiro
 
Inquiry Meeting For Eaapa 09
Inquiry Meeting For  Eaapa 09Inquiry Meeting For  Eaapa 09
Inquiry Meeting For Eaapa 09Justine Driver
 
eMOOCs2015_humanmooc_pechakucha
eMOOCs2015_humanmooc_pechakuchaeMOOCs2015_humanmooc_pechakucha
eMOOCs2015_humanmooc_pechakuchaWhitney Kilgore
 
John brown standards based assessment
John  brown standards based assessmentJohn  brown standards based assessment
John brown standards based assessmentNAFCareerAcads
 
Learning to innovate innovating to learn by Tony Wagner
Learning to innovate innovating to learn by Tony WagnerLearning to innovate innovating to learn by Tony Wagner
Learning to innovate innovating to learn by Tony WagnerGlobal Education Forum
 
Developing Surface and Deep Level Knowledge and Skill through Project Based L...
Developing Surface and Deep Level Knowledge and Skill through Project Based L...Developing Surface and Deep Level Knowledge and Skill through Project Based L...
Developing Surface and Deep Level Knowledge and Skill through Project Based L...mmcdowell13
 
Teaching Metacognition
Teaching MetacognitionTeaching Metacognition
Teaching MetacognitionJim Wentworth
 
Student Retention Foothill College Roundtable
Student Retention Foothill College RoundtableStudent Retention Foothill College Roundtable
Student Retention Foothill College RoundtableRobert Cormia
 
Internet Reciprocal Teaching
Internet Reciprocal TeachingInternet Reciprocal Teaching
Internet Reciprocal Teachingcburnham
 
I nacol cw_issuebrief_learningedge_full
I nacol cw_issuebrief_learningedge_fullI nacol cw_issuebrief_learningedge_full
I nacol cw_issuebrief_learningedge_fullallison_sorensen
 
Post-16 briefing - Introducing ELLI
Post-16 briefing - Introducing ELLI  Post-16 briefing - Introducing ELLI
Post-16 briefing - Introducing ELLI Nigel Newton
 
Issues in Developing and Implementing ePortfolios
Issues in Developing and Implementing ePortfoliosIssues in Developing and Implementing ePortfolios
Issues in Developing and Implementing ePortfoliosGrahamAttwell
 

Semelhante a Asesoría para la Educación Online (20)

ICSN 2010
ICSN 2010ICSN 2010
ICSN 2010
 
Change Leadership In Education
Change Leadership In EducationChange Leadership In Education
Change Leadership In Education
 
Learning modules marketing messages.docx
Learning modules marketing messages.docxLearning modules marketing messages.docx
Learning modules marketing messages.docx
 
Teaching Metacognition
Teaching MetacognitionTeaching Metacognition
Teaching Metacognition
 
CopyrightGail Burnaford and Tara BrownTeaching and Learnin
CopyrightGail Burnaford and Tara BrownTeaching and LearninCopyrightGail Burnaford and Tara BrownTeaching and Learnin
CopyrightGail Burnaford and Tara BrownTeaching and Learnin
 
eMOOCs Conference #HumanMOOC Flipped Video Slides
eMOOCs Conference #HumanMOOC Flipped Video Slides eMOOCs Conference #HumanMOOC Flipped Video Slides
eMOOCs Conference #HumanMOOC Flipped Video Slides
 
Introduction to Personal Digital Inquiry in Grades K-8
Introduction to Personal Digital Inquiry in Grades K-8Introduction to Personal Digital Inquiry in Grades K-8
Introduction to Personal Digital Inquiry in Grades K-8
 
Inquiry Meeting For Eaapa 09
Inquiry Meeting For  Eaapa 09Inquiry Meeting For  Eaapa 09
Inquiry Meeting For Eaapa 09
 
eMOOCs2015_humanmooc_pechakucha
eMOOCs2015_humanmooc_pechakuchaeMOOCs2015_humanmooc_pechakucha
eMOOCs2015_humanmooc_pechakucha
 
John brown standards based assessment
John  brown standards based assessmentJohn  brown standards based assessment
John brown standards based assessment
 
Learning to innovate innovating to learn by Tony Wagner
Learning to innovate innovating to learn by Tony WagnerLearning to innovate innovating to learn by Tony Wagner
Learning to innovate innovating to learn by Tony Wagner
 
Developing Surface and Deep Level Knowledge and Skill through Project Based L...
Developing Surface and Deep Level Knowledge and Skill through Project Based L...Developing Surface and Deep Level Knowledge and Skill through Project Based L...
Developing Surface and Deep Level Knowledge and Skill through Project Based L...
 
Teaching Metacognition
Teaching MetacognitionTeaching Metacognition
Teaching Metacognition
 
Student Retention Foothill College Roundtable
Student Retention Foothill College RoundtableStudent Retention Foothill College Roundtable
Student Retention Foothill College Roundtable
 
Internet Reciprocal Teaching
Internet Reciprocal TeachingInternet Reciprocal Teaching
Internet Reciprocal Teaching
 
Wake up Everybody
Wake up EverybodyWake up Everybody
Wake up Everybody
 
I nacol cw_issuebrief_learningedge_full
I nacol cw_issuebrief_learningedge_fullI nacol cw_issuebrief_learningedge_full
I nacol cw_issuebrief_learningedge_full
 
The Global Achievement Gap
The Global Achievement GapThe Global Achievement Gap
The Global Achievement Gap
 
Post-16 briefing - Introducing ELLI
Post-16 briefing - Introducing ELLI  Post-16 briefing - Introducing ELLI
Post-16 briefing - Introducing ELLI
 
Issues in Developing and Implementing ePortfolios
Issues in Developing and Implementing ePortfoliosIssues in Developing and Implementing ePortfolios
Issues in Developing and Implementing ePortfolios
 

Mais de Facultad de Humanidades Universidad Católica de Santa Fe

Mais de Facultad de Humanidades Universidad Católica de Santa Fe (20)

Cómo comenzar con Google Classroom - FH UCSF
Cómo comenzar con Google Classroom - FH UCSFCómo comenzar con Google Classroom - FH UCSF
Cómo comenzar con Google Classroom - FH UCSF
 
Documentos para los equipos de cátedra, planes 2014 FH UCSF
Documentos para los equipos de cátedra, planes 2014 FH UCSFDocumentos para los equipos de cátedra, planes 2014 FH UCSF
Documentos para los equipos de cátedra, planes 2014 FH UCSF
 
Krinen Nº 12, Revista de Educación, año 2014
Krinen Nº 12, Revista de Educación, año 2014Krinen Nº 12, Revista de Educación, año 2014
Krinen Nº 12, Revista de Educación, año 2014
 
Krinen 11 (2014)
Krinen 11 (2014)Krinen 11 (2014)
Krinen 11 (2014)
 
Reflexiones en torno de la carta de juan pablo II
Reflexiones en torno de la carta de juan pablo IIReflexiones en torno de la carta de juan pablo II
Reflexiones en torno de la carta de juan pablo II
 
La oferta formativa aprobada para la función de asesoría
La oferta formativa aprobada para la función  de asesoría La oferta formativa aprobada para la función  de asesoría
La oferta formativa aprobada para la función de asesoría
 
El asesoramiento curricular a los centros escolares.
El asesoramiento curricular a los centros escolares.El asesoramiento curricular a los centros escolares.
El asesoramiento curricular a los centros escolares.
 
Asesoramiento hacia la escuela inclusiva y formación en centros
Asesoramiento hacia la escuela inclusiva  y formación en centrosAsesoramiento hacia la escuela inclusiva  y formación en centros
Asesoramiento hacia la escuela inclusiva y formación en centros
 
Asesoramiento educativo para la formación docente en la visión de escuela inc...
Asesoramiento educativo para la formación docente en la visión de escuela inc...Asesoramiento educativo para la formación docente en la visión de escuela inc...
Asesoramiento educativo para la formación docente en la visión de escuela inc...
 
ASESORAMIENTO SOBRE EL PROCESO DE AUTOEVALUACIÓN DE CENTROS DOCENTES
ASESORAMIENTO SOBRE EL PROCESO DE AUTOEVALUACIÓN DE CENTROS  DOCENTES ASESORAMIENTO SOBRE EL PROCESO DE AUTOEVALUACIÓN DE CENTROS  DOCENTES
ASESORAMIENTO SOBRE EL PROCESO DE AUTOEVALUACIÓN DE CENTROS DOCENTES
 
Asesoramiento al Centro Educativo
Asesoramiento al Centro EducativoAsesoramiento al Centro Educativo
Asesoramiento al Centro Educativo
 
Asesoramiento y desarrollo profesional de los docentes: procesos clave del li...
Asesoramiento y desarrollo profesional de los docentes: procesos clave del li...Asesoramiento y desarrollo profesional de los docentes: procesos clave del li...
Asesoramiento y desarrollo profesional de los docentes: procesos clave del li...
 
Asesoramiento y desarrollo profesional de los docentes: procesos clave del li...
Asesoramiento y desarrollo profesional de los docentes: procesos clave del li...Asesoramiento y desarrollo profesional de los docentes: procesos clave del li...
Asesoramiento y desarrollo profesional de los docentes: procesos clave del li...
 
Hacia una pedagogía del respeto a la diversidad
Hacia una pedagogía del respeto a la diversidadHacia una pedagogía del respeto a la diversidad
Hacia una pedagogía del respeto a la diversidad
 
Violencia escolar en contextos educativos
Violencia escolar en contextos educativosViolencia escolar en contextos educativos
Violencia escolar en contextos educativos
 
Estudios sobre educación
Estudios sobre educaciónEstudios sobre educación
Estudios sobre educación
 
La atención a la diversidad en las escuelas secuandarias
La atención a la diversidad en las escuelas secuandariasLa atención a la diversidad en las escuelas secuandarias
La atención a la diversidad en las escuelas secuandarias
 
La atención a la diversidad.
La atención a la diversidad.La atención a la diversidad.
La atención a la diversidad.
 
Evaluar la calidad.
Evaluar la calidad.Evaluar la calidad.
Evaluar la calidad.
 
Sujeto, diversidad y discapacidad
Sujeto, diversidad y discapacidadSujeto, diversidad y discapacidad
Sujeto, diversidad y discapacidad
 

Último

How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...RKavithamani
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 

Último (20)

How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
Privatization and Disinvestment - Meaning, Objectives, Advantages and Disadva...
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 

Asesoría para la Educación Online

  • 1. National Institute for Learning Outcomes Assessment October 2011 Assessing Learning in Online Education: The Role of Technology in Improving Student Outcomes Matthew Prineas and Marie Cini Foreword by Peter Ewell nowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innovation success ingenuity ntellect curiosity challenge create achievement connection self-reflection educate action under stand communicate listen learn access quality innov uccess ingenuity intellect curiosity challenge knowledge accountability connection under stand communicate listen learn access quality innovation s ngenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate action under stand commun uriosity challenge create achievement connection self-reflection curiosity challenge create achievement connection self-reflection knowledge accou onnection self-reflection educate action under stand communicate listen learn access quality innovation success ingenuity intellect curiosity challen ducate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountability connection self-reflection educate a nder stand communicate curiosity challenge create achievement connection self-reflection under stand communicate listen learn access quality acti ction under stand communicate listen learn action under stand communicate listen learn access quality innovation success ingenuity intellect curios hallenge knowledge accountability connection access quality self-reflection curiosity challenge create achievement learn access quality innovation ngenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate action under stand knowled ccountability connection self-reflection educate action under stand communicate listen learn access quality innovation success ingenuity intellect c hallenge connection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innova uccess ingenuity challenge create achievement connection self-reflection educate action under stand connection self-reflection under stand commun sten learn access quality action create achievement connection self-reflection educate action under stand communicate listen learn access quality uccess educate action communicate listen learn access quality action educate action under stand communicate educate innovation success self-refl nowledge accountability communicate listen learn achievement connection self-reflection educate action under stand communicate listen learn acc nnovation success ingenuity intellect access quality innovation success self-reflection curiosity challenge create achievement connection self-reflec nder stand educate action under stand communicate listen learn action under stand communicate listen learn access quality innovation success inge uriosity challenge create achievement connection self-reflection under stand communicate listen learn access quality action create achievement con elf-reflection educate action under stand communicate listen learn access quality innovation success educate action communicate listen learn acce Oc c a s i o n a l Pa p e r # 1 2 lear ningoutcomesassessment.org National Institute for Learning Outcomes Assessment  |  1    
  • 2. About the Authors Matthew Prineas Matthew Prineas is the Academic Director for English, Literature, and Humanities in the School of Undergraduate Studies at University of Maryland University College. He received his PhD in English language and literature from the University of Rochester and an M.A. in English from the University of Michigan, Ann Arbor. Previously, he worked as an administrator at the National Endowment for the Humanities, where he coordinated curriculum development for EDSITEment, an online resource for K-12 teachers across the country. From 1995-2000, he was a faculty member in the English department at Idaho State University, and has published articles on Milton, Henry Vaughan, Ben Jonson, and other seventeenth-century authors. Marie Cini Marie Cini is the Vice Provost and Dean of the Undergraduate School at University of Maryland University College, where she oversees 32 baccalaureate programs serving 67,000 students worldwide. She earned a PhD and a master’s degree in social psychology from the University of Pittsburgh, as well as a master’s degree in counselor education from Pennsylvania State University. Before UMUC, she was an associate vice president for Academic Affairs and interim dean of the School of Management at City University of Seattle. During her tenure there, she reorganized and led the school’s e-campus initiative, while also establishing common curriculum standards, enhancing the Prior Learning Assessment process and linking academic affairs staff in the U.S. more closely with those in the international arena. Dr. Cini has published and presented extensively on the topics of authentic assessment and academic integrity, leadership development in adult learners, retention and adult learners, group newcomers and innovation, and program development across cultures. Contents Abstract . . . 3 Foreword . . . 4 Assessing Learning in Online Education . . . 6 Learning Analytics: Designing DataDriven Interventions to Support Student Learning . . . 7 Online Mastery Learning: Closing the Gap Between Assessment and Learning . . . 8 Courses and Programs that Learn . . .9 Changing Roles for Faculty . . . 12 Conclusion . . . 12 References . . . 14 NILOA National Advisory Panel . . . 16 About NILOA . . . 17 NILOA Staff . . . 17 NILOA Sponsors . . . 17 In this compelling new NILOA Occasional Paper, Matthew Prineas and Marie Cini argue persuasively not only that the connections between online education and learning outcomes assessment are deep but also that the mediated settings provided by online education have the potential to significantly improve assessment and its capacity to improve teaching and learning. Peter T. Ewell Vice President, NCHEMS Senior Scholar, NILOA The ideas and information contained in this publication are those of the authors and do not necessarily reflect the views of Carnegie Corporation of New York, Lumina Foundation for Education, or The Teagle Foundation. National Institute for Learning Outcomes Assessment  |  2    
  • 3. intellect curiosity challenge create achievement connection self-reflection educate action under stand communicate listen learn access qualit innovation success ingenuity intellect curiosity challenge knowledge accountability connection under stand communicate listen learn access qualit innovation success ingenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate actio under stand communicate curiosity challenge create achievement connection self-reflection curiosity challenge create achievement connection sel reflection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innovation succes ingenuity intellect curiosity challenge educate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountabilit connection self-reflection educate action under stand communicate curiosity challenge create achievement connection self-reflection under stan communicate listen learn access quality action educate action under stand communicate listen learn action under stand communicate listen learn acces Abstract Assessing Learning in Online Education The national learning outcomes assessment (LOA) movement and online learning in higher education emerged during roughly the same period. What has not yet developed is a sophisticated understanding of the power of online learning and its concomitant technologies to change how we view, design, and administer LOA programs. This paper considers how emerging techniques, such as data mining and learning analytics, allow the use of performance and behavioral data to improve student learning not just for future iterations of a program but in real time for current students. Also considered are powerful learning methodologies which predate online learning but have found renewed utility when coupled with new technologies for assessing and assisting student learners. In this paper, we postulate that technology will enable educators to design courses and programs that learn in the same way that individual students learn, and we offer some conditions that we believe are important to further this goal. We conclude with a consideration of how the faculty role will necessarily change as a result of these advances in our understanding of using technology to improve learning outcomes. National Institute for Learning Outcomes Assessment  |  3    
  • 4. intellect curiosity challenge create achievement connection self-reflection educate action under stand communicate listen learn access qualit innovation success ingenuity intellect curiosity challenge knowledge accountability connection under stand communicate listen learn access qualit innovation success ingenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate actio under stand communicate curiosity challenge create achievement connection self-reflection curiosity challenge create achievement connection sel reflection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innovation succes ingenuity intellect curiosity challenge educate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountabilit connection self-reflection educate action under stand communicate curiosity challenge create achievement connection self-reflection under stan communicate listen learn access quality action educate action under stand communicate listen learn action under stand communicate listen learn acces Foreword Although online education for college and university study has experienced remarkable growth over the past two decades, the links of online education to the assessment of student learning outcomes have been largely overlooked. While the ongoing “no significant difference” debate—which deploys assessment techniques to determine whether computer-mediated approaches are “as good as” face-to-face classroom delivery—may seem an exception, in my view, that debate has been fruitless. In this compelling new NILOA Occasional Paper, Matthew Prineas and Marie Cini argue persuasively not only that the connections between online education and learning outcomes assessment are deep but also that the mediated settings provided by online education have the potential to significantly improve assessment and its capacity to improve teaching and learning. Online education and assessment practice in higher education evolved contemporaneously, as Prineas and Cini observe. Assessment’s beginning is usually dated to the mid-1980s, when “distance education” consisted largely of paper-andpencil correspondence courses and what now seem relatively primitive one-way video communications. Now, of course, online education is multifaceted, sophisticated, and almost ubiquitous. Assessment, meanwhile, has developed from the ad hoc use of available standardized tests like the GRE and the ACT to the deployment of a powerful array of instruments, portfolios, and rubrics—many of which are featured on the NILOA web page. Yet while online education and assessment in higher education developed in parallel, these two “movements” intersected little during that time. Now they are poised to do so, Prineas and Cini emphasize, and both will benefit. The authors offer three broad observations to support this claim. The first centers on the ability of online learning management systems to harvest data not just on what students learn but also on students’ every learning activity. Historically, one of the major impediments in using assessment results to improve instruction has been the lack of data about student learning behaviors and the inability to connect analytically the behavioral evidence for further educational development. Instruments like the National Survey of Student Engagement (NSSE) were designed to gather such data in conventional learning environments. In online learning environments, however, these data are collected automatically in learning management systems that obtain an electronic record of everything the student does online. While making sense of all these data will certainly be a challenge, data mining techniques and what Prineas and Cini term “data analytics” are already guiding improvement. The authors’ second broad observation is about the growing dominance in the design of online education programs of the asynchronous mastery learning approach. Learning assessment is integral to the design of these programs because students advance from module to module only after demonstrating what they know and can do. In contrast to most assessment programs in traditional academic programs, this means that instead of assessing just a small sample of students, these programs assess every student against established learning objectives. What is more, these programs require no additional data collection, no “add-on” testing, nor the application of rubrics to rescore already graded student work, as is common in portfolio-based assessment approaches. National Institute for Learning Outcomes Assessment  |  4    
  • 5. intellect curiosity challenge create achievement connection self-reflection educate action under stand communicate listen learn access qualit innovation success ingenuity intellect curiosity challenge knowledge accountability connection under stand communicate listen learn access qualit innovation success ingenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate actio under stand communicate curiosity challenge create achievement connection self-reflection curiosity challenge create achievement connection sel reflection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innovation succes ingenuity intellect curiosity challenge educate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountabilit connection self-reflection educate action under stand communicate curiosity challenge create achievement connection self-reflection under stan communicate listen learn access quality action educate action under stand communicate listen learn action under stand communicate listen learn acces Foreword continued The authors’ third broad observation is about the emerging best practices for building effective learning environments online—practices that are analogous to the famous Seven Principles of Good Practice for Undergraduate Education, articulated by Arthur Chickering and Zelda Gamson some 25 years ago, but that are being applied in the new milieu of virtual learning. Like the original principles, these practices are intended to guide good pedagogy, but they also require an approach to curriculum design that is thoroughly intentional and that incorporates learning outcomes from the outset. Consistent with the mastery learning approach, this is a far different situation from that of conventional assessment—in which learning outcomes are frequently developed after the fact and applied to evaluate the effectiveness of courses never explicitly designed to fit together. Another common feature of online education that is favorable to the development of effective assessment, the authors note, is disaggregated faculty functions. While instructional design, content delivery, advising and mentoring, and the assessment of student work are all performed by the same individual in the conventional classroom, in online settings these functions are frequently undertaken separately. Additionally favorable to the development of effective assessment is the fact that, in online education, assessment is a distinct activity of faculty, providing them opportunities to undertake assessment activities more deeply, more thoroughly, and professionally than in their conventional grading activities. Because I have seen much of what Prineas and Cini describe through the lens of a particular institution—Western Governors University (WGU), which my organization helped design and where I continue to serve as a council member—I am delighted to introduce this paper. WGU is based entirely on the mastery learning approach and has exactly the kind of “unbundled” faculty functions the authors describe. While WGU is only just beginning to harness “learning analytics,” its remarkable growth and development owe much to the fact that its structure and operations are based almost entirely on the principles these authors articulate here. Peter T. Ewell Vice President, National Center for Higher Education Management Systems (NCHEMS) Senior Scholar, NILOA National Institute for Learning Outcomes Assessment  |  5    
  • 6. communicate listen learn access quality action educate action under stand communicate listen learn action under stand communicate listen learn acces quality innovation success ingenuity intellect curiosity challenge knowledge accountability connection access quality self-reflection curiosity challeng create achievement learn access quality innovation success ingenuity self-reflection educate action under stand intellect knowledge accountabili connection self-reflection educate action under stand knowledge accountability connection self-reflection educate action under stand communicat listen learn access quality innovation success ingenuity intellect curiosity challenge connection knowledge accountability connection self-reflectio educate action under stand communicate listen learn access quality innovation success ingenuity challenge create achievement connection sel reflection educate action under stand communicate listen learn achievement connection self-reflection educate action under stand communicate liste learn access quality innovation success ingenuity intellect access quality innovation success self-reflection curiosity challenge create achievemen Assessing Learning in Online Education: T h e R o l e o f Te c h n o l o g y i n I m p r o v i n g Student Outcomes Matthew Prineas and Marie Cini Online education, defined as a platform for delivering educational content and facilitating instructor-student interaction over a computer network (Shelton & Saltsman, 2005, pp. 3–4), came of age in the 1990s and grew rapidly over the next decade (Allen & Seaman, 2010; U.S. Department of Education, 2003; U.S. General Accounting Office, 2002). During roughly the same period, increasing calls for accountability in higher education led to the development of measures to establish the value of higher education in general, through the mechanism known as “learning outcomes assessment” (LOA). Besides historical proximity, these movements—online education and LOA—shared important features: both represented the introduction of disruptive concepts into the traditional face-to-face, faculty-centric classroom, and both raised questions about the efficacy of traditional models of teaching and learning measurement that had remained essentially unchanged for centuries. Yet, for the most part, the two movements progressed independently. Early LOA efforts were focused on the traditional or face-to-face classroom, with online courses assessed only to determine whether the learning outcomes matched the face-to-face “standard.” Even today, LOA efforts tend to use the same approach, asking the same questions about face-to-face classrooms as about online classrooms. What has not emerged is a more sophisticated understanding of the power of online learning to change how we view, design, and administer LOA programs. Technological developments have the potential to re-energize LOA efforts for both traditional and online education. Online education exists because technology made it possible. Technology is also making possible an increasing ability to track, assess, and respond to the behaviors and mastery levels of students in online courses with far greater depth and rapidity than ever before. Researchers are focusing on the large amounts of student data that can be gathered and archived in online courses and programs in order to “mine” the data for patterns that can assist educators to improve learning outcomes. Referred to as “learning analytics” in the higher education setting, these approaches enable faculty and course designers to make rapid changes in instructional practices and curriculum, and they empower students to make informed decisions about their learning behaviors and course choices. Emerging technologies are also reinvigorating powerful instructional methodologies such as mastery learning that in some cases predate the emergence of online education by decades (Scrima, 2009). These technological developments have the potential to re-energize LOA efforts for both traditional and online education. Although LOA efforts have grown substantially over the past decade, most institutions have used assessment results simply to satisfy accreditation pressures, unfortunately, instead of using them to improve student learning (Kuh & Ikenberry, 2009). Pressures from the public and from the federal government to be more accountable for the outcomes of higher education no longer allow us to pay only lip service to student learning. The promise of learning outcomes assessment is that through continuous improvement of curriculum and instruction, learning achievement for all students should increase. Online education and its concomitant technologies promise better ways to help all our students reach their full potential. National Institute for Learning Outcomes Assessment  |  6    
  • 7. In this paper we describe the power of learner analytics to improve student learning not just for future iterations of a program but in real time for current students. We also explore several powerful instructional methodologies that have found renewed utility in the online environment. We postulate that technology will enable educators to design courses and programs that learn in the same way individual students learn, and we offer some conditions that we believe are important to further this goal. Finally, we present some concluding thoughts about the ways in which the faculty role will necessarily change as a result of these advances in our understanding of using technology to improve learning outcomes. Learning Analytics: Designing Data-Driven Interventions to Support Student Learning Colleges and universities typically offer online classes through course management system (CMS) software, which provides the virtual classroom space for faculty and students to interact over the course of a semester (Watson, 2007). These interactions are tracked and stored, making CMS an important potential source of data related to student learning. Using current CMS software— which delivers curricular content in various formats including text, audio, and video—students and faculty can conduct synchronous or asynchronous discussions, faculty can administer quizzes and exams, students can submit papers and assignments, and faculty can provide grades and other forms of feedback. Within a single online course section, then, an individual faculty member has access to a wealth of quantitative and qualitative data about student engagement, for example, number of words posted, total number of posts, average length of posts, and the text of every student comment or question from the first week’s discussion to the last. The sophistication and reach of CMS storage and tracking features continue to grow. Additional kinds of data gathered by CMS software may include how frequently students log in to their courses; the length of time they spend on particular tasks such as reading a content page, taking self-directed quizzes, and so on; and customized statistical analyses of individual or group responses to quizzes. All such transactions are archived, during and after the semester, for every online section of every course offered at an institution. Thus, institutions with online enrollments in the thousands or tens of thousands come to possess a vast repository of data with potential applications in the assessment student learning. In this paper we describe the power of learner analytics to improve student learning not just for future iterations of a program but in real time for current students. An understanding is growing in higher education of how to utilize these data for improved student outcomes. Over the last several years, a movement has emerged that attempts to uncover patterns in data stored on the CMS and to design interventions accordingly. Because these applications are so new, the terminology of the field is still in flux; in recently published papers and conference presentations, it is common to see the terms “data mining,” “data analytics,” “academic analytics,” and “learning analytics” used to describe a family of related techniques and technologies for finding pattern in large data sets (for example, see the annual conference program of the Western Cooperative for Electronic Telecommunications [WCET] at http://wcetconference. wiche.edu/). Such techniques have been in existence for years in the business world, typically under the rubric “data analytics” or “business intelligence,” and have become more sophisticated than ever in profiling customer behavior (as anyone who receives recommendations from Google, Amazon, or Netflix knows). A recent and useful effort to define terms appropriate for the higher education setting has distinguished between academic analytics, in which data are examined based on a particular hypothesis or research question, and data mining, which describes a family of techniques, not necessarily driven by a hypothesis or question, for detecting patterns in large bodies of data. Both approaches have applications in higher education (Baepler & Murdoch, 2010). For the purposes of this paper, we refer to all of these techniques as “learning analytics.” National Institute for Learning Outcomes Assessment  |  7    
  • 8. Learning analytics have recently received a great deal of attention in their role of helping educators focus efforts and resources when designing interventions for student success. For example, researchers have used data analytics to discern which types of students are most likely to be at risk of failure at various stages in their academic journey. Once institutions identify these student groups, they can design interventions such as supportive services, enhancements to curricula, or improved instructional practices. A simple form of intervention, based on patterns identifying at-risk students in the online classroom, is the use of automatic emails sent to students to flag key problems such as a student’s a) not posting in a discussion thread by a certain date, b) not submitting an assignment, and/or c) not engaging frequently enough in online class discussion. This basic form of tracking can be done independently of the instructor; the message can be automatically sent to a student, noting the missed work or interaction, along with the reason why it is important for the student to engage or re-engage with the course. In sophisticated course management systems these messages can even share the knowledge that a certain percentage of students who do not engage in these targeted ways are “likely to fail,” thus motivating students to either engage or seek help. A pioneering example of such an early system designed and used by Purdue University—Signals—draws from 20 data points. The Signals algorithm synthesizes quiz and test data, but it can also incorporate time spent on task and other behavioral measures. These data are spliced together and presented back to the student as well as to the faculty member in the intuitive format of traffic signals: red, yellow, or green lights reflecting the level of the student’s performance (Arnold, 2010). Similar to the Signals dashboard, another approach assisting student learning has been developed by researchers at the University of Maryland Baltimore (UMB), who began with the observation that students earning a D or F used the university’s CMS 39% less frequently than students earning a grade of C or higher. Reasoning that these students might make better choices about their own engagement in online discussions if they had access to real-time data about the effects of their classroom behaviors, the researchers designed a tool that allows students to follow in real time their levels of activity in the online classroom as well as their grades compared to an anonymous summary of their peers’ performance (Fritz, 2011). All this information is stored in most CMS software already; the UMB tool simply made it accessible and provided an understanding of the data for each student. Researchers have used data analytics to discern which types of students are most likely to be at risk of failure at various stages in their academic journey. Once institutions identify these student groups, they can design interventions such as supportive services, enhancements to curricula, or improved instructional practices. Learning analytics are being used not just in the classroom, but in a variety of other operational functions supporting online programs. For example, instructional designers can now use CMS to improve courses through “built in” feedback mechanisms that continually (not just as the end of a course) gather user feedback on the relative usefulness of course features. In the past, such improvements required the labor-intensive analysis of individual student evaluations after the semester. Increasingly, mechanisms for user feedback can be built into the online course, so that as students in multiple sections of a course use and respond to various course features, course designers receive large amounts of data in real time about which features students are using or not using, which features are problematic, and so on (“Using data,” 2011). Online Mastery Learning: Closing the Gap Between Assessment and Learning Another set of emerging tools for assessing student outcomes may be grouped under the heading online mastery learning, which combines the decades-old instructional methodology of mastery learning with the technology of online education. First developed in the 1960s and early 1970s, mastery learning represents an instructional approach that radically closes the gap between assessing student achievement and intervening to assist and advance student learning (Scrima, 2009). National Institute for Learning Outcomes Assessment  |  8    
  • 9. In mastery learning, students must demonstrate proficiency in a particular learning objective before they are allowed to progress to a subsequent stage. Assessment and feedback alternate in frequent cycles as the student progresses through each stage of the curriculum. Time to completion becomes fluid. Students begin with differing sets of prior knowledge, progress at different rates, and master the course curriculum within different time frames. Clearly and precisely defined learning outcomes—absolutely crucial for this instructional methodology—must adequately define the criteria of mastery toward which students are working. Of equal importance are the precise definition and accurate alignment of incremental learning objectives that delineate the intermediate stages in the pathway toward mastery. While research into mastery learning efforts has supported its efficacy, at the same time it has highlighted the substantial investment of time and faculty involvement required to implement an effective curriculum using this approach (Kulik, Kulik, & Bangert-Drowns, 1990). Recent advances in online learning are breathing new life into this instructional methodology, however. In a fully online curriculum, a variety of assessment instruments—and the technical means for instant feedback—can be built into every instructional activity. Previously static aspects of an online course (for example, content areas containing background reading) can be designed as interactive, with embedded assessments testing then assisting students as they advance through the curriculum, whether on their own, in collaboration with other students, or with the help of an instructor. Particularly important, the process of learning for each individual can be tracked, monitored, and assisted. Underlying all the elements of such a course are data collection “engines” capable of gathering and quickly processing a large amount of information on student performance. This marriage of mastery learning and technology allows instructors to monitor students’ learning and to intervene when students need assistance. Computer-mediated approaches to mastery learning are being developed by a smattering of researchers and for-profit companies. Current leaders in this effort include CarnegieMellon’s Open Learning Initiative (OLI) and the Khan Academy. Computer-mediated approaches to mastery learning are being developed by a smattering of researchers and for-profit companies. Current leaders in this effort include Carnegie-Mellon’s Open Learning Initiative (OLI) and the Khan Academy. With open learning courses in fields ranging from engineering to French, OLI is working to develop multiple feedback loops for rapidly collecting, disseminating, analyzing, and acting on student learning data: • Feedback to students: Built into the mastery model of OLI courses are mechanisms that provide students with feedback even during the process of problem-solving, allowing for self-paced, incremental progress toward learning objectives. • Feedback to instructors: OLI researchers are developing ways to harness the large amounts of assessment data generated by the model and to present these data to instructors in a manner that empowers them to intervene in timely and effective ways (Bajzek et al., 2008). • Feedback to course designers: The OLI model provides course designers with rapid and frequent feedback to improve online courses in real time. In addition, the OLI model looks not only at patterns of how students use particular course features but also compares those usage patterns with learning assessment results. Much in the news of late, the Khan Academy (http://www.khanacademy. org/) is known for its collection of short instructional videos on topics ranging from arithmetic to finance, narrated and posted on YouTube by Salman Khan, a former hedge fund manager who aimed, initially, to tutor his cousins in math. The Khan Academy approach, now augmenting videos with technology and mirroring online mastery learning, is being used in K–12 settings to complement traditional classroom instruction (Rasicot, 2011). National Institute for Learning Outcomes Assessment  |  9    
  • 10. Like OLI, the Khan Academy approach reverses the usual relationship of homework and lecture or classroom discussion. At home, students learn content material at their own pace by working through incremental, interactive lessons and tutorials designed on the mastery model of progression. Data is collected that instructors can use to help students learn concepts they may be struggling with in the computer-mediated lessons. The classroom then becomes a place of active work, with students using class time to complete projects and assignments, either individually or in groups, with an instructor present to monitor, coach, and assist their efforts. Underlying the Carnegie Mellon OLI and Khan Academy models of mastery learning is an assessment methodology that also predates online learning but that in recent years has been energized by the growing capacity to track, store, and analyze student learning data. Adaptive achievement testing, according to Wainer (1990), bases test questions on students’ level of prior knowledge as shown in their responses to previous questions. Adaptive testing is more efficient in that fewer questions can be used to assess a student’s level of knowledge than with traditional linear testing, which requires a student to answer all low-level questions before moving to more difficult material. With the use of adaptive testing, instructors can quickly determine students’ skill level and provide them with just-in-time learning, making optimal use of their prior knowledge. This assessment method is also made simpler through course management systems that connect new learning materials to the questions the students answer incorrectly. Courses and Programs That Learn To be truly revolutionary, student learning data generated in online technologies must be applied in a systematic way at the program level and in real time, so that students can benefit from ongoing adjustments—changes in curricula, course sequences, academic requirements, and resource allocation. Online technologies can provide this feedback loop in ways that traditional classrooms simply cannot. Just as individual learners need feedback that is timely, targeted, and likely to be acted upon, courses and programs also need feedback loops that efficiently and quickly direct the results of assessment to improve student learning. The emerging tools for online assessment examined previously are primarily being used to assist individual student learning in vivo, for current students at the time of greatest need. As a national movement, however, learning outcomes assessment is concerned mostly with adjustments to curricula and instruction at the program level, generally applying these adjustments to assist future students to achieve at greater levels. To be truly revolutionary, student learning data generated in online technologies must be applied in a systematic way at the program level and in real time, so that students can benefit from ongoing adjustments at the program level—changes in curricula, course sequences, academic requirements, resource allocation, and so on. Online technologies can provide this real-time, program-level feedback loop in ways that traditional classrooms simply cannot. Two preconditions seem necessary for effectively scaling emerging technologies for assessment in the online classroom to the needs of program-level, realtime LOA efforts. First, as the example of online mastery learning suggests, educational technology will be most effective when coupled with a conceptual learning model—for example, a set of best practices about the kinds of interventions that have the greatest impact on student performance in the classroom. Without a learning model to guide course-level uses of online assessment technology, ad hoc applications by individual instructors will be difficult to scale up to instructional changes across multiple sections of the same course or across a program. Second, to ensure that assessment data are actionable at the course and program level, there needs to be a tight integration of learning outcomes throughout the curriculum, including assignments, discussion, course material, and instructional practices. Both of these preconditions—a guiding learning model and tight curricular integration—of course, are essential for LOA in any program, online or face-to-face. With the ever-increasing volume of assessment data available from emerging technologies, however, such structural underpinnings are becoming even more important. Without a model to guide this work and to serve as a framework for collecting and interpreting this great amount of data, we will find ourselves simply unable to use it effectively. National Institute for Learning Outcomes Assessment  |  10    
  • 11. Fortunately, the foundations already exist for a learning model appropriate for the online environment. The learner-centered tenets, based on a broad review of the learning literature, articulated in the influential Seven Principles for Best Practice in Undergraduate Education, by Chickering and Gamson (1987), are particularly applicable to the task of identifying interventions that impact student performance: faculty engagement, student collaboration, active learning, frequent and prompt feedback, time on task, high expectations, and respect for diversity (see also Chickering & Ehrmann, 1996). Although initially developed for the faceto-face environment, the Chickering/Gamson model has been utilized widely in the online environment. Palloff and Pratt (2009), in Assessing the Online Learner: Resources and Strategies for Faculty, provide numerous examples and models for designing effective assessments in the online modality as well as effective interventions based on outcomes data. These educators offer institutions both a theoretical framework based on the learner-centered approach and a practical guide to online assessment practices leading to improvements in instruction, courses, and programs. Similar approaches—emphasizing interactivity, instructor feedback, critically engaging assignments, and total time on task—are reflected as well in a growing number of published best practices for online education. Among the first, if not the first, of such sets of published guidelines was the 1995 document produced by the Western Cooperative for Electronic Telecommunications (WCET) in cooperation with the eight regional accrediting bodies, Principles of Good Practice for Electronically Offered Degree and Certificate Programs. A more recent best-practices document produced by WCET (2009), Best Practice Strategies to Promote Academic Integrity in Online Education, Version 2.0., emphasizes assessment practices in the online classroom. The practices in these documents are supported as well by reviews and meta-analyses of comparative studies on online and face-to-face education published over the past several years (see Bernard et al., 2004; Tallent-Runnels et al., 2006; U.S. Department of Education, 2010; Zhao, Lei, Yan, Lai, & Tan, 2005). Another precondition for effectively scaling up course-level assessment data to changes at the program level is the integration of learning outcomes throughout all aspects of the curriculum, including assignments, discussion, course material, and other instructional features through a “backward design” process beginning with robust program outcomes. The second precondition for effectively scaling up course-level assessment data to changes at the program level is the integration of learning outcomes throughout all aspects of the curriculum, including assignments, discussion, course material, and other instructional features through a “backward design” process beginning with robust program outcomes. Many universities, however, using a more traditional approach to curriculum design, have responded to the growing market for online education by quickly cobbling together degree programs that combine existing face-to-face elements with new courses developed in the online format (Shelton & Saltsman 2005). Program-level LOA in this context becomes a Rube Goldberg operation of retrofitting curricula with objectives derived after the fact from existing courses developed by individual faculty—with inevitable gaps and contradictions. Learning objectives in individual courses may be presented out of sequence. Some program-level outcomes may not be supported by sufficient learning experiences for students. In addition, course-level assignments and assessments originally designed for the face-to-face environment may not be ideal for an online classroom, making it more difficult to identify which program-level adjustments should be made in response to assessment data. A more efficient and effective scenario would be a curriculum in which outcomes and assessments are built in from the beginning. Competencies could then be sequentially developed and reinforced within courses and across the student experience, with assessments and feedback deeply embedded and intertwined in each stage of learning. An example of this approach to curriculum redesign was recently undertaken by the Undergraduate School at the University of Maryland University College in its Supporting Educational Goals for Undergraduate Excellence (SEGUE) project. The SEGUE process transformed the undergraduate curriculum, making it more relevant, sequenced, and focused on learning outcomes (Stiehl & Lewchuk, 2008). Applying this process to connect individual courses to programs of study, the undergraduate faculty created seamless learning pathways in 33 programs of study by answering this question: What do we expect students to be able to do “out there” as a result of the learning experiences we offer them “in here” in our curriculum in the Undergraduate School? National Institute for Learning Outcomes Assessment  |  11    
  • 12. Program outcomes were focused not only on the workplace but also on the community, the family, and global society. As a result of SEGUE, faculty have a better understanding of the program in which they teach, its intended outcomes, and what students in their particular course must master to reach intended learning outcomes. Also as a result of SEGUE, students are better able to see the connections between their courses and the contributions of each course to their overall education, their career, and their role in society. Changing Roles for Faculty If our assertions here are correct, the distinction between online courses and face-to-face courses will continue to blur. We foresee that day when no course will be entirely face-to-face without at least some online practice and assessments. At many institutions, face-to-face courses use the same CMS software to complement traditional class sessions already; at others, such as the University of Maryland University College, the process has gone further, with all previously face-to-face courses now being taught only as “hybrid” or “blended” format courses (a hybrid or blended course is one in which online course activities replace a portion of the course’s official “seat time”). As this blurring continues, the key group with a significant impact on institutional LOA efforts will be faculty. Without faculty support and engagement, these changes simply cannot occur. To be effectively deployed on a large scale, these promising approaches will require a profound transformation in how faculty members interact with students as well as how faculty create and improve curricula and programs. Perhaps the greatest conceptual shift for faculty to absorb will be the new reality that the effective teaching of each student “takes a village” and that the individual instructor in isolation can no longer achieve this. The “classroom walls” of the online course will become increasingly porous, with library staff, course designers, and various other support staff creating, maintaining, and continually improving the learning experience for students (Neely & Tucker, 2010). The new technologies and approaches to online assessment only accelerate a process that has marked online learning programs from the beginning, particularly those programs with large enrollments and students and faculty dispersed across the nation. While demands of scale and cost dictate the sharing of responsibility for course content and instruction for larger online programs, creating and upgrading the learning environment already rests with a network of staff and systems. What has changed with the emergence of new technologies and approaches such as learning analytics is that we are now able to make informed interventions on students’ behalf during the semester, not just before the next one. If our assertions here are correct, the distinction between online courses and face-to-face courses will continue to blur. We foresee that day when no course will be entirely face-to-face without at least some online practice and assessments. As this blurring continues, the key group with a significant impact on institutional LOA efforts will be faculty. Conclusion Redesign efforts to create “programs that learn” will push us further away still from the comfortable and thus-far dominant model of the instructor who has mastered a discipline and who imparts his or her knowledge either through lecture or class discussion. In this new world of higher education, rather than delivering prepared lectures to all students regardless of their foundational knowledge, the instructor’s role will be to monitor students’ progress and to intervene appropriately when students are unable to advance to the next stage in the curriculum. Instead of the instructor transferring knowledge to students, the instructor (along with learning experts) will develop learning environments with content designed for students in ways that align with best practices. As students work through material delivered online, the role of the instructor will not be to teach all topics to all students but, rather, to monitor which students are having trouble mastering which concepts, so that specific help can be provided to those students at the right time. Faculty roles in pre-semester preparation and post-semester adjustments are changing as well. In outcomes-based program design, course materials and assignments will be delivered based on learning research, not simply on how an instructor wishes to deliver the content based on his or her own past experience. National Institute for Learning Outcomes Assessment  |  12    
  • 13. Course curricula, assignments, and infrastructure will be designed ahead of time, with a tight integration of learning outcomes in all aspects of the course. The difficulty and rigor of assignments will also be set and refined by collective efforts and a shared model of how students learn, rather than by each individual instructor’s idiosyncratic practice. Emerging technologies and approaches in online education enable all instructors to practice what the very best teachers have always known: that students are enabled or limited by their prior knowledge (Ambrose, Bridges, DiPietro, Lovett, & Norman, 2010) and that the fine art of teaching involves discovering the multiple pathways that will move students from where they are to the desired learning outcomes. If a student comes to the course knowing a great deal already, the online instructor will be able to assist that student to accelerate the time to completion in a course. If another student needs more help and time to acquire needed prior knowledge and skills, the instructor can augment the student’s learning to fill in some of the basic knowledge required before moving forward. While these features could, of course, be integrated in an online course designed by an individual instructor, the truth is that most faculty simply do not have the time, resources, or expertise to recreate what can be integrated far more simply into course management systems for online education. There will be some who find the seeming loss of faculty autonomy in this new world of higher education profoundly disturbing. Such voices will likely be widespread and, particularly for traditional institutions without extensive experience with online courses and programs, may present a barrier to adopting the new technologies and approaches examined in this paper. Other faculty members, however, will welcome a work environment in which their time is not consumed by preparing learning materials and grading student work. They will embrace the opportunity to meet students where they are intellectually and to provide individualized assistance benefitting the at-risk student as well as the more advanced student held back by a one-size-fits-all approach. Emerging technologies and approaches in online education enable all instructors to practice what the very best teachers have always known: that students are enabled or limited by their prior knowledge and that the fine art of teaching involves discovering the multiple pathways that will move students from where they are to the desired learning outcomes. National Institute for Learning Outcomes Assessment  |  13    
  • 14. References Allen, I. E., & Seaman, J. (2010, November). Class differences: Online education in the United States, 2010. Wellesley, MA: Babson Survey Research Group and The Sloan Consortium. Retrieved from http://sloanconsortium.org/publications/ survey/class_differences Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: Seven research-based principles for smart teaching. San Francisco, CA: Jossey-Bass. Arnold, K. E. (2010, November 1). Signals: Applying academic analytics. Educause Quarterly, 33(1). Retrieved from http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVolum/ SignalsApplyingAcademicAnalyti/199385 Baepler, P., & Murdoch, C. J. (2010, July). Academic analytics and data mining in higher education. International Journal for the Scholarship of Teaching and Learning. 4(2), 1–9. Bajzek, D., Brooks, J., Jerome, W., Lovett, M., Rinderle, J., Rule, G., & Thille, C. (2008). Assessment and instruction: Two sides of the same coin. In G. Richards (Ed.), Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2008 (pp. 560–565). Chesapeake, VA: AACE. Retrieved from http://oli.web.cmu.edu/ openlearning/files/theinitiative/publications/eLearn08-TwoSides.pdf Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., . . . & Huang, B. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379–439. doi: 10.3102/00346543074003379 Chickering, A. W., & Ehrmann, S. C. (1996). Implementing the seven principles: Technology as a lever. American Association of Higher Education Bulletin, 49(2), 3–6. Retrieved from http://www.aahea.org/bulletins/articles/sevenprinciples.htm Chickering, A. W., & Gamson, Z. (1987). Seven principles for good practice in undergraduate education. American Association of Higher Education Bulletin, 39(7) 3–7. Retrieved from http://www.aahea.org/bulletins/articles/ sevenprinciples1987.htm Fritz, J. (2011, March). Classroom walls that talk: Using online course activity data of successful students to raise selfawareness of underperforming peers. The Internet and Higher Education, 14(2), 89–97. Kuh, G. D., & Ikenberry, S. O. (2009). More than you think, less than we need: learning outcomes assessment in American higher education. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment. Retrieved from http://www.learningoutcomeassessment.org/documents/fullreportrevised-L.pdf Kulik, C.-L. C., Kulik, J. A., & Bangert-Drowns, R. L. (1990, summer). Effectiveness of mastery learning programs: A meta-analysis. Review of Educational Research, 60(2), 265–299. doi: 10.3102/00346543060002265 Neely, P. W., & Tucker, J. P. (2010). Unbundling faculty roles in online distance education programs. Contemporary Issues in Education Research, 3(6), 17–24. Retrieved from http://journals.cluteonline.com/index.php/CIER/article/viewArticle/211 Palloff, R. M., & Pratt, K. (2009, December). Assessing the online learner: Resources and strategies for faculty. San Francisco, CA: Jossey-Bass. Rasicot, J. (2011, August 4). Education review: Web site offering free math lessons catches on ‘like wildfire.’ The Washington Post. Retrieved from http://www.washingtonpost.com/lifestyle/magazine/web-site-offering-free-online-math-lessonscatches-on-like-wildfire/2011/07/15/gIQAtL5KuI_story.html Scrima, A. E. (2009). Implementing a mastery model through self quizzing in an online learning environment (Doctoral dissertation). Retrieved from Proquest Dissertations & Theses, AAT No. 3364685. Shelton, K., & Saltsman, G. (2005). An administrator’s guide to online education. Charlotte, NC: IAP-Information Age. Stiehl, R., & Lewchuk, L. (2005). The assessment primer: Creating a flow of learning evidence. Corvallis, OR: The Learning Organization. Stiehl, R., & Lewchuk, L. (2008). The outcomes primer: Reconstructing the college curriculum (3rd ed.). Corvallis, OR: The Learning Organization. Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S. , Ahern, T. C., Shaw, S. M., & Liu, X. (2006, spring). Teaching courses online: A review of the research. Review of Educational Research, 76(1), 93–135. doi: 10.3102/00346543076001093 National Institute for Learning Outcomes Assessment  |  14    
  • 15. U.S. Department of Education, National Center for Education Statistics. (2003). Distance education at degree-granting postsecondary education institutions, 2000–2001 (NCES 2003-017). Washington, DC: Author. Retrieved from http://nces. ed.gov/pubs2003/2003017.pdf U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. (2010, September). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies (ID: CSD5650 ). Washington, DC: Author. Retrieved from http://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport. pdf U.S. General Accounting Office. (2002). Distance education: Growth in distance education programs and implications for federal education policy (Statement of Cornelia M. Ashby, Director, Education, Workforce, and Income Security Issues. Testimony before the Committee on Health, Education, Labor, and Pensions, U.S. Senate [GAO-02-1125]). Washington, DC: Author. Retrieved from http://www.gao.gov/new.items/d021125t.pdf Using data to improve online courses. (2011, May 15). Distance Education Report. Retrieved from http://www.magnapubs. com/newsletter/story/5718/ Wainer, H. (1990). Computerized adaptive learning: A primer. Hillsdale, NJ: Lawrence Erlbaum. Watson, W. R., & Watson, S. L. (2007). An argument for clarity: What are learning management systems, what are they not, and what should they become? TechTrends, 51(2), 28–34. Retrieved from http://www.springerlink.com/content/ r1ur45g5t3225553/ Western Cooperative for Electronic Telecommunications (WCET). (1995). Principles of good practice for electronically offered degree and certificate programs. Retrieved from http://wcet.wiche.edu/wcet/docs/publications/ PrinciplesofGoodPractice1995.pdf Western Cooperative for Electronic Telecommunications (WCET), UT TeleCampus, & Instructional Technology Council. (2009). Best practice strategies to promote academic integrity in online education, Version 2.0. Retrieved from http://wcet. wiche.edu/wcet/docs/cigs/studentauthentication/BestPractices.pdf Zhao, Y., Lei, J., Yan, B., Lai., C., & Tan, S. (2005). What makes the difference? A practical analysis of research on the effectiveness of distance education. Teachers College Record, 107(8), 1836–1884. Retrieved from http://www.tcrecord.org/ Content.asp?ContentId=12098 National Institute for Learning Outcomes Assessment  |  15    
  • 16. NILOA National Advisory Panel Joe Alutoo Provost The Ohio State University Trudy W. Banta Professor Indiana University-Purdue University Indianapolis Douglas C. Bennett President Earlham College Wallace Boston President and CEO American Public University System Charlene Nunley Program Director Doctoral Program in Community College Policy and Administration University of Maryland University College Randy Swing Executive Director Association for Institutional Research Carol Geary Schneider President Association of American Colleges and Universities Molly Corbett Broad President American Council on Education Michael Tanner Chief Academic Officer/Vice President Association of Public and Land-grant Universities Judith Eaton President Council for Higher Education Accreditation Belle Wheelan President Southern Association of Colleges and Schools Richard Ekman President Council of Independent Colleges Mildred Garcia President California State University Dominguez Hills Ralph Wolff President Western Association of Schools and Colleges Ex-Officio Members Susan Johnston Executive Vice President Association of Governing Boards Peter Ewell Vice President National Center for Higher Education Management Systems Steve Jordan President Metropolitan State University - Denver Stanley Ikenberry President Emeritus and Regent Professor University of Illinois Paul Lingenfelter President State Higher Education Executive Officers George Kuh Director, National Institute for Learning Outcomes Assessment Adjunct Professor, University of Illinois Indiana University Chancellor’s Professor Emeritus George Mehaffy Vice President Academic Leadership and Change American Association of State Colleges and Universities NILOA Mission NILOA’s primary objective is to discover and disseminate ways that academic programs and institutions can productively use assessment data internally to inform and strengthen undergraduate education, and externally to communicate with policy makers, families and other stakeholders. NILOA Occasional Paper Series NILOA Occasional Papers are commissioned to examine contemporary issues that will inform the academic community of the current state-of-the art of assessing learning outcomes in American higher education. The authors are asked to write for a general audience in order to provide comprehensive, accurate information about how institutions and other organizations can become more proficient at assessing and reporting student learning outcomes for the purposes of improving student learning and responsibly fulfilling expectations for transparency and accountability to policy makers and other external audiences. Comments and questions about this paper should be sent to sprovez2@illinois.edu. National Institute for Learning Outcomes Assessment  |  16    
  • 17. About NILOA • The National Institute for Learning Outcomes Assessment (NILOA) was established in December 2008. • NILOA is co-located at the University of Illinois and Indiana University. • The NILOA web site went live on February 11, 2009. www.learningoutcomesassessment.org • The NILOA research team has scanned institutional websites, surveyed chief academic officers, and commissioned a series of occasional papers. • One of the co-principal NILOA investigators, George Kuh, founded the National Survey for Student Engagement (NSSE). • The other co-principal investigator for NILOA, Stanley Ikenberry, was president of the University of Illinois from 1979 to 1995 and of the American Council of Education from 1996 to 2001. He is currently serving as Interim President of the University of Illinois. • Peter Ewell joined NILOA as a senior scholar in November 2009. NILOA Staff National Institute for Learning Outcomes Assessment Stanley Ikenberry, Co-Principal Investigator George Kuh, Co-Principal Investigator and Director Peter Ewell, Senior Scholar Staci Provezis, Project Manager and Research Analyst Jillian Kinzie, Associate Research Scientist Natasha Jankowski, Associate Project Manager and Research Analyst Gloria Shenoy, Research Analyst Gianina Baker, Research Analyst NILOA Sponsors Carnegie Corporation of New York Lumina Foundation for Education The Teagle Foundation Permission is hereby granted to reproduce and distribute copies of this work for nonprofit educational purposes, provided that copies are distributed at or below cost, and that the author, source and copyright notice are included on each copy. © 2011 NILOA Produced by Creative Services | Public Affairs at the University of Illinois for NILOA. 10.032 National Institute for Learning Outcomes Assessment  |  17    
  • 18. ducate innovation success ingenuity intellect curiosity challenge create achievement knowledge accountability connection self-reflection educate a nder stand communicate curiosity challenge create achievement connection self-reflection under stand communicate listen learn access quality acti ction under stand communicate listen learn action under stand communicate listen learn access quality innovation success ingenuity intellect curios hallenge knowledge accountability connection access quality self-reflection curiosity challenge create achievement learn access quality innovation ngenuity self-reflection educate action under stand intellect knowledge accountability connection self-reflection educate action under stand knowled ccountability connection self-reflection educate action under stand communicate listen learn access quality innovation success ingenuity intellect c hallenge connection knowledge accountability connection self-reflection educate action under stand communicate listen learn access quality innova uccess ingenuity challenge create achievement connection self-reflection educate action under stand connection self-reflection under stand commun sten learn access quality action create achievement connection self-reflection educate action under stand communicate listen learn access quality i uccess educate action communicate listen learn access quality action educate action under stand communicate educate innovation success self-refl nowledge accountability communicate listen learn achievement connection self-reflection educate action under stand communicate listen learn acce nnovation success ingenuity intellect access quality innovation success self-reflection curiosity challenge create achievement connection self-reflec nder stand educate action under stand communicate listen learn action under stand communicate listen learn access quality innovation success inge uriosity challenge create achievement connection self-reflection under stand communicate listen learn access quality action create achievement con elf-reflection educate action under stand communicate listen learn access quality innovation success educate action communicate listen learn acce uality action educate action under stand create achievement connection self-reflection under stand communicate listen learn access quality action c chievement connection self-reflection educate action under stand communicate listen communicate educate innovation success self-reflection know ccountability connection self-reflection educate action under stand communicate listen learn access quality innovation ingenuity intellect connectio eflection under stand communicate listen learn access quality action create achievement connection self-reflection educate action under stand comm sten learn access quality innovation success educate action communicate listen learn access quality action educate action under stand communicat National Institute for Learning Outcomes Assessment For more information, please contact: National Institute for Learning Outcomes Assessment (NILOA) University of Illinois at Urbana-Champaign 340 Education Building Champaign, IL 61820 learningoutcomesassessment.org njankow2@illinois.edu Fax: 217.244.3378 Phone: 217.244.2155