(1) Jisc provides shared services and support for UK universities and colleges. Its learning analytics project aims to improve student retention, achievement, employability and learning through data.
(2) Jisc developed an open architecture storing learning data in the cloud with APIs and tools. Early adopter institutions are piloting dashboards and predictive models, but significant process changes take time.
(3) For success, dedicated project roles are needed including managers, analysts and academic representatives. Tools must match existing workflows and use descriptive analytics initially before predictive uses. Standards facilitate multiple vendors using shared data.
2. About Jisc
● Non-profit Education Technology provider for the UKs 600+ Universities, Colleges and
other post 16 education providers
● We provide shared services, purchasing support and help and advice
About Me
● Director of Technology and Analytics at JIsc
● Leading development of new services in analytics and data
3. Aims of learning analytics for our members
› For our members:
– improve retention
– improve achievement
– improve employability
– improve learning design
4. Jisc’s Learning Analytics project
Three core strands:
Learning Analytics
architecture and
service
Toolkit Community
Jisc Learning Analytics
5. Our open architecture:
» A cloud based multi-tenanted standards-
based store for learning analytics data
» Plug-ins, connectors and tools to allow
institutions to submit data to the store
» APIs to allow vendors to extract data from
the store
» A predictive modeling service
» A staff dashboard and student app
11. Community and tools include:
● Networking events held four times a year
with around 100 attendees per event.
● A code of practice covering legal and
ethical issues
● A mailing list with around 600 members
and a blog https://analytics.jiscinvolve.org
● A procurement framework agreement
● Documentation and guides
12. Our Service in Numbers
30+ Institutions
500,000,000 xAPI Learning Activity
Statements
Institutions at early stages of full
rollout.
13. Lessons:
» Lesson 1: The team needs a number of core roles in order to succeed
» Lesson 2: Do not expect process change to occur quickly.
» Lesson 3: The tools should be developed with users and match their terminology and
processes.
» Lesson 4: Applying standards to data really does work
» Lesson 5: Do not underestimate legal and contractual complexity
» Lesson 6: Users want to understand predictive models (and that is hard)
» Lesson 7: Consider the innovation chasm
14. 1: The team needs a number of core roles in order to succeed
The teams that made the best progress had the following characteristics:
● A single senior manager with clear responsibility for the project. It does not seem to matter
which part of the organization this role comes from.
● A dedicated project manager.
● A named contact for each department/service responsible for delivering data,
● A number of named academic staff representatives.
15. 2: Do not expect process change to occur
quickly.
Two years into the project...
….institutions only now getting to roll out stage
Currently using learning analytics to support existing process
(see lesson 3!)
Why? Learning Analytics is really a process change project as
much as a technology or data project
16. 3: The tools should be developed with users and match
their terminology, needs and processes
● See 2 - we expected institutions to adapt to the new things they could do (eg act on
early alerts)...
● ...so our original tools did not obviously fit into users existing workflow and
processes.
● We overestimated what tutors already had:
o They wanted lots of fairly basic supporting descriptive analytics
● The language used, particularly around risk, did not match users view.
o They were interested in success, not risk.
17. Sector
Transformation
Awareness
Experimentation
Organisation
support
Organisational
transformation
Descriptive Analytics
what happened? How do I compare?
Predictive Analytics
what will happen?
Prescriptive Analytics
what should I do?
Automated
it’s done
Data
Diagnostic Analytics
why did it happen?
Ordered Data
Standardised Data
Adaptive learning etc.
Recommendation engines etc.
Predictive models, Intervention
management etc
Data exploration tools,
processes etc
Dashboards,
Benchmarking etc.
Data Warehouse, data
stores
Data connectors
Analytics with a national approach
18. To address this, a new tool was developed fitting
around existing roles and processes. The initial
roles are:
● Personal Tutors: People who meet with
students several times a year, and review
progress.
● Module/Course Leaders: People who want to
understand how the teaching their
module/course is going.
.
.
19. 4: Applying standards to data really
does work
Our approach:
» Activity Data: xAPI
» Student Data: Based on UK standard by Higher
Education Stats Agency
20. 4: Applying standards to
data really does work
● Multiple vendor solutions run on
the same data set.
● Dashboards etc are system
agnostic.
{
"actor": {
"name": "Michael Webb",
"mbox": "mailto:michael.webb@jisc.ac.uk"
},
"verb": {
"id": "http://adlnet.gov/expapi/verbs/experienced",
"display": { "en-US": "experienced" }
},
"object": {
"id": "http://lak.com/activities/lak18",
"definition": {
"name": { "en-US": "LAK 18" }
}
}
}
22. 5: Do not underestimate legal and
contractual complexity
» Time to sign ranged from:
» 8 days of days to 183 days
» with a mean of 43 days for the first 18
institutions.
23. 6: Users want to understand predictive
models (and that is hard)
» Jisc’s Learning Analytics Code of Practice (Sclater and
Bailey, 2015).
» “All algorithms and metrics used for predictive
analytics or interventions should be understood,
validated, reviewed and improved by appropriately
qualified staff”
26. » By definition, the institutions taking part in the first phase of the pilots were
early adopters, and behaved in a way consistent with Moore’s description:
» ‘They want to start out with a pilot project, which makes sense
because they are 'going where no man has gone before' and you are
going with them”