1. ``
`
Dr. Hendrik Drachsler, UHD, Pre-Con. #OWD2015 09.11.2015
1
Learning
Analytics
http://www.flickr.com/photos/traftery/4773457853Picture Pic. by Tom Raftery:
2. ``
`
3
• Hendrik Drachsler
Associate Professor
Learning Technologies
• Research topics:
Personalization,
Recommender Systems,
Learning Analytics,
Mobile devices
• Application domains:
Schools, HEI, Medical
education
WhoAmI
2006 - 2009
@HDrachsler
5. ``
`
More ICT = More DATA
Deze data zijn waardevol en informatief.
6. ``
`
The Big Data Economy
Voorbeeld:
The Google Flue trend
technology …
… kun nu ook voor
onderwijs toegepast
worden.
Learning Analytics = Data Science voor educatie
9. ``
`
DATA van
Learning Profiles
DATA van
LMS en MOOCs
DATA van
Learning Resources (Apps, Games,..)
DATA van
assessments
Wat zijn educatief data?
De eerste keer dat wij de leerling in het
leerproces kunnen volgen.
- On Demand Learning Measures -
10. ``
`
17
Reinhardt, W., Meier, C., Drachsler, H., & Sloep, P. B. (2011). Analyzing 5 years of EC-TEL
proceedings. In C. D. Kloos, D. Gillet, R. M. Crespo García, F. Wild, & M. Wolpers (Eds.),
Towards Ubiquitous Learning: 6th European Conference of Technology Enhanced Learning, EC-
TEL 2011 (pp. 531-536). September, 20-23, 2011, Palermo, Italy. LNCS 6964; Heidelberg,
Berlin: Springer.
Nieuwe inzichten
11. ``
`
17
Nieuwe inzichten
Dawson, S., Bakharia, A., & Heathcote, E. (2010, May). SNAPP: Realising the
affordances of real-time SNA within networked learning environments. In
Proceedings of the 7th International Conference on Networked Learning (pp. 125-133).
Denmark, Aalborg.
12. ``
`
17
Graph by Rob Koper. Data science voor de realisatie van online activerend onderwijs.
Presentation given at Dag van het Onderwijs (5 November 2015). Heerlen. The Netherlands
Nieuwe inzichten
Learning
Activities
Studytime
in days
13. ``
`
17
Graph by Rob Koper. Data science voor de realisatie van online activerend onderwijs.
Presentation given at Dag van het Onderwijs (5 November 2015). Heerlen. The Netherlands
Nieuwe inzichten
Learning
Activities
Studytime
in days
16. ``
`
18
Greller, W. & Drachsler, H. (2012). Turning Learning into Numbers. Toward a Generic
Framework for Learning Analytics. Journal of Educational Technology & Society.
http://ifets.info/journals/15_3/4.pdf
Visualiz.
17. ``
`
19
Stakeholders
data
subjects
data
clients
Greller, W. & Drachsler, H. (2012). Turning Learning into Numbers. Toward a Generic
Framework for Learning Analytics. Journal of Educational Technology & Society.
http://ifets.info/journals/15_3/4.pdf
18. ``
`
20
Stakeholders
Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning
analytics dashboard applications. American Behavioral Scientist.
20. ``
`
24
Educational Data
Drachsler, H., et al. (2010). Issues and Considerations regarding Sharable Data Sets for
Recommender Systems in Technology Enhanced Learning. 1st Workshop Recommnder
Systems in Technology Enhanced Learning (RecSysTEL@EC-TEL 2010) September, 28,
2010, Barcelona, Spain.
Verbert, K., Manouselis, N., Drachsler, H., and Duval, E. (2012). Dataset-driven Research
to Support Learning and Knowledge Analytics. Journal of Educational Technology &
Society. www.ifets.info/journals/15_3/10.pdf
21. ``
`
25
Edu. Data Storage
Berg, A., Scheffel, M., Ternier, S., Drachsler, H., and Specht, M. (submitted). Dutch
cooking with xAPI recipes and the flavour of various Learning Record Stores.
Learning Analytics and Knowledge conference 2016, Edinburgh, UK.
• Various heterogonous
data sources
• No metadata standards
• No proper description of
data fields
• No unique user ID in the
different systems
• Not intended for
evaluation and
educational interventions
• No comparison of effective
methods
22. ``
`
26
Edu. Data Storage
Berg, A., Scheffel, M., Ternier, S., Drachsler, H., and Specht, M. (submitted). Dutch
cooking with xAPI recipes and the flavour of various Learning Record Stores.
Learning Analytics and Knowledge conference 2016, Edinburgh, UK.
27. ``
`
37
Constraints
• $100 million investment
• Aim: Personalized learning in public schools, through data &
technology standards
• 9 US states participated, In 2013 data about millions of children
have been stored
32. ``
`
43
Competences
Drachsler, H., Stoyanov, S., d'Aquin, M., Herder, E., Dietze, S., & Guy, M. (2014, 16-19
September). An Evaluation Framework for Data Competitions in TEL. 9th European
Conference on Technology-Enhanced Learning (EC-TEL 2014), Graz, Austria.
35. ``
`
Sophistican model
Siemens, G., Dawson, S., & Lynch, G. (2014). Improving the Quality and Productivity of
the Higher Education Sector – Policy and Strategy for Systems-Level Deployment of
Learning Analytics. Canberra, Australia: Office of Learning and Teaching, Australian
Government. Retrieved from http://solaresearch.org/Policy_Strategy_Analytics.pdf
LA Sophistication Model
36. ``
`
@HDrachsler, #LASI_NL, Zeist, Netherlands
Slide 50 / 29 June 2014
Creative data
sourcing,
necessary IT
support
Question-
driven, not
data or IT
driven
Participatory design
of analytics tools
One–size-fits-all does not work in LA
and is no innovation
Suggestions to do own LA
What is Learning Analytics?
Measurement, collection, analysis, and reporting of data about learners and their context.
Why LA?
Understanding and optimizing the learning and the env. It occurs?
For example, when one competition asked teams to predict whether a student would drop out during the next ten days, based on student interactions with resources on an online course, there were many possible factors to consider. Teams might have looked at how late students turned in their problem sets, or whether they spent any time looking at lecture notes. But instead, MIT News reports, the two most important indicators turned out to be how far ahead of a deadline the student began working on their problem set, and how much time the student spent on the course website. These statistics weren’t directly collected by MIT’s online learning platform, but they could be inferred from data available.
add nice pics
copy and put between slides
add nice pics
copy and put between slides
add nice pics
copy and put between slides
add nice pics
copy and put between slides
Participatory design of analytics tools for non-statics experts, Develop capabilities to exploit (big) data
Multidisciplinary teams
Learning from successful examples
The mentioned framework was used to structure the questionnaire in order to avoid as much as possible bias toward a single perspective of learning analytics, e.g. the data technologies, and in order to get a balanced overview of the field as a whole. The questionnaire took concrete aspects into focus in the following way: The ‘stakeholders’ dimension inquired about the expected beneficiaries; ‘objectives’ tried to highlight the preference between reflective use of analytics and prediction; It also checked for the development areas where benefits are most likely or are expected; the ‘data’ section looked into stances on sharing and access to datasets; ‘methods’ investigated trust in technology and algorithmic approaches; ‘constraints’ focused on observations on ethical and privacy limitations (so-called soft-barriers); and, finally, ‘competences’ looked into the confidence for exploiting the results of analytics in beneficial ways.
Limited response rate from Romance countries (France, Iberia, Latin America)
high return from Anglo-Saxon countries.
lack of responses Russia, China or India
Effects of LA on ...
The LinkedUp evaluation framework will be created and refined during each stage of the LinkedUp challenge to take into account both the increasing requirements of the competition, and recent progress with respect to technological developments. This framework will persist and be maintained beyond the project duration and enable the long-term evaluation and take-up of data-driven technologies according to a well-defined set of evaluation criteria, thresholds and assessment methods. It will be validated through its ability to support the evaluation of applications participating to the LinkedUp Challenge, as well as through its uptake in the evaluation of other applications, in other competitions, situations and domains.
point map of the 103 contributed quality indicators
(outcome of the multidimensional scaling analysis)
multidimensional scaling analysis assigns bridging value between 0 and 1
low bridging values are grouped together statements are coherent (98, 52, 75, 99)
higher bridging values grouped together but further apart (statements 95, 23, 50, 61)
close together = close in meaning = clustered together by many participants
1
system automatically suggests a list of labels per cluster
top suggestion is label of a participant’s cluster whose centroid is the closest to the centroid of the cluster formed by the aggregated cluster data from all participants
2
other way is to look at the bridging values: the lower the bridging values are, the better do those statements define the cluster
3
third way is to find the overarching theme of a cluster
We combined all three methods to define the labels of the 8-cluster solution
tool automatically applies the experts’ ratings to the cluster map
system always divides the ratings into 5 equivalent layers based on the average ratings provided by the participants for the rating maps
one layer indicates a low rating, whereas five layers indicate a high rating of the respective aspect
ladder graph offers visualisation well suited to compare the clusters’ ratings according to importance and feasibility
rating values are based on a cluster’s average rating
Pearson product-moment correlation coefficient (r = 0.65) indicates a strong positive relationship between the two aspects of importance and feasibility
Go-zone graph maps each statement onto a space between x- and y-axis based on the mean values of the two rating aspects of importance and feasibility
lower left = neither important nor feasible
upper left = feasible but not important
lower right = important but not so feasible
upper right = important and also feasible
importance map on the left
learning-related belt is deemed highly important
all Eastern clusters also receive five layers of importance
focus of importance is on the learning process-related clusters
feasibility map on the right
technically-oriented clusters in the North (green circle) are deemed most feasible
followed by the learning-related belt in the middle (yellow circle)
concluded by the human-related clusters in the South (red circle)
suitable indicator collected from the go-zone graphs