O slideshow foi denunciado.
Utilizamos seu perfil e dados de atividades no LinkedIn para personalizar e exibir anúncios mais relevantes. Altere suas preferências de anúncios quando desejar.

Public PhD defense

Nowadays, we are constantly interacting with computers, mobiles and other wearable devices. These interactions leave behind the digital footprint of the user. This data is used with different goals in the so-called Big Data field to predict customer behaviour in marketing and health research. Learning Analytics tackles this challenge in the Technology Enhanced Learning field.

George Siemens defines Learning Analytics as the measurement, collection, analysis and reporting of the data to understand and optimise learning. In this context, we find a variety of studies that process the data different. Some studies implement complex algorithms and display the outcome to the user. Others rely on simpler approaches to process the data but enabling the user to explore the data with understandable, comprehensive and usable visualisations. Users can draw conclusions by their own and, with this information, steer their own learning process. This thesis is contextualised in the latter and intends to help students to become autonomous and lead their own educational process.

This dissertation presents the work in the scope of four research questions: 1) RQ1 - What characteristics of learning activities can be visualised usefully for learners?; 2) RQ2 - What characteristics of learning activities can be visualised usefully for teachers?; 3) RQ3 - What are the affordances of and user problems with tracking data automatically and manually?; and 4) RQ4 - What are the key components of a simple and flexible architecture to collect, store and manage learning activity?.

The exploration of these research questions include the deployment of: 1) three different learning dashboard designs deployed in real courses with 128 students participating in the evaluations; 2) the analysis of two Massive Open Online Courses (MOOCs) with 56876 enrolled students; and 3) the deployment of an architecture in two real case studies, including a European project with more than 15 scheduled pilots.

Manual and automatic trackers have benefits and drawbacks. For example, manual trackers respect the user privacy in blended learning courses but the data provided by the students is not trusted by their fellow students. Automatic trackers are more accurate, but they do not track the activity outside of the computer, and, therefore, do not provide the complete picture that students demand.

This research also identifies three components to deploy a simple and flexible architecture to collect data in open learning environments: 1) a set of simple services to push and pull the learning traces; 2) a simple data schema to ensure completeness and findability of the data; and 3) independent components to collect the learning activity.

Livros relacionados

Gratuito durante 30 dias do Scribd

Ver tudo
  • Seja o primeiro a comentar

  • Seja a primeira pessoa a gostar disto

Public PhD defense

  1. 1. Exploring Learning Analytics and Learning Dashboards from a HCI Perspective Jose Luis Santos PUBLIC PHD DEFENSE 1http://bit.do/santos_scholar http://www.slideshare.net/jlsantoso
  2. 2. Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” - George Siemens [1] [1] G. Siemens. “Learning analytics: envisioning a research discipline and a domain of practice”. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge . ACM. 2012, pp. 4–8. DEFINITION page 1 of the thesis text 2
  3. 3. IMPACT MACRO-LEVEL 3
  4. 4. IMPACT MESO - LEVEL https://www.flickr.com/photos/jaysantiago/7523644862 4
  5. 5. IMPACT MICRO-LEVEL https://www.flickr.com/photos/fortworthpubliclibrary/5202801554 5
  6. 6. IMPACT MICRO-LEVEL https://www.flickr.com/photos/fortworthpubliclibrary/5202801554 https://www.flickr.com/photos/mklapper/5812224468 6
  7. 7. IMPACT MICRO-LEVEL https://www.flickr.com/photos/fortworthpubliclibrary/5202801554 https://www.flickr.com/photos/mklapper/5812224468 https://www.flickr.com/people/francisco_osorio/ 7
  8. 8. Classrooms and learning communities SCOPE see background section (pg.3) of the thesis text 8
  9. 9. 9
  10. 10. 10
  11. 11. 11
  12. 12. 12
  13. 13. 13
  14. 14. LEARNINGDASHBOARDS 14
  15. 15. Classrooms and learning communities Learning Dashboards as Personal Informatics Tools see background section (pg.3) of the thesis text 15 SCOPE
  16. 16. https://www.flickr.com/photos/pere/523019984 The Iceberg - Abelardo Pardo’s metaphorLMS The Open 16
  17. 17. 17 LDintheOpen
  18. 18. 18
  19. 19. 19 LDintheOpen
  20. 20. 20 LDintheOpen
  21. 21. Classrooms and learning communities Learning Dashboards as Personal Informatics Tools Open Learning Environments see background section (pg.3) of the thesis text 21 SCOPE
  22. 22. MOTIVATION ability motivation triggers fail here triggers succeed here BJ Fogg’s model http://www.behaviormodel.org/ 22
  23. 23. Classrooms and learning communities Learning Dashboards as Personal Informatics Tools Open Learning Environments Motivation see background section (pg.3) of the thesis text 23 SCOPE
  24. 24. EVALUATION INSTRUMENTS 24 • Google Analytics • Own tracking systems
  25. 25. EVALUATION INSTRUMENTS 25 • Google Analytics • Own tracking systems • SUS questionnaire
  26. 26. EVALUATION INSTRUMENTS 26 • Google Analytics • Own tracking systems • SUS questionnaire • Customised questionnaires
  27. 27. EVALUATION INSTRUMENTS 27 • Google Analytics • Own tracking systems • SUS questionnaire • Customised questionnaires • Interviews
  28. 28. METHODOLOGY Iterative Real environments No controlled variables leads to knowledge that can be used by practitioners Design-based research 28
  29. 29. METHODOLOGY Iterative Real environments No controlled variables leads to knowledge that can be used by practitioners Design-based research 29
  30. 30. METHODOLOGY Iterative Real environments No controlled variables leads to knowledge that can be used by practitioners Design-based research 30
  31. 31. METHODOLOGY Iterative Real environments No controlled variables leads to knowledge that can be used by practitioners Design-based research 31
  32. 32. 32 CASE STUDIES
  33. 33. OPEN LEARNING - CASE STUDY 133
  34. 34. CASE STUDY 134
  35. 35. CASE STUDY 135
  36. 36. CASE STUDY 136
  37. 37. CASE STUDY 137
  38. 38. CASE STUDY 138
  39. 39. CASE STUDY 139
  40. 40. 40 What did we learn?
  41. 41. 41 What did we learn? What came next?
  42. 42. OPEN LEARNING - CASE STUDY 242
  43. 43. OPEN LEARNING - CASE STUDY 243
  44. 44. OPEN LEARNING - CASE STUDY 244
  45. 45. 45 What did we learn?
  46. 46. 46 What did we learn?
  47. 47. 47 What did we learn? What came next?
  48. 48. 48
  49. 49. OPEN LEARNING - CASE STUDY 349
  50. 50. 50 What did we learn?
  51. 51. MOOCs51https://coma.uned.es/
  52. 52. 52 MOOCs
  53. 53. EBL - Enquiry Based Learning53
  54. 54. RESEARCH QUESTIONS RQ1: What characteristics of learning activities can be visualised usefully for learners? RQ2: What characteristics of learning activities can be visualised usefully for teachers? RQ3: What are the affordances of and user problems with tracking data automatically and manually? RQ4: What are the key components of a simple and flexible architecture to collect, store and manage learning activity? 54
  55. 55. RESEARCH QUESTIONS RQ1: What characteristics of learning activities can be visualised usefully for learners? RQ2: What characteristics of learning activities can be visualised usefully for teachers? RQ3: What are the affordances of and user problems with tracking data automatically and manually? RQ4: What are the key components of a simple and flexible architecture to collect, store and manage learning activity? 55
  56. 56. Approach56 128 students3 learning analytics dashboards 5 case studies real settingsusability perceived usefulness
  57. 57. Publications57 Santos et al. 2012. “Goal-oriented visualizations of activity tracking: a case study with engineering students”, In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK ’12). ACM, New York, NY, USA, 143-152. Santos et al. 2013a. “Addressing learner issues with StepUp!: an evaluation”, In Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK ’13). ACM, New York, NY, USA, 14-22. Santos et al. 2013b. “Evaluating the use of open badges in an open learning environment” , In proceedings of of the Eight European Conference on Technology Enhanced Learning, Scaling up Learning for Sustained Impact, Springer Berlin Heidelberg, Berlin, Germany. 314-327. Santos et al. 2015 “Tracking Data in Open Learning Environments" Journal of Universal Computer Science, Vol. 21, No. 7, pp. 976-996 Santos et al. 2014. “Success, activity and drop-outs in MOOCs an exploratory study on the UNED COMA courses”. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge (LAK’14). ACM, New York, NY, USA, 98-102
  58. 58. “As described in section 1.1, learning dashboards visualise learning traces, actions that students perform while they learn. In this context, RQ1 explores the usefulness of such traces in five different open learning courses.” Problem58 rq1 problem statement - see page 14 of the thesis text
  59. 59. [3,4] Time spent Artefacts Produced Social Interaction Resource use Exercise/ Test results Ch. 2 Ch. 3 Ch. 4 [3] K. Verbert, E. Duval, J. Klerkx, S. Govaerts, and J. L. Santos. “Learning Analytics Dashboard Applications”. In: American Behavioral Scientist 57.10 (2013), pp. 1500–1509. [4] K. Verbert, S. Govaerts, E. Duval, J. L. Santos, F. Van Assche, G. Parra, and J. Klerkx. “Learning dashboards: an overview and future research opportunities”. In: Personal and Ubiquitous Computing 18.6 (2014), pp. 1499–1514. Approach page 15 of the thesis text 59
  60. 60. Time spent Artefacts Produced Social Interaction Resource use Exercise/ Test results Ch. 2 Ch. 3 Comparison Ch. 4 Outcomes page 16 of the thesis text 60
  61. 61. [3,4] Time spent Artefacts Produced Social Interaction Resource use Exercise/ Test results Ch. 2 Ch. 3 Social activity Ch. 4 Outcomes page 16 of the thesis text 61
  62. 62. Time spent Artefacts Produced Social Interaction Resource use Exercise/ Test results Ch. 2 Ch. 3 individual vs group work Ch. 4 Outcomes page 16 of the thesis text 62
  63. 63. RESEARCH QUESTIONS RQ1: What characteristics of learning activities can be visualised usefully for learners? RQ2: What characteristics of learning activities can be visualised usefully for teachers? RQ3: What are the affordances of and user problems with tracking data automatically and manually? RQ4: What are the key components of a simple and flexible architecture to collect, store and manage learning activity? 63
  64. 64. Approach64 exploratory study 2 language learning MOOCs 56876 students enrolled
  65. 65. Publications65 Santos et al. 2012. “Goal-oriented visualizations of activity tracking: a case study with engineering students”, In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK ’12). ACM, New York, NY, USA, 143-152. Santos et al. 2013a. “Addressing learner issues with StepUp!: an evaluation”, In Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK ’13). ACM, New York, NY, USA, 14-22. Santos et al. 2013b. “Evaluating the use of open badges in an open learning environment” , In proceedings of of the Eight European Conference on Technology Enhanced Learning, Scaling up Learning for Sustained Impact, Springer Berlin Heidelberg, Berlin, Germany. 314-327. Santos et al. 2014. “Success, activity and drop-outs in MOOCs an exploratory study on the UNED COMA courses”. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge (LAK’14). ACM, New York, NY, USA, 98-102 Santos et al. 2015 “Tracking Data in Open Learning Environments" Journal of Universal Computer Science, Vol. 21, No. 7, pp. 976-996
  66. 66. “Results of our analyses [94, 95] report that dashboards for teachers are designed to raise awareness of the activities taking place in the course, analyse activity and plan interventions, among others. Related to activity analysis, we explored what teachers can actually learn from visualisations.” Problem66 rq2 problem statement - see page 17 of the thesis text
  67. 67. Time spent Artefacts Produced Social Interaction Resource use Exercise/ Test results Drop-outs [5,6,7] Language use [8] Social interaction [9] [5] C. Alario-Hoyos et al. “Analysing the Impact of Built-In and External Social Tools in a MOOC on Educational Technologies”. In: ECTEL’13 . Vol. 8095. LNCS. Springer, 2013, pp. 5–18. [6] D. Clow. “MOOCs and the funnel of participation”. In: Proceedings of the Third International Conference on Learning Analytics and Knowledge. LAK ’13. ACM, 2013, pp. 185–189. [7] H. Spoelstra et al. “Team formation instruments to enhance learner interactions in open learning environments”. In: Computers in Human Behavior 45 (2015), pp. 11–20. [8] P. Levy. “Technology-Supported Design for Inquiry-Based Learning”. In: Exploring Learning & Teaching in Higher Education . Springer, 2015, pp. 289–304. [9] N. Michinov et al. “Procrastination, participation, and performance in online learning environments”. In: Computers & Education 56.1 (Jan. 2011), pp. 243–252. 67 Approach table with data from page 17 and chapter 5 of the thesis text
  68. 68. Time spent Artefacts Produced Social Interaction Resource use Exercise/ Test results Drop-outs [5,6,7] attention to the first units of the lessons Language use [8] Social interaction [9] 68 Outcome table with data from page 18 of the thesis text
  69. 69. Time spent Artefacts Produced Social Interaction Resource use Exercise/ Test results Drop-outs [5,6,7] attention to the first units of the lessons Language use [8] ? ? Social interaction [9] 69 Outcome table with data from page 18 of the thesis text
  70. 70. Time spent Artefacts Produced Social Interaction Resource use Exercise/ Test results Drop-outs [5,6,7] attention to the first units of the lessons Language use [8] ? ? Social interaction [9] 70 Outcome table with data from page 18 of the thesis text thresholds thresholds
  71. 71. RESEARCH QUESTIONS RQ1: What characteristics of learning activities can be visualised usefully for learners? RQ2: What characteristics of learning activities can be visualised usefully for teachers? RQ3: What are the affordances of and user problems with tracking data automatically and manually? RQ4: What are the key components of a simple and flexible architecture to collect, store and manage learning activity? 71
  72. 72. “Therefore, we consider relevant to evaluate how students perceived automatic and manual trackers.” Problem72 rq3 problem statement - see page 19 of the thesis text Manual Automatic PROS Privacy tracking fatigue [10] CONTRAS tracking fatigue [10] Privacy [10] E. K. Choe, N. B. Lee, B. Lee, W. Pratt, and J. A. Kientz. “Understanding quantified-selfers’ practices in collecting and exploring personal data”. In: Proceedings of the 32nd annual ACM conference on Human factors in computing systems . ACM. 2014, pp. 1143–1152.
  73. 73. 73 Approach table with data from page 20 of the thesis text Lab sessions Blended learning courses no learning activity outside of the classroom big part of the learning activity outside of the classroom automatic trackers manual trackers Rabbit Eclipse plug-in * * https://marketplace.eclipse.org/content/rabbit
  74. 74. Outcome Lab sessions Blended learning courses Approach Automatic trackers Manual trackers No privacy concerns No tracking fatigue lack of tracking Over reporting 74
  75. 75. RESEARCH QUESTIONS RQ1: What characteristics of learning activities can be visualised usefully for learners? RQ2: What characteristics of learning activities can be visualised usefully for teachers? RQ3: What are the affordances of and user problems with tracking data automatically and manually? RQ4: What are the key components of a simple and flexible architecture to collect, store and manage learning activity? 75
  76. 76. Publications76 Santos et al. 2015 “Tracking Data in Open Learning Environments" Journal of Universal Computer Science, Vol. 21, No. 7, pp. 976-996 Santos et al. 2012. “Goal-oriented visualizations of activity tracking: a case study with engineering students”, In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK ’12). ACM, New York, NY, USA, 143-152. Santos et al. 2013a. “Addressing learner issues with StepUp!: an evaluation”, In Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK ’13). ACM, New York, NY, USA, 14-22. Santos et al. 2013b. “Evaluating the use of open badges in an open learning environment” , In proceedings of of the Eight European Conference on Technology Enhanced Learning, Scaling up Learning for Sustained Impact, Springer Berlin Heidelberg, Berlin, Germany. 314-327. Santos et al. 2014. “Success, activity and drop-outs in MOOCs an exploratory study on the UNED COMA courses”. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge (LAK’14). ACM, New York, NY, USA, 98-102
  77. 77. 77 image at page 21 of the thesis text Approach Experience report two environments
  78. 78. 78 Rabbit Eclipse plugin RescueTime Wordpress API Blogspot API Medium / RSS Twitter Toggl Trackers REST services dashboard badge system Internethosted in the cloud Google App Engine 1. Common data schema 2 3 three elements described in page 22 of the thesis text Outcome
  79. 79. Wrap-up 79
  80. 80. image at page 137 of the thesis text 128 students actually used the learning dashboards 56876 students enrolled in the MOOC courses the architecture was deployed in more than 10 case studies 3 learning analytics dashboards 80
  81. 81. Publications and RQs C - Conference, J - Journal RQ1 Chapter 2: Santos et al. 2012 (C) Chapter 3: Santos et al. 2013a (C) Chapter 4: Santos et al. 2013b (C) RQ2 Chapter 5: Santos et al. 2014 (C) RQ3 Chapter 2: Santos et al. 2012 (C) Chapter 3: Santos et al. 2013a (C) RQ4 Chapter 6: Santos et al. 2015 (J) 81
  82. 82. External citations Own citations chapter 2 28 7 chapter 3 14 5 chapter 4 9 4 chapter 5 10 0 chapter 6 0 0 other co-authored publications 199 48 82
  83. 83. 83
  84. 84. 84
  85. 85. 85
  86. 86. 86
  87. 87. 87
  88. 88. http://jlsantoso.blogspot.be/2013/05/reveal-it-applied-in-educational-context.html 88
  89. 89. 8 May 2013 | @svencharleer | svencharleer.com http://ceur-ws.org/Vol-1103/paper5.pdf89
  90. 90. http://ceur-ws.org/Vol-1103/paper5.pdf90
  91. 91. “What is research but a blind date with knowledge?” Will Harvey 91
  92. 92. Thank you for your attention! Looking forward to answer your questions. 92

×