O slideshow foi denunciado.
Utilizamos seu perfil e dados de atividades no LinkedIn para personalizar e exibir anúncios mais relevantes. Altere suas preferências de anúncios quando desejar.

Learning Analytics 101

4.806 visualizações

Publicada em

Learning Analytics is an emerging topic of interest throughout all levels of education focusing on how to harness the power of data mining, interpretation, and modeling.

However, there are several similar terms (academic analytics, predictive analytics, business intelligence, etc.) that can confuse educators and administrators alike. In this session, we will unpack this new area of interest and discuss how institutions can begin to leverage available products and open source communities to utilize analytics to improve understandings of teaching and learning and to tailor education more effectively.

We will briefly present an overview of the learning analytics field, drawing from popular examples such as the Signals project at Purdue U. and the Check My Activity tool at U. Maryland, Baltimore County. We will also review the structure of Sakai CLE and OAE user-level metrics and briefly discuss projects to design and implement tools to utilize these metrics in meaningful ways.

Publicada em: Educação, Tecnologia
  • My personal experience with research paper writing services was highly positive. I sent a request to ⇒ www.HelpWriting.net ⇐ and found a writer within a few minutes. Because I had to move house and I literally didn’t have any time to sit on a computer for many hours every evening. Thankfully, the writer I chose followed my instructions to the letter. I know we can all write essays ourselves. For those in the same situation I was in, I recommend ⇒ www.HelpWriting.net ⇐.
       Responder 
    Tem certeza que deseja  Sim  Não
    Insira sua mensagem aqui
  • There is a useful site for you that will help you to write a perfect and valuable essay and so on. Check out, please ⇒ www.HelpWriting.net ⇐
       Responder 
    Tem certeza que deseja  Sim  Não
    Insira sua mensagem aqui
  • My brother found Custom Writing Service ⇒ www.WritePaper.info ⇐ and ordered a couple of works. Their customer service is outstanding, never left a query unanswered.
       Responder 
    Tem certeza que deseja  Sim  Não
    Insira sua mensagem aqui
  • Got a new Iphone 6 in just 7 days completing surveys and offers! Now I'm just a few days away from completing and receiving my samsung tablet! Highly recommended! Definitely the best survey site out there! ♥♥♥ https://bit.ly/2Ruzr8s
       Responder 
    Tem certeza que deseja  Sim  Não
    Insira sua mensagem aqui

Learning Analytics 101

  1. 1. Steve Lonn, University of Michigan Josh Baron, Marist College June 10-15, 2012Growing Community;Growing Possibilities
  2. 2. 1. What is Learning Analytics (LA)?2. Current LA work in Higher Education3. Data available in Sakai CLE & OAE4. Big Questions to Ponder5. Q&ASlides Available: slideshare.net/stevelonn/ 2012 Jasig Sakai Conference 2
  3. 3. “...datasets whose size is beyondthe ability of typical databasesoftware tools to capture, store,manage, and analyze.” Manyika et al. (2011) 2012 Jasig Sakai Conference 3
  4. 4. 2012 Jasig Sakai Conference 4
  5. 5. Analytics:An overarching concept that is defined as data-driven decision making van Barneveld, Arnold, & Campbell, 2012 adapted from Ravishanker 2012 Jasig Sakai Conference 5
  6. 6. 2012 Jasig Sakai Conference 6
  7. 7. Business / Academic Analytics:A process for providing highereducation institutions with the data necessary to support operational and financial decision making van Barneveld, Arnold, & Campbell, 2012 adapted from Goldstein and Katz 2012 Jasig Sakai Conference 7
  8. 8.  evidenceframework.org/big-data/ Educational Data Mining Learning Analytics Bienkowski, Feng, & Means, 2012 ◦ SRI International 2012 Jasig Sakai Conference 8
  9. 9.  Generally emphasizes reduction into small, easily analyzable components ◦ Can be then adapted to student by software ◦ Siemens and Baker, 2012 Predicting future learning behavior Domain models for content / sequences Software-provided pedagogical supports Computational models that incorporate student, domain, and pedagogy 2012 Jasig Sakai Conference 9
  10. 10.  Example: Cognitive Tutors  Pittsburgh Advanced Cognitive Tutor Center  Carnegie Mellon Universityhttp://ctat.pact.cs.cmu.edu 2012 Jasig Sakai Conference 10
  11. 11. Educational Data Mining: A process for analyzing datacollected during teaching and learning to test learning theories and inform educational practice Bienkowski, Feng, & Means, 2012 2012 Jasig Sakai Conference 11
  12. 12.  Understand entire systems and support human decision making Applies known methods & models ◦ answer questions about learning and organizational learning systems Tailored responses ◦ adapted instructional content, specific interventions, providing specific feedback 2012 Jasig Sakai Conference 12
  13. 13. Learning Analytics: The use of analytic techniques to help target instructional, curricular,and support resources to support the achievement of specific learning goals through applications thatdirectly influence educational practice van Barneveld, Arnold, & Campbell, 2012 adapted from Bach 2012 Jasig Sakai Conference 13
  14. 14.  Predictive Analytics ◦ uncover relationships and patterns ◦ can be used to predict behavior and events Visual Data Analytics ◦ discovering and understanding patterns in large datasets via visual interpretation 2012 Jasig Sakai Conference 14
  15. 15. Term Definition Level of Focus An overarching concept that is defined as data-Analytics All levels driven decision making A process for providing higher educationAcademic institutions with the data necessary to support InstitutionAnalytics operational and financial decision making A process for analyzing data collected during Department /Educational teaching and learning to test learning theories Instructor /Data Mining Learner and inform educational practice The use of analytic techniques to help target instructional, curricular, and support resources Department /Learning Instructor / to support the achievement of specific learningAnalytics Learner goals through applications that directly influence educational practice 2012 Jasig Sakai Conference 15
  16. 16. Who‟s been working in thisspace in Higher Education? 2012 Jasig Sakai Conference 16
  17. 17.  Purdue University‟s Course Signals ◦ College-wide learning analytics approach University of Michigan‟s E2Coach ◦ Course-specific learning analytics approach UMBC‟s “Check My Activity” Tool ◦ Student-centered learning analytics approach 2012 Jasig Sakai Conference 17
  18. 18.  Built predictive model using data from… ◦ LMS – Events (login, content, discuss.) & gradebook ◦ SIS – Aptitude (SAT/ACT, GPA) & demographic data Leverage model to create Early-alert system ◦ Identify students at risk to not complete the course ◦ Deploy intervention to increase chances of success Systems automates intervention process ◦ Students get “traffic light” alert in LMS ◦ Messages are posted to student that suggest corrective action (practice tests) 2012 Jasig Sakai Conference 18
  19. 19.  Impact on course grades and retention ◦ Students in courses using Course Signals…  scored up to 26% more A or B grades  up to 12% fewer Cs; up to 17% fewer Ds and F„s Ellucian product that integrates w/Blackboard Open Academic Analytics Initiative (OAAI) ◦ Creating a similar Sakai-based OS solution Arnold & Pistilli, 2012 - LAK 2012 Jasig Sakai Conference 19
  20. 20.  Focused specifically on introductory Physics Uses data from… ◦ Pre-course survey: academic info, learner‟s goals, psycho-social factors ◦ Performance: Exams, Web HW, Sakai Michigan Tailoring System (MTS) ◦ OS tool designed for highly customized messaging ◦ Used in health sciences for behavior change ◦ Messaging based on input from many sources “…to say to each what we would say if we could sit down with them for a personal chat.” 2012 Jasig Sakai Conference 20
  21. 21. 2012 Jasig Sakai Conference 21
  22. 22.  UMBC found that students earning D/F‟s use Bb 39% less then higher grade achievers ◦ Not suggesting cause and effect ◦ Goal is to model higher achiever behavior Provides data directly student ◦ Compare LMS use to class averages ◦ Can also compare averages usage data to grade outcomes Feedback has been positive 2012 Jasig Sakai Conference 22
  23. 23.  Student Success Plan – Sinclair CC ◦ Holistic case-management system ◦ Connects faculty, advisors, counselors, & students ◦ Jasig Incubation Project STAR Academic Journey – U of Hawaii ◦ Online advising and degree attainment system SNAPP – UBC/Wollongong ◦ Visualize networks of interaction resulting from discussion forum posts and replies 2012 Jasig Sakai Conference 23
  24. 24.  Papers and Articles on Purdue‟s Course Signals http://www.itap.purdue.edu/learning/research/ Michigan‟s Expert Electronic Coaching http://sitemaker.umich.edu/ecoach/home UMBC‟s Check My Activity Tool http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVolum/Vi deoDemoofUMBCsCheckMyActivit/219113 Student Success Plan http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVol um/TheStudentSuccessPlanCaseManag/242785 STAR Academy Journey http://net.educause.edu/ir/library/pdf/pub7203cs7.pdf SNAPP http://research.uow.edu.au/learningnetworks/seeing/snapp 2012 Jasig Sakai Conference 24
  25. 25. What can we know in CLE andOAE products? 2012 Jasig Sakai Conference 25
  26. 26.  User-level data stored as “events” sakai_event sakai_session EVENT_ID SESSION_ID EVENT_DATE SESSION_USER EVENT SESSION_IP REF SESSION_USER_AGENT SESSION_ID SESSION_START EVENT_CODE SESSION_END CONTEXT SESSION_SERVER SESSION_ACTIVE SESSION_HOSTNAME List of events available on Confluence ◦ Search for “event table description” 2012 Jasig Sakai Conference 26
  27. 27.  Site-level data stored in separate tables sakai_site sakai_realm CUSTOM_PAGE_ORDERED REALM_KEY SITE_ID REALM_ID TITLE PROVIDER_ID TYPE MAINTAIN_ROLE SHORT_DESC CREATEDBY DESCRIPTION MODIFIEDBY ICON_URL CREATEDON INFO_URL MODIFIEDON SKIN PUBLISHED JOINABLE PUBVIEW JOIN_ROLE CREATEDBY MODIFIEDBY CREATEDON realm_id like /site/ || site_id MODIFIEDON IS_SPECIAL IS_USER 2012 Jasig Sakai Conference 27
  28. 28. 16,00014,00012,00010,000 8,000 6,000 4,000 2,000 0 F04 F05 F06 F07 F08 F09 F10 F11 W05 W06 W07 W08 W09 W10 W11 W12 Project Sites Course Sites Max Users Thanks to John Leasia 2012 Jasig Sakai Conference 28
  29. 29. 3% 2% 1% 1%1% Presence 3% Web Content Resources Attachments 43% Test Center18% Assignments Syllabus Forums Gradebook Drop Box 25% Evaluations 2012 Jasig Sakai Conference 29
  30. 30. Social Work Architecture Engineering Business LS&A EducationPublic HealthArt & Design Law Nursing Music Medicine Dentistry Pharmacy 0% 25% 50% 75% 100% 2012 Jasig Sakai Conference 30
  31. 31. Clinical Assoc Prof Clinical Lecturer Clinical Professor Clinical Asst Prof Asst Professor Professor Assoc Professor Asst ProfessorAdjunct Clin Asst Professor Adjunct Clinical Lecturer Adjunct Clin Assoc Prof 0% 25% 50% 75% 100% 2012 Jasig Sakai Conference 31
  32. 32. BITSum of Revisions ENGR NURS IOE SI ENGLISH RCHUMS AAAS EECS RCLANG Count of Course Sites SI ENGLISH BIT EECS PSYCH COMP MODGREEK NURS NRE 2012 Jasig Sakai Conference 32
  33. 33.  Summary information about site visits, tool activity, and resource activity 2012 Jasig Sakai Conference 33
  34. 34. 2012 Jasig Sakai Conference 34
  35. 35.  User-level data available via “activity feeds” ◦ follows a “push and publication” model rather than a “store and query” model (CLE is store & query) ◦ Activity is both highly specific: individual interactions between users, content, contexts… ◦ …and more general: user interaction everywhere rather than only within a single course context. What new questions will we ask? ◦ Interesting activity can happen with external capabilities: CLE tools, LTI tools, widgets. How will we ensure this data is captured? Many thanks to Nate Angell for OAE slides 2012 Jasig Sakai Conference 35
  36. 36. FYI: Designs are still in draft form. 2012 Jasig Sakai Conference 36
  37. 37. FYI: Designs are still in draft form. 37 2012 Jasig Sakai Conference
  38. 38. FYI: Designs are still in draft form. 38 2012 Jasig Sakai Conference
  39. 39.  Activity (OAE) & Grades (CLE): Week 1 Developed by the Kaleidoscope Project in collaboration with rSmart. 2012 Jasig Sakai Conference 39
  40. 40.  Activity (OAE) & Grades (CLE): Week 7 Developed by the Kaleidoscope Project in collaboration with rSmart. 2012 Jasig Sakai Conference 40
  41. 41.  Activity (OAE) & Grades (CLE): Animation Developed by the Kaleidoscope Project in collaboration with rSmart. 2012 Jasig Sakai Conference 41
  42. 42.  Tools / services to support analytics initiatives ◦ Ways to connect different silos of data ◦ Methods to connect back to CLE / OAE  LTI? Web services? Others? OAE improvements over CLE approach to user data ◦ What data is most relevant for analytics? ◦ What displays and/or data are most useful to help learners? 2012 Jasig Sakai Conference 42
  43. 43. Josh Baron, Marist College 2012 Jasig Sakai Conference 43
  44. 44.  Data Mining vs. Learning Science Approaches ◦ Do we build predictive models from large data sets or from our understanding of learning sciences? ◦ Is both the right answer? How does that work? Challenges of Scaling LA Across Higher Ed ◦ Does each institution have to build its own model?  How “portable” are predictive models? ◦ Do we need an open standard for LA? Could LIS and LTI play a role? How can LA be used to assist ALL students? ◦ Michigan‟s E2Coach system is a good example 2012 Jasig Sakai Conference 44
  45. 45.  “The obligation of knowing” – John Campbell ◦ If we have the data and tools to improve student success, are we obligated to use them?  Consider This > If a student has a 13% chance of passing a course, should they be dropped? 3%? Who owns the data, the student? Institution? ◦ Should students be allowed to “opt out”?  Consider This > Is it fair to the other students if by opting out the predictive model‟s power drops? What do we reveal to students? Instructors?  Consider This > If we tell a student in week three they have a 9% chance of passing, what will they do?  Will instructors begin to “profile” students? 2012 Jasig Sakai Conference 45
  46. 46. Connect with LearningAnalytics communities 2012 Jasig Sakai Conference 46
  47. 47.  http://www.solaresearch.org/ Learning Analytics & Knowledge Conferences (LAK) STORM – initiative to help fund research projects FLARE – regional practitioner conference ◦ Purdue University, Oct 1-3, 2012 2012 Jasig Sakai Conference 47
  48. 48.  Symposium on Learning Analytics at Michigan http://sitemaker.umich.edu/slam/ 15 speakers (12 UM, 3 external) Videos & slides available from all speakers 2012 Jasig Sakai Conference 48
  49. 49.  Analytics in Higher Education: Establishing a Common Language ◦ Van Barneveld, Arnold, Campbell, 2012 ◦ http://www.educause.edu/Resources/AnalyticsinHigherEducationEsta/245405 Analytics to Literacies: Emergent Learning Analytics to evaluate new literacies ◦ Dawson, 2011- http://blogs.ubc.ca/newliteracies/files/2011/12/Dawson.pdf Learning Analytics: Definitions, Process Potential ◦ Elias, 2011 ◦ http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf The State of Learning Analytics in 2012: A Review and Future Challenges ◦ Ferguson, 2012 - http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf Academic analytics: A new tool for a new era. ◦ Campbell, Deblois, & Oblinger (2007). Educause Review, 42(4), 40-57. ◦ http://net.educause.edu/ir/library/pdf/ERM0742.pdf Mining LMS data to develop an "early warning system" for educators: A proof of concept. ◦ Macfadyen & Dawson (2010) - Computers & Education, 54(2), 588-599. Classroom walls that talk: Using online course activity data of successful students to raise self- awareness of underperforming peers. ◦ Fritz, 2011 - Internet and Higher Education, 14(2), 89-97. 2012 Jasig Sakai Conference 49
  50. 50.  Wednesday, 13 June ◦ Learning Analytics: A Panel Debate on the Merits, Methodologies, and Related Issues (1:15pm) ◦ Learning Analytics at Michigan: Designing Displays for Advisors, Instructors, and Students (2:30pm) ◦ BOF for Learning Analytics: Current and Planned Projects and Tools (3:45pm) Thursday, 14 June ◦ Creating an Open Ecosystem for Learner Analytics (10:15am)  Open Academic Analytics Initiative (OAAI)  https://confluence.sakaiproject.org/x/8aWCB 2012 Jasig Sakai Conference 50
  51. 51.  Steve Lonn ◦ slonn@umich.edu @stevelonn Josh Baron ◦ Josh.baron@marist.edu @joshbaron 2012 Jasig Sakai Conference 51

×