O slideshow foi denunciado.
Utilizamos seu perfil e dados de atividades no LinkedIn para personalizar e exibir anúncios mais relevantes. Altere suas preferências de anúncios quando desejar.

Momentum Predictive Analytics Jam Brief

748 visualizações

Publicada em

MOMENTUM is a program of online, collaborative inquiry hosted by the Bill & Melinda Gates Foundation in collaboration with Knowledge in the Public Interest, to advance compelling strategies for student success in higher education.

A online Jam on Predictive Analytics, held on August 7, 2013, probed the challenges and opportunities offered by predictive analytics to boost college student persistence and completion. This is the Jam Brief or synopsis of the major themes that surfaced during the Jam.

Publicada em: Educação, Tecnologia
  • Seja o primeiro a comentar

Momentum Predictive Analytics Jam Brief

  1. 1. Momentum Brief Predictive Analytics Presented by Knowledge in the Public Interest
  2. 2. Assume you are a lifeguard on a crowded beach. With no “data” or “predictive model”, you usually wait for someone to scream “HELP” before acting. With analytics, if you could see that someone’s heart rate was increasing, you may proactively swim out to help them. From a causation standpoint, you don’t know if they are having a heart attack, caught in the undertow, or if they saw a shark. You just know there are signs of trouble. Once you get there, use your training to determine the cause and try and solve the problem. That’s one way to think about using analytics with students who are at-risk. - Mike
  3. 3. Predictive Analytics is a BIG topic with implications for resource allocation by institutional and system leaders, for pedagogical change among faculty, for advising targets and methods, and for student and family choices. The Jam provides evidence of tremendous appetite and interest in the subject and uncovers widespread experimentation with data-driven models and actions in all of these areas and in all corners of higher education. The number one challenge offers no surprise. Given the criticality of cross-discipline collaboration (tech/institutional research/pedagogy/non-cognitive/policy etc.) for both the generation and take-up of robust “data for change”, there is an essential requirement for leadership and incentives. When these are missing the ubiquitous inward looking and siloed nature of higher education wins out, questions go unanswered, and change is stymied.
  4. 4. Academic freedom creates millions of experiments across the ecosystem, but we currently lack any of the required data infra-structure to capture and use the outcomes of those experiments to iterate and improve our educational delivery. Our thesis is that the real promise of analytics is to show what kind of improvement is possible when you rigorously track and analyze intervention efficacy across schools, and then use those results to encourage additional rigor in deploying interventions. - Charles While we speak of predictive analytics, which at its essence is the identification of patterns in large volumes of data, most activists are operating in a “pre-pattern” zone. Their ambition is to predict but, by and large, their capability is to identify.
  5. 5. If predictive analytics holds out the possibility of not only seeing the swimmer in distress before she cries “help,” but also predicting the actions or interventions that are likely to be of most help to her, then this may require more data and more analytic sophistication than most institutions can muster alone, and call for multi-institution/system collaborations that share the expense and benefits of working at scale. It is more likely that advances in personalized learning and non-cognitive methods will emerge from the work of data commons than from a dispersion of individual initiatives. The PAR framework and Completion by Design, along with work by iData and Civitas are compelling examples of the value of inter-institutional collaborations, particularly for data definitions and metrics.
  6. 6. Much of what is labeled predictive analytics are studies that most IR offices have been doing for years. But some are now driven by faculty or other non-IR/non-traditional data staff. - Randy Data teams are the future of the field. Actually that was the “future” before we found big data. It is still true. -Russ This is important because good data ill-used is demoralizing I have been involved in projects where we have had good/actionable data but no one to give it to who can do something about it. Frustrating but a realistic concern. Early alert systems often falter because the resources to respond to them are limited. -Russ Most reporting or data analysis projects do not fail because of the reporting tools or databases. They fail because of lack of trust in the data that was produced. This trust is only built through inclusive collaboration, engagement and transparency in the process and results.- Brian A change has occurred in WHO is thinking about data
  7. 7. Advising individual students effectively at scale How can we work with each student as an individual? First we have to “understand” who they are. Gathering rich data is a strong way to do this. But it is not enough by itself. No instructor can comprehend the details in the backgrounds of 400 students. This is where technology can help. If we can think of different kinds of students, with different background, goals and affects, we can imagine what we would say to these students if we met them. Technology can then help us to actually say it. - Tim Identifying students at risk of failing In our experience the intersection of student background and preparation, with their behaviors at an institution, offer the greatest predictive insights into student success. We have all spent enough time in education to know that the relatively short list of blunt demographic variables that are consistently tracked across schools don't generally result in strong enough linear correlations to student success to support recommendations or predictions. - Charles Purdue University’s Signals program uses demographics, predictive analytics and performance to give a red/ yellow/green light evaluating student status... (It) looks at student engagement with online resources for courses and compares level of engagement with students who have been successful in the past (and) I believe it also includes early academic assignments. So it addresses students who fall behind in their classes at the beginning of the semester, before midterms or major assignments. This is a common problem at large institutions. - Don The action is around improving teaching and learning
  8. 8. Generating information that can guide instructor planning In an action research project my research did with CC instructors in bio, the faculty who set out to "help" students deal with a new type of teaching-- problem-based learning--found by looking at student engagement in the classroom (that's data too!) and student test results--that letting the students struggle helped most. The data informed change in their beliefs about their role vis-a-vis student learning. - Louise Using formative data to help students anticipate the consequences of meeting vs. not meeting course expectations I like using…engagement analytics to foster the conversation with the student about how his/her behavior will influence course outcomes. Reminders are so critical. Particularly coming from a teacher who can say, “hey, when a student misses 3 homework assignments they tend to get a D on the mid-term. Do your homework!” - Louise
  9. 9. What has been learned and where are the red flags? Tools trump reports. They can stop data from overwhelming and make it “just in time.” Tools enable engagement with data (queries/what if’s) that reports do not. A good learning analytics program is going to be layered with technical support and analytics expertise used to create tools, which allow students, faculty and administrators to ask and answer questions they find important…tools are more important than reports! - Don Instructional tools can both help a learner and also build the data supply. To learn HOW students learn, we need to get them using tools that expose the process of learning. - Diana A game being designed under the Gates Foundation - Refraction Center for Game Science at the University of Washington is interesting as an AI type program, which assesses a learner’s error and presents them with new individualized challenges to address that specific error/misconception. …This game yields masses of back end data about user interaction and arguably thinking. - Bronwyn Data in the hands of learners can be empowering but it can equally be demoralizing or overwhelming. Rather than an argument for not sharing, this is an admonition to scaffold the student experience. I think this is the main insight I'm getting from this discussion today: Some big subset of "big data" in learner analytic data needs to be in the service of the learner. We need to think about designs that foster confidence, persistence, and honest assessment of what learners know and what learners are still learning. - Louise An example of something we have done that is very prescriptive is that we plan out Academic Plans (MAP) with no placeholders. All spots have a default course in them and we do not allow things like “take elective” or take this or that”. We default to a known good value and present that to students at time of registration. We want them to have choice, guided by an advisor, but not to pick electives based on schedule openings at time of registration. To make this work we have brought chairs and advisors together to create “Ideal” academic Pathways through each program in FT & PT modes. - Russ
  10. 10. Predicts student outcomes thereby enabling interventions that promote persistence and completion. Links individual student needs to tailored advising services and personalized interventions. Examines student behaviors and needs and ties these to patterns of engagement with services, in order to develop “successful” and “at risk” profiles. Informs teaching and learning strategies to improve student outcomes. Enables early interventions and timely feedback so that interventions are proactive and not reactive. 215 people 70 people 69 people 44 people 36 people *Analysis of response to the open-ended question “What do you imagine are the top 1-3 opportunities for predictive analytics in your work” Jammers want predicative analytics that identify good teaching and learning strategies and early intervention models. They want predictive analytics that*
  11. 11. If higher education could shorten the time lapse between collection (of good/clean data), analysis and action and make sure each cycle self-perpetuates and feeds the data up for further analysis, strides would be made in experience and understanding of how to effectively support students through completion. The Jam exchange demonstrates interest and eagerness to engage. The opportunity lies in continued communication and transparency to minimize wheel reinvention and support innovation.
  12. 12. A program of collaborative inquiry hosted by the Bill & Melinda Gates Foundation in collaboration with Knowledge in the Public Interest to advance compelling strategies for student success in higher education. kpublic.com Momentum.edthemes.org gatesfoundation.org