ICT Role in 21st Century Education & its Challenges.pptx
Research analytics service - ARMA study tour
1. Scoping a Jisc Research Analytics Service
23 September 2019
ARMA Study Tour
Defining the problems and refining potential solutions
Christopher Brown
2. Data and Analytics
2
Business Intelligence
• Learning Analytics Service
www.jisc.ac.uk/learning-analytics
• Putting data to work to tackle strategic challenges in HE/FE
• Transform student learning experience
• Support student wellbeing
• Help students achieve more
• Analytics Lab / Community Dashboards
www.jisc.ac.uk/analytics-labs
• Proven, innovative and unique approach
• Co-created using the imagination and expertise of a wide
range of participants
• Identifies timely areas of wide applicability to UK education
(Student, Workforce, Staffing, Estates etc)
• Upskills the workforce / CPD (competency framework)
4. Analytics Lab
• Alternative providers
• Apprenticeships
• Athena Swan and race equality
• Business and community interaction
• Destinations of leavers
• Diversity
• Estates sector benchmarking
• Financial sustainability
• Finding comparable providers
• Grade improvement
• Higher education league tables
• Impact of Brexit
• International competitiveness
• Learning analytics
• Libraries
• Postgraduate admissions
• Postgraduate research
• Quality assurance
• REF2021 planning
• School competitor modelling tools
4
Topics explored by analytics labs so far include:
First Research focused analytics lab (Feb – May 2019),
looked at two key themes
1. Downstream effects of research funding
2. Reproducibility
5. Research Analytics Service
• Jisc’s members are increasingly focused on the use of data to inform their planning, though mindful of
GDPR and competition law
• A Jisc research analytics service would build on existing work undertaken in the area of learning
analytics, and would extend the scope of the existing Jisc Analytics Labs and Heidi Plus Community
Dashboards initiative to meet urgent policy drivers such as the Industrial Strategy, REF and KEF
• A research analytics service offer from Jisc is not limited in scope to data just from Jisc services and
might also be based on data from HEI services, and/or third party (sector and commercial
organisations, etc) sources as well.
5
Scoping a research analytics service
6. Research Analytics Service
• Identify all significant Jisc activity that would be relevant to a coherent research analytics offer
• Define what a research analytics service from Jisc would look like
• What sources of data could be used to help to solve some of the problems identified by our members
and stakeholders (Jisc, HEI, third party (sector and commercial organisations, etc) sources as well
(+HESA))
• Can we develop a research analytics service similar to the way we developed a learning analytics
service?
• Discovery phase - before developing such a service, work with our members and funders to explore
the options for a coherent Jisc offer
6
Research Analytics Service project
7. Research Analytics Service
7
Discovery phase
• Problem Definition
• Broad context
• Stakeholders / interviews
• 5 HEI case studies
• User stories / personas
• Problem dossier
• Refine solutions
• Wireframe prototypes
• Initial thoughts on business model, vision, competitive landscape, alignment with political and
international landscape, and technical approach
• Solution pitch to Investment Committee
Double Diamond model (Copyright Design Council 2014)
8. Stakeholders / Case studies
8
Consult stakeholders who might use research analytics in the course of their work
• Higher education institutions
• Research funders
• Membership organisations and representative bodies
• Other organisations
• Case Studies
9. Problem Definition
• Jisc - Engaging with UKRI, Research England, and the Forum For
Responsible Research Metrics (FFRRM)* members and institutional
leaders to:
• Help us define the problems in the research landscape where research
analytics could help
• Refine them down to smaller, more manageable problems that can act
as the starting point for possible solutions
• Understand other trends and developments in the landscape that may
be relevant to this work
• Clarify the role that Jisc might play in addressing these problems and
trends
9
Engaging
*A group of research funders, sector bodies, and infrastructure experts working in
partnership with UUK to promote the responsible use of research
metrics. https://www.universitiesuk.ac.uk/policy-and-analysis/research-policy/open-
science/Pages/forum-for-responsible-research-metrics.aspx
10. Problem Definition
• Consultants - One-day visit to each institution in June
2019. 45-60 minute interviews with the following
individuals, or their equivalents:
• Pro Vice Chancellor for Research
• Director of Research Office
• Director of Planning
• Library Director, Associate Director or Head of Scholarly
Communication
• One or two senior academics with research leadership
responsibilities (e.g. Associate Dean for Research,
Faculty/School Director of Research)
• Research Information Officer or Analyst(s) and/or
Research IT leads
10
Interviews
11. Problem Definition
• “Definition of done” = problem dossier that includes:
•Broad context
•Stakeholders
•Problems for each stakeholder and evidence to back it up
•User stories and user personas
•Evidence for all of the above
• Presented to FFRRM:
•Validation on the approach undertaken in the work to date
•Views on emerging findings from this work; and
•Guidance and steer on next steps to ensure selected solution aligns with
recommendations from the Metric Tide report (Independent Review of the Role of Metrics in
Research Assessment and Management - https://responsiblemetrics.org/the-metric-tide/)
11
Problem Dossier
12. Research Analytics Sophistication Model
12 Maturity of Research Analytics Development
Researcher/Group/Organisational/SectorImpact
LimitedIntegrated
Immature Advanced
Aware
Experimentation
Organisational
transformation
Research group
Researcher
PGR student
Sector
transformation
Research
organisation
• Basic reports
• Log data
• Drill down reports
• Sample dashboards
• PGR dashboards
• Researcher
dashboards
• Business intelligence
reporting tools
• Cross-system data
integration
• Predictive models
• Measured by impact
and organisational
strategy
• Data sharing
capabilities
• Innovation
• Open data
• Sector-wide agility
Emerging
13. Trends and overarching themes
Discussions with institutional research leaders and managers indicate the following priorities are
relevant to this work:
• Using data to identify opportunities and inform strategy
• Researcher careers and wellbeing (including EDI and mental health)
• Financial sustainability and the relationship between research and teaching
• Promoting knowledge exchange and demonstrating societal impact
• External accountability and evaluation, particularly REF and KEF
• Improving internal performance management of research
Did not figure prominently in discussions with institutions:
• Open infrastructure
• Avoiding vendor lock-in
• Research integrity and responsible metrics
13
Priorities
15. Assessing Research Analytics Capability – 5 case studies
15
• Research analytics remains immature in most
HEIs.
• ‘Research intelligence’ is primarily qualitative
in nature.
• Analytics expertise resides in Planning & MI
functions, and is focussed on teaching &
learning.
• Analytics maturity linked to size and data
availability, but also internal leadership.
• Most opportunities (>70%) rely on use of HEIs’
internal data rather than third party data alone.
HighLowMedium
Immature Emerging Mature
Availabilityofdata
ResearchAnalytics Development
HighLowMedium
Immature Emerging Mature
Availabilityofdata
HEI A
HEI B
HEI C
HEI D
HEI E
16. Application of research analytics
16
Thematic analysis of c.160 user stories derived from case studies, workshops and
interviews
Infrastructure and
technical capability
Resourcing and
skills
Negligible number
of user stories
relating to research
integrity
18. Identification of areas to take to solution phase
18
0
5
10
15
20
25
30
35
40
Strategy and
planning
Researcher
development and
careers
Financial
performance and
sustainability
Collaboration,
impact and
knowledge
exchange
Understanding and
evaluating
research
performance
Compliance with
legal, regulatory or
funder
requirements
Scholarly
communication
Research integrity
HEI data only
HEI & external data
External data only
Numberofuserstories
19. Jisc’s role in research analytics
• Low awareness and uptake of research analytics at present, but clear indicators of emerging
interest
• Funder and HEI priorities differ, and there is no ‘one-size-fits-all’ solution
• Those who have engaged directly with Jisc’s current analytics offer perceive this positively, and see
Jisc as a potential trusted partner…
• …but the wider research and research management community are unclear or ambivalent about what
role Jisc might play
• Any Jisc solution will need to complement or co-exist with:
• Commercial research analytics services (e.g. Elsevier, Clarivate, Digital Science)
• Institutions’ own reporting and analytics tools (e.g. Tableau, Power BI, Cognos)
19
Key considerations
20. Discovery Process
• Candidate solutions filtered down, more detail and tested with relevant users.
• Solution(s) prototype (wireframe or slide deck) to be built, tested with supporting information
• “Definition of done” is when at least one prototype has been tested with users and the solution pitch
is complete.
• Outline of how Jisc could pursue that solution - initial thoughts on business model, vision, competitive
landscape, alignment with political and international landscape, and technical approach.
• Set of assumptions for the solution, which subsequent development is to test.
• Solution requires buy-in from key Jisc and external stakeholders, the Forum for Responsible
Research Metrics, and (ideally) be able to demonstrate interest from at least 5 universities.
20
Solution refining
21. Design Sprint – How might we?
21 Reporting
Misc
Relationships
Design issues
EDI
Impact
Modelling &
Planning
Careers
Aggregation
Grant Applications
http://www.designkit.org/methods/3
22. Design Sprint
22
Solution Refining
• Identify a refined list of possible solutions to be developed
Grant applications
Grant applications
Career development
23. Design Sprint
23
Grant application ideas
• Automated analysis of grant submission. Tools to help with grant submission process.
• TDM service for all grant applications received for funders. Information to help refine future calls, insights.
• AI improves quality of grant applications. Could look at call information.
• Compare HEIs success rates. Grant information on dashboard to compare success rates.
• Breakdown of funding to HEIs from different sources – UK, EU, international (HESA?)
• Knowledge Graph – opportunities for innovation. Include research outputs. Current trends and
opportunities.
• News as a data source. Different journals, impact, opportunities for improving grant applications.
• Application analysis. Information on key topics and players. Help from grant submissions.
• Partner matching and consortium building using grant application data.
• First priority is supporting institutions and their research strategy.
• Helping universities plan their portfolio of research.
24. Design Sprint
• Identify a refined list of possible solutions to be developed
• List of solutions storyboarded with hypotheses and assumptions for testing, and plans for next steps
• Analytics Dashboards for HESA
•Selected for analytics lab work
• Cluster of user stories on measuring success rates of grant applications
•Further topics
• Scenario modelling (REF being one example)
• Testing ORCID API
• Solution storyboarding – grant applications (including AI and TDM techniques)
• List hypotheses and assumptions for testing (what data sources are available, user stories, test)
• Plan for next steps (including user testing)24
Outcomes
25. Grant Applications – Storyboard
Once upon a time (Issues)
• Grant applications are made (to JeS for Research Councils and other funding systems)
• These applications are made on topics of interest to the PI and/or in response to specific calls
Everyday
• Institutions have limited information on grant success rates. What has been funded and why?
• PVC-Rs want to know about success rates, reasons for application rejection, which
teams/departments/schools have the most success and why?
• What makes a successful bid?
• How does our institution compare with others of similar size and research areas?
• How can we improve grant applications’ structure to make it easier to put bids together?
• How can we identify key individuals from our institution and collaboration opportunities with other
institutions?
• Research support teams struggle to access this information, collate it and present it in a clear format
for the PVC-R25
Part 1
26. Grant Applications – Storyboard
Until
• Jisc offers a research analytics solution (RAS).
• Access to HEI’s historic grant applications (success and failure) and application data from funders.
• Aggregating the data.
And then
• RAS provides analytics for the institution to
•analyse performance
•comparisons with other HEIs
•support strategies and intervention decisions
•improve grant application success rates
•use tool to abstract features and traits of calls and awards in relevant topic areas
26
Part 2
27. Grant Applications – Storyboard
27
Part 3
And then
Use AI and TDM techniques to assess previous applications
•To look for relevant content, people, etc to bring together to create better teams
•To look for partners to cover institutional gaps
•Quality of bids
•Provide insight on where to direct effort (show strengths/weaknesses)
•Improve efficiencies
Until finally
• HEI now submits better focussed/targeted applications with a greater chance of success
• Reduction in time spent on proposals
• More focussed support
• Provides PVC-R and research office with a better idea of application content and success rates
• Improved reputation as focussed on cutting edge interdisciplinary research
• Increase in application success leading to increase in funding
28. Solution Phase
• Analytics Lab – putting together details of lab work based on selected user stories
• Five case studies – delivered to HEIs
• Communicate problem definition results and plans for solution refining
• Grant Applications
• Desk research
• Storyboard
•Extract hypotheses and assumptions
•Test these with HEIs (expressions of interest welcome!)
• Identify any university willing to share grant applications’ data
• Solutions pitch for a Research Analytics Service
28
What next?
29. Analytics Session
• What sort of grant application data are you collecting and analysing?
• How do you use this information to improve your grant application success rates?
• What metrics would you want from an analytics service related to grant applications?
• Would you be interested in tools and techniques that could improve your grant applications?
29
Main questions