Vip Call Girls Noida âĄïž Delhi âĄïž 9999965857 No Advance 24HRS Live
Â
Software Analytics: Data Analytics for Software Engineering
1. Software Analytics:
Data Analytics for Software Engineering
Achievements and Opportunities
Tao Xie
Department of Computer Science
University of Illinois at Urbana-Champaign, USA
taoxie@illinois.edu
In Collaboration with Microsoft Research; Most Slides Adapted from Slides Co-Presented with
Dongmei Zhang (Microsoft Research, http://research.microsoft.com/en-us/groups/sa/)
5. How people use software is changingâŠ
Individual Isolated
Not much data/content
generation
6. How people use software is changingâŠ
Individual
Social
Isolated
Not much data/content
generation
Collaborative
Huge amount of data/artifacts
generated anywhere anytime
8. How software is built & operated is changingâŠ
Data pervasive
Long product cycle
Experience & gut-feeling
In-lab testing
Informed decision making
Centralized development
Code centric
Debugging in the large
Distributed development
Continuous release
⊠âŠ
9. How software is built & operated is changingâŠ
Data pervasive
Long product cycle
Experience & gut-feeling
In-lab testing
Informed decision making
Centralized development
Code centric
Debugging in the large
Distributed development
Continuous release
⊠âŠ
10. Software Analytics
Software analytics is to enable software
practitioners to perform data exploration and
analysis in order to obtain insightful and
actionable information for data-driven tasks
around software and services.
Dongmei Zhang, Yingnong Dang, Jian-Guang Lou, Shi Han, Haidong Zhang, and Tao Xie. Software
Analytics as a Learning Case in Practice: Approaches and Experiences. In MALETS 2011
http://research.microsoft.com/en-us/groups/sa/malets11-analytics.pdf
11. Software Analytics
Software analytics is to enable software
practitioners to perform data exploration and
analysis in order to obtain insightful and
actionable information for data-driven tasks
around software and services.
http://research.microsoft.com/en-us/groups/sa/
http://research.microsoft.com/en-us/news/features/softwareanalytics-052013.aspx
12. Research topics â the trinity view
Software
Users
Software
Development
Process
Software
System
âą Covering different areas of
software domain
âą Throughout entire development
cycle
âą Enabling practitioners to obtain
insights
13. Data sources
Runtime traces
Program logs
System events
Perf counters
âŠ
Usage log
User surveys
Online forum posts
Blog & Twitter
âŠ
Source code
Bug history
Check-in history
Test cases
Eye tracking
MRI/EMG
âŠ
16. Target audience â software practitioners
Developer
Tester
Program Manager
Usability engineer
Designer
Support engineer
Management personnel
Operation engineer
17. Output â insightful information
âą Conveys meaningful and useful understanding or
knowledge towards completing the target task
âą Not easily attainable via directly investigating raw data
without aid of analytics technologies
âą Examples
â It is easy to count the number of re-opened bugs, but how to
find out the primary reasons for these re-opened bugs?
â When the availability of an online service drops below a
threshold, how to localize the problem?
18. Output â actionable information
âą Enables software practitioners to come up with concrete
solutions towards completing the target task
âą Examples
â Why bugs were re-opened?
âą A list of bug groups each with the same reason of re-
opening
â Why availability of online services dropped?
âą A list of problematic areas with associated confidence values
19. Research topics & technology pillars
Software
Users
Software
Development
Process
Software
System
Vertical
Horizontal
Information Visualization
Data Analysis Algorithms
Large-scale Computing
20. Connection to practice
âą Software Analytics is naturally tied with software
development practice
âą Getting real
21. Connection to practice
âą Software Analytics is naturally tied with software
development practice
âą Getting real
Real
Data
Real
Problems
Real
Users
Real
Tools
22. Outline
âą Software Engineering for Big Data
âą Overview of Software Analytics
âą Selected Projects
â StackMine: Performance debugging in the large
â XIAO: Scalable code clone analysis
â SAS: Incident management of online services
â Code Hunt/Pex4Fun: Teaching & learning via interactive gaming
23. StackMine
Performance debugging in the large
via mining millions of stack traces
http://research.microsoft.com/en-us/groups/sa/stackmine_icse2012.pdf
24. Performance issues in the real world
âą One of top user complaints
âą Impacting large number of users every day
âą High impact on usability and productivity
High Disk I/O High CPU consumption
25. Performance issues in the real world
âą One of top user complaints
âą Impacting large number of users every day
âą High impact on usability and productivity
High Disk I/O High CPU consumption
As modern software systems tend to get more and more complex,
given limited time and resource before software release, development-
site testing and debugging become more and more insufficient to
ensure satisfactory software performance.
29. Performance debugging in the large
Trace Storage
Trace collection
Bug Database
Network
Trace analysis
Bug filing
30. Performance debugging in the large
Trace Storage
Trace collection
Problematic Pattern
Repository Bug Database
Network
Trace analysis
Bug filing
31. Performance debugging in the large
Pattern Matching
Trace Storage
Trace collection
Bug update
Problematic Pattern
Repository Bug Database
Network
Trace analysis
Bug filing
32. Performance debugging in the large
Pattern Matching
Trace Storage
Trace collection
Bug update
Problematic Pattern
Repository Bug Database
Network
Trace analysis
Bug filing
Key to issue
discovery
33. Performance debugging in the large
Pattern Matching
Trace Storage
Trace collection
Bug update
Problematic Pattern
Repository Bug Database
Network
Trace analysis
Bug filing
Key to issue
discovery
Bottleneck of
scalability
34. Performance debugging in the large
Pattern Matching
Trace Storage
Trace collection
Bug update
Problematic Pattern
Repository Bug Database
Network
Trace analysis
How many issues are
still unknown?
Bug filing
Key to issue
discovery
Bottleneck of
scalability
35. Performance debugging in the large
Pattern Matching
Trace Storage
Trace collection
Bug update
Problematic Pattern
Repository Bug Database
Network
Trace analysis
How many issues are
still unknown?
Which trace file should I
investigate first?
Bug filing
Key to issue
discovery
Bottleneck of
scalability
36. Problem definition
Given OS traces collected from tens of thousands
(potentially millions) of users,
help domain experts identify impactful program execution
patterns
(that cause the most impactful underlying performance problems)
with limited time and resource.
37. Challenges
Combination of expertise
âą Generic machine learning tools without domain
knowledge guidance do not work well
Highly complex analysis
âą Numerous program runtime combinations triggering
performance problems
âą Multi-layer runtime components from application to
kernel being intertwined
Large-scale trace data
âą TBs of trace files and increasing
âą Millions of events in single trace streamInternet
38. Technical highlights
âą Machine learning for system domain
â Formulate the discovery of problematic execution patterns as
callstack mining & clustering
â Systematic mechanism to incorporate domain knowledge
âą Interactive performance analysis system
â Parallel mining infrastructure based on HPC + MPI
â Visualization aided interactive exploration
39. Impact
âWe believe that the MSRA tool is highly valuable and much more
efficient for mass trace (100+ traces) analysis. For 1000 traces, we
believe the tool saves us 4-6 weeks of time to create new signatures,
which is quite a significant productivity boost.â
Highly effective new issue discovery on Windows
mini-hang
Continuous impact on future Windows
versions
41. XIAO: Code Clone Analysis
âą Motivation
â Copy-and-paste is a common developer behavior
â A real tool widely adopted internally and externally
âą XIAO enables code clone analysis in the following way
â High tunability
â High scalability
â High compatibility
â High explorability
42. High tunability â what you tune is what you get
âą Intuitive similarity metric: effective control of the
degree of syntactical differences between two code
snippets
for (i = 0; i < n; i ++) {
a ++;
b ++;
c = foo(a, b);
d = bar(a, b, c);
e = a + c; }
for (i = 0; i < n; i ++) {
c = foo(a, b);
a ++;
b ++;
d = bar(a, b, c);
e = a + d;
e ++; }
43. High scalability
âą Four-step analysis process
âą Easily parallelizable based on source code partition
Pre-processing
Pruning Fine Matching
Coarse Matching
44. High compatibility
âą Compiler independent
âą Light-weight built-in parsers for C/C++ and C#
âą Open architecture for plug-in parsers to support
different languages
âą Easy adoption by product teams
â Different build environment
â Almost zero cost for trial
45. High explorability
1. Clone navigation based on source tree hierarchy
2. Pivoting of folder level statistics
3. Folder level statistics
4. Clone function list in selected folder
5. Clone function filters
6. Sorting by bug or refactoring potential
7. Tagging
1 2 3 4 5 6
7
1. Block correspondence
2. Block types
3. Block navigation
4. Copying
5. Bug filing
6. Tagging
1
2
3
4
1
6
5
46. Scenarios & Solutions
Quality gates at milestones
âą Architecture refactoring
âą Code clone clean up
âą Bug fixing
Post-release maintenance
âą Security bug investigation
âą Bug investigation for sustained engineering
Development and testing
âą Checking for similar issues before check-in
âą Reference info for code review
âą Supporting tool for bug triage
Online code clone search
Offline code clone analysis
48. More secure Microsoft products
Code Clone Search service integrated into
workflow of Microsoft Security Response Center
Over 590 million lines of code indexed across
multiple products
Real security issues proactively identified and
addressed
49. Example â MS Security Bulletin MS12-034
Combined Security Update for Microsoft Office, Windows, .NET Framework, and
Silverlight, published: Tuesday, May 08, 2012
3 publicly disclosed vulnerabilities and seven privately reported involved. Specifically,
one is exploited by the Duqu malware to execute arbitrary code when a user opened
a malicious Office document
Insufficient bounds check within the font parsing subsystem of win32k.sys
Cloned copy in gdiplus.dll, ogl.dll (office), Silver Light, Windows Journal viewer
Microsoft Technet Blog about this bulletin
However, we wanted to be sure to address the vulnerable code wherever it appeared
across the Microsoft code base. To that end, we have been working with Microsoft
Research to develop a âCloned Code Detectionâ system that we can run for every
MSRC case to find any instance of the vulnerable code in any shipping product. This
system is the one that found several of the copies of CVE-2011-3402 that we are
now addressing with MS12-034.
51. Motivation
Incident Management (IcM) is a critical task to
assure service quality
âą Online services are increasingly popular & important
âą High service quality is the key
52. Incident Management: Workflow
Detect a
service
issue
Alert On-
Call
Engineers
(OCEs)
Investigate
the problem
Restore
the
service
Fix root cause
via
postmortem
analysis
55. SAS: Incident management of online services
SAS, developed and deployed to effectively reduce MTTR
(Mean Time To Restore) via automatically analyzing
monitoring data
5
5
ï Design Principle of SAS
ï Automating Analysis
ï Handling Heterogeneity
ï Accumulating Knowledge
ï Supporting human-in-the-loop (HITL)
57. Industry Impact of SAS
Deployment
âą SAS deployed to
worldwide datacenters for
Service X (serving
hundreds of millions of
users) since June 2011
âą OCEs now heavily depend
on SAS
Usage
âą SAS helped successfully
diagnose ~76% of the
service incidents assisted
with SAS
58. Code Hunt/Pex4Fun
Teaching and learning via interactive gaming
http://research.microsoft.com/pubs/189240/icse13see-pex4fun.pdf
http://pex4fun.com/https://www.codehunt.com/
61. Behind the Scene
Secret Implementation
class Secret {
public static int Puzzle(int
x) {
if (x <= 0) return 1;
return x * Puzzle(x-1);
}
}
Player Implementation
class Player {
public static int Puzzle(int x) {
return x;
}
}
class Test {
public static void Driver(int x) {
if (Secret.Puzzle(x) != Player.Puzzle(x))
throw new Exception(âMismatchâ);
}
}
behavior
Secret Impl == Player Impl
62
62. Coding Duels: Fun and Engaging
Iterative gameplay
Adaptive
Personalized
No cheating
Clear winning criterion
66. Example User Feedback
âIt really got me *excited*. The part that got me most
is about spreading interest in teaching CS: I do think
that itâs REALLY great for teaching | learning!â
âI used to love the first person shooters and
the satisfaction of blowing away a whole team
of Noobies playing Rainbow Six, but this is far
more fun.â
âIâm afraid Iâll have to constrain myself to spend just an
hour or so a day on this really exciting stuff, as Iâm
really stuffed with work.â
Released since 2010
X
67. Usage Scenarios
Students: proceed through a sequence on puzzles to
learn and practice.
Educators: exercise different parts of a curriculum,
and track studentsâ progress
Recruiters: use contests to inspire communities and
results to guide hiring
Researchers: mine extensive data in Azure to evaluate
how people code and learn
http://taoxie.cs.illinois.edu/publications/icse15jseet-codehunt.pdf
ICSE 2015 JSEET:
http://taoxie.cs.illinois.edu/publications/icse13see-pex4fun.pdf
ICSE 2013 SEE: