Presentation at Americas Conference on Information Systems, 2017. Paper abstract:
End-user development (EUD) is a strategy that can reduce a considerable amount of business demand on
IT departments. Empowering the end-user in the context of software development is only possible
through technologies that allow them to manipulate data and information without the need for deep
programming knowledge. The successful selection of appropriate tools and technologies is highly
dependent on the context in which the end-user is embedded. End-users should be a central piece in any
software package evaluation, being key in the evaluation process in the end-user development context.
However, little research has empirically examined software package evaluation criteria and techniques in
general, and in the end-user development context in particular. This paper aims to provide a method for
technology evaluation in the context of end-user development and to present the evaluation of two
platforms. We conclude our study proposing a set of suggestions for future research.
Breaking the Kubernetes Kill Chain: Host Path Mount
A Method for Evaluating End-User Development Technologies
1. Americas Conference
on Information Systems,
AMCIS 2017,
Aug, 10th
A METHOD FOR EVALUATING
END-USER DEVELOPMENTTECHNOLOGIES
Prof. Dr. Claudia Melo, Dep. of Computer Science
Jonathan Moraes, Fac. of Software Engineering
Marcelo Ferreira, Fac. of Software Engineering
Prof. Dr. Rejane Figueiredo, Fac. of Software EngineeringUniversity of Brasília, Brazil
2. WHY ARE WE INVESTIGATING EUD TECHNOLOGY EVALUATION?
•# of end users > # of professional
so1ware developers
•> 50M end-user developers,
USA only (Burne9 and Myers,
2014)
• Li9le research has empirically examined so1ware package
evalua7on criteria and techniques in general, and in the end-user
development context in par7cular (Harnisch, 2014; Jadhav and
Sonar, 2009; Jadhav and Sonar, 2011; Misra and Mohanty, 2003)
• Success and failure of end-
user development within an
organizaOon ulOmately
depends on how effec7ve
so1ware packages are
chosen and used
2
3. • This research study is part of a partnership with the Ministry of
CommunicaOons, Science, and Technology, Brazil
• Plenty of EUD tools being offered in the market
• OrganizaOons sOll lost on how to take full advantage of EUD
(facing cultural, managerial and technological challenges). But
already buying soluOons!
INDUSTRY CONTEXT
3
4. Three different areas have important, but parOal, contribuOons to
our research purpose:
•So#ware package acquisi1on research;
•So#ware quality models & CSCW/HCI research, and
•Technology acceptance research.
LITERATURE REVIEW
4
5. WHAT WAS CONSIDERED AS A STARTING POINT?
The model should evaluate technologies that have:
•Essen1al quali1es that enable the end-user developer to manipulate the
tool and produce useful results in a certain applica1on domain
•from so]ware quality models and CSCW/HCI models;
•General quali1es inherent to so#ware packages
•from so]ware package acquisiOon models and technology acceptance models;
•Essen1al quali1es for management and technological governance
•from so]ware package acquisiOon models;
•An evalua1on method based on already-established and tested
techniques, even if they come from a different context
5
6. • Evalua1on criteria, characteris1cs, sub characteris1cs, and
aJributes
• Developed ques1ons and metrics from different points-of-
view (PoV).
STRUCTURING THE EVALUATION MODEL
6
8. EVALUATING EUD TECHNOLOGIES - STEPS
1. Determining the need, including high-level investigation of
software features and capabilities provided by vendors
August/2016 to October/2016 - literature review and contact
with leaders of public and private organizations to build a
general list of tools.
2. Short listing candidate packages and eliminating the
candidate packages that do not have the required feature We
shortlisted the most solid market offers (Forrester, 2016).
8
9. 3. Using the proposed evaluation technique to evaluate
remaining packages and obtain a score
4. Pilot testing the tool in an appropriate environment (parallel
to 3) Fundamental to refine the model proposed (removing,
rewriting, and adding questions/metrics).
The evaluation model and the platform evaluation results
presented in this work are already the result from a second
evaluation iteration
EVALUATING EUD TECHNOLOGIES - RESEARCH STEPS
9
10. EXECUTION DETAILS
The platforms:
● OutSystems
● Oracle Apex
● Salesforce Lightning¹
● Zoho Creator¹
1 - The Salesforce Lightning and Zoho Creator evaluations are not described in the paper.
10
11. Testing:
• 3 software engineering students as testers;
• Each student performed the entire model for every platform
using four default scenarios (C-R-U-D)
• Either using the platform’s predefined templates or not.
• A researcher supervised the testers and validated the forms.
EXECUTION DETAILS
11
20. CONCLUSION
● The major original contributions of the paper are:
○ A detailed method for evaluating EUD technologies that comprises 11
characteristics, 20 sub-characteristics, 30 attributes, 300 questions/metrics,
and
○ Evaluations using the method against 4 leading EUD platforms in the market
(we described only 2 in the paper)
○ Next steps: validation with real-world scenarios (e.g. action research) &
investigation of automation opportunities
20