SlideShare uma empresa Scribd logo
1 de 32
Baixar para ler offline
Phoenix Automated Test Bench
David P. Brown (sqa@compumonk.com)
What Is
 What is test automation?
 Utilizing software to test software
 Why do we automate?
 Replace testers?
 Efficiency?
 So we can do more and focus limited
resources on new and critical functions
What Is?
 COTS Progression
 Scripted (unit-testing)
 Record-playback
 Proprietary scripting engines
 Framework-based scripting engines
 Open Source?
 Yes! Automated testing doesn’t have to
be expensive!
Gutter Automation
 Scenario
 Publishing engine using XML input to
produce PDF formatted documents
 Resulting documents are several
thousand pages
 Weekly updates to the engine to correct
formatting
 Formatting important, but content
critical
 How do we automate this?
Gutter Automation
 Scenario
 Found tool to export PDF to TXT
 Found tool to export individual PDF
pages to an image
 Compare prior output to new output
 Images will direct SMEs to which pages
to review for formatting
 TXT compare ensures that content was
not dropped as a result of the update
Best Practices
 Best practices
 Automation provides an ROI when used
for regression testing
 Treated as a development project with
associated controls
 Test environments must be robust
 Framework based
 Data-driven
 Action-word driven?
Best Practices
 Framework Driven
 Library of functions that isolate common
operations from the script
 Scripts are no longer atomic
 Framework is updated when changes to
the product “break” scripts
Best Practices
 Data Driven
 Data used by the scripts is housed
separately
 Scripts accept data along with expected
results and therefore can be executed
multiple times with varying data
 Example: An test for a store locator can
be generic enough to accept several
hundred scenarios for searching
Best Practices
 Action-word Driven
 Scripts are written using a set of defined
“Action Words” with parameters
 Testers do not have to be developers
 A middle-ware layer interprets the
“Action Words” and translates them to
framework calls
 Automated tests can be written at the
same time as manual tests!
Edge Scenario
 Nielsen PC-Meter
 Around 100 defined metering tests
 Must be executed against various
operating systems and browsers (2,300
scenarios)
 Must be executed using at least 30
different web sites (69,000 scenarios)
 Tests must be executed against pre-
release versions of operating systems
and browsers
 How do we automate this?
Edge Scenario
 An IDEA!
 Use open-source and off-the-shelf tools
 How about a portal where tests could be
configured to run against various
combinations of…
 Test machines
 Test sets
 Data sets
 Let’s make it action-word and data-
driven!
 How about “clean-slate” test environs?
Phoenix ATB
 Phoenix ATB
 Enter the Phoenix Automated Test
Bench
 Open-source customizable test
automation solution
Phoenix ATB
 Features
 Fully framework based
 Action-word driven (allow testers to write tests, no
coding)
 Data-driven (tests can be run multiple times for different
data points)
 Verbose results for later push to test tracking tool
(Gemini?), including evidence
 Platform agnostic (not targeted to a specific application)
 Client-server architecture (minimal footprint on target
test machine)
 Execute tests against multiple test machines in parallel
 Fully database driven
 Operate against VMware ESXi virtual machines
Phoenix ATB
 Tests
 Authored in Microsoft Excel and
imported into the portal
 Utilizes defined action-words and
parameters
 Parameter values can be defined to
come from a “Data Set” or “Base State”
 Grouped in “Test Sets”
Phoenix ATB
 Test Case – Excel Authoring
Phoenix ATB
 Test Case – Portal View
Phoenix ATB
 Test Data
 Authored in Microsoft Excel and
imported into the management portal
 Basic “named” value which matches up
to a test case parameter value name
 There can be multiple named values
 Married to the test case at the “Test Set”
level
Phoenix ATB
 Test Set
 Defined in the management portal
 One-to-many relationship to test cases
 Optionally assign a data-set
 Tests can be sequenced (one runs after
another) and will be run as a group
Phoenix ATB
 Base State
 Defined in the management portal
 Set of configuration parameters used
globally in a “Test Run”
 Examples are login parameters, location
of installers, etc.
Phoenix ATB
 Test Machines
 Defined in the management portal
 Individual machines where a test can execute
 Can be a physical configuration or a VMWare
client operating system
 Test Machine Groups
 A set of defined test machines
 I have 3 Windows 8.1 machines, I’d group
them here
Phoenix ATB
 Test Run
 Defined in the management portal
 One-to-many relationship to Test Set,
Base State and Test Machine Group
 For example, you can define a Test Set
to execute against multiple operating
systems by simply assigning different
Test Machine Groups
Phoenix ATB
 Schedule
 Sets up a Run to execute immediately or
at a specified date/time
Phoenix ATB
 Journal
 Review results of the execution of a Run
Phoenix ATB
 Hierarchy
Test Case
Test Set
Test
Machines
Base
States
Test Data
Sets
Test Run
Test
Results
Phoenix ATB
 Journal
 Review results of the execution of a Run
Phoenix ATB
 Controller
 Console application that can run as a
service
 Manages execution of the tests
 Manages assignment of test machines
 Resets virtual machines as necessary
 Launches Engine per test run
 Posts results of test execution to the
database
Phoenix ATB
 Engine/Agent
 Engine runs per test or sequence of
tests and is assigned a test machine by
the controller
 Utilizes an XML input to determine the
tests, parameters and test machine to
run against
 Agent resides on the test machine and
accepts commands from the Engine
 Engine outputs results to an XML file and
then terminates
Phoenix ATB
VMWare
Portal WebSite
DB
Controller
VMs
Agent
Results & Evidence
Temp Storage
Client
Client
Engine
Phoenix ATB
 Base Components
 Microsoft SQL Server (MSDE)
 IIS Express (8.0)
 VMWare ESXi (optional)
 Microsoft Excel 2000 or above (not free,
but pretty much everyone has it)
 Phoenix ATB (open-source)
Phoenix ATB
 VMWare ESXi
 Yes, it is FREE!!
 Controller utilizes SSH commands to the
VMWare Host
 You can go cheap and support
Mac+Windows+Linux using Mac Minis!
 Define a bunch of machines, and
configure the portal to only run a few at
a time
Phoenix ATB
 Moving forward
 Installation/configuration guide
 Ability to modify action-words other list
items via the portal (currently SQL
statements)
 Generic Engine/Agent (Selenium-based)
 Ability to schedule a recurring test run
 Author tests via the portal vs. MS Excel
Questions
&
Answers
Questions?
David P. Brown (sqa@compumonk.com)

Mais conteúdo relacionado

Mais procurados

QTP with Descriptive programming
QTP with Descriptive programmingQTP with Descriptive programming
QTP with Descriptive programming
Kuldeep Sharma
 
ASP.NET 4.0
ASP.NET 4.0ASP.NET 4.0
ASP.NET 4.0
XeDotNet
 
Apache JMeter Introduction
Apache JMeter IntroductionApache JMeter Introduction
Apache JMeter Introduction
Søren Lund
 

Mais procurados (20)

Web Services and Introduction of SOAPUI
Web Services and Introduction of SOAPUIWeb Services and Introduction of SOAPUI
Web Services and Introduction of SOAPUI
 
Web Services Automated Testing via SoapUI Tool
Web Services Automated Testing via SoapUI ToolWeb Services Automated Testing via SoapUI Tool
Web Services Automated Testing via SoapUI Tool
 
Test automation process
Test automation processTest automation process
Test automation process
 
IBM Performance Optimizaiton Toolkit for Rational Performance Tester
IBM Performance Optimizaiton Toolkit for Rational Performance TesterIBM Performance Optimizaiton Toolkit for Rational Performance Tester
IBM Performance Optimizaiton Toolkit for Rational Performance Tester
 
API Testing - The power of libraries (chai, cheerio.js ,lodash and moment.js)
API Testing - The power of libraries (chai, cheerio.js ,lodash and moment.js)API Testing - The power of libraries (chai, cheerio.js ,lodash and moment.js)
API Testing - The power of libraries (chai, cheerio.js ,lodash and moment.js)
 
Load Test Drupal Site Using JMeter and Amazon AWS
Load Test Drupal Site Using JMeter and Amazon AWSLoad Test Drupal Site Using JMeter and Amazon AWS
Load Test Drupal Site Using JMeter and Amazon AWS
 
QTP with Descriptive programming
QTP with Descriptive programmingQTP with Descriptive programming
QTP with Descriptive programming
 
JMeter vs LoadRunner | Edureka
JMeter vs LoadRunner | EdurekaJMeter vs LoadRunner | Edureka
JMeter vs LoadRunner | Edureka
 
Rpt ppt
Rpt pptRpt ppt
Rpt ppt
 
Java 8 New features
Java 8 New featuresJava 8 New features
Java 8 New features
 
ASP.NET 4.0
ASP.NET 4.0ASP.NET 4.0
ASP.NET 4.0
 
2 fitnesse
2 fitnesse2 fitnesse
2 fitnesse
 
My cool new Slideshow!
My cool new Slideshow!My cool new Slideshow!
My cool new Slideshow!
 
Spring’16 Developer Highlights
Spring’16 Developer HighlightsSpring’16 Developer Highlights
Spring’16 Developer Highlights
 
Automation using ibm rft
Automation using ibm rftAutomation using ibm rft
Automation using ibm rft
 
Automating Perl deployments with Hudson
Automating Perl deployments with HudsonAutomating Perl deployments with Hudson
Automating Perl deployments with Hudson
 
What's New in ReSharper 9?
What's New in ReSharper 9?What's New in ReSharper 9?
What's New in ReSharper 9?
 
Test automation
Test automationTest automation
Test automation
 
Apache JMeter Introduction
Apache JMeter IntroductionApache JMeter Introduction
Apache JMeter Introduction
 
FUNTASY - Functional testing automated system
FUNTASY - Functional testing automated systemFUNTASY - Functional testing automated system
FUNTASY - Functional testing automated system
 

Destaque

Ask the Tester with Michael Larsen
Ask the Tester with Michael LarsenAsk the Tester with Michael Larsen
Ask the Tester with Michael Larsen
Michael Larsen
 

Destaque (11)

Ask the Tester with Michael Larsen
Ask the Tester with Michael LarsenAsk the Tester with Michael Larsen
Ask the Tester with Michael Larsen
 
Agile: Looking Back, Looking Forward: Adapt, Innovate, Collaborate & Deliver
Agile: Looking Back, Looking Forward: Adapt, Innovate, Collaborate & DeliverAgile: Looking Back, Looking Forward: Adapt, Innovate, Collaborate & Deliver
Agile: Looking Back, Looking Forward: Adapt, Innovate, Collaborate & Deliver
 
Develop your brand
Develop your brandDevelop your brand
Develop your brand
 
Startups And Software Testing
Startups And Software TestingStartups And Software Testing
Startups And Software Testing
 
MIR
MIRMIR
MIR
 
15 Uses of Video in Technical Communications
15 Uses of Video in Technical Communications15 Uses of Video in Technical Communications
15 Uses of Video in Technical Communications
 
Coaching Nightmares: Lessons We Can Learn From Gordon Ramsay
Coaching Nightmares: Lessons We Can Learn From Gordon RamsayCoaching Nightmares: Lessons We Can Learn From Gordon Ramsay
Coaching Nightmares: Lessons We Can Learn From Gordon Ramsay
 
Visual Management: Leading With What You Can See
Visual Management: Leading With What You Can SeeVisual Management: Leading With What You Can See
Visual Management: Leading With What You Can See
 
Visual Management: Leading With What You Can See
Visual Management: Leading With What You Can SeeVisual Management: Leading With What You Can See
Visual Management: Leading With What You Can See
 
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
 
It All Starts With An idea: Kicking Off Initiatives For Success
It All Starts With An idea: Kicking Off Initiatives For SuccessIt All Starts With An idea: Kicking Off Initiatives For Success
It All Starts With An idea: Kicking Off Initiatives For Success
 

Semelhante a David P Brown - Phoenix ATB 2014-11-18

Final Automation Testing
Final Automation TestingFinal Automation Testing
Final Automation Testing
priya_trivedi
 
Alm Specialist Toolkit Team System Roadmap 2008 And Beyond External
Alm Specialist Toolkit   Team System Roadmap 2008 And Beyond ExternalAlm Specialist Toolkit   Team System Roadmap 2008 And Beyond External
Alm Specialist Toolkit Team System Roadmap 2008 And Beyond External
Christian Thilmany
 
Testing soa, web services and application development framework applications
Testing soa, web services and application development framework applicationsTesting soa, web services and application development framework applications
Testing soa, web services and application development framework applications
InSync Conference
 
justin presentation upload PPT june 19
justin presentation upload PPT june 19justin presentation upload PPT june 19
justin presentation upload PPT june 19
techweb08
 
justin for ppt1 by browse button
justin for ppt1 by browse buttonjustin for ppt1 by browse button
justin for ppt1 by browse button
techweb08
 
upload ppt by browse button
upload ppt by browse buttonupload ppt by browse button
upload ppt by browse button
techweb08
 
Paper PsUpload
Paper PsUploadPaper PsUpload
Paper PsUpload
techweb08
 
justin presentation Slideshare PPT upload June 25 Final one
justin presentation Slideshare PPT upload June 25 Final onejustin presentation Slideshare PPT upload June 25 Final one
justin presentation Slideshare PPT upload June 25 Final one
techweb08
 
upload ppt1 by browse button
upload ppt1 by browse buttonupload ppt1 by browse button
upload ppt1 by browse button
techweb08
 

Semelhante a David P Brown - Phoenix ATB 2014-11-18 (20)

Netserv Software Testing
Netserv Software TestingNetserv Software Testing
Netserv Software Testing
 
QTP Online Training
QTP Online TrainingQTP Online Training
QTP Online Training
 
About Qtp 92
About Qtp 92About Qtp 92
About Qtp 92
 
About QTP 9.2
About QTP 9.2About QTP 9.2
About QTP 9.2
 
About Qtp_1 92
About Qtp_1 92About Qtp_1 92
About Qtp_1 92
 
Whats New In 2010 (Msdn & Visual Studio)
Whats New In 2010 (Msdn & Visual Studio)Whats New In 2010 (Msdn & Visual Studio)
Whats New In 2010 (Msdn & Visual Studio)
 
Final Automation Testing
Final Automation TestingFinal Automation Testing
Final Automation Testing
 
Alm Specialist Toolkit Team System Roadmap 2008 And Beyond External
Alm Specialist Toolkit   Team System Roadmap 2008 And Beyond ExternalAlm Specialist Toolkit   Team System Roadmap 2008 And Beyond External
Alm Specialist Toolkit Team System Roadmap 2008 And Beyond External
 
Testing soa, web services and application development framework applications
Testing soa, web services and application development framework applicationsTesting soa, web services and application development framework applications
Testing soa, web services and application development framework applications
 
Paper CS
Paper CSPaper CS
Paper CS
 
alkatest7
alkatest7alkatest7
alkatest7
 
justin presentation upload PPT june 19
justin presentation upload PPT june 19justin presentation upload PPT june 19
justin presentation upload PPT june 19
 
justin for ppt1 by browse button
justin for ppt1 by browse buttonjustin for ppt1 by browse button
justin for ppt1 by browse button
 
Paper Ps
Paper PsPaper Ps
Paper Ps
 
upload ppt by browse button
upload ppt by browse buttonupload ppt by browse button
upload ppt by browse button
 
Paper PsUpload
Paper PsUploadPaper PsUpload
Paper PsUpload
 
justin presentation Slideshare PPT upload June 25 Final one
justin presentation Slideshare PPT upload June 25 Final onejustin presentation Slideshare PPT upload June 25 Final one
justin presentation Slideshare PPT upload June 25 Final one
 
Paper Ps
Paper PsPaper Ps
Paper Ps
 
upload ppt1 by browse button
upload ppt1 by browse buttonupload ppt1 by browse button
upload ppt1 by browse button
 
Paper Ps
Paper PsPaper Ps
Paper Ps
 

David P Brown - Phoenix ATB 2014-11-18

  • 1. Phoenix Automated Test Bench David P. Brown (sqa@compumonk.com)
  • 2. What Is  What is test automation?  Utilizing software to test software  Why do we automate?  Replace testers?  Efficiency?  So we can do more and focus limited resources on new and critical functions
  • 3. What Is?  COTS Progression  Scripted (unit-testing)  Record-playback  Proprietary scripting engines  Framework-based scripting engines  Open Source?  Yes! Automated testing doesn’t have to be expensive!
  • 4. Gutter Automation  Scenario  Publishing engine using XML input to produce PDF formatted documents  Resulting documents are several thousand pages  Weekly updates to the engine to correct formatting  Formatting important, but content critical  How do we automate this?
  • 5. Gutter Automation  Scenario  Found tool to export PDF to TXT  Found tool to export individual PDF pages to an image  Compare prior output to new output  Images will direct SMEs to which pages to review for formatting  TXT compare ensures that content was not dropped as a result of the update
  • 6. Best Practices  Best practices  Automation provides an ROI when used for regression testing  Treated as a development project with associated controls  Test environments must be robust  Framework based  Data-driven  Action-word driven?
  • 7. Best Practices  Framework Driven  Library of functions that isolate common operations from the script  Scripts are no longer atomic  Framework is updated when changes to the product “break” scripts
  • 8. Best Practices  Data Driven  Data used by the scripts is housed separately  Scripts accept data along with expected results and therefore can be executed multiple times with varying data  Example: An test for a store locator can be generic enough to accept several hundred scenarios for searching
  • 9. Best Practices  Action-word Driven  Scripts are written using a set of defined “Action Words” with parameters  Testers do not have to be developers  A middle-ware layer interprets the “Action Words” and translates them to framework calls  Automated tests can be written at the same time as manual tests!
  • 10. Edge Scenario  Nielsen PC-Meter  Around 100 defined metering tests  Must be executed against various operating systems and browsers (2,300 scenarios)  Must be executed using at least 30 different web sites (69,000 scenarios)  Tests must be executed against pre- release versions of operating systems and browsers  How do we automate this?
  • 11. Edge Scenario  An IDEA!  Use open-source and off-the-shelf tools  How about a portal where tests could be configured to run against various combinations of…  Test machines  Test sets  Data sets  Let’s make it action-word and data- driven!  How about “clean-slate” test environs?
  • 12. Phoenix ATB  Phoenix ATB  Enter the Phoenix Automated Test Bench  Open-source customizable test automation solution
  • 13. Phoenix ATB  Features  Fully framework based  Action-word driven (allow testers to write tests, no coding)  Data-driven (tests can be run multiple times for different data points)  Verbose results for later push to test tracking tool (Gemini?), including evidence  Platform agnostic (not targeted to a specific application)  Client-server architecture (minimal footprint on target test machine)  Execute tests against multiple test machines in parallel  Fully database driven  Operate against VMware ESXi virtual machines
  • 14. Phoenix ATB  Tests  Authored in Microsoft Excel and imported into the portal  Utilizes defined action-words and parameters  Parameter values can be defined to come from a “Data Set” or “Base State”  Grouped in “Test Sets”
  • 15. Phoenix ATB  Test Case – Excel Authoring
  • 16. Phoenix ATB  Test Case – Portal View
  • 17. Phoenix ATB  Test Data  Authored in Microsoft Excel and imported into the management portal  Basic “named” value which matches up to a test case parameter value name  There can be multiple named values  Married to the test case at the “Test Set” level
  • 18. Phoenix ATB  Test Set  Defined in the management portal  One-to-many relationship to test cases  Optionally assign a data-set  Tests can be sequenced (one runs after another) and will be run as a group
  • 19. Phoenix ATB  Base State  Defined in the management portal  Set of configuration parameters used globally in a “Test Run”  Examples are login parameters, location of installers, etc.
  • 20. Phoenix ATB  Test Machines  Defined in the management portal  Individual machines where a test can execute  Can be a physical configuration or a VMWare client operating system  Test Machine Groups  A set of defined test machines  I have 3 Windows 8.1 machines, I’d group them here
  • 21. Phoenix ATB  Test Run  Defined in the management portal  One-to-many relationship to Test Set, Base State and Test Machine Group  For example, you can define a Test Set to execute against multiple operating systems by simply assigning different Test Machine Groups
  • 22. Phoenix ATB  Schedule  Sets up a Run to execute immediately or at a specified date/time
  • 23. Phoenix ATB  Journal  Review results of the execution of a Run
  • 24. Phoenix ATB  Hierarchy Test Case Test Set Test Machines Base States Test Data Sets Test Run Test Results
  • 25. Phoenix ATB  Journal  Review results of the execution of a Run
  • 26. Phoenix ATB  Controller  Console application that can run as a service  Manages execution of the tests  Manages assignment of test machines  Resets virtual machines as necessary  Launches Engine per test run  Posts results of test execution to the database
  • 27. Phoenix ATB  Engine/Agent  Engine runs per test or sequence of tests and is assigned a test machine by the controller  Utilizes an XML input to determine the tests, parameters and test machine to run against  Agent resides on the test machine and accepts commands from the Engine  Engine outputs results to an XML file and then terminates
  • 28. Phoenix ATB VMWare Portal WebSite DB Controller VMs Agent Results & Evidence Temp Storage Client Client Engine
  • 29. Phoenix ATB  Base Components  Microsoft SQL Server (MSDE)  IIS Express (8.0)  VMWare ESXi (optional)  Microsoft Excel 2000 or above (not free, but pretty much everyone has it)  Phoenix ATB (open-source)
  • 30. Phoenix ATB  VMWare ESXi  Yes, it is FREE!!  Controller utilizes SSH commands to the VMWare Host  You can go cheap and support Mac+Windows+Linux using Mac Minis!  Define a bunch of machines, and configure the portal to only run a few at a time
  • 31. Phoenix ATB  Moving forward  Installation/configuration guide  Ability to modify action-words other list items via the portal (currently SQL statements)  Generic Engine/Agent (Selenium-based)  Ability to schedule a recurring test run  Author tests via the portal vs. MS Excel