O slideshow foi denunciado.
Utilizamos seu perfil e dados de atividades no LinkedIn para personalizar e exibir anúncios mais relevantes. Altere suas preferências de anúncios quando desejar.

How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob

1.888 visualizações

Publicada em

REST APIs are a key enabling technology for the cloud. Mobile applications, service-oriented architecture, and the Internet of Things depend on reliable and usable REST APIs. Unlike browser, native, and mobile apps, REST APIs can only be tested with software that drives the APIs. Unlike developer-centric hand-coded unit testing, adequate testing of REST APIs is truly well-suited to advanced automated testing.

As most web service applications are developed following an Agile process, effective testing must also avoid the "testing backblob," in which work to maintain hand-coded BDD-style test suites exceeds available time after a few iterations.

This talk presents a methodology for developing and testing REST APIs using model-based automation that has the beneficial side-effect of shrinking the testing backblob.

Publicada em: Software
  • Seja o primeiro a comentar

How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob

  1. 1. 20140918 System Verification Associates © 2014 1 How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob Unicom Next Generation Testing Conference Chicago, September 18, 2014 Robert V. Binder SystemVerification Associates Enabling High Assurance http://sysverif.com
  2. 2. 20140918 System Verification Associates © 2014 2 Overview • Background • Advanced API Verification • Dataflow Testing Model • Model-based Testing Demo • The Testing Twofer • Q&A
  3. 3. 20140918 System Verification Associates © 2014 3 Discovery Analysis Design Verification Support BACKGROUND
  4. 4. 20140918 System Verification Associates © 2014 4 You are here … Browser HTTP Client HTTP Server App HTTP Client App SOAP Client HTTP Client Files HTTP Server Service SOAP Server Service SOAP Server Files SOAP API REST API
  5. 5. 20140918 System Verification Associates © 2014 5 Programmable Web’s Growing Roster
  6. 6. 20140918 System Verification Associates © 2014 6 Google Trends: REST and SOAP 100 News Headline Occurrence, Monthly SOAP API REST API
  7. 7. 20140918 System Verification Associates © 2014 7 So many APIs, so little time … Why is this happening?
  8. 8. 20140918 System Verification Associates © 2014 8 Challenges • Usability • Narrow developer focus • Poor documentation • Revenue prevention • Assurance Fragmentation • Functionality • Security • Performance • Low reliability • Ineffective testing • Manual UI interaction • Developer-centric, hand- coded unit testing • Wheel spinning • High QA expense • Low quality All-aspect approach needed
  9. 9. 20140918 System Verification Associates © 2014 9 Discovery Analysis Design Verification Support ADVANCED API VERIFICATION
  10. 10. 20140918 System Verification Associates © 2014 10 Discovery Sprint • Survey and catalog • API documentation • Open and closed issues • Social media views • Codebase • Usage logs • Results • Strategy • Test environment spec • Report card Discovery Analysis Design Verification Support
  11. 11. 20140918 System Verification Associates © 2014 11 Analysis Sprint • Workflow • Construct usage profile • Scrutinize documentation • Abstract data model • Results • Doc issues • Gap analysis • Revised strategy Discovery Analysis Design Verification Support
  12. 12. 20140918 System Verification Associates © 2014 12 Design Sprint • Workflow • Configure virtual lab • Behavior/data models • Traffic capture/parsers • Instantiate adapters • Results • Stable test environment • All-aspect test model • Revised strategy Discovery Analysis Design Verification Support
  13. 13. 20140918 System Verification Associates © 2014 13 Verification Sprint • Workflow • Model checking • Generate/run test suites • Collect traffic logs • Analyze coverage • Results • All test artifacts • Test coverage report • Final report • Briefing Discovery Analysis Design Verification Support
  14. 14. 20140918 System Verification Associates © 2014 14 Support • As needed • Incremental design review • Usage monitoring • CI and regression testing • Results • Continuity • Protect investment • Continuous improvement Discovery Analysis Design Verification Support
  15. 15. 20140918 System Verification Associates © 2014 15 Discovery Analysis Design Verification Support DATAFLOW TESTING MODEL
  16. 16. 20140918 System Verification Associates © 2014 16 System Under Test Service Browser HTTP Client HTTP Server App SOAP Client SOAP Server HTTP Client Files Service HTTP Server SOAP Server Files App HTTP Client REST API
  17. 17. 20140918 System Verification Associates © 2014 17 Test Configuration Service App HTTP Server HTTP Client Service HTTP Server Generated Test Code Test Model REST API
  18. 18. 20140918 System Verification Associates © 2014 18 REST = Methods + Resources + Parameters Service App HTTP ServerHTTP Client HTTP Server Service HTTP methods: GET, PUT, POST, DELETE … HTTP resources (URI): http://foo.com/titles HTTP returned payload, JSON format: {"firstName": "Bob", "lastName": "Binder", "books": [ { "title": "Testing Object-oriented"}, { "title": "Application Debugging"} ] } Status Code: 200, 201, 400, 404 /?au=binder
  19. 19. 20140918 System Verification Associates © 2014 19 REST Dataflow Model – Normal Paths alpha Defined Used Gone PUT/201 GET/200 PUT|POST/200 DELETE/200 DELETE/200PUT|POST/200 GET/200
  20. 20. 20140918 System Verification Associates © 2014 20 REST Dataflow Model – Method Errors alpha Defined Used Gone DELETE|GET/404 DELETE|GET|PUT|POST/404
  21. 21. 20140918 System Verification Associates © 2014 21 REST Dataflow Model – Parameter Errors alpha Defined Used Gone PUT|POST|GET|DELETE ?garbage/400 PUT|POST|GET|DELETE ?garbage/400
  22. 22. 20140918 System Verification Associates © 2014 22 REST Dataflow Model alpha Defined Used Gone Test Pattern: Non-Modal Class
  23. 23. 20140918 System Verification Associates © 2014 23 Input variation, all sequences • Nominal values • Boundary values • Operator mutants • Fuzzing, each/all • Domain model • Pairwise selection • Sequence randomization Sounds like a lot of work!
  24. 24. 20140918 System Verification Associates © 2014 24 Model-based Testing • Model-based testing tool • Microsoft Research, 2001 • Test 500 MSFT APIs, 2007-12 • Robust and stable • Visual Studio “power tool” • C# code, not cartoons • Generates standalone executable test suite
  25. 25. 20140918 System Verification Associates © 2014 25 Demo • Synthetic Client • Model Program • Coordination File • Test Cases SUT HostTest Host Test Suite HTTP Server Synthetic Client Pass/Fail Synthetic Client Interface Spex Rules Spex Cord Test Modeling Test Execution Service Under Test Explore/ Generate
  26. 26. 20140918 System Verification Associates © 2014 26 Synthetic Client • The test model’s view of the SUT • Static class wrapper for HTTP client • Public methods correspond to SUT’s HTTP methods and resources • Manage server-side setup/cleanup • Message serialize/deserialize • Becomes part of the executable test code assembly • Example is a stub!
  27. 27. 20140918 System Verification Associates © 2014 27 Model Program • [Rule] • Determines when an action is called • Selects argument values for the action call • Computes expected results • Updates its model state as needed • Simulates environment and/or system under test
  28. 28. 20140918 System Verification Associates © 2014 28 Cord File • Defines all model actions • action = Synthetic Client public method • machine • Any action sequence • Similar to regex • May use other machines • Model any use case, scenario, slice, etc. • Many options
  29. 29. 20140918 System Verification Associates © 2014 29 What is Exploration? • Find all action sequences and data bindings that model program Rules and a machine allow • Search loop • Select a rule for a machine action • If enabling condition true: • Update model program state • Return expected results • Stop when all selected inputs used or size limit exceeded
  30. 30. 20140918 System Verification Associates © 2014 30 Machine Exploration • Shows all possible action sequences for a machine • No data bindings • Note similarity to normal path dataflow
  31. 31. 20140918 System Verification Associates © 2014 31 Model Program Exploration • Rules + machine • Rules add data bindings, expected results • Many ways to choose data values
  32. 32. 20140918 System Verification Associates © 2014 32 Test Cases from an Exploration • Spex chooses exploration steps that end in accepting state • Covers all states and steps at least once
  33. 33. 20140918 System Verification Associates © 2014 33 Generate Test Code • Standalone code – does not require model • Run from VS Test Explorer or command line
  34. 34. 20140918 System Verification Associates © 2014 34 SUT HostTest Host Test Suite HTTP Server Synthetic Client Pass/Fail Synthetic Client Interface Spex Rules Spex Cord Test Modeling Test Execution Service Under Test Explore/ Generate
  35. 35. 20140918 System Verification Associates © 2014 35 Test Strategy • Each resource path • Interleave all DUG variants • Accepting sequence • Wrong sequence • Pairwise combination • Parameters (path and value) • Mutants, nominal, edge • Security • Interleave Fuzz cases • Abuse case model • All other HTTP methods • Performance • Virtual users/test drivers • Randomize combos
  36. 36. 20140918 System Verification Associates © 2014 36 Discovery Analysis Design Verification Support THE TESTING TWOFER
  37. 37. 20140918 System Verification Associates © 2014 37 The Testing BackBlob Total Number of Test Cases Sprint 1 Available Test Time Manual Test Cases not executed Automated Test Cases not maintained Total Developed Test Cases Sprint 2 Sprint 3 Sprint 4
  38. 38. 20140918 System Verification Associates © 2014 38 The Attack of the Testing BackBlob Coming soon … to a scrum near you
  39. 39. 20140918 System Verification Associates © 2014 39 Test Asset Size Model Test Code Adapters Model-based Testing Behavior Driven Development
  40. 40. 20140918 System Verification Associates © 2014 40 Test Asset Maintenance Load Model Test Code Adapters Model-based Testing Behavior Driven Development
  41. 41. 20140918 System Verification Associates © 2014 41 The Testing Twofer Rock Solid APIs • Documentation Scrutiny • Fact-based Evaluation • Multi-dimensional testing • Dataflow coverage • Everything wrong at least once • Fuzzing • Repeat at scale Icing the BackBlob • Develop/maintain model • Regenerate test suites
  42. 42. 20140918 System Verification Associates © 2014 42 Q & A rvbinder@sysverif.com #MoreModelsLessTests http://sysverif.com
  43. 43. 20140918 System Verification Associates © 2014 43 Discovery Analysis Design Verification Support ETC. Say what you do, do what you say
  44. 44. 20140918 System Verification Associates © 2014 44 Robert V. Binder Robert Binder is a high-assurance entrepreneur. He has developed hundreds of application systems and advanced automated testing solutions. As test process architect for Microsoft’s Open Protocol Initiative, he lead the application of model-based testing to all of Microsoft’s server-side APIs. He is the author of the definitive Testing Object-Oriented Systems: Models, Patterns, and Tools and two other books. He holds a US patent for model-based testing of mobile systems. • MS, EECS, University of Illinois at Chicago • MBA, University of Chicago • BA, University of Chicago
  45. 45. 20140918 System Verification Associates © 2014 45 System Verification Associates Enabling High Assurance • Chicago- based consulting boutique • Clients are typically software development organizations for whom system failure is not an option. • We assist clients in achieving high reliability and effectiveness in their IT processes and systems. • Founded in 2009 and led by Robert V. Binder • http://sysverif.com • Advanced API Verification Datasheet • Supported Microsoft’s Open Protocols project with a team of experts; Robert Binder served process architect, leading the technical work of over 300 staff located in Redmond, China, India, and Argentina. • Assessed and improved software process at several FDA-regulated product companies, balancing quality management system compliance and Agile practices. • Developed model-based testing solutions for high- frequency trading and aerospace applications. • Helped software service and product companies articulate unique high-value messaging for innovative services. • Conducted and published the Model-based Testing User Survey of 2012 and 2014 (forthcoming.)
  46. 46. 20140918 System Verification Associates © 2014 46 Does My API Suck?  Your documentation is incomplete, wrong, misleading, or just plain incomprehensible.  Users complain that coding simple use cases is just too much hassle.  Users often rely on workarounds—they FTP files instead of using your API’s getFile.  Your API is unbalanced or incomplete—you can turn something on, but not off.  Your API’s service crashes or responds with garbage when messages are out of order or contain invalid data.  Version mismatches have unpredictable results.  No one is really sure what will happen with edge cases and they don’t want to know.  Your API allows your service to be hacked with common attack vectors.  Your service supports several protocols (REST, SOAP,…) or formats (JSON, XML,…), but behavior and data isn’t consistent  Your API doesn’t provide useful feedback— good and bad input all get the same response.  Your service is so awesome that it draws traffic spikes, but then your server chokes and dies. Buggy APIs are eating the world

×