O slideshow foi denunciado.
Seu SlideShare está sendo baixado. ×
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Carregando em…3
×

Confira estes a seguir

1 de 32 Anúncio

Mais Conteúdo rRelacionado

Semelhante a Testing Process (20)

Mais recentes (20)

Anúncio

Testing Process

  1. 1. ‘QA’ Process for Any Project Srikanth Chatradipalli
  2. 2. Start Requirements Design Development B & I T E S T I N G Project Process
  3. 3. ‘Project’ Structure Product / Project PM (Project Management) B & I (Build & Integration) QA (Quality Assurance) DEV (Development) - Client Interaction - Project Plan - Preparation of Requirements DOC - Project status update -Dev Plan - Application / Module / component development - Bug fixes - Unit Testing - B & I Plan - Integration of Code - Generation of new Build - Release notes - Test Plan - MTS preparation - Flash new build - Build acceptance - TE and Reports
  4. 4. ‘QA’ Process T E S T I N G Test Suite Preparation Test Execution MTS TS Process …. …. …. Execution TE Process …. …. …. Reports MAIL
  5. 5. Sanity Mail
  6. 6. Bug Format… • Priority: How quickly issue should be resolve – Immediate, High, Medium, and Low • Severity: Overall impact of the functionality – Critical, Major, Minor and Low Bug-Id A_12_001 Summary Observed SA app crash while launching the app Description Build Details: V0.2.apk HW Details: MotoG5sPlus Precondition: V app should be installed on device Steps to reproduce: Launch V app Expected Result: V app should launch successfully Actual Result: Crash observed while launching the V app Notes: attached logs and screen shot for your ref Priority High Severity High Reproducibility 5/5 (100%)
  7. 7. Types Of Issues Crashes ANR – Application not responding UI – User Interface Functionality Stability Performance Usability E2E issues – complete scenario not working Cosmetic issues – font Link not working Web pages – data not displaying, overlapping, Text Gap Errors, Search, scroll, Checkbox, Toggle button, Spelling mistakes, Color issues
  8. 8. Release Notes Format
  9. 9. Test Report 100 Percentage% Comments 80 80.0% Executed = Pass + Fail 70 70.0% Excuted and matches with Expected Value 10 10.0% Excuted and not matches with Expected Value Int 5 5.0% Due to issue Ext 5 5.0% Due to setup 5 5.0% Valid REQ and need to execute 5 5.0% Invalid REQ for the Project NE NA Total No of Cases Total No of Executed Cases PASS FAIL BLOCK Summary of the execution results for Sanity / component / Modules / Regression / System
  10. 10. What is Testing • Process - of Verifying and Validating a SW • Meets business REQs • Meets technical REQs (Dev Specific Use cases) • Identify Bug and Bugs free Project • Ensure Project Should handle all use cases Ultimately : QA is Process, Planning, Execution, identify / detect Defects and Reports
  11. 11. Major Objective •Before 1990 no Testing, •Done by Dev team, •not handling properly •Due to Huge loss •Many Projects are failed IMP: Follow the PROCESS Bug ‘Detection’ and ‘Prevention’ Quality Gate keepers : quality of the product Not only Functional and need to check Non-Functional Reduce cost of bugs – means try to raise bugs in early stage Continuous improvement the process
  12. 12. ‘Requirements’ Process R E Q s Client USER Should not Assume Discussion with TM/TL and get clarifications Record all comments
  13. 13. Testing ‘techniques’ • Positive(+Ve) : how the application behaves for the positive set of data / requirements. • Negative(-Ve) Testing : sw not fail when an unexpected input is given (against/break to the REQ) • Equivalence partitioning : technique that divides the input test data of the application DUT into each partition at least once of equivalent data from which test cases can be derived(range). • Boundary value : Valid(8-12) and Invalid(<8,>12) Analysis • Setup : Install, Uninstall • New Feature / Enhancement • Interaction, Interrupt, Recovery • Usability, Compatibility • GUI / UI • Common use cases : Irrespective of the Requirement More Coverage Optimization
  14. 14. …Continue with Examples - Install - Uninstall - registration - Send and receive messages - Video/Voice call - Send Voice message - send more than 10 images at a time - send file size > than the limited size - try to register invalid number - try to delete the file while downloading Equivalence partitioning Boundary value Analysis Interaction - add max 250 people in a group - 10,50,100,200 group - add max 1-250 people in a group - '< 249' , '>251' - WA + Attchments - WA Video call + Chat Usability Compatibility Interrupt - WhatsApp is easy to use - All options are allgned properly - Able to understand - with Diffferent OS - with Different Browsers - with Different Versions and screens - interruot with call while WA voice call in progress - Disconnect NW Stability Performance Others - Iterations : 100/1000/2K/5k - Install, Launch - Hours : 5/15/60/720 mins/24hrs - WA launching time - Countinously using - Chat messages scroll - Recovery : Power cycle, reboot - Common : 40+ - New Feature : Group Chat, WAs Web - Battery Drain, Idle
  15. 15. What are some best practices for writing TCs • Write test cases from as end-user perspective • Write test steps in a simple way so that anyone can easily follow them • Make the test cases reusable • Set the priority on each test case • Provide a test case description, test data, precondition, expected and post condition • Minimize the variables • Minimize the number of steps to create more modular test and more isolated functionality verification • Write invalid(-ve)test cases along with valid test cases • Follow proper name conventions • Review the test cases regularly and update or remove them if necessary
  16. 16. Types of SW Testing T E S T I N G Static Testing Dynamic Testing Docs Informal Reviews Walkthrough etc… White Box Black Box Unit Testing - Branch - Path - Loop - Statement - Data flow Functional Non-Functional - Sanity - Regression - New Feature - Integration - System - UAT - E2E - Interoperability - GUI - API - Stability - Performance - Compatibility - Security Verification Validation
  17. 17. Test Suite Preparation and Review Process Requirement MTS Review Under standing Queries Clarification Capture Doc Preparation UC Peer Review Lead Review Capture Comments Incorporate Final Review MTS MAIL to Team Note: Need to follow all types of testing techniques while preparation of the MTS MTS – Master Test Suite (will derive all test suites from MTS like Sanity, Stability, Regression…et)
  18. 18. Testing Strategy • Smoke testing done by B & I • Sanity Testing by QA team • Regression will start on Sanity Approved build • Stability : will start after 30% pass of Regression • Performance : Will start 50% pass of Regression • Usability : after 90% pass of Regression • UAT : after 95% pass of Regression (against the requirements) • E2E : method that validates entire software from starting to the end
  19. 19. ‘adb’ commands – Android Debug Bridge adb devices adb devices -l adb install <path of the apk> adb uninstall <apk package name> adb get-state adb reboot adb kill-server adb start-server adb push <folder path>/V2.apk /sdcard/ adb pull /sdcard/sb.png <folder path> adb version adb shell getprop ro.build.version.release adb shell adb shell screencap /sdcard/sb.png adb shell pm list packages <all apps> adb shell pm list packages -s (only for system apps> adb shell pm list packages -3 (only for 3rd party apps>
  20. 20. Continue… adb logcat adb logcat -c adb logcat > C:UsersSDesktopSelftrainingchksa1.txt adb bugreport > C:UsersSDesktopSelftrainingchksa1.txt adb shell <get focused app package name details> dumpsys window windows | grep -E "mCurrentFocus" adb shell am start -n com.android.settings/.Settings adb shell input text ‘Android' adb shell screenrecord /sdcard/demo.mp4 adb shell pwd adb shell wm size $ adb shell monkey -p your.package.name -v 500 adb root adb remount Adb reboot bootloader
  21. 21. Bug-Id Project Summary Raised By Date Component Description Build Details: HW Details: Precondition: Steps to reproduce: Expected Result: Actual Result: Notes: Priority Severity Reproducibility Total New Assign Duplicate Rejected Deferred Fix Re-Open Close Bug Report Bug Fields
  22. 22. Browser Use cases: sanity • A web Browser or Browser is a computer software application that enables a person to locate, retrieve, and display content such as webpages, images, video, as well as other files on the World Wide Web. • Ex: Google Chrome, Apple Safari, Microsoft IE and Edge, Firefox, Opera • What and How to do Browser/web testing • Remember Key Points: • 24-hour availability • Huge no. of users • Response Time, Ease of use, Accuracy – if not – users will move to competitor site
  23. 23. Browser “use cases”… 1. Functionality 2. Navigation - Test all links in your webpages are working correctly and make sure there are no broken links 3. Forms , Page content, Searches, Popup Windows, Menus, Buttons, Images, firewalls 4. Interface testing : Application, Webserver and Database Server 5. Database testing : executing the queries like - Creating, Updating and Deleting 6. Compatibility : Different - “Browsers” and “OS” 7. Performance : response time, load test – normal & peak, 8. Security Testing 9. Content / Data 10. Test Cookies 11. Cosmetic : no spelling or grammatical errors 12. Usability : user friendly 13. E2E : business workflow
  24. 24. Browser “issues”… • Compatibility • Cross Browser • Cross-Device • Firewalls and Port issues • User Input Validation • Response very slow • Content not displaying • Spelling and Color and Font issues • Security • UI issues
  25. 25. Types of Web Testing… ‘Static’ Content wont change WEB TESTING Mobile Mobile Browser E-Commerce Amazon, Flipkart ‘Dynamic’ Regularly content will change 1. Functionality Testing 2. Usability Testing 3. Interface Testing 4. Compatibility Testing 5. Performance Testing 6. Security Testing
  26. 26. Testing IP and OP for various levels •QA Process •Test Plan •MTS •All other formats Before •Build •Test Data •Environment / simulators •Test matrix •Test scripts During •Results •Reports •QA Release Notes After
  27. 27. QA Docs / Check list • Requirements • Test plan • Query sheet • MTS • Test case • Traceability matrix • Test results • Reports – Test and Bug • Comments • Mail
  28. 28. Project Plan ProjectName Android12'NewFeatures' StartDate 5/23/2022 End Date 6/16/2022 Comments No Tasks StartDate End Date No ofdays 05/23 05/24 05/25 05/26 05/27 05/28 05/29 05/30 05/31 06/01 ### 06/03 06/04 06/05 06/06 06/07 06/08 06/09 06/10 06/11 06/12 06/13 06/14 06/15 06/16 Day 1 Day 2 Day 3 Day 4 Day 5 Day 6 Day 7 Day 8 Day 9 Day 10 Day 11Day 12 Day 13 Day 14 Day 15 Day 16 Day 17 Day 18 Day 19 Day 20 Day 21 Day 22 Day 23 Day 24 Day 25 1 Android12'NewFeatures' 23/05 16/06 25 1 Project Plandiscussion 04/05 04/05 1 KT 04/06 04/07 2 Development DevPlanDiscussion 1 Unit Testing splash-screen 2 rounded-corners 10 recents#url-sharing 10 B&I Buildplandiscussion 1 Integrateall modules / code 21 BuildCadence: weekly 3builds andReleasenotes QA Test Plandiscussion 1 MTS, Test Report andBugReports MTS Preparation 6 NewFeaturetesting- splash-screen 2 rounded-corners 8 recents#url-sharing 8 Sanity Testing 18 System Testing 11 Stability Testing 3 BugFixes Testing 12 Week4/ Sprint4 Week5/ Sprint5 Week1/ Sprint1 Week2/ Sprint2 Week3/ Sprint3
  29. 29. Project Terminology MOM : Minutes of Meeting KT : Knowledge Transfer MTS : Master Test Suite TS : Test Suite, TE : Test Execution, TR : Test Report, BR : Bug Report Scrum Master – PM will play scrum master role POC - Point of Contact DUT - Device Under test
  30. 30. ‘QA’ Terminology • Verification • Validation • RCA - Root cause analysis • TM, TE, TS • Bug Age • ED - Escape Defect • Reminder Mails • Follow-Ups and Escalations • Always respond to query mails / other mails
  31. 31. Tell me about your self 1. My self…Education and Exp 2. Project Desc 3. Project / team structure 4. QA Process QA Team structure 5. Technology 6. Scrum call MOM, AI, Follow-ups 7. Roles and Responsibilities

×