SlideShare uma empresa Scribd logo
1 de 85
Baixar para ler offline
Software
     A software is a set of programs. They will take input and provide outputs. They are
two types 1) Software Application 2) Software Product

1)   A software development for a specific customer requirements called as Software
     Application.
2)   A software development depending on overall requirements in market called as
     software product. The interested customers are purchasing the licenses of Software
     Product.

Software Bidding :

       A proposal to develop a New Software is called Software Bidding. In Software
Application Development, the proposal is coming from specific customer. In product
development our organization is taking their own proposal.

Kick of Meeting :

       The CEO category person is conducting a meeting with high level management
and select a Project Manager to handle the New Software Development Process.

PIN (Project Initiation Note) Document :

        The selected Project Manager (PM) is preparing this document to estimate the
required people, the required technologies, required time and required resources. He/She
submitting the report to CEO. The CEO is conducting a review to give green signal to
Project Manager.

SDLC (Software Development Life Cycle) : (Water Model)

                                  Required Gathering
                                           ↓
                                 Analysis & Planning
                                           ↓
                                      Designing
                                           ↓
                                         Code
                                           ↓
                                        Testing
                                           ↓
                                Release & Maintenance
In above SDLC process, the single stage of testing is available and conducting the
testing by Developers. Due to these reasons, the organizations are concentrating on
Multiple Stages of Testing and separate testing teams to achieve quality.

Software Quality :

       →    Meet Customer Requirements (Functionality)
       →    Meet Customer Expectations (Usability Performance)
       →    Cost to Purchase License
       →    Time to Release

Software Quality Assurance (SQA) :

      The Monitoring and Measuring the strength of development process is called as
Software Quality Assurance / Verification.

Software Quality Control (SQC) :

      The Validation of product with respect to customer requirements is calling as
Software Quality Control / Validation / Testing.

“V” Model :

       ‘V’ Stands for Verification & Validation. This model is defining development
process with Testing Stages. This model is extension of SDLC Model.


     Verification                                    Validation

      Requirements                                           User Acceptance Testing
Gathering & Review


  Analysis & Planning                                      System Testing
         With Review


       High Level Design                               Integration Testing
              & Review                                 (Programs Testing)

         Low Level Design
               & Review                             Unit Testing (Program Testing)



                                       Coding
In above ‘V’ Model Reviews are calling as Verification Methods and Testing
levels are calling as Validations. In small and medium scale organizations the
management is maintaining the separate Testing Team for System Testing Only to
decrease project cost, because the System Testing is Bottle Next Stage in Software
Development Process.

I) Reviews in Analysis :
       In general the software development process is starting with requirements
gathering from Specific Customer in Application Development and requirements
gathering from Model Customers in Product development. After gathering requirements
the responsible Business Analyst is preparing BRS ( Business Requirements
Specification) document. This document is also known as User Requirement
Specification or Customer Requirement Specification.

       After gathering requirements, the business analyst sit with Project Manger and
develop SRS and Project Plan. The Software Requirements Specification Consists of
functional requirements to be developed and system requirements to be used.

    Example :


                 BRS                                        SRC
                                               Functional Requirement :

                                               2 Inputs , 1 Out Put, ‘+’ is
                Addition                       Operation

                                               System Requirement :

                                               ‘C’ Language


                  What?                                    How?

       After completion of BRS & SRS preparations, the corresponding Business
Analyst is conducting a review to estimate completeness and correctness of the
documents.

→   Are they Correct Requirements?
→   Are they Complete Requirements?
→   Are they Achievable Requirements?
→   Are they Reasonable(Time) Requirements?
→   Are they Testable Requirements?                                Go to V Model Next
II) Reviews in Design :
       After completion of successful Analysis and Review, the Design Category people
are preparing HLD, LLDs (High Level Design & Low Level Designs) The High Level
Design specifies the overall architecture of the Software. It is also known as System
Design or Architectural Design.
 Example :                                          Root
                                  LOGIN


       Mailing
                                                            Chatting



                                 LOGOUT
                                                  Leaf :


     Every Functionality or Module Internal Structure specified by Low Level Design
Documents. These are also known as Structural Design or Component Design.
 Example :
                                   User

                                       User ID & Password
          Invalid
                                  LOGIN                     Data Base
                    Re-Login

                                       Valid


                                Next Window


       HLD is a system level design and LLD is component or Module level design. So
one Software design consists of one HLD and Multiple LLDs.

      The corresponding designers are conducting a review on that document for
completeness and correctness.

→   Are they Understandable Designs?
→   Are they Correct Designs?
→   Are they Complete Designs?
→   Are they Followable Designs?                                  Go to V Model Next
III) Unit Testing :
       After completion of successful designs and reviews the corresponding
programmers are starting coding to construct a Software Physically. In this stage the
programmers are writing programs and Test each program using White Box / Glass Box /
Open Box Testing Techniques.




                   }
                              →   Basic Paths Coverage
                              →   Control Structure Coverage
                              →   Program Technique Coverage
                              →   Mutation Coverage

   Programs

(A) Basic Paths Coverage :
       The programmers are using this technique to estimate the Execution of a
programs. In this technique the programmer Executing a program more than one time to
cover all areas of that program in execution.

(B) Control Structure Coverage :
        After completion of successful Basic path coverage the corresponding
programmer is concentrating on the Correctness of that program execution in terms of
Inputs, Process and Outputs.

(C) Program Technique Coverage
        After successful Basic Paths & Control Structure Coverage, the corresponding
programmer is calculating the execution of that program. If that program execution speed
is not acceptable then the programmer is performing changes in that program structure
without disturbing the functionality.

       In this coverage the programmers are using Monitors and Profiles like 3rd party
software to calculate the execution speed of the program.

Note :

Monitors are used in VB.net
Profilers are used in Java
(D) Mutation Coverage
      Mutation means a change in program. Programmers are performing changes in
programs to estimate the completeness and correctness of that program testing.
      Test                     Repeat Test                        Test
       ↓                           ↓                               ↓


                                Change                          Change


       ↓                         ↓                              ↓
     Passed           Passed (Incomplete Test        Failed (Complete Testing)
       Basics Paths Coverage, Control Structure Coverage and Program Technique
Coverage are applicable on a program to test. Mutation Coverage is applicable Program
Testing to estimate completeness and correctness of that Testing.   Go to V Model Next


IV) Integration Testing :
        After completion of dependent programs development and Unit Testing, the
programmers are interconnecting them to form a complete System / Software.
        This testing is also known as Interface Testing there are Four Approaches to
Integrate Programs and Testing.

A) Top Down Approach :-

       In this approach the programmers are interconnecting main program and some of
subprograms. In the place of remaining sub-programs, the programmers are using
Temporary programs called “Stub"



                                     Main



                                             STUB        (Under Construction)




                        Sub1                      Sub2
B) Bottom Up Approach :-
      In this approach the programmers are interconnecting sub-programs without
coming from Main Program.
                                    Main



                                    Driver      (Under Construction)



                                    Sub1




                                    Sub2


C) Hybrid Approach :-
      In is a combined approach of Top Down & Bottom Up approaches. It is also
known as Sand Witch Approach.


                                    Main



                                    Driver      (Under Construction)



                                    Sub1



                                             Driver   (Under Construction)
                   Sub2


                                                      Sub3


D) System Approach :-

      The Integration of programs after completion of 100% coding is called System
Approach or Big Bang Approach
V) System Testing :
       After completion of successful Integration Testing, the Development Team is
Releasing a Software Build to separate Testing Team in our organization. This System
Testing classified into Three Sub Stages.

       1.      Usability Testing
       2.      Functional Testing
       3.      Non-Functional Testing

1. Usability Testing :
       In general the testing execution is starting with Usability Testing. During this Test
the Testing Team is Concentrating on “User Friendliness of Software Build” There are 2
sublevels in this Usability Testing.

       a) User Interface Testing :

               → Ease of Use (Understandable Screens)
               → Look & Feel (Attractive Screens)
               → Speed in Interface (short Navigations in Screens)

       b) Manuals Support Testing :

            In this test the Testing Team is verifying the Help of that Software.

Case Study :

       Receive S/w Build from Developers after Integration Testing.

                                                        ↓
                                             User Interface Testing
                                                        ↓
                                               Functional Testing
                                                        ↓
  Usability Testing                          Non-Functional Testing
                                                        ↓
                                                Manuals Testing
2. Functional Testing :
       It is a Mandatory Testing level in System Testing. During this test the Testing
Team is concentrating on the Correctness of Customer requirements in that S/w Build.
       This Testing classified into below sub tests.

a) Control Flow Testing :-
      The changes in properties of objects in an Application / S/w Build with respect to
      mouse and keyboard operations.

b) Error Handling Testing :-
      The prevention of wrong operations with meaningful messages.

c) Input Domain Coverage :-
      Whether our S/w Build is taking valid type and size of inputs or not?

d) Manipulations Coverage :-
     Whether our S/w Build is providing customer expected output or not?

e) Database Testing :-
      The input of Front End Screens operations on Back End database contact

f) Sanitation Testing :-
      Finding extra functionality with respect to Customer Requirements

Case Study :-

                        Software Build



          Screens
        (Front End)                           Data Base
                                             (Back End)



        Control Flow
       Error Handling                       Data Base
         I/p Domain                          Testing
       Manipulations
          Sanitation


                Functional / Black Box Testing
3. Non-Functional Testing :
       It is an optional level in System Testing. This level is expensive and complex to
conduct. During this test the Testing Team is concentrating on extra characteristics of
Software.


a) Reliability Testing :-

        It is also known as Recovery Testing. During this test the Testing Team is
validating whether our S/w Build is changing from Abnormal State to Normal State or
not?


b) Compatibility Testing :-

       It is also known Portability Testing. During this test the Testing Team is
concentrating on whether our S/w Build is running on Customer Expected platform or
not?
       Platform means Operating System, Browser, Compilers and Other System
Software’s.

c) Configuration Testing :-

       It is also known as Hardware Compatibility Testing. During this test the
Testing Team is concentrating on whether our S/w Build is supporting different
technology hardware devices or not?

       Ex :- Different Technology Printers, Networks … etc.,


d) Inter System Testing :-

        It is also known as End to End Testing or Interoperability Testing. During this
test the Testing Team is concentrating on whether our S/w Build is co-existence with
other Software application to share common resources or not?

Case Study :-
     Compatibility Testing         S/w Build → Operating System
                                      S/w Build → H/w Device
     Configuration Testing
                                           Ex : Printers
      Inter System Testing         S/w Build → Other S/w Build
e) Data Volume Testing :-

        During this test the Testing Team is inserting model data in our Application Build
to estimate peak limit of data. This data limit estimate is calling as Data Volume Testing.

   Ex :    1) M.S.Access Technology Software are managing 2GB Data Base, SQL
           Server managing 6-7GB Data Base and Orcle Tech. managing 10-12GB Data
           Base as maximum.

f) Installation Testing :-


     S/w Build                           Customer expected configuration system
         +                   Install      Customer expected size of Ram, HDD,
   Supported S/w                                Processor, OS…. Etc.,



       →    Setup program execution to start Installation.
       →    Easy interface during Installation.
       →    Occupied disk space after Installation.

g) Load Testing :-

       Load means that in number of Concurrent users are using our S/w Build at a
time. During this test the Testing Team is executing our S/w Build under customer
expected configuration and customer expected load to estimate speed of processing or
performance.

                       Client 1    □                   Server
                       Client 2    □.                S/w Build
                              .                       Process
                              .
                       Client N    □
h) Stress Testing :-

       The execution of our S/w Build under customer expected configuration and more
than Customer Load to estimate peak limit of Load is called Stress Testing.

i) Endurance Testing :-

       The execution of our S/w Build under Customer Expected configuration and
customer expected load to estimate continuity in processing is called Endurance Testing.
j) Security Testing :-
It is also known as penetration testing. During this test the Testing Team is
concentrating on three factors.

       Authorizations : S/w Build is allowing valid users and preventing invalid users.

       Ex :    Login with password, PIN, Digital Signatures, Finger Prints, Eye Retina,
               Scratch Cards….etc.,

       Access Control : The permission of valid users to access functionality in Build.

       Ex : Admin, User

       Encryption / Decryption :     The code conversation in between client and server
                                     process.
               Client                                  Server



              Request
                                                                     Response
                                                      Decrypted

              Encrypted


  Decrypted                     Cipher Text                         Encrypted


                                Cipher Text

k) Localization and Internationalization Testing :-

        This testing is applicable for Multi Languity Software. This type of softwares are
allowing multiple user language characters. Ex : English, Spanish, French …. Etc.,
        In localization testing the Test Engineer is providing multiple language characters
as Inputs to the S/w Build. In Internationalization Testing the Test Engineer is providing
a common language character (English) to S/w as Input. In this scenario the 3rd party
tools transfer common language character to other language characters.

Note : Java Unicode is better technology to develop multi languity softwares.

l) Parallel Testing :-
       It is also known as Competitive / Comparative Testing. During this test the
Testing Team is comparing our S/w Build with old version of same S/w or with similar
product in market to estimate competitiveness.
VI) User Acceptance Testing :
After completion of successful System Testing the Project Manger is
concentrating on UAT to collect feedback from real customers or model customers.
There are two ways in this User Acceptance Testing.

         α Alpha Testing                            β Beta Testing
→ For S/w Application                    → For S/w Products
→ By real customers with involvement     → By Model Customers
  Of Developers and Testers
→ In Development Site                    → In Model Customer Site

VII) Release Testing :
       After completion of UAT and their modifications the Project Manger is forming
Release Team or On Site Team to release application to Real Customer or to release
Product to license purchased customer. This release team or onsite team consists of Few
Programmers, Few Testers, Few Hardware Engineers with a Team Lead. This team is
observing below factors in Customer Site.

   1)   Complete Installation
   2)   Overall Functionality
   3)   Input devices handling (Key Board, Mouse….etc.,)
   4)   Output devices handling (Monitor, Printer….etc.,)
   5)   Secondary storage devices handling (Floppy, Pen Drive…etc.,)
   6)   O/s error handling
   7)   Co-existence with other S/w in customer site.
      The above factors checking in customer site is also known as Port Testing /
Deployment Testing.
      After successful release, the release team is conducting training sessions to
customer site people & then back to our organization.
VIII) Maintenance:
       During utilization of a Software, the customer site people are sending Software
Change Request (SCR) to our organization. These requests received by a special team in
our organization called Change Control Board (CCB). This team is consists of Few
Programmers, Few Testers, Few Hardware Engineers along with Project Manager.

                              S/w Change Request

      Enhancement                                          Missed Deffects


      Impact Analysis                                      Impact Analysis
             ↓                                                     ↓
        Perform S/w                                      Perform S/w Changes
          Changes                                                  ↓
             ↓                Conducted by CCB            Test S/w Changes
     Test S/w Changes                                              ↓
                                                           Improve Testing
                                                           Process & People
                                                              Capability
Case Study :-
                        Deliverable to be
  Testing Stages                              Responsibility      Testing Techniques
                             Tested
                                                                    Walk Through,
Reviews in Analysis       BRS & SRS                 BA            Inspections & Peer
                                                                        Reviews
                                                                    Walk Through,
 Review in Design        HLD & LLDs              Designers        Inspections & Peer
                                                                        Reviews
                                                                  White Box Testing
   Unit Testing            Programs            Programmers
                                                                      Techniques
                      Interface in between                        Top Down, Bottom
Integration Testing                            Programmers
                            Programs                              Up, Hybrid, System
                                                                       Usability,
                                              Test Engineers /
                                                                   Functional / Black
  System Testing           S/w Build          Quality Control
                                                                       Box, Non-
                                                Engineers
                                                                   Functional Testing
 User Acceptance                             Real Customers /         α -Testing,
                           S/w Build
       Test                                  Model Customers          β - Testing
                                                                  S/w Release Factors
 Releasing Testing         S/w Build           Release Team
                                                                   (7 Factors in VII)
Maintenance Level
                          S/w changes              CCB            Regressing Testing
     Testing
Walk Through :- A document study to estimate completeness and correctness

Inspection :- Search & Issue in a document called as Inspection

Peer Reviews :- Comparing the document with other similar document.


                        Challenges in Software Testing
       In general every Testing Team is planning formal testing to conduct. Due to some
challenges in testing, the Testing Teams are going to conduct Ad-hoc Testing or
Informal Testing. There are Five Styles of Ad-Hoc Testing.

a) Monkey / Chimpangy Testing :-

       Due to lack of time the Testing Team is conducting testing on Main Activities of
a Software. This type / stage of testing is called as Monkey Testing.

b) Buddy Testing :-

       Due to lack of time the Project Management is combining one programmer and
one Tester as a Buddy. This teams are conducting Development & Testing Parallely.

c) Exploratory Testing :-

      It is also known as Artistic Testing. Due to lack of Documentation, the Test
Engineers are depending on Past Experience, Discussions with others, Video Conference
with customer site people, Internet Browsing & Similar S/w surfing to understand
customer requirements. This type of testing is called Exploratory Testing.

d) Pair Testing :-

      Due to lack of knowledge the Senior Test Engineers are groping with Junior Test
Engineers to share their knowledge. This style of testing is called Pair Testing.

e) Bebugging:-

        To estimate the efforts of Test Engineers the Development People are adding
defects to coding. This informal way is called Bebugging or Defect Feeding / Seeding.
System Testing Process

  Test              Test              Test               Test                   Test
Initiation        Planning           Design            Execution               Closure



                                                         Test
                                                       Reporting


             Development              Vs          System Testing

                                  S/w Bidding
                                       ↓
                                Kick of meeting
                                       ↓
                                 PIN Document
                                       ↓
                        Requirements Gathering (BRS)
                                       ↓
                    Analysis & Planning (SRS & Project Plan)



     S/w Design & Review (HLD, LLDs)                           System Test Initiation
                       ↓                                                 ↓
 Coding → Unit Testing (White Box Technique)                   System Test Planning
                       ↓                                                 ↓
       Integration → Integration Testing                            Test Design



                                  Initial Build
                                        ↓
                             System Test Execution
      Test                              ↓
    Reporting                 System Test closure
                                        ↓
                              User Acceptance Test
                                        ↓
                             Release & Maintenance
I) System Test Initiation :
        In general the System Testing process is starting with System Test Initiation by
Project Manager or Test Manager. They will develop Test Strategy or Test Methodology
Document. This document defines the reasonable Test to be applied in current project.



                    SRS           Test Initiation       Test Strategy
                     I/P                                     O/P




                           Project Manager / Test Manager


Components in Test Strategy :

      The Test Strategy Document consists of below components to define Test
Approach to be followed by Team in current project.

1. Scope & Objective :-

          The Purpose of Testing in current project

2. Business Issues :-

          The Budget allocation for Testing in current project

   Ex :      100% → Project Cost


       64%                               36%
    Development                     System Testing
   & Maintenance


3. Rolls & Responsibilities :-

          The names of jobs in Testing Team and responsibility of each job in current
project

4. Communication & Status Reporting :-

          The required negotiations in between various jobs in Testing Team
* 5. Test Responsibility Matrix (TRM) :-
**
           The list of reasonable test to be applied in current project.
   Ex.

          Testing Topic              Yes/No                Comment
            UI Testing                 Yes                     -
          Manual Testing               Yes                     -
         Functional Testing            Yes                     -
           Load Testing                No              Lack of Resources
           Stress Testing              No              Lack of Resources
         Endurance Testing             No              Lack of Resources
           Compatibility
                                       Yes                       -
              Testing
                                                          No need with
           Inter System
                                        No                  respect to
              Testing
                                                          requirements
               ..etc,,                 ..etc,,                ..etc,,

   6. Test Automation & Testing Tools :-

          The purpose of automation testing in current project and available testing tools in
   our organization.

   7. Defect Reporting & Seeking :-

          The required negotiation in between Testing Team and Development Team to
   report & solve defects.

   8. Change & Configuration Management :-

           The maintenance of deliverable in testing for future reference.

   9. Risks & Assumptions :-

           The expected list of risks and solutions to over come.

   10. Testing measurements & Metrics

           The list of measurements & Metrics to estimate test status.

   11. Training Plan :-

          The required number of training sessions to Testing Team to understand customer
   requirements.
II) Test Planning :
       After completion of Test Strategy document preparation the Test Lead Category
people are concentrating on Test Plan Documents Preparation.

 SRS, HLD & LLDs
                                Testing Team Formation
 Project Plan                   Identify Risks                   Test Plans
                                Prepare Detailed Text Plans
 Test Strategy                  Review Plans


Testing Team Formation :

       In general the Test Planning is starting with Testing Team formation. In this stage
the Test Lead is depending on below factors.

        →   Project Size (No. of Functional Prints)
        →   No.of Testers available on the bench
        →   Test Duration W.R.T Project Plan
        →   Available Test Environment Resources. (Ex. Testing Tools….)

Case Study :

         Type of Project                     Developers : Testers
         → ERP, Client / Server, Website     3:1
         → System S/w Application            1:1
         → Machine Critical                  1:7

Identify Risks :

       After completion of Testing Formation the Test Lead is concentrating on Team
Level Risks Analysis.

Ex :-
        Risk 1 :   Lack of Time
        Risk 2 :   Lack of Resources
        Risk 3 :   Lack of Documentation
        Risk 4 :   Delays in Delivery
        Risk 5 :   Lack of Development Process Seriouness
        Risk 6 :   Lack of Communication
Prepare Detailed Test Plans :
                 After Completion of Testing Team Formation and the risks analysis, the test lead
          is concentrating on test plan document preparation in IEEE 829 Format (Institute of
          Electrical and Electronics Engineer)

          Format :

          1. Test Plan ID                : Unique number or name for future reference about
                                           project.
          2. Introduction                : About Project
          3. Test Items                  : The names of Modules or Functionalities in Project
 What
          4. Features to be Tested       : The names of functionalities to be tested.
to Test
          5. Features not to be Tested   : The names of tested modules if available.
          6. Test Approach               : The List of selected tests by P.M.
          7. Test Environment            : The required Hardwares & Softwares to using testing.
          8. Entry Criteria              : Test Cases Designed, Test Environment Established,
                                           S/w Build received from Developers.
 How
to Test   9. Suspension Criteria         : → Test Environment Abounded
                                          → Shows stopper in build (Build not working)
                                          → Pending defects are more
          10. Exit Criteria              : → All modules in build covered
                                          → Test duration exceeded
                                          → All major defects solved
          11. Test Deliverables          : The list of testing documents to be prepared by test
                                           engineers in testing. (Test Scenarios, Test Cases,
                                           Automation Programs, Test Log, Defects reports and
                                           weekend reports)
          12. Staff and Training Needs   : The names of selected test engineers & required
Whom                                       training sessions to understand customer requirements.
to Test
          13. Responsibilities           : Work allocation to above selected test engineers. 9 All
                                           responsible tests on specified modules or specified
                                           testing on all modules.)
 When     14. Schedule                   : The dates & times to conduct testing
to Test
          15. Risks & Assumptions        : The previously analyzed risks and solutions to over
                                           come.
          16. Approvals                  : The signature of Test Lead & Project Manager.
Review Test Plan :

                    After completion of Test Plan document preparation the test is conducting a
             review meeting to estimate completeness and correctness of that planed document.
                    → Requirements / Module / Features / Functionalities Coverage
                    → Testing Topics Coverage
                    → Risks Oriented Coverage

             Note :
                    After completion of Test Planning and before starting Test Designs, the Business
             Analyst and Test Lead are conducting Training Sessions to select Test Engineers on that
             customer requirements in Project. Some organizations are inviting Domain Experts /
             Subject Experts for that Training Sessions from out side.

             III) Test Design :
                     After completion of required training sessions on customer requirements the
             corresponding Test Engineers are concentrating on Test Design to prepare Test Scenarios
             and Test Cases.
                     The Test Scenarios specifies “What” to test. The Test Cases specifies “How” to
             test including a detailed procedure. From these sentences the Test Cases are drawing
             from Test Scenarios. There are four methods in this Test Design.

Functional      1.    Functional Specification Based Test Case Design
 Testing        2.    Use Cases Based Test Case Design
 UT             3.    User Interface Based Test Case Design
 NFT            4.    Functional & System Specification Based Test Case Design

             1. Functional Specification Based Test Case Design :
                     To prepare Test Scenarios and Cases for Functional Testing, the Test Engineers
             are using this method. In this approach, the Test Engineers are preparing Scenarios and
             Cases depending on Functional Specifications in SRS.

                 BRS
                   ↓                        Test Design
                 SRS (Functional                                 Test Scenarios
                       Specifications)                                 ↓
                   ↓
                                                                  Test Cases
                 HLD
                   ↓
                 LLDs
                   ↓
                                    System Test Execution
               S/w Build
Approach :
Step 1 :-
        Collect Functional Specifications related to responsible areas.
Step 2 :-
        Take one specified and read that specification to gather entry point, required
inputs, normal flow, coming outputs, alternative flows, exit point and exceptions are
rules.
Step 3 :-
        Prepare Test Scenarios depending on above gathering information
Step 4 :-
        Preview that Test Scenarios and implement them as Test Cases
Step 5 :-
        Go to Step2 until all responsible Functional Specifications Study.

Functional Specification – 1 :-

        A login process allows User ID& Password to Authorized users. The User ID
object is taking alphanumeric in lower case from 4 to 16 characters long. The password
object is taking alphabets in lower case from 4 to 8 characters long.Prepare Test Scenario.

Test Scenario 1 :- Verify User ID object

Boundary Value Analysis (BVA) (Size) :

Min = 4 Char. → Pass         Max = 16 Char. → Pass
Min-1 = 3 Char. → Fail       Max-1 = 15 Char. → Pass
Min+1 = 5 Char. → Pass       Max+1 = 17Char. → Fail

Equivalence Class Partition (ECP) (Type) :
Valid    In-Valid
a-z, 0-9 A-Z, Special Characters, Blank Field

Test Scenario 2 :- Verify Password Object

Boundary Value Analysis (BVA) (Size) :
Min = 4 Char. → Pass      Max = 8 Char. → Pass
Min-1 = 3 Char. → Fail    Max-1 = 7 Char. → Pass
Min+1 = 5 Char. → Pass    Max+1 = 9 Char. → Fail

Equivalence Class Partition (ECP) (Type) :
Valid In-Valid
0-9   a-z, A-Z, Special Characters, Blank Field
Test Scenario 3 :- Verify Password Object Login Operation

Decision Table :

        User Id        Password      Expected O/p
        Valid Value    Valid Value   Next Window
        Valid Value    In Valid      Error Message
        Invalid        Valid         Error Message
        Valid          Blank Field   Error Message
        Bland          Valid         Error Message

Note : Exhaustive Testing is not possible due to this reason. The Testing Team is
conducting Optimal Testing using Black Box Testing Techniques like BVA,ECP,
Decision Table, regular expressions … etc.,

Functional Specification – 2 :-

        In an Insurance application, users are applying for different types of Insurance
policies. If a user select Type-A Insurance, then our system asks the age of that user. The
age value should be grater than 16 years and should be less than 80 years. Prepare Test
Scenario.

Test Scenario 1 :- Verify Type-A selection

Test Scenario 2 :- Verify focus to Age when you selected Type-A Insurance

Test Scenario 3 :- Verify Age Value

Boundary Value Analysis (BVA) (Range) :

Min = 17 → Pass         Max = 79 → Pass
Min-1 = 16 → Fail       Max-1 = 78 → Pass
Min+1 = 18 → Pass       Max+1 = 80 → Fail

Equivalence Class Partition (ECP) (Type) :
Valid In-Valid
0-9   a-z, A-Z, Special Characters, Blank Field

Functional Specification – 3 :-

        In a shopping application users are applying for different type to items purchase
orders. The purchase order is allowing user to select Item No. and to enter Qty. up to 10.
The purchase order returns Total Amount along with one item price. Prepare Test
Scenario.
Test Scenario 1 :- Verify Item No. Selection

Test Scenario 2 :- Verify Qty. Value

Boundary Value Analysis (BVA) (Range) :

Min = 1 → Pass        Max = 10 → Pass
Min-1 = 0 → Fail      Max-1 = 9 → Pass
Min+1 = 2 → Pass      Max+1 = 11 → Fail

Equivalence Class Partition (ECP) (Type) :
Valid In-Valid
0-9   a-z, A-Z, Special Characters, Blank Field

Test Scenario 3 :- Verify Total Amount, given Qty. * Item Pass

Functional Specification – 4 :-

       A Door Opened when a person comes to in front of the door and that door closed
when that person went to inside. Prepare Test Scenario.

Test Scenario 1 :- Verify Door Open

             Person Door Criteria
             Present Opened Pass
             Present Closed Fail
             Absent Opened  Fail
             Absent Closed  Pass

Test Scenario 2 :- Verify Door Close

Person Door Criteria
Inside Closed Pass
Inside Opened Fail

Test Scenario 3 :- Verify Door operation when a person is standing at the middle of the
door.

Functional Specification – 5 :-

        In an e-banking application, the customers are connecting to Bank Server through
a login process. This login allows customer to fill below fields.
Password : 6 digits number
Prefix : 3 Digits number but does not start with 0 & 1
Suffix : 6 Digits alphanumeric
Area Code : 3 Digits no but it is optional
Command : Cheque Deposit, Money Transfer, Mini Statement and Bills Paid.
Prepare Test Scenario.

Test Scenario 1 :- Verify Password Value

Boundary Value Analysis (BVA) (Size) :

Min = Max = 6 Digits → Pass    5 Digits → Fail     7 Digits → Fail

Equivalence Class Partition (ECP) (Type) :
Valid In-Valid
0-9   a-z, A-Z, Special Characters, Blank Field

Test Scenario 2 :- Verify Prefix

Boundary Value Analysis (BVA) (Size) :

Min = Max = 3 Digits → Pass    2 Digits → Fail     4 Digits → Fail

Equivalence Class Partition (ECP) (Type) :
Valid           In-Valid
[2-9][0-9][0-9] a-z, A-Z, Special Characters, Blank Field

Test Scenario 3 :- Verify Suffix

Boundary Value Analysis (BVA) (Size) :

Min = Max = 6 Digits → Pass    5 Digits → Fail 7 Digits → Fail

Equivalence Class Partition (ECP) (Type) :
Valid         In-Valid
0-9, a-z, A-Z Special Characters, Blank Field

Test Scenario 4 :- Verify Area Code

Boundary Value Analysis (BVA) (Size) :

Min = Max = 3 Digits → Pass        2 Digits → Fail 4 Digits → Fail

Equivalence Class Partition (ECP) (Type) :
Valid            In-Valid
0-9, Blank Field a-z, A-Z, Special Characters
Test Scenario 5 :- Verify command selection like Cheque Deposit, Money Transfer,
Mini Statement and Bills Paid.

Test Scenario 6 :- Verify login operation to connect to Bank Server

             Remaining Fields   Area Code         Expected O/p
               All are valid       Valid          Next Window
               All are valid    Blank Field       Next Window
               All are valid      Invalid         Error Message
              Any one Invalid   Valid/Blank       Error Message
            Any one Blank Field Valid/Blank       Error Message


Functional Specification – 6 :-

        In a library Management System the readers are applying for Identity No. to get
this no., the reader is filling below fields.

Reader Name   :   Alphabets in lower case with Init Cap as single word
House Name    :   Alphabets in lower case as single word
PIN Code      :   Related to India Postal Department
City Name     :   Alphabets in uppercase as single word
Phone No.     :   Related to India Subscribers and optional

Prepare Test Scenario

Test Scenario 1 :- Verify Reader Name

Boundary Value Analysis (BVA) (Size) :

Min = 1Char. → Pass         Max = 256Char. → Pass
Min-1 = 0Char. → Fail       Max-1 = 255Char. → Pass
Min+1 = 2Char. → Pass       Max+1 = 257Char. → Fail
(In any front end developed programs the default max. char are 256.)

Equivalence Class Partition (ECP) (Type) :
Valid       In-Valid
[A-Z][a-z]* 0-9, Special Characters, Blank Field

Test Scenario 2 :- Verify House Name

Boundary Value Analysis (BVA) (Size) :
Min = 1Char. → Pass      Max = 256Char. → Pass
Min-1 = 0Char. → Fail     Max-1 = 255Char. → Pass
Min+1 = 2Char. → Pass     Max+1 = 257Char. → Fail
Equivalence Class Partition (ECP) (Type) :
Valid In-Valid
[a-z]* A-Z, 0-9, Special Characters, Blank Field

Test Scenario 3 :- Verify PIN Code

Boundary Value Analysis (BVA) (Size) :

Min = Max = 6 Digits → Pass     5 Digits → Fail 7 Digits → Fail

Equivalence Class Partition (ECP) (Type) :
Valid                          In-Valid
[1-9][0-9][0-9][0-9][0-9][0-9] a-z, A-Z, Special Characters, Blank Field

Test Scenario 4 :- Verify City Name

Boundary Value Analysis (BVA) (Size) :

Min = 1Char. → Pass        Max = 256Char. → Pass
Min-1 = 0Char. → Fail      Max-1 = 255Char. → Pass
Min+1 = 2Char. → Pass      Max+1 = 257Char. → Fail

Equivalence Class Partition (ECP) (Type) :
Valid  In-Valid
[A-Z]* a-z, 0-9, Special Characters, Blank Field

Test Scenario 5 :- Verify Phone Number

Boundary Value Analysis (BVA) (Size) :
Min = 10 Digits → Pass        Max = 12 Digits → Pass
Min-1 = 9 digits → Fail     Max+1 = 13 Digits → Fail
Min+1 = 11 Digits → Pass
Equivalence Class Partition (ECP) (Type) :
Valid            In-Valid
0-9, Blank Field A-Z, a-z, Special Characters
Test Scenario 6 :- Verify Reader Registration

Decision Table :
Remaining Fields             Telephone Number              Expected O/p
All are valid                Valid                         Identity No.
All are valid                Blank Field                   Identity No.
All are valid                Invalid                       Error Msg.
Any one Invalid              Valid / Blank                 Error Msg.
Any one Blank Field          Valid / Blank                 Error Msg.

Functional Specification – 7 :-   A Computer Shut Down Operation

Test Scenario 1 :     Verify Shut Down option selection using Shut Down

Test Scenario 2 :     Verify Shut Down option selection using Alt+F4

Test Scenario 3 :     Verify Shut Down option selection using Ctr+Alt+Del

Test Scenario 4 :     Verify Shut Down operation success

Test Scenario 5 :     Verify Shut Down operation using Run Command.

Test Scenario 6 :     Verify Shut Down operation when a process is running

Test Scenario 7 :     Verify Shut Down operation using Power Off Button

Functional Specification – 8 :-

       Money With Drawl From ATM with all Rules and Regulations

Test Scenario 1 :   Verify Card Insertion

Test Scenario 2 :   Verify Card Insertion in Wrong Angle

Test Scenario 3 :   Verify Cancel After Card Insertion

Test Scenario 4 :   Verify Language Selection

Test Scenario 5 :   Verify Cancel after selection of Language

Test Scenario 6 :   Verify PIN Entry

Test Scenario 7 :   Verify operation with wrong PIN

Test Scenario 8 :   Verify operation when you enter wrong PIN 3 times consecutively
Test Scenario 9 :   Verify Cancel after enter PIN

Test Scenario 10 : Verify Amount type selection

Test Scenario 11 : Verify operation when you selected wrong account type with
                   respected to the inserted card

Test Scenario 12 : Verify cancel after account type selection

Test Scenario 13 : Verify with drawl option selection

Test Scenario 14 : Verify cancel after selection of with drawl

Test Scenario 15 : Verify amount entry

Test Scenario 16 : Verify operation with wrong denomination in amount

Test Scenario 17 : Verify with drawl operation success. (Correct amount, right receipt,
                   able to take card back)

Test Scenario 18 : Verify with drawl operation with grater than possible balance.

Test Scenario 19 : Verify with drawl operation with grater than day limit.

Test Scenario 20 : Verify with drawl operation with Net work problem

Test Scenario 21 : Verify with drawl amount with lack of amount in ATM

Test Scenario 22 : Verify with drawl operation with exceeded no.of Transactions per
                   day

Test Scenario 23 : Verify with drawl operation with other bank card

Test Scenario 24 : Verify with drawl operation with stolen card
2. Use Cases Based Test Case Design :

        It is an alternative method for Functional Specification Based Test Case Design.
In this method the Test Engineers are depending on Use Cases instead of Functional
Specifications to prepare Test Scenarios and Test Cases.

    BRS
      ↓                         Use Cases
    SRS (Functional          BA + Test Lead
                                                      Test Scenarios
          Specifications)                                   ↓
      ↓
                                                       Test Cases
    HLD
      ↓
    LLDs
      ↓
                       System Test Execution
   Coding
  (UT & IT)
  S/w Build

       From the above diagram the Business Analyst and Test Lead category people are
developing use cases depending on corresponding functional specifications in SRS.
       Every Use Case is an Implemented Form of Functional Specifications.

Use Case Format :-

1. Use Case ID                 : Unique number or name for future reference
2. Use Case Description        : The summery of corresponding Functionality
3. Required Inputs             : The required Inputs for corresponding Functionality
4. Precondition                : The necessary Condition to follow before operating
                                 corresponding functionality
5. Events List                 :

                                    Events / Tasks Expected O/p or Out come



                                   (A Step by Step procedure with expected outputs)
6. Activity Flow Diagram       : A Pictorial      /   Diagrammatic     of   corresponding
                                 functionality
7. Post Condition              : Necessary tasks to do after corresponding functionality
8. Alternative events list     : Alternative procedures to do this functionality if
                                 available
9. Proto Type                  : A screen shot related to corresponding functionality.
10. Related use cases          : The names of other Use Cases relation to corresponding
                                 functionality


Approach :
Step1 : Collect use cases of responsible areas
Step2 : Take one use case and study
Step3 : Identify Entry Point, Required I/p, Normal Flow, Expected O/p, Exit Point,
        Alternative Flows and Exceptions rules.
Step4 : Prepare Test Scenarios depending on above Identified Information.
Step5 : Review that scenario and implement them as Test Cases
Step6 : Go to Step2 until all responsible Use Cases Study

Use Case 1 :

1. Use Case ID                 : UC_Login
2. Use Case Description        : Login operation is authorization
3. Required Inputs             : User ID is in alphabets lower from 4-16 characters
                                 long. The Password alpha numeric in lower case from
                                 4-8Char. Long.
4. Precondition                : New User Registration to get valid User ID & Password
5. Events List                 :

                                       Events / Tasks         Expected O/p or Out come

                                     Enter User ID an        Next window for valid user
                                   Password Values and       and invalid data error msg.
                                   then click OK Button           for Invalid user.
6. Activity Flow Diagram     :
                                 Example :
                                                                   User

                                                                      User ID & Password
                                      Error Msg.
                                                                  LOGIN            Data Base
                                                   Re-Login

                                                                      Valid


                                                                Next Window


7. Post Condition            : Log out operation is mandatory after successful Login
8. Alternative events list   : None
9. Proto Type                :




10. Related use cases        : UC_New User, UC_Logout
Test Scenario 1 :- Check User ID

Boundary Value Analysis (BVA) (Size) :

Min = 4Char. → Pass        Max = 16Char. → Pass
Min-1 = 3Char. → Fail      Max-1 = 15Char. → Pass
Min+1 = 5Char. → Pass      Max+1 = 17Char. → Fail

Equivalence Class Partition (ECP) (Type) :
Valid In-Valid
a-z   A-Z, 0-9, Special Characters, Blank Field

Test Scenario 2 :- Check Password

Boundary Value Analysis (BVA) (Size) :

Min = 4Char. → Pass        Max = 8Char. → Pass
Min-1 = 3Char. → Fail      Max-1 = 7Char. → Pass
Min+1 = 5Char. → Pass      Max+1 = 9Char. → Fail

Equivalence Class Partition (ECP) (Type) :
Valid   In-Valid
a-z,0-9 A-Z, Special Characters, Blank Field

Test Scenario 3 :- Check Ok Button Click

  User ID   Password         Expected Out Put
   Valid       Valid            Next Window
   Valid      Invalid      Invalid Data Error Msg.
  Invalid      Valid       Invalid Data Error Msg.
   Value    Blank Field    Invalid Data Error Msg.
Blank Value    Value       Invalid Data Error Msg.

Test Scenario 4 :- Check Cancel Button

             Event                 Expected Out Put
 Click Cancel after open login    Login Window Closed
Click Cancel after enter user ID Login Window Closed
Click cancel after enter Password Login Window Closed

Test Scenario 5 :- Check Minimize Icon
Test Scenario 6 :- Check Maximize Icon
Test Scenario 7 :- Check Close Icon
Use Case 2 :

1. Use Case ID               : UC_Book_Issue
2. Use Case Description      : Issue a Book for Valid User
3. Required Inputs           : User ID is in below format
                                 Mm_yy-xxxx (4 digits)
                                 Book ID is in below format
                                 BOOK_xxxx
4. Precondition              : New User Registration to get valid User ID
5. Events List               :

                                   Events / Tasks        Expected O/p or Out come

                                   Enter User ID       Focus to Book ID for Valid User
                                   and then click       and Invalid User error msg. for
                                    “Go” Button                  Invalid User

                                   Enter Book ID      Book issued message for available
                                   and click “Go”        book and unavailable book
                                       Button          message for unavailable book id


6. Activity Flow Diagram     :
                                  Example :
                                                                      User

                                                                         Valid User ID
                                       Invalid User                  BOOK
                                                                     ISSUE           Data Base
                                                      Re-Login
                                                                         Valid Book ID
                                       Unavailable                   BOOK
                                       Book                          ISSUE           Data Base
                                                      Re-Login
                                                                         Valid
                                                                  “Book Issued”


7. Post Condition            : Received that issued book from Computer Operator
8. Alternative events list   : None
9. Proto Type                    :

      Book Issue                                - □X


       User ID                       Go

       Book ID                       Go


10.      Related use cases   :       UC_New User, UC_Book Feeding

Test Scenario 1 :- Verify User ID

Boundary Value Analysis (BVA) (Size) :

Min = Max = 10 Position Value → Pass
          = 9 Position Value → Fail
          = 11 Position Value → Fail

Equivalence Class Partition (ECP) (Type) :
Valid                                         In-Valid
 [0][1-9][_][0-9][0-9][_][0-9][0-9][0-9][0-9] a-z, A-Z, 0-9,
[1][0-2][_][0-9][0-9][_][0-9][0-9][0-9][0-9] Special Char. except _,Blank Field

Test Scenario 2 :- Verify “Go” button click

User ID                                       Expected O/p after click ‘Go’
Valid Value                                   Focus to Book ID
Invalid Value                                 “Invalid User” Error Message
Blank Field                                   “Invalid User” Error Message

Test Scenario 3 :- Verify User ID
Boundary Value Analysis (BVA) (Size) :

Min = Max = 8 Position Value → Pass
          = 7 Position Value → Fail
          = 9 Position Value → Fail

Equivalence Class Partition (ECP) (Type) :
Valid                               In-Valid
[B][O][O][K][_][0-9][0-9][0-9][0-9] a-z, A-Z Except B,O,K,
                                    Special Char. except _,Blank Field
Test Scenario 4 :- Verify “Go” Click
Book ID                                      Expected O/p after click “Go”
Valid Book ID                                “Book issued” Msg.
Invalid Book ID                              “Unavailable Book” Message
Blank Field                                  “Unavailable Book” Message

Test Scenario 5 :- Verify minimized Icon
Test Scenario 6 :- Verify maximized Icon
Test Scenario 7 :- Verify close Icon


3. User Interface Based Test Design :

        The Functional Specification Based Test Design or The Use Cases Based Test
Designs are using to prepare Test Scenarios and Cases for Functional Testing. This User
Interface Based Test Design is using by Test Engineers to prepare Test Scenarios and
cases for “Usability Testing”.

    BRS
      ↓
    SRS       (UI                                      Test Scenarios
          Requirements)                                      ↓
      ↓
                                                        Test Cases
    HLD
      ↓
    LLDs
      ↓
                     System Test Execution
   Coding
  (UT & IT)
  S/w Build

       In this method the Test Engineers are depending on User Interface Requirements
in SRS.
       In general the Test Engineers are writing Common Test Scenarios for Usability
Testing, which are applicable on any type of Application Scenarios.

Test Scenario 1 :- Verify Spelling in every scenario

Test Scenario 2 :- Verify error msg. meaning

Test Scenario 3 :- Verify Int.Cap of labels in every screen
Test Scenario 4 :- Verify color uniqueness through out the screens

Test Scenario 5 :- Verify Font or Style uniqueness through the screens

Test Scenario 6 :- Verify size uniqueness throughout the scene

Test Scenario 7 :- Verify alignment of objects in every screens

Test Scenario 8 :- Verify line spacing uniqueness through out the screens

Test Scenario 9 :- Verify Tool Tips of icons in every screen.

Test Scenario 10 :- Verify default object in every screen.

Test Scenario 11 :- Verify Uniform Background colors of objects in every screen.

Test Scenario 12 :- Verify Scroll Bars when our screen size is grater than Desk Top

Test Scenario 13 :- Verify keyboard accessing of every object in every screen

Test Scenario 14 :- Verify abbreviations & Short cuts in screens

Test Scenario 15 :- Verify Multiple Data Object positions in every screen.

                   Ex : List Box, Menu, Table … etc.,

Test Scenario 16 :- Verify Help Messages (Manual Support Testing)

Test Scenario 17 :- Verify Functionally Grouped Objects in every screen.

Test Scenario 18 :- Verify Boarders of Functionally Grouped Objects in every screens

Test Scenario 19 :- Verify Labels of objects with respect to Functionality

Test Scenario 20 :- Verify Window Labels with respect to Functionality


4. Functional and System Specification Based Test Design :

       After completion of Test Scenarios selection for Functional and Usability Testing
the Test Engineers are concentrating on Test Scenario selection for Non-Functional
Testing depending on Functional and System Specifications in SRS.

       Functional Specifications are describing the required functionalities in Software
and System specifications are describing the required environment to be used.
BRS
           ↓
         SRS                                         Test Scenarios
                                                           ↓
      (Functional
                                                      Test Cases
    Specifications +
 System Specifications)
           ↓
     HLD & LLDs
           ↓                  System Test Execution
   Coding (UT & IT)
       S/w Build

Example Test Scenarios for Compatibility Testing :

Test Scenario 1 :    Verify Login in Win NT with Customer expected configuration
Test Scenario 2 :    Verify Login in Win 2000 with Customer expected configuration
Test Scenario 3 :    Verify Login in Win Vista with Customer expected configuration
And more…

Example Test Scenarios for Performance Testing :

Test Scenario 1 :    Verify Login Under Customer expected Load and Configuration
Test Scenario 2 :    Verify Login Under more than Customer expected configuration
And more….

Example Test Scenarios for Installation Testing :

Test Scenario 1 :    Verify Setup Program to Start Installation.
Test Scenario 2 :    Verify Interface easiness during Installation
Test Scenario 3 :    Verify occupied disk space after Installation
And more…

Test Case Format :

       After completion of Test Scenarios selection for responsible areas in terms of
Functional, Usability and Non-Functional Testing, the Test Engineers are implementing
them as Test Cases. Test Engineers are using IEEE (Institute of Electrical & Electronics
Engineer) 829 Test Case Format.

1. Test Case ID                : Unique Number / Name for Future Reference
2. Test Case Name              : The Corresponding Test Scenario
3. Feature to be Tested        : The Name corresponding Module or Functionality
4. Test Suite ID                  : The Unique number or name of a Test Batch. This case
                                    is a member in that Batch
5. Priority                       : The importance of this Test Case (P0 priority for
                                    Functional Test Cases, P1 Priority for Non-Functional
                                    Test Cases and P2 Priority for Usability Test Cases.)
6. Test Environment               : The required Hardware and Software to execute this
                                    test.
7. Test Effort                    : Person per hour (Ex.20min is average Test Execution
                                    Time)
8. Test Duration                  : The data and time to execute this test.
9. Test Setup                     : The necessary tasks to do before start this test
                                    execution.
10. Test Procedure / Data Matrix :

Step        Action /      Required      Expected     Actual              Defects
                                                                Result             Comments
No.        Task event       I/p           O/p         O/p                  Id



                        Test Design                             Test Execution




                                                     }
               ECP (Type)       BVA (Range / Size)
I/p Object                                                    Data Matrix
                                                                   in
              Valid Invalid       Min        Max

11. Test Case Pass / Fail Criteria :    The Final result of this Test Case after execution


Note 1 :     In general the test engineers are not interesting to fill all fields in Test Case
             Format due to lack of time and similarity in fields values of Test Cases.


Note 2 :     The test engineers are using test procedure for operation test cases and data
             matrix for input object test cases.


Functional Specification :
         In a Banking application the valid employees are creating fixed deposit operations
with depositors provided information. In this fixed deposit operation, the employees are
filling below fields.
Depositor Name : Alphabets in Lower Case with Int.Cap, allows multiple words in name
Amount : 1500 to 1,00,000
Time : Up to 12 months
Interest : Numeric with one decimal
If the time>10months, then the Interest>10% from Bank Rules
Prepare Test Scenarios and Test Cases :


Test Scenario 1 :    Verify Depositor Name
Test Scenario 2 :    Verify Amount
Test Scenario 3 :    Verify Time
Test Scenario 4 :    Verify Interest
Test Scenario 5 :    Verify Fixed Deposit Operation
Test Scenario 6 :    Verify Fixed Deposit Operation with Bank Rule


Test Case Documents :
Test Case 1 :-
1. Test Case ID               : TC_FD_Ravi_24th May_1
2. Test Case Name             : Verify Depositor Name
3. Test Suit ID               : TS_FD
4. Priority                   : P0
5. Test Setup                 : Depositor Name is taking inputs
6. Data Matrix                :

                                  ECP (Type)                        BVA (Size)
   I/p Object
                      Valid                Invalid             Min        Max

Depositor Name ([A-Z][a-z]*)* 0-9,Spl.Char, Blank Field 1 Char 256 Char


Test Case 2 :-
1. Test Case ID               : TC_FD_Ravi_24th May_2
2. Test Case Name             : Verify Amount
3. Test Suit ID               : TS_FD
4. Priority                   : P0
5. Test Setup                 : Depositor Object is taking inputs
6. Data Matrix                  :

                           ECP (Type)                  BVA (Range)
I/p Object
              Valid              Invalid               Min       Max

Amount         0-9    a-z, A-Z, Spl.Char, Blank Field 1500 100000


Test Case 3 :-
1. Test Case ID                 : TC_FD_Ravi_24th May_3
2. Test Case Name               : Verify Time
3. Test Suit ID                 : TS_FD
4. Priority                     : P0
5. Test Setup                   : Time Object is taking inputs
6. Data Matrix                  :

                           ECP (Type)                     BVA (Range)
I/p Object
              Valid              Invalid                 Min           Max

  Time         0-9    a-z, A-Z, Spl.Char, Blank Field 1 Month 12 Months


Test Case 4 :-
1. Test Case ID                 : TC_FD_Ravi_24th May_4
2. Test Case Name               : Verify Interest
3. Test Suit ID                 : TS_FD
4. Priority                     : P0
5. Test Setup                   : Interest Object is taking inputs
6. Data Matrix                  :

                                       ECP (Type)                            BVA (Range)
I/p Object
                        Valid                         Invalid                Min   Max

 Interest     0-9 . 0-9 with one decimal a-z, A-Z, Spl.Char, Blank Field     0.1    100
Test Case 5 :-
1. Test Case ID                 : TC_FD_Ravi_24th May_5
2. Test Case Name               : Verify Fixed Deposit Operation
3. Test Suit ID                 : TS_FD
4. Priority                     : P0
5. Test Setup                   : Valid Values are available in hand
6. Test Procedure               :

Step No.             Action               Required I/p             Expected O/p

   1.         Connect Bank Server         Valid Exp Id             Menu Appears

   2.          Select “FD” Option             None           Fixed Deposit Form Opened

                                          All are valid          Acknowledgement

   3.      Fill Fields and Click Ok      Any one Invalid               Error Msg.

                                       Any one Blank Field             Error Msg.

Test Case 6 :-
1. Test Case ID                 : TC_FD_Ravi_24th May_6
2. Test Case Name               : Verify Fixed Deposit Operation with Bank Rule
3. Test Suit ID                 : TS_FD
4. Priority                     : P0
5. Test Setup                   : Valid Values are available in hand
6. Test Procedure               :

 Step
                 Action                  Required I/p                    Expected O/p
 No.

           Connect Bank
  1.                                    Valid Exp Id                    Menu Appears
              Server

              Select “FD”                                          Fixed Deposit Form
  2.                                        None
                Option                                                   Opened

                              Valid Name, Amount, Time>10
                                                                       Acknowledgement
           Fill Fields and           with Interest>10
  3.
              Click Ok
                              Valid Name, Amount, Time>10
                                                                          Error Msg.
                                    With Interest <=10
Like as above example the Test Engineers are implementing Test Scenarios as Test
Cases. Every Test Case is a combination of corresponding Test Scenario and required
details to apply this test on S/w Build.


Test Cases Selection Review :
        After completion of Test Scenarios and Cases writing the Test Lead & Test
Engineers are conducting a review meeting to estimate the completeness and correctness
of that documents. In this review the Testing Team is depending on below coverages.
       □ Requirements Oriented Coverage (Modules)
       □ Testing Topic Oriented Coverage (UT,FT,NFT)


IV. Test Execution :-
       After completion of Test Design and Review the Testing Team is concentrating
on below issue.
       □ Formal meeting with developers
       □ Test Environment Establishment
       □ Levels of Test Execution


□ Formal Meeting :-
       In general the Test Execution process is starting with a Formal Meeting in
between Testing Team & Development Team representatives. In this meeting the
corresponding representatives are concentrating on Build Version Control and Defect
Tracking.
        From Build version control concept, the Development Team is modifying S/w
Build Coding, to resolve defects and they will release that modified build with Unique
version number. This version numbering system is understandable to Test Engineers to
distinguish Old Build & Modified Build. For this version controlling, the Developers are
using Version Control Tools also. (Ex : - VSS (Visual Source Safe))
      To report mismatches to Development Team the Test Engineers are reporting that
mismatch to Defect Tracking Team (DTT) First
Test Lead + Project Manager + Project Lead + Business Analyst   →   DTT
□ Test Environment Establishment :-
       After completion of Formal Meeting, the Testing Team is concentrating on Test
Environment Establishment with required all Hardware and Software


                                           SERVER

                                Configuration Repository
                       TCP/IP                               TCP/IP

                         FTP                                  FTP


                                    TCP/IP     FTP

  Development                                                            Project
  Environment                                                          Management




                                    Test Environment
FTP : File Transfer Petrol (Single Location)
TCP/IP : Transmission Control Protocol / Internet Protocol (Different Location(s))


□ Levels of Test Execution:-
         Development                                         Testing

                           Initial Build
                                                           Level-0 (Sanity)
                           Stable Build

                          Defect Report                    Level-1 (Comprehensive)

   Defect                Modified Build
   Fixing                                                  Level-2 (Regression)

                                                           Level-3 (Final Regression)
Case Study :-
                                       Initial Build
                                             ↓
                                 Sanity Testing (Level-0)
                                             ↓
                                       Stable Build
                                             ↓
                                Comprehensive (Level-1)
                                             ↓
                                    Defect Detection
                                             ↓
                                     Modified Build
                                             ↓
                                Regression Test (Level-2)
                                             ↓
                                      Defect Closing
                                             ↓
                                       Master Build
                                             ↓
                                Final Regression (Leve-3)
                                             ↓
                              Golden Build (Able to Release)

□ Levels of Test Execution Vs Test Cases :-

       Level -0 → Some P0 (Functional) Test Cases
       Level–1 → All P0,P1&P2 Test Cases
       Level-2 → Selected P0,P1&P2 Test Cases with respect to modification
       Level-3 → Selected P0,P1&P2 Test Cases with respect to Defect Density


□ Level-0 Sanity Testing :-
       After Downloading Initial Build from Configuration Reporting in server, the
Testing Team is concentrating on Level-0 sanity testing to estimate Testability of that
Software. Testability means that Understandable, Operatable, Observable, Controllable,
Consistency, Simplicity, Maintainable and Automatable.
       If that Initial Build is not Stable then the Testing Team sends back that Build to
Developers. If that build is Stable Build then the Test Engineers are concentrating on
Level-1 Test Execution to detect defects. This Level-0 testing is also known as Sanity
Testing / Smoke Testing / Testability Testing / Tester Acceptance Testing or Build
Verification Testing /n Octangle Testing.
□ Level-1 Comprehensive / Real Testing :-
        In this Level-1 Test Execution, the Test Engineers are executing all Test Cases as
Batches. Every Test Batch Consist of a set of dependent Test Cases. In these test batches
the end state of one test is Base State to Next State. Test batches are also known as Test
Suite or Test Set or Test Build or Test Chain.

  Receive Stable             Make Test                  Select             Select a
   Build from                Cases as                  A Batch            Test Case
   Developers                 Batches                             Next
                                                                  Batch

                                                        Yes                       Next
                                                                                  Case


                 Defect                              Step                 Take a Step
                Reporting            No            Expected                 in Case
                                                   = Actual
                                                    Build




        From the above diagram the Test Engineers are continuing Test Execution Batch
by Batch and Case by Case in every Batch. If our Test Case Step expected is not equal to
actual then the Test Engineer is concentrating on Defect Reporting. If possible, they will
continue Test Execution also.
     In this Level-1 test execution, the Test Engineers are preparing Test Log
Document to specify test results.
Test Log Document Format :-

 Test Case         Results (Pass /        Defect       Executed   Executed
                                                                               Comments
    ID                 Fail)               ID            By         On



There are three types of Test Results.
→ Passed, All expected values are equal to Actual
→ Failed, Any one expected are not equal to Actual
→ Blocked, Test execution postponed due to incorrect parent functionality
V. Defect Reporting & Tracking :-
       During Level-1 Test Execution, the Test Case expected values are not equal to
Actual. These mismatches are calling as Defects / Issues / Bugs / Flaws
Defect Report :-
1. Defect ID              :   Unique No. or Name
2. Description            :   Summary of that mismatch in between Tester expected
                              value and Build actual value
3. Build Version ID       :   The version number of Current Build
                              (The Test Engineers detected this defect in that Build)
4. Feature                :   The Name of Module or Functionality
                              (Test Engineers detected this defect in that Module)
5. Test Case ID           :   The ID of failed test case
                              (Test Engineers detected this defect in that case
                              execution)
6. Reproducible           :   Yes → Defect appears every time in Test Execution
                              No → Defect appears rarely in Test Execution
7. If Yes, attach procedure :
8. If No, attach procedure and screen shots :
9. Severity                 : The seriousness of defect in terms of Functionality
                               High / Critical :- Not able to continue testing without
                               resolving.
                               Medium / Major :- Able to Continue Testing but
                               Compulsory / Mandatory to resolve
                               Low / Minor :- Able to continue, May or May Not to
                               resolve.
10. Priority                : The importance of defect to solve in-terms of customer
                               interest. (High / Medium / Low)
11. Detected By             : The name of the Test Engineer
12. Detected On             : The data of detection and submission
13. Status                  : New : Reporting first time
                               Re-Open : Re-Reporting
14. Assigned to             : Report to Tracking Team
15. Suggested Fix           : Suggestion to Solve that Defect.
    (Optional)
Defect Reporting Process :

                             Test Engineer Report
                             Defect to DTT as New


                             DTT Analize that Defect




                                    Accepted                 Defect Status
                                                       No     Changed to
                                                             “ Rejected”


                                                Yes

                                  Categorized that
                                 defect and change
                                  status to “Open”

                                          No


                                      Data                    Assigned to
                                                       Yes   Testing Team
                                     related
                                     Defect


                                          No



                                   Procedure                  Assigned to
                                                       Yes   Testing Team
                                    Related
                                    Defect



                                         No
No



                                  H/w or                      Assigned to
                                                     Yes      H/w Team
                                Infrastruct
                                ure Defect


                                              No

                          Code Related defect is
                      Assigned to Development Team


Case Study :-

                   Report
      Test                        Defect           Assigned    Project Lead
    Engineer       Defect     Tracking Team                         +
                                                               Programmers

                             Code Related Defect

                   Report
      Test                        Defect           Assigned
    Engineer                  Tracking Team                       BA+TL+TE
                   Defect


                Test Case Procedure & Test Data Related Defect

                                                                     H/w or
                   Report                                         Infrastructure
      Test                        Defect           Assigned
    Engineer                  Tracking Team                           Team
                   Defect


                     H/w or Environment Related Defect
Defect Life Cycle or Bug Life Cycle :

      New
       ↓
    Assigned               Reject            Deferred
       ↓
     Open
       ↓
     Fixed
                             Reopen
       ↓
      Closed



New        :   Reporting First Time
Assigned   :   Accepted by DTT
Reject     :   Not Accepted by DTT
Deferred   :   Accepted but not interested to solve due to low severity and low priority.
Open       :   Responsible Team is ready to resolve
Fixed      :   Defect not Correctly solved (or) Re reporting
Closed     :   Defect correctly solved and confirmed through Regression Testing.

Test Data Related Defect Fixing :
             If our reported defect accepted by Defect Tracking Team (DTT) and they
decided that defect as Test Data Related Mismatch. In this situation the responsible
testing team is concentrating Correct Data Collection (CDC) without having conceptual
gap with the help of BA and TL and then, the Test Engineers are re-executing previously
failed test on same Build with correct test data. This test repetition is calling as Retesting
or Confirmation Testing.
                Testing        Build         Failed
   Test Case                                            Defect Reporting



                                                          Data Related
                                                            Defect



                             Repeat Test Case            Collect Correct
           Build
                               With correct                   Data
                                  Data

                      Retesting / Confirmation Testing
Test Script or Procedure Related Defect Fixing :

           If our reported defect accepted as Test Procedure Related Defect by DTT,
then Responsible Testing Team is preparing Correct Procedure for that Test Case with
help of TL and BA

               Testing      Build        Failed
   Test Case                                            Report to DTT


                                                          Procedure
                                                        Related Defect



                                                      Correct Test
                          Repeat Test Case             Procedure
          Build
                             In correct             Prepared by Test
                             procedure                 Engineers


                     Retesting / Confirmation Testing

Infrastructure Related Defect Fixing :

            If our Report Defect Accepted by DTT as Environment Related or
Infrastructure Related or Hardware Related Defect, then responsible Hardware Team is
Re-establishing correct test environment.
               Testing      Build        Failed
   Test Case                                            Report to DTT


                                                         Environment
                                                        Related Defect



                                                    Re-establish Test
                          Repeat Test Case          Environment by
          Build
                            In modified                H/w Team
                            environment


                     Retesting / Confirmation Testing
Code Related Defect Fixing :-
         If our reported defect accepted as Code Related Defect, then the responsible
Programmers / Developers are performing changes in Build Coding to Resolve that
defect.
  PL Updates the                Impact Analysis               Selected Coding
  status of Defect                    by                       areas reviewed
     to “Open”                   Programmers                       by PL




 Review Document,                 Changes by
    changes by                    concerned                      Changes
                                                    Yes         Required in
  BA/Designers &                    person
   Project Lead                  (BA/Design)                    Documents



   Unit Test &                    Changes in                             No
  Make modified                   coding by
     Build                       Programmers


   PL changes                    Release Modified Build
  defect status to                with Unique Version
     “Fixed”                    Number and Release Note



           After receiving build from Development Team, the Testing Team is
concentrating on re-testing & Regression Testing

  Test Cases


               Related Passed
                   Tests                          Modified            Passed
    Build
                                  Failed Test      Build              Passed


                                                        Programmers
   Pass
              Report Defect
         Faild                DTT           Code Related Defect
          From the above model the test engineer is re-executing previously failed test
on modified build to confirm defect fixing, called as Retesting or Confirmation Testing.
To identify side effects of defect fixing modifications in modified build, the test
engineers is re-executing previously passed related test on that modified build called
Regression Testing.

Level-2 Regression Testing :

                           Take Modified Build and Release Note



                   Identify severity of fixed defect in that Modified Build


    High                                    Medium                            Low


 All P0                                All P0                        Some P0
 All P1                                Carefully Selected P1         Some P1
 Carefully Selected P2 Cases           And Some P2 Test Cases        Some P2 Test Cases



           On that modified build to detect Side Effects in Build with respect
                     to Modifications Specified in Release Note

Case 1:-
            If the development team fixed defect severity is High then the Test Engineers
are repeating All P0, All P1 and Carefully Selected P2 Test Cases on that Modified Build
w.r.t. modifications specified in release note.
Case 2 :-
            If the Development Team fixed defect severity is Medium then the Test
Engineers are repeating All P0, Carefully Selected P1 and Some P2 Test Cases on that
modified build w.r.t. modifications specified in release note.
Case 3 :-
            If the Development Team fixed defect severity is Low then the Test Engineers
are repeating Some P0, Some P1 and Some P2 Test Cases on that modified build w.r.t.
modifications specified in release note.
Case 4:-
            If the development team release modified build w.r.t. changes in Customer
Requirements then the Test Engineers are re-executing All P0, All P1 and Carefully
Selected P2 Test Cases on that Modified Build w.r.t. changes in Customer requirements.
In this case Test Engineers are performing changes in Test Scenarios and Test Cases
w.r.t. changes in Customer Requirement.
VI. Test Closure :-
After completion of all reasonable tests and detected defects closing, the test
lead is conducting a review meeting to Stop Testing. In this review the TL is analyzing
below factors with the involvement of Test Engineers.

1. Coverage Analysis :-
       → Requirements Oriented Coverage (Module)
       → Testing Topic Related Coverage (Usability, Functional, Non-Functional)
2. Defect Density Calculation :
       Ex :
            Modules / Requirement                    %
                        A                           20%
                        B                           20%
                        C               40% ( Need Regression Test )
                        D                           20%
                      Total                        100%

3. Analysis of Deferred Defect :
       Whether the deferred defects are postponed or not?

Level-3 Final Regression Testing :

       After completion of successful Test Closure review the Testing Team is
concentrating Leve-3 or Final Regression Testing.

                            Identify High
                            Defect Density                  Person /
                               Module                       Hour

   Golden Defect                                      Effort
    Reporting If                                    Estimation
     Required




                 Regression                      Plan
                  Testing                      Regression



VII. User Acceptance Testing (UAT) :
After Completion of Final Regression Testing the Project Management is
concentrating on User Acceptance Testing to collect feedback from Real Customers /
Model Customers.
           There are two ways in User Acceptance Testing, such as Alpha Testing and
Beta Testing.

VIII. Sign Off :

           After completion of successful User Acceptance Testing and there
modifications, the Test Lead is preparing Final Test Summary Report and review
corresponding Test Engineer from this project. The final Test Summary Report is a
combination below document.
           → Test Strategy / Methodology
           → Test Plan(s)
           → Test Scenarios
           → Test Cases
           → Test Logs
           → Defect Reports
           → Requirements Traceability Matrix
            Required Test Case   Result      Detected       Status
                                                                          Comments
              ID        ID     (Pass / Fail)   ID     (Closed / Deferred)
It is a mapping between requirements and defects via test cases.

Case Study (5Months of Testing Process) :-

        Deliverable                        Responsibility             Duration
       Test Strategy                          PM / TM                  4-5 days
       Test Planning                          Test Lead                4-5 days
 Requirements Training to
                                   BA + Domain / Subject Experts      5-10 days
     Test Engineers
 Test Scenarios & Review                    Test Engineer             5-10 days
Test Cases Implementation                   Test Engineer             10-15 days
  Review Build + Level-0
                                            Test Engineer              2-3 days
     (Sanity Testing)
    ** Test Automation                      Test Engineer             10-15 days
    Level-1 and Level-2
                                            Test Engineer             30-40 days
     Testing Execution
        Deliverable                        Responsibility             Duration
On Going
     Defect Reporting                       Test Engineer
                                                                            (Same Day)
     Status Reporting                         Test Lead                    Weekly Twice
  Test Closure & Level-3              Test Lead & Test Engineer              5-10 days
                                Real / Model Customers with In front of
 User Acceptance Testing                                                     3-5 days
                                        Developers and Testers
         Sign Off                             Test Lead                      1-2 days


                                        W-Model
                                     System Testing
      Development                     And Manual                   Test Automation

                                                   N.F.T                  Load Runner &
 Req. Analysis                                                            J Meter
                                                   F.T               Win Runner / QTP
    S/w Design                                                       / Robot / Silk
                                              Usability
   Coding + Unit Testing                      Testing       No Tools in Market


          Integration Testing
                                                           Note : Test Automation is
                                           Build                   Optional



       From the above W-Model, the Testing Tools are available for Functional Testing
and Some of Non-Functional Testing and Endurance Testing and Data Volume
Testing.
       The remaining Non-Functional Tests and Usability Testing conducted by Test
Engineers Manually.
Win Runner 8.0 :

      Developed by Mercury Inter Active and Take over by Hewlett Packed (HP)
      Functional Testing Tool
      This Version released in “2005”January
      Supports VB, .Net, Java, Power Builder, HTML, Delphi, VC++, D2K, and Siebel
      and Siebel Technology Software for Functional Testing.
      To Support SAP, People Soft, XML, Multimedia and Oracle Applications
      (“ERPS”) including above technologies, Test Teams are using Quick Test
      Professional (QTP)
      Win Runner runs on windows only
      X-Runner for Unix / Linux

Win Runner Test Process :

               Receive Stable Build From Developers after Sanity Testing
                                            ↓
      Identify Functional Test Cases (Priority P0) to Automate (English + Manual)
                                            ↓
           Create Automation Programs (TSL) for that Functional Test Cases
                                            ↓
                     Runs Programs on S/w Build to detect defects
                                            ↓
                               Test Reporting if required

       From the above approach, the Test Engineers are concentrating Manual
Functional Test Cases into Test Script Language (TSL) programs.
       TSL is a “C” like language

Add-in Manager :
        This window list out all Win Runner supporting technologies with respect to
license. Test Engineers are selecting current project technology in that list

Welcome Screen :

      After Successful Win Runner launching Welcome Screen is coming on the
Desktop. The screen consists of 3 New Options like
   → Create a New Test
   → Open an Existing Test
   → A Quick Preview of Win Runner
Win Runner Icons :
        Start Recording
 ↓      Run From Top
 →      Run From Arrow
        Stop Recording
        Pause (Stop Run)


Win Runner Test Automation Frame Works :
       The Win Runner 8.0 is allowing you to convert our Manual Functional Test Cases
into Test Script Language (TSL) programs in 4 ways

   →   Record and Playback Frame Work
   →   Data Driven Frame Work
   →   Keyword Driven Frame Work
   →   Hybrid Frame Work

I. Record & Playback Frame Work :

      In this frame work the Test Engineers are converting manual test cases into
automation programs with Two Steps of procedure.

   A. Recording Operations
   B. Inserting Check Points

A. Recording Operations :-

       In Test Automation program creation, the Test Engineers are recording S/w Build
operations. There are two modes in recording such as Context Sensitive Mode and
Analog Mode.
       In Context Sensitive Mode, the tool is recording Mouse and Keyboard operations
with respect to objects and window in build. To select this mode the Test Engineers are
using below options.

       Click “Start Recording” icon Once
       Test Menu → Record Context Sensitive Option.

       To record mouse pointer movements with respect to desktop co-ordinates, Test
Engineers are using Analog Mode in Win Runner. To select this mode we can use below
options.
Click “Start Recording” icon Twice
          Test Menu → Record Analog
Ex :-
          Digital Signatures, Graphs Drawing and Image Movements.

“F2” is a short cut key to change from one mode to another mode.

Note :-
       In Analog Mode the Win Runner is Recording Mouse Pointer Movements with
respect to Desktop Co-ordinate. Due to this reason the Test Engineers are not changing
corresponding window position and monitor resolution.

B. Inserting Check Point :

       After recording build operations, the Test Engineers are inserting check points
with respect to expectations. Every check point is comparing Test Engineer given
Expected Value and Build Actual Value. There are Four check points in Win Runner.

          GUI (Graphical User Interface) Check Point
          Bitmap Check Point
          Database Check Point
          Text Check Point

   GUI (Graphical User Interface) Check Point :

       To verify properties of Objects, we can use this check point. It consists of 3 sub
options.
     i. For Single Property
    ii. For Object / Window
   iii. For Multiple Object

i. For Single Property :-
        To verify one property of one object we can use this option.

Ex.-1 :
Test Procedure :-
 Step
                         Action                 Required I/p             Expected O/p
 No.
                 Open an order in Flight         Order No. as          Delete Order button
   1
                  Reservation Window                Valid                  “enabled”
Build :- Flight Reservation Window

Automation Program :-

        set_window (“Flight Reservation”,1);
        menu_select_item (“File;Open Order….”);
        set_window (“Open Order”,1);
        button_set (“Order No.”, ON);
        edit_set (“Edit”, “1”);
        button_press (“OK”);
        set_window (“Flight Reservation”,1);
        button_check_info (“Delete Order”, “enabled”, 1);

Ex.-2 :
Test Procedure :-
 Step
                        Action                  Required I/p         Expected O/p
 No.
                Open an order in Flight         Order No. as       Insert Order button
   1
                 Reservation Window                Valid               “disabled”

Build :- Flight Reservation Window

Automation Program :-

        set_window (“Flight Reservation”,1);
        menu_select_item (“File;Open Order….”);
        set_window (“Open Order”,1);
        button_set (“Order No.”, ON);
        edit_set (“Edit”, “1”);
        button_press (“OK”);
        set_window (“Flight Reservation”,1);
        button_check_info (“Insert Order”, “enabled”, 0);

Note :- TSL is case sensitive language and it is taking # symbol for comments.
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods
Software Development Lifecycle (SDLC) Stages and Testing Methods

Mais conteúdo relacionado

Mais procurados

Software testing principles
Software testing principlesSoftware testing principles
Software testing principlesDonato Di Pierro
 
Software Quality Assurance in software engineering
Software Quality Assurance in software engineeringSoftware Quality Assurance in software engineering
Software Quality Assurance in software engineeringMuhammadTalha436
 
Chapter 13 software testing strategies
Chapter 13 software testing strategiesChapter 13 software testing strategies
Chapter 13 software testing strategiesSHREEHARI WADAWADAGI
 
Software Testing
Software TestingSoftware Testing
Software TestingSengu Msc
 
Object oriented testing
Object oriented testingObject oriented testing
Object oriented testingHaris Jamil
 
verification and validation
verification and validationverification and validation
verification and validationDinesh Pasi
 
Software Quality Assurance
Software Quality AssuranceSoftware Quality Assurance
Software Quality AssuranceSaqib Raza
 
Hm system programming class 1
Hm system programming class 1Hm system programming class 1
Hm system programming class 1Hitesh Mohapatra
 
Mini project in java swing
Mini project in java swingMini project in java swing
Mini project in java swingvarun arora
 
Niyati_Manual_Testing_ISTQB_Certified_Resume
Niyati_Manual_Testing_ISTQB_Certified_ResumeNiyati_Manual_Testing_ISTQB_Certified_Resume
Niyati_Manual_Testing_ISTQB_Certified_ResumeNiyati Madad
 
Software testing ppt
Software testing pptSoftware testing ppt
Software testing pptSavyasachi14
 
Cocomo model
Cocomo modelCocomo model
Cocomo modelMZ5512
 
Bug reporting and tracking
Bug reporting and trackingBug reporting and tracking
Bug reporting and trackingVadym Muliavka
 
Software testing and process
Software testing and processSoftware testing and process
Software testing and processgouravkalbalia
 
Software myths | Software Engineering Notes
Software myths | Software Engineering NotesSoftware myths | Software Engineering Notes
Software myths | Software Engineering NotesNavjyotsinh Jadeja
 

Mais procurados (20)

Software testing principles
Software testing principlesSoftware testing principles
Software testing principles
 
Unit testing
Unit testing Unit testing
Unit testing
 
Software Quality Assurance in software engineering
Software Quality Assurance in software engineeringSoftware Quality Assurance in software engineering
Software Quality Assurance in software engineering
 
Chapter 13 software testing strategies
Chapter 13 software testing strategiesChapter 13 software testing strategies
Chapter 13 software testing strategies
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
Software testing
Software testingSoftware testing
Software testing
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
Object oriented testing
Object oriented testingObject oriented testing
Object oriented testing
 
verification and validation
verification and validationverification and validation
verification and validation
 
Software Quality Assurance
Software Quality AssuranceSoftware Quality Assurance
Software Quality Assurance
 
Hm system programming class 1
Hm system programming class 1Hm system programming class 1
Hm system programming class 1
 
Mini project in java swing
Mini project in java swingMini project in java swing
Mini project in java swing
 
Srs
SrsSrs
Srs
 
Niyati_Manual_Testing_ISTQB_Certified_Resume
Niyati_Manual_Testing_ISTQB_Certified_ResumeNiyati_Manual_Testing_ISTQB_Certified_Resume
Niyati_Manual_Testing_ISTQB_Certified_Resume
 
Software testing ppt
Software testing pptSoftware testing ppt
Software testing ppt
 
Cocomo model
Cocomo modelCocomo model
Cocomo model
 
Bug reporting and tracking
Bug reporting and trackingBug reporting and tracking
Bug reporting and tracking
 
software quality
software qualitysoftware quality
software quality
 
Software testing and process
Software testing and processSoftware testing and process
Software testing and process
 
Software myths | Software Engineering Notes
Software myths | Software Engineering NotesSoftware myths | Software Engineering Notes
Software myths | Software Engineering Notes
 

Semelhante a Software Development Lifecycle (SDLC) Stages and Testing Methods

Manual Testing Notes
Manual Testing NotesManual Testing Notes
Manual Testing Notesguest208aa1
 
Bridge Process Model
Bridge Process ModelBridge Process Model
Bridge Process ModelStephen Raj
 
Testing material (1).docx
Testing material (1).docxTesting material (1).docx
Testing material (1).docxKVamshiKrishna5
 
16103271 software-testing-ppt
16103271 software-testing-ppt16103271 software-testing-ppt
16103271 software-testing-pptatish90
 
Manual testing by reddy
Manual testing by reddyManual testing by reddy
Manual testing by reddyKrishna Gurjar
 
CIPL Application Development Process
CIPL Application Development ProcessCIPL Application Development Process
CIPL Application Development Processreetamclassic
 
21UCAE65 Software Testing.pdf(MTNC)(BCA)
21UCAE65 Software Testing.pdf(MTNC)(BCA)21UCAE65 Software Testing.pdf(MTNC)(BCA)
21UCAE65 Software Testing.pdf(MTNC)(BCA)ssuser7f90ae
 
STLC & SDLC-ppt-1.pptx
STLC & SDLC-ppt-1.pptxSTLC & SDLC-ppt-1.pptx
STLC & SDLC-ppt-1.pptxssusere4c6aa
 
Software Engineering (Short & Long Questions)
Software Engineering (Short & Long Questions)Software Engineering (Short & Long Questions)
Software Engineering (Short & Long Questions)MuhammadTalha436
 
Software Engineering Solved Past Paper 2020
Software Engineering Solved Past Paper 2020 Software Engineering Solved Past Paper 2020
Software Engineering Solved Past Paper 2020 MuhammadTalha436
 
Software Engineering Overview
Software Engineering OverviewSoftware Engineering Overview
Software Engineering OverviewPrachi Sasankar
 
Software Engineering-Part 1
Software Engineering-Part 1Software Engineering-Part 1
Software Engineering-Part 1Shrija Madhu
 
Software Development Life Cycle (SDLC )
Software Development Life Cycle (SDLC )Software Development Life Cycle (SDLC )
Software Development Life Cycle (SDLC )eshtiyak
 

Semelhante a Software Development Lifecycle (SDLC) Stages and Testing Methods (20)

Manual Testing Notes
Manual Testing NotesManual Testing Notes
Manual Testing Notes
 
Software_Testing.pptx
Software_Testing.pptxSoftware_Testing.pptx
Software_Testing.pptx
 
Manual testing
Manual testingManual testing
Manual testing
 
2
22
2
 
2
22
2
 
Bridge Process Model
Bridge Process ModelBridge Process Model
Bridge Process Model
 
Testing material (1).docx
Testing material (1).docxTesting material (1).docx
Testing material (1).docx
 
16103271 software-testing-ppt
16103271 software-testing-ppt16103271 software-testing-ppt
16103271 software-testing-ppt
 
Manual testing by reddy
Manual testing by reddyManual testing by reddy
Manual testing by reddy
 
CIPL Application Development Process
CIPL Application Development ProcessCIPL Application Development Process
CIPL Application Development Process
 
21UCAE65 Software Testing.pdf(MTNC)(BCA)
21UCAE65 Software Testing.pdf(MTNC)(BCA)21UCAE65 Software Testing.pdf(MTNC)(BCA)
21UCAE65 Software Testing.pdf(MTNC)(BCA)
 
STLC & SDLC-ppt-1.pptx
STLC & SDLC-ppt-1.pptxSTLC & SDLC-ppt-1.pptx
STLC & SDLC-ppt-1.pptx
 
Software Engineering (Short & Long Questions)
Software Engineering (Short & Long Questions)Software Engineering (Short & Long Questions)
Software Engineering (Short & Long Questions)
 
Software Engineering Solved Past Paper 2020
Software Engineering Solved Past Paper 2020 Software Engineering Solved Past Paper 2020
Software Engineering Solved Past Paper 2020
 
Software Engineering Overview
Software Engineering OverviewSoftware Engineering Overview
Software Engineering Overview
 
Software Engineering-Part 1
Software Engineering-Part 1Software Engineering-Part 1
Software Engineering-Part 1
 
Software engineer
Software engineerSoftware engineer
Software engineer
 
Software Development Life Cycle (SDLC )
Software Development Life Cycle (SDLC )Software Development Life Cycle (SDLC )
Software Development Life Cycle (SDLC )
 
Software Engineering
Software EngineeringSoftware Engineering
Software Engineering
 
Software Engineering
Software EngineeringSoftware Engineering
Software Engineering
 

Último

Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxGrade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxkarenfajardo43
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationdeepaannamalai16
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataBabyAnnMotar
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxlancelewisportillo
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Seán Kennedy
 
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...DhatriParmar
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseCeline George
 
Q-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITWQ-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITWQuiz Club NITW
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management SystemChristalin Nelson
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmStan Meyer
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operationalssuser3e220a
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxVanesaIglesias10
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDhatriParmar
 

Último (20)

Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptxGrade Three -ELLNA-REVIEWER-ENGLISH.pptx
Grade Three -ELLNA-REVIEWER-ENGLISH.pptx
 
prashanth updated resume 2024 for Teaching Profession
prashanth updated resume 2024 for Teaching Professionprashanth updated resume 2024 for Teaching Profession
prashanth updated resume 2024 for Teaching Profession
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentation
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped data
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...
 
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of EngineeringFaculty Profile prashantha K EEE dept Sri Sairam college of Engineering
Faculty Profile prashantha K EEE dept Sri Sairam college of Engineering
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 Database
 
Q-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITWQ-Factor General Quiz-7th April 2024, Quiz Club NITW
Q-Factor General Quiz-7th April 2024, Quiz Club NITW
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
Transaction Management in Database Management System
Transaction Management in Database Management SystemTransaction Management in Database Management System
Transaction Management in Database Management System
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
Oppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and FilmOppenheimer Film Discussion for Philosophy and Film
Oppenheimer Film Discussion for Philosophy and Film
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operational
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
ROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptxROLES IN A STAGE PRODUCTION in arts.pptx
ROLES IN A STAGE PRODUCTION in arts.pptx
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
 

Software Development Lifecycle (SDLC) Stages and Testing Methods

  • 1. Software A software is a set of programs. They will take input and provide outputs. They are two types 1) Software Application 2) Software Product 1) A software development for a specific customer requirements called as Software Application. 2) A software development depending on overall requirements in market called as software product. The interested customers are purchasing the licenses of Software Product. Software Bidding : A proposal to develop a New Software is called Software Bidding. In Software Application Development, the proposal is coming from specific customer. In product development our organization is taking their own proposal. Kick of Meeting : The CEO category person is conducting a meeting with high level management and select a Project Manager to handle the New Software Development Process. PIN (Project Initiation Note) Document : The selected Project Manager (PM) is preparing this document to estimate the required people, the required technologies, required time and required resources. He/She submitting the report to CEO. The CEO is conducting a review to give green signal to Project Manager. SDLC (Software Development Life Cycle) : (Water Model) Required Gathering ↓ Analysis & Planning ↓ Designing ↓ Code ↓ Testing ↓ Release & Maintenance
  • 2. In above SDLC process, the single stage of testing is available and conducting the testing by Developers. Due to these reasons, the organizations are concentrating on Multiple Stages of Testing and separate testing teams to achieve quality. Software Quality : → Meet Customer Requirements (Functionality) → Meet Customer Expectations (Usability Performance) → Cost to Purchase License → Time to Release Software Quality Assurance (SQA) : The Monitoring and Measuring the strength of development process is called as Software Quality Assurance / Verification. Software Quality Control (SQC) : The Validation of product with respect to customer requirements is calling as Software Quality Control / Validation / Testing. “V” Model : ‘V’ Stands for Verification & Validation. This model is defining development process with Testing Stages. This model is extension of SDLC Model. Verification Validation Requirements User Acceptance Testing Gathering & Review Analysis & Planning System Testing With Review High Level Design Integration Testing & Review (Programs Testing) Low Level Design & Review Unit Testing (Program Testing) Coding
  • 3. In above ‘V’ Model Reviews are calling as Verification Methods and Testing levels are calling as Validations. In small and medium scale organizations the management is maintaining the separate Testing Team for System Testing Only to decrease project cost, because the System Testing is Bottle Next Stage in Software Development Process. I) Reviews in Analysis : In general the software development process is starting with requirements gathering from Specific Customer in Application Development and requirements gathering from Model Customers in Product development. After gathering requirements the responsible Business Analyst is preparing BRS ( Business Requirements Specification) document. This document is also known as User Requirement Specification or Customer Requirement Specification. After gathering requirements, the business analyst sit with Project Manger and develop SRS and Project Plan. The Software Requirements Specification Consists of functional requirements to be developed and system requirements to be used. Example : BRS SRC Functional Requirement : 2 Inputs , 1 Out Put, ‘+’ is Addition Operation System Requirement : ‘C’ Language What? How? After completion of BRS & SRS preparations, the corresponding Business Analyst is conducting a review to estimate completeness and correctness of the documents. → Are they Correct Requirements? → Are they Complete Requirements? → Are they Achievable Requirements? → Are they Reasonable(Time) Requirements? → Are they Testable Requirements? Go to V Model Next
  • 4. II) Reviews in Design : After completion of successful Analysis and Review, the Design Category people are preparing HLD, LLDs (High Level Design & Low Level Designs) The High Level Design specifies the overall architecture of the Software. It is also known as System Design or Architectural Design. Example : Root LOGIN Mailing Chatting LOGOUT Leaf : Every Functionality or Module Internal Structure specified by Low Level Design Documents. These are also known as Structural Design or Component Design. Example : User User ID & Password Invalid LOGIN Data Base Re-Login Valid Next Window HLD is a system level design and LLD is component or Module level design. So one Software design consists of one HLD and Multiple LLDs. The corresponding designers are conducting a review on that document for completeness and correctness. → Are they Understandable Designs? → Are they Correct Designs? → Are they Complete Designs? → Are they Followable Designs? Go to V Model Next
  • 5. III) Unit Testing : After completion of successful designs and reviews the corresponding programmers are starting coding to construct a Software Physically. In this stage the programmers are writing programs and Test each program using White Box / Glass Box / Open Box Testing Techniques. } → Basic Paths Coverage → Control Structure Coverage → Program Technique Coverage → Mutation Coverage Programs (A) Basic Paths Coverage : The programmers are using this technique to estimate the Execution of a programs. In this technique the programmer Executing a program more than one time to cover all areas of that program in execution. (B) Control Structure Coverage : After completion of successful Basic path coverage the corresponding programmer is concentrating on the Correctness of that program execution in terms of Inputs, Process and Outputs. (C) Program Technique Coverage After successful Basic Paths & Control Structure Coverage, the corresponding programmer is calculating the execution of that program. If that program execution speed is not acceptable then the programmer is performing changes in that program structure without disturbing the functionality. In this coverage the programmers are using Monitors and Profiles like 3rd party software to calculate the execution speed of the program. Note : Monitors are used in VB.net Profilers are used in Java
  • 6. (D) Mutation Coverage Mutation means a change in program. Programmers are performing changes in programs to estimate the completeness and correctness of that program testing. Test Repeat Test Test ↓ ↓ ↓ Change Change ↓ ↓ ↓ Passed Passed (Incomplete Test Failed (Complete Testing) Basics Paths Coverage, Control Structure Coverage and Program Technique Coverage are applicable on a program to test. Mutation Coverage is applicable Program Testing to estimate completeness and correctness of that Testing. Go to V Model Next IV) Integration Testing : After completion of dependent programs development and Unit Testing, the programmers are interconnecting them to form a complete System / Software. This testing is also known as Interface Testing there are Four Approaches to Integrate Programs and Testing. A) Top Down Approach :- In this approach the programmers are interconnecting main program and some of subprograms. In the place of remaining sub-programs, the programmers are using Temporary programs called “Stub" Main STUB (Under Construction) Sub1 Sub2
  • 7. B) Bottom Up Approach :- In this approach the programmers are interconnecting sub-programs without coming from Main Program. Main Driver (Under Construction) Sub1 Sub2 C) Hybrid Approach :- In is a combined approach of Top Down & Bottom Up approaches. It is also known as Sand Witch Approach. Main Driver (Under Construction) Sub1 Driver (Under Construction) Sub2 Sub3 D) System Approach :- The Integration of programs after completion of 100% coding is called System Approach or Big Bang Approach
  • 8. V) System Testing : After completion of successful Integration Testing, the Development Team is Releasing a Software Build to separate Testing Team in our organization. This System Testing classified into Three Sub Stages. 1. Usability Testing 2. Functional Testing 3. Non-Functional Testing 1. Usability Testing : In general the testing execution is starting with Usability Testing. During this Test the Testing Team is Concentrating on “User Friendliness of Software Build” There are 2 sublevels in this Usability Testing. a) User Interface Testing : → Ease of Use (Understandable Screens) → Look & Feel (Attractive Screens) → Speed in Interface (short Navigations in Screens) b) Manuals Support Testing : In this test the Testing Team is verifying the Help of that Software. Case Study : Receive S/w Build from Developers after Integration Testing. ↓ User Interface Testing ↓ Functional Testing ↓ Usability Testing Non-Functional Testing ↓ Manuals Testing
  • 9. 2. Functional Testing : It is a Mandatory Testing level in System Testing. During this test the Testing Team is concentrating on the Correctness of Customer requirements in that S/w Build. This Testing classified into below sub tests. a) Control Flow Testing :- The changes in properties of objects in an Application / S/w Build with respect to mouse and keyboard operations. b) Error Handling Testing :- The prevention of wrong operations with meaningful messages. c) Input Domain Coverage :- Whether our S/w Build is taking valid type and size of inputs or not? d) Manipulations Coverage :- Whether our S/w Build is providing customer expected output or not? e) Database Testing :- The input of Front End Screens operations on Back End database contact f) Sanitation Testing :- Finding extra functionality with respect to Customer Requirements Case Study :- Software Build Screens (Front End) Data Base (Back End) Control Flow Error Handling Data Base I/p Domain Testing Manipulations Sanitation Functional / Black Box Testing
  • 10. 3. Non-Functional Testing : It is an optional level in System Testing. This level is expensive and complex to conduct. During this test the Testing Team is concentrating on extra characteristics of Software. a) Reliability Testing :- It is also known as Recovery Testing. During this test the Testing Team is validating whether our S/w Build is changing from Abnormal State to Normal State or not? b) Compatibility Testing :- It is also known Portability Testing. During this test the Testing Team is concentrating on whether our S/w Build is running on Customer Expected platform or not? Platform means Operating System, Browser, Compilers and Other System Software’s. c) Configuration Testing :- It is also known as Hardware Compatibility Testing. During this test the Testing Team is concentrating on whether our S/w Build is supporting different technology hardware devices or not? Ex :- Different Technology Printers, Networks … etc., d) Inter System Testing :- It is also known as End to End Testing or Interoperability Testing. During this test the Testing Team is concentrating on whether our S/w Build is co-existence with other Software application to share common resources or not? Case Study :- Compatibility Testing S/w Build → Operating System S/w Build → H/w Device Configuration Testing Ex : Printers Inter System Testing S/w Build → Other S/w Build
  • 11. e) Data Volume Testing :- During this test the Testing Team is inserting model data in our Application Build to estimate peak limit of data. This data limit estimate is calling as Data Volume Testing. Ex : 1) M.S.Access Technology Software are managing 2GB Data Base, SQL Server managing 6-7GB Data Base and Orcle Tech. managing 10-12GB Data Base as maximum. f) Installation Testing :- S/w Build Customer expected configuration system + Install Customer expected size of Ram, HDD, Supported S/w Processor, OS…. Etc., → Setup program execution to start Installation. → Easy interface during Installation. → Occupied disk space after Installation. g) Load Testing :- Load means that in number of Concurrent users are using our S/w Build at a time. During this test the Testing Team is executing our S/w Build under customer expected configuration and customer expected load to estimate speed of processing or performance. Client 1 □ Server Client 2 □. S/w Build . Process . Client N □ h) Stress Testing :- The execution of our S/w Build under customer expected configuration and more than Customer Load to estimate peak limit of Load is called Stress Testing. i) Endurance Testing :- The execution of our S/w Build under Customer Expected configuration and customer expected load to estimate continuity in processing is called Endurance Testing. j) Security Testing :-
  • 12. It is also known as penetration testing. During this test the Testing Team is concentrating on three factors. Authorizations : S/w Build is allowing valid users and preventing invalid users. Ex : Login with password, PIN, Digital Signatures, Finger Prints, Eye Retina, Scratch Cards….etc., Access Control : The permission of valid users to access functionality in Build. Ex : Admin, User Encryption / Decryption : The code conversation in between client and server process. Client Server Request Response Decrypted Encrypted Decrypted Cipher Text Encrypted Cipher Text k) Localization and Internationalization Testing :- This testing is applicable for Multi Languity Software. This type of softwares are allowing multiple user language characters. Ex : English, Spanish, French …. Etc., In localization testing the Test Engineer is providing multiple language characters as Inputs to the S/w Build. In Internationalization Testing the Test Engineer is providing a common language character (English) to S/w as Input. In this scenario the 3rd party tools transfer common language character to other language characters. Note : Java Unicode is better technology to develop multi languity softwares. l) Parallel Testing :- It is also known as Competitive / Comparative Testing. During this test the Testing Team is comparing our S/w Build with old version of same S/w or with similar product in market to estimate competitiveness. VI) User Acceptance Testing :
  • 13. After completion of successful System Testing the Project Manger is concentrating on UAT to collect feedback from real customers or model customers. There are two ways in this User Acceptance Testing. α Alpha Testing β Beta Testing → For S/w Application → For S/w Products → By real customers with involvement → By Model Customers Of Developers and Testers → In Development Site → In Model Customer Site VII) Release Testing : After completion of UAT and their modifications the Project Manger is forming Release Team or On Site Team to release application to Real Customer or to release Product to license purchased customer. This release team or onsite team consists of Few Programmers, Few Testers, Few Hardware Engineers with a Team Lead. This team is observing below factors in Customer Site. 1) Complete Installation 2) Overall Functionality 3) Input devices handling (Key Board, Mouse….etc.,) 4) Output devices handling (Monitor, Printer….etc.,) 5) Secondary storage devices handling (Floppy, Pen Drive…etc.,) 6) O/s error handling 7) Co-existence with other S/w in customer site. The above factors checking in customer site is also known as Port Testing / Deployment Testing. After successful release, the release team is conducting training sessions to customer site people & then back to our organization.
  • 14. VIII) Maintenance: During utilization of a Software, the customer site people are sending Software Change Request (SCR) to our organization. These requests received by a special team in our organization called Change Control Board (CCB). This team is consists of Few Programmers, Few Testers, Few Hardware Engineers along with Project Manager. S/w Change Request Enhancement Missed Deffects Impact Analysis Impact Analysis ↓ ↓ Perform S/w Perform S/w Changes Changes ↓ ↓ Conducted by CCB Test S/w Changes Test S/w Changes ↓ Improve Testing Process & People Capability Case Study :- Deliverable to be Testing Stages Responsibility Testing Techniques Tested Walk Through, Reviews in Analysis BRS & SRS BA Inspections & Peer Reviews Walk Through, Review in Design HLD & LLDs Designers Inspections & Peer Reviews White Box Testing Unit Testing Programs Programmers Techniques Interface in between Top Down, Bottom Integration Testing Programmers Programs Up, Hybrid, System Usability, Test Engineers / Functional / Black System Testing S/w Build Quality Control Box, Non- Engineers Functional Testing User Acceptance Real Customers / α -Testing, S/w Build Test Model Customers β - Testing S/w Release Factors Releasing Testing S/w Build Release Team (7 Factors in VII) Maintenance Level S/w changes CCB Regressing Testing Testing
  • 15. Walk Through :- A document study to estimate completeness and correctness Inspection :- Search & Issue in a document called as Inspection Peer Reviews :- Comparing the document with other similar document. Challenges in Software Testing In general every Testing Team is planning formal testing to conduct. Due to some challenges in testing, the Testing Teams are going to conduct Ad-hoc Testing or Informal Testing. There are Five Styles of Ad-Hoc Testing. a) Monkey / Chimpangy Testing :- Due to lack of time the Testing Team is conducting testing on Main Activities of a Software. This type / stage of testing is called as Monkey Testing. b) Buddy Testing :- Due to lack of time the Project Management is combining one programmer and one Tester as a Buddy. This teams are conducting Development & Testing Parallely. c) Exploratory Testing :- It is also known as Artistic Testing. Due to lack of Documentation, the Test Engineers are depending on Past Experience, Discussions with others, Video Conference with customer site people, Internet Browsing & Similar S/w surfing to understand customer requirements. This type of testing is called Exploratory Testing. d) Pair Testing :- Due to lack of knowledge the Senior Test Engineers are groping with Junior Test Engineers to share their knowledge. This style of testing is called Pair Testing. e) Bebugging:- To estimate the efforts of Test Engineers the Development People are adding defects to coding. This informal way is called Bebugging or Defect Feeding / Seeding.
  • 16. System Testing Process Test Test Test Test Test Initiation Planning Design Execution Closure Test Reporting Development Vs System Testing S/w Bidding ↓ Kick of meeting ↓ PIN Document ↓ Requirements Gathering (BRS) ↓ Analysis & Planning (SRS & Project Plan) S/w Design & Review (HLD, LLDs) System Test Initiation ↓ ↓ Coding → Unit Testing (White Box Technique) System Test Planning ↓ ↓ Integration → Integration Testing Test Design Initial Build ↓ System Test Execution Test ↓ Reporting System Test closure ↓ User Acceptance Test ↓ Release & Maintenance
  • 17. I) System Test Initiation : In general the System Testing process is starting with System Test Initiation by Project Manager or Test Manager. They will develop Test Strategy or Test Methodology Document. This document defines the reasonable Test to be applied in current project. SRS Test Initiation Test Strategy I/P O/P Project Manager / Test Manager Components in Test Strategy : The Test Strategy Document consists of below components to define Test Approach to be followed by Team in current project. 1. Scope & Objective :- The Purpose of Testing in current project 2. Business Issues :- The Budget allocation for Testing in current project Ex : 100% → Project Cost 64% 36% Development System Testing & Maintenance 3. Rolls & Responsibilities :- The names of jobs in Testing Team and responsibility of each job in current project 4. Communication & Status Reporting :- The required negotiations in between various jobs in Testing Team
  • 18. * 5. Test Responsibility Matrix (TRM) :- ** The list of reasonable test to be applied in current project. Ex. Testing Topic Yes/No Comment UI Testing Yes - Manual Testing Yes - Functional Testing Yes - Load Testing No Lack of Resources Stress Testing No Lack of Resources Endurance Testing No Lack of Resources Compatibility Yes - Testing No need with Inter System No respect to Testing requirements ..etc,, ..etc,, ..etc,, 6. Test Automation & Testing Tools :- The purpose of automation testing in current project and available testing tools in our organization. 7. Defect Reporting & Seeking :- The required negotiation in between Testing Team and Development Team to report & solve defects. 8. Change & Configuration Management :- The maintenance of deliverable in testing for future reference. 9. Risks & Assumptions :- The expected list of risks and solutions to over come. 10. Testing measurements & Metrics The list of measurements & Metrics to estimate test status. 11. Training Plan :- The required number of training sessions to Testing Team to understand customer requirements.
  • 19. II) Test Planning : After completion of Test Strategy document preparation the Test Lead Category people are concentrating on Test Plan Documents Preparation. SRS, HLD & LLDs Testing Team Formation Project Plan Identify Risks Test Plans Prepare Detailed Text Plans Test Strategy Review Plans Testing Team Formation : In general the Test Planning is starting with Testing Team formation. In this stage the Test Lead is depending on below factors. → Project Size (No. of Functional Prints) → No.of Testers available on the bench → Test Duration W.R.T Project Plan → Available Test Environment Resources. (Ex. Testing Tools….) Case Study : Type of Project Developers : Testers → ERP, Client / Server, Website 3:1 → System S/w Application 1:1 → Machine Critical 1:7 Identify Risks : After completion of Testing Formation the Test Lead is concentrating on Team Level Risks Analysis. Ex :- Risk 1 : Lack of Time Risk 2 : Lack of Resources Risk 3 : Lack of Documentation Risk 4 : Delays in Delivery Risk 5 : Lack of Development Process Seriouness Risk 6 : Lack of Communication
  • 20. Prepare Detailed Test Plans : After Completion of Testing Team Formation and the risks analysis, the test lead is concentrating on test plan document preparation in IEEE 829 Format (Institute of Electrical and Electronics Engineer) Format : 1. Test Plan ID : Unique number or name for future reference about project. 2. Introduction : About Project 3. Test Items : The names of Modules or Functionalities in Project What 4. Features to be Tested : The names of functionalities to be tested. to Test 5. Features not to be Tested : The names of tested modules if available. 6. Test Approach : The List of selected tests by P.M. 7. Test Environment : The required Hardwares & Softwares to using testing. 8. Entry Criteria : Test Cases Designed, Test Environment Established, S/w Build received from Developers. How to Test 9. Suspension Criteria : → Test Environment Abounded → Shows stopper in build (Build not working) → Pending defects are more 10. Exit Criteria : → All modules in build covered → Test duration exceeded → All major defects solved 11. Test Deliverables : The list of testing documents to be prepared by test engineers in testing. (Test Scenarios, Test Cases, Automation Programs, Test Log, Defects reports and weekend reports) 12. Staff and Training Needs : The names of selected test engineers & required Whom training sessions to understand customer requirements. to Test 13. Responsibilities : Work allocation to above selected test engineers. 9 All responsible tests on specified modules or specified testing on all modules.) When 14. Schedule : The dates & times to conduct testing to Test 15. Risks & Assumptions : The previously analyzed risks and solutions to over come. 16. Approvals : The signature of Test Lead & Project Manager.
  • 21. Review Test Plan : After completion of Test Plan document preparation the test is conducting a review meeting to estimate completeness and correctness of that planed document. → Requirements / Module / Features / Functionalities Coverage → Testing Topics Coverage → Risks Oriented Coverage Note : After completion of Test Planning and before starting Test Designs, the Business Analyst and Test Lead are conducting Training Sessions to select Test Engineers on that customer requirements in Project. Some organizations are inviting Domain Experts / Subject Experts for that Training Sessions from out side. III) Test Design : After completion of required training sessions on customer requirements the corresponding Test Engineers are concentrating on Test Design to prepare Test Scenarios and Test Cases. The Test Scenarios specifies “What” to test. The Test Cases specifies “How” to test including a detailed procedure. From these sentences the Test Cases are drawing from Test Scenarios. There are four methods in this Test Design. Functional 1. Functional Specification Based Test Case Design Testing 2. Use Cases Based Test Case Design UT 3. User Interface Based Test Case Design NFT 4. Functional & System Specification Based Test Case Design 1. Functional Specification Based Test Case Design : To prepare Test Scenarios and Cases for Functional Testing, the Test Engineers are using this method. In this approach, the Test Engineers are preparing Scenarios and Cases depending on Functional Specifications in SRS. BRS ↓ Test Design SRS (Functional Test Scenarios Specifications) ↓ ↓ Test Cases HLD ↓ LLDs ↓ System Test Execution S/w Build
  • 22. Approach : Step 1 :- Collect Functional Specifications related to responsible areas. Step 2 :- Take one specified and read that specification to gather entry point, required inputs, normal flow, coming outputs, alternative flows, exit point and exceptions are rules. Step 3 :- Prepare Test Scenarios depending on above gathering information Step 4 :- Preview that Test Scenarios and implement them as Test Cases Step 5 :- Go to Step2 until all responsible Functional Specifications Study. Functional Specification – 1 :- A login process allows User ID& Password to Authorized users. The User ID object is taking alphanumeric in lower case from 4 to 16 characters long. The password object is taking alphabets in lower case from 4 to 8 characters long.Prepare Test Scenario. Test Scenario 1 :- Verify User ID object Boundary Value Analysis (BVA) (Size) : Min = 4 Char. → Pass Max = 16 Char. → Pass Min-1 = 3 Char. → Fail Max-1 = 15 Char. → Pass Min+1 = 5 Char. → Pass Max+1 = 17Char. → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid a-z, 0-9 A-Z, Special Characters, Blank Field Test Scenario 2 :- Verify Password Object Boundary Value Analysis (BVA) (Size) : Min = 4 Char. → Pass Max = 8 Char. → Pass Min-1 = 3 Char. → Fail Max-1 = 7 Char. → Pass Min+1 = 5 Char. → Pass Max+1 = 9 Char. → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid 0-9 a-z, A-Z, Special Characters, Blank Field
  • 23. Test Scenario 3 :- Verify Password Object Login Operation Decision Table : User Id Password Expected O/p Valid Value Valid Value Next Window Valid Value In Valid Error Message Invalid Valid Error Message Valid Blank Field Error Message Bland Valid Error Message Note : Exhaustive Testing is not possible due to this reason. The Testing Team is conducting Optimal Testing using Black Box Testing Techniques like BVA,ECP, Decision Table, regular expressions … etc., Functional Specification – 2 :- In an Insurance application, users are applying for different types of Insurance policies. If a user select Type-A Insurance, then our system asks the age of that user. The age value should be grater than 16 years and should be less than 80 years. Prepare Test Scenario. Test Scenario 1 :- Verify Type-A selection Test Scenario 2 :- Verify focus to Age when you selected Type-A Insurance Test Scenario 3 :- Verify Age Value Boundary Value Analysis (BVA) (Range) : Min = 17 → Pass Max = 79 → Pass Min-1 = 16 → Fail Max-1 = 78 → Pass Min+1 = 18 → Pass Max+1 = 80 → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid 0-9 a-z, A-Z, Special Characters, Blank Field Functional Specification – 3 :- In a shopping application users are applying for different type to items purchase orders. The purchase order is allowing user to select Item No. and to enter Qty. up to 10. The purchase order returns Total Amount along with one item price. Prepare Test Scenario.
  • 24. Test Scenario 1 :- Verify Item No. Selection Test Scenario 2 :- Verify Qty. Value Boundary Value Analysis (BVA) (Range) : Min = 1 → Pass Max = 10 → Pass Min-1 = 0 → Fail Max-1 = 9 → Pass Min+1 = 2 → Pass Max+1 = 11 → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid 0-9 a-z, A-Z, Special Characters, Blank Field Test Scenario 3 :- Verify Total Amount, given Qty. * Item Pass Functional Specification – 4 :- A Door Opened when a person comes to in front of the door and that door closed when that person went to inside. Prepare Test Scenario. Test Scenario 1 :- Verify Door Open Person Door Criteria Present Opened Pass Present Closed Fail Absent Opened Fail Absent Closed Pass Test Scenario 2 :- Verify Door Close Person Door Criteria Inside Closed Pass Inside Opened Fail Test Scenario 3 :- Verify Door operation when a person is standing at the middle of the door. Functional Specification – 5 :- In an e-banking application, the customers are connecting to Bank Server through a login process. This login allows customer to fill below fields. Password : 6 digits number Prefix : 3 Digits number but does not start with 0 & 1 Suffix : 6 Digits alphanumeric
  • 25. Area Code : 3 Digits no but it is optional Command : Cheque Deposit, Money Transfer, Mini Statement and Bills Paid. Prepare Test Scenario. Test Scenario 1 :- Verify Password Value Boundary Value Analysis (BVA) (Size) : Min = Max = 6 Digits → Pass 5 Digits → Fail 7 Digits → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid 0-9 a-z, A-Z, Special Characters, Blank Field Test Scenario 2 :- Verify Prefix Boundary Value Analysis (BVA) (Size) : Min = Max = 3 Digits → Pass 2 Digits → Fail 4 Digits → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid [2-9][0-9][0-9] a-z, A-Z, Special Characters, Blank Field Test Scenario 3 :- Verify Suffix Boundary Value Analysis (BVA) (Size) : Min = Max = 6 Digits → Pass 5 Digits → Fail 7 Digits → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid 0-9, a-z, A-Z Special Characters, Blank Field Test Scenario 4 :- Verify Area Code Boundary Value Analysis (BVA) (Size) : Min = Max = 3 Digits → Pass 2 Digits → Fail 4 Digits → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid 0-9, Blank Field a-z, A-Z, Special Characters
  • 26. Test Scenario 5 :- Verify command selection like Cheque Deposit, Money Transfer, Mini Statement and Bills Paid. Test Scenario 6 :- Verify login operation to connect to Bank Server Remaining Fields Area Code Expected O/p All are valid Valid Next Window All are valid Blank Field Next Window All are valid Invalid Error Message Any one Invalid Valid/Blank Error Message Any one Blank Field Valid/Blank Error Message Functional Specification – 6 :- In a library Management System the readers are applying for Identity No. to get this no., the reader is filling below fields. Reader Name : Alphabets in lower case with Init Cap as single word House Name : Alphabets in lower case as single word PIN Code : Related to India Postal Department City Name : Alphabets in uppercase as single word Phone No. : Related to India Subscribers and optional Prepare Test Scenario Test Scenario 1 :- Verify Reader Name Boundary Value Analysis (BVA) (Size) : Min = 1Char. → Pass Max = 256Char. → Pass Min-1 = 0Char. → Fail Max-1 = 255Char. → Pass Min+1 = 2Char. → Pass Max+1 = 257Char. → Fail (In any front end developed programs the default max. char are 256.) Equivalence Class Partition (ECP) (Type) : Valid In-Valid [A-Z][a-z]* 0-9, Special Characters, Blank Field Test Scenario 2 :- Verify House Name Boundary Value Analysis (BVA) (Size) : Min = 1Char. → Pass Max = 256Char. → Pass Min-1 = 0Char. → Fail Max-1 = 255Char. → Pass Min+1 = 2Char. → Pass Max+1 = 257Char. → Fail
  • 27. Equivalence Class Partition (ECP) (Type) : Valid In-Valid [a-z]* A-Z, 0-9, Special Characters, Blank Field Test Scenario 3 :- Verify PIN Code Boundary Value Analysis (BVA) (Size) : Min = Max = 6 Digits → Pass 5 Digits → Fail 7 Digits → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid [1-9][0-9][0-9][0-9][0-9][0-9] a-z, A-Z, Special Characters, Blank Field Test Scenario 4 :- Verify City Name Boundary Value Analysis (BVA) (Size) : Min = 1Char. → Pass Max = 256Char. → Pass Min-1 = 0Char. → Fail Max-1 = 255Char. → Pass Min+1 = 2Char. → Pass Max+1 = 257Char. → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid [A-Z]* a-z, 0-9, Special Characters, Blank Field Test Scenario 5 :- Verify Phone Number Boundary Value Analysis (BVA) (Size) : Min = 10 Digits → Pass Max = 12 Digits → Pass Min-1 = 9 digits → Fail Max+1 = 13 Digits → Fail Min+1 = 11 Digits → Pass Equivalence Class Partition (ECP) (Type) : Valid In-Valid 0-9, Blank Field A-Z, a-z, Special Characters
  • 28. Test Scenario 6 :- Verify Reader Registration Decision Table : Remaining Fields Telephone Number Expected O/p All are valid Valid Identity No. All are valid Blank Field Identity No. All are valid Invalid Error Msg. Any one Invalid Valid / Blank Error Msg. Any one Blank Field Valid / Blank Error Msg. Functional Specification – 7 :- A Computer Shut Down Operation Test Scenario 1 : Verify Shut Down option selection using Shut Down Test Scenario 2 : Verify Shut Down option selection using Alt+F4 Test Scenario 3 : Verify Shut Down option selection using Ctr+Alt+Del Test Scenario 4 : Verify Shut Down operation success Test Scenario 5 : Verify Shut Down operation using Run Command. Test Scenario 6 : Verify Shut Down operation when a process is running Test Scenario 7 : Verify Shut Down operation using Power Off Button Functional Specification – 8 :- Money With Drawl From ATM with all Rules and Regulations Test Scenario 1 : Verify Card Insertion Test Scenario 2 : Verify Card Insertion in Wrong Angle Test Scenario 3 : Verify Cancel After Card Insertion Test Scenario 4 : Verify Language Selection Test Scenario 5 : Verify Cancel after selection of Language Test Scenario 6 : Verify PIN Entry Test Scenario 7 : Verify operation with wrong PIN Test Scenario 8 : Verify operation when you enter wrong PIN 3 times consecutively
  • 29. Test Scenario 9 : Verify Cancel after enter PIN Test Scenario 10 : Verify Amount type selection Test Scenario 11 : Verify operation when you selected wrong account type with respected to the inserted card Test Scenario 12 : Verify cancel after account type selection Test Scenario 13 : Verify with drawl option selection Test Scenario 14 : Verify cancel after selection of with drawl Test Scenario 15 : Verify amount entry Test Scenario 16 : Verify operation with wrong denomination in amount Test Scenario 17 : Verify with drawl operation success. (Correct amount, right receipt, able to take card back) Test Scenario 18 : Verify with drawl operation with grater than possible balance. Test Scenario 19 : Verify with drawl operation with grater than day limit. Test Scenario 20 : Verify with drawl operation with Net work problem Test Scenario 21 : Verify with drawl amount with lack of amount in ATM Test Scenario 22 : Verify with drawl operation with exceeded no.of Transactions per day Test Scenario 23 : Verify with drawl operation with other bank card Test Scenario 24 : Verify with drawl operation with stolen card
  • 30. 2. Use Cases Based Test Case Design : It is an alternative method for Functional Specification Based Test Case Design. In this method the Test Engineers are depending on Use Cases instead of Functional Specifications to prepare Test Scenarios and Test Cases. BRS ↓ Use Cases SRS (Functional BA + Test Lead Test Scenarios Specifications) ↓ ↓ Test Cases HLD ↓ LLDs ↓ System Test Execution Coding (UT & IT) S/w Build From the above diagram the Business Analyst and Test Lead category people are developing use cases depending on corresponding functional specifications in SRS. Every Use Case is an Implemented Form of Functional Specifications. Use Case Format :- 1. Use Case ID : Unique number or name for future reference 2. Use Case Description : The summery of corresponding Functionality 3. Required Inputs : The required Inputs for corresponding Functionality 4. Precondition : The necessary Condition to follow before operating corresponding functionality 5. Events List : Events / Tasks Expected O/p or Out come (A Step by Step procedure with expected outputs) 6. Activity Flow Diagram : A Pictorial / Diagrammatic of corresponding functionality 7. Post Condition : Necessary tasks to do after corresponding functionality
  • 31. 8. Alternative events list : Alternative procedures to do this functionality if available 9. Proto Type : A screen shot related to corresponding functionality. 10. Related use cases : The names of other Use Cases relation to corresponding functionality Approach : Step1 : Collect use cases of responsible areas Step2 : Take one use case and study Step3 : Identify Entry Point, Required I/p, Normal Flow, Expected O/p, Exit Point, Alternative Flows and Exceptions rules. Step4 : Prepare Test Scenarios depending on above Identified Information. Step5 : Review that scenario and implement them as Test Cases Step6 : Go to Step2 until all responsible Use Cases Study Use Case 1 : 1. Use Case ID : UC_Login 2. Use Case Description : Login operation is authorization 3. Required Inputs : User ID is in alphabets lower from 4-16 characters long. The Password alpha numeric in lower case from 4-8Char. Long. 4. Precondition : New User Registration to get valid User ID & Password 5. Events List : Events / Tasks Expected O/p or Out come Enter User ID an Next window for valid user Password Values and and invalid data error msg. then click OK Button for Invalid user.
  • 32. 6. Activity Flow Diagram : Example : User User ID & Password Error Msg. LOGIN Data Base Re-Login Valid Next Window 7. Post Condition : Log out operation is mandatory after successful Login 8. Alternative events list : None 9. Proto Type : 10. Related use cases : UC_New User, UC_Logout
  • 33. Test Scenario 1 :- Check User ID Boundary Value Analysis (BVA) (Size) : Min = 4Char. → Pass Max = 16Char. → Pass Min-1 = 3Char. → Fail Max-1 = 15Char. → Pass Min+1 = 5Char. → Pass Max+1 = 17Char. → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid a-z A-Z, 0-9, Special Characters, Blank Field Test Scenario 2 :- Check Password Boundary Value Analysis (BVA) (Size) : Min = 4Char. → Pass Max = 8Char. → Pass Min-1 = 3Char. → Fail Max-1 = 7Char. → Pass Min+1 = 5Char. → Pass Max+1 = 9Char. → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid a-z,0-9 A-Z, Special Characters, Blank Field Test Scenario 3 :- Check Ok Button Click User ID Password Expected Out Put Valid Valid Next Window Valid Invalid Invalid Data Error Msg. Invalid Valid Invalid Data Error Msg. Value Blank Field Invalid Data Error Msg. Blank Value Value Invalid Data Error Msg. Test Scenario 4 :- Check Cancel Button Event Expected Out Put Click Cancel after open login Login Window Closed Click Cancel after enter user ID Login Window Closed Click cancel after enter Password Login Window Closed Test Scenario 5 :- Check Minimize Icon Test Scenario 6 :- Check Maximize Icon Test Scenario 7 :- Check Close Icon
  • 34. Use Case 2 : 1. Use Case ID : UC_Book_Issue 2. Use Case Description : Issue a Book for Valid User 3. Required Inputs : User ID is in below format Mm_yy-xxxx (4 digits) Book ID is in below format BOOK_xxxx 4. Precondition : New User Registration to get valid User ID 5. Events List : Events / Tasks Expected O/p or Out come Enter User ID Focus to Book ID for Valid User and then click and Invalid User error msg. for “Go” Button Invalid User Enter Book ID Book issued message for available and click “Go” book and unavailable book Button message for unavailable book id 6. Activity Flow Diagram : Example : User Valid User ID Invalid User BOOK ISSUE Data Base Re-Login Valid Book ID Unavailable BOOK Book ISSUE Data Base Re-Login Valid “Book Issued” 7. Post Condition : Received that issued book from Computer Operator 8. Alternative events list : None
  • 35. 9. Proto Type : Book Issue - □X User ID Go Book ID Go 10. Related use cases : UC_New User, UC_Book Feeding Test Scenario 1 :- Verify User ID Boundary Value Analysis (BVA) (Size) : Min = Max = 10 Position Value → Pass = 9 Position Value → Fail = 11 Position Value → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid [0][1-9][_][0-9][0-9][_][0-9][0-9][0-9][0-9] a-z, A-Z, 0-9, [1][0-2][_][0-9][0-9][_][0-9][0-9][0-9][0-9] Special Char. except _,Blank Field Test Scenario 2 :- Verify “Go” button click User ID Expected O/p after click ‘Go’ Valid Value Focus to Book ID Invalid Value “Invalid User” Error Message Blank Field “Invalid User” Error Message Test Scenario 3 :- Verify User ID Boundary Value Analysis (BVA) (Size) : Min = Max = 8 Position Value → Pass = 7 Position Value → Fail = 9 Position Value → Fail Equivalence Class Partition (ECP) (Type) : Valid In-Valid [B][O][O][K][_][0-9][0-9][0-9][0-9] a-z, A-Z Except B,O,K, Special Char. except _,Blank Field
  • 36. Test Scenario 4 :- Verify “Go” Click Book ID Expected O/p after click “Go” Valid Book ID “Book issued” Msg. Invalid Book ID “Unavailable Book” Message Blank Field “Unavailable Book” Message Test Scenario 5 :- Verify minimized Icon Test Scenario 6 :- Verify maximized Icon Test Scenario 7 :- Verify close Icon 3. User Interface Based Test Design : The Functional Specification Based Test Design or The Use Cases Based Test Designs are using to prepare Test Scenarios and Cases for Functional Testing. This User Interface Based Test Design is using by Test Engineers to prepare Test Scenarios and cases for “Usability Testing”. BRS ↓ SRS (UI Test Scenarios Requirements) ↓ ↓ Test Cases HLD ↓ LLDs ↓ System Test Execution Coding (UT & IT) S/w Build In this method the Test Engineers are depending on User Interface Requirements in SRS. In general the Test Engineers are writing Common Test Scenarios for Usability Testing, which are applicable on any type of Application Scenarios. Test Scenario 1 :- Verify Spelling in every scenario Test Scenario 2 :- Verify error msg. meaning Test Scenario 3 :- Verify Int.Cap of labels in every screen
  • 37. Test Scenario 4 :- Verify color uniqueness through out the screens Test Scenario 5 :- Verify Font or Style uniqueness through the screens Test Scenario 6 :- Verify size uniqueness throughout the scene Test Scenario 7 :- Verify alignment of objects in every screens Test Scenario 8 :- Verify line spacing uniqueness through out the screens Test Scenario 9 :- Verify Tool Tips of icons in every screen. Test Scenario 10 :- Verify default object in every screen. Test Scenario 11 :- Verify Uniform Background colors of objects in every screen. Test Scenario 12 :- Verify Scroll Bars when our screen size is grater than Desk Top Test Scenario 13 :- Verify keyboard accessing of every object in every screen Test Scenario 14 :- Verify abbreviations & Short cuts in screens Test Scenario 15 :- Verify Multiple Data Object positions in every screen. Ex : List Box, Menu, Table … etc., Test Scenario 16 :- Verify Help Messages (Manual Support Testing) Test Scenario 17 :- Verify Functionally Grouped Objects in every screen. Test Scenario 18 :- Verify Boarders of Functionally Grouped Objects in every screens Test Scenario 19 :- Verify Labels of objects with respect to Functionality Test Scenario 20 :- Verify Window Labels with respect to Functionality 4. Functional and System Specification Based Test Design : After completion of Test Scenarios selection for Functional and Usability Testing the Test Engineers are concentrating on Test Scenario selection for Non-Functional Testing depending on Functional and System Specifications in SRS. Functional Specifications are describing the required functionalities in Software and System specifications are describing the required environment to be used.
  • 38. BRS ↓ SRS Test Scenarios ↓ (Functional Test Cases Specifications + System Specifications) ↓ HLD & LLDs ↓ System Test Execution Coding (UT & IT) S/w Build Example Test Scenarios for Compatibility Testing : Test Scenario 1 : Verify Login in Win NT with Customer expected configuration Test Scenario 2 : Verify Login in Win 2000 with Customer expected configuration Test Scenario 3 : Verify Login in Win Vista with Customer expected configuration And more… Example Test Scenarios for Performance Testing : Test Scenario 1 : Verify Login Under Customer expected Load and Configuration Test Scenario 2 : Verify Login Under more than Customer expected configuration And more…. Example Test Scenarios for Installation Testing : Test Scenario 1 : Verify Setup Program to Start Installation. Test Scenario 2 : Verify Interface easiness during Installation Test Scenario 3 : Verify occupied disk space after Installation And more… Test Case Format : After completion of Test Scenarios selection for responsible areas in terms of Functional, Usability and Non-Functional Testing, the Test Engineers are implementing them as Test Cases. Test Engineers are using IEEE (Institute of Electrical & Electronics Engineer) 829 Test Case Format. 1. Test Case ID : Unique Number / Name for Future Reference 2. Test Case Name : The Corresponding Test Scenario 3. Feature to be Tested : The Name corresponding Module or Functionality
  • 39. 4. Test Suite ID : The Unique number or name of a Test Batch. This case is a member in that Batch 5. Priority : The importance of this Test Case (P0 priority for Functional Test Cases, P1 Priority for Non-Functional Test Cases and P2 Priority for Usability Test Cases.) 6. Test Environment : The required Hardware and Software to execute this test. 7. Test Effort : Person per hour (Ex.20min is average Test Execution Time) 8. Test Duration : The data and time to execute this test. 9. Test Setup : The necessary tasks to do before start this test execution. 10. Test Procedure / Data Matrix : Step Action / Required Expected Actual Defects Result Comments No. Task event I/p O/p O/p Id Test Design Test Execution } ECP (Type) BVA (Range / Size) I/p Object Data Matrix in Valid Invalid Min Max 11. Test Case Pass / Fail Criteria : The Final result of this Test Case after execution Note 1 : In general the test engineers are not interesting to fill all fields in Test Case Format due to lack of time and similarity in fields values of Test Cases. Note 2 : The test engineers are using test procedure for operation test cases and data matrix for input object test cases. Functional Specification : In a Banking application the valid employees are creating fixed deposit operations with depositors provided information. In this fixed deposit operation, the employees are filling below fields. Depositor Name : Alphabets in Lower Case with Int.Cap, allows multiple words in name Amount : 1500 to 1,00,000
  • 40. Time : Up to 12 months Interest : Numeric with one decimal If the time>10months, then the Interest>10% from Bank Rules Prepare Test Scenarios and Test Cases : Test Scenario 1 : Verify Depositor Name Test Scenario 2 : Verify Amount Test Scenario 3 : Verify Time Test Scenario 4 : Verify Interest Test Scenario 5 : Verify Fixed Deposit Operation Test Scenario 6 : Verify Fixed Deposit Operation with Bank Rule Test Case Documents : Test Case 1 :- 1. Test Case ID : TC_FD_Ravi_24th May_1 2. Test Case Name : Verify Depositor Name 3. Test Suit ID : TS_FD 4. Priority : P0 5. Test Setup : Depositor Name is taking inputs 6. Data Matrix : ECP (Type) BVA (Size) I/p Object Valid Invalid Min Max Depositor Name ([A-Z][a-z]*)* 0-9,Spl.Char, Blank Field 1 Char 256 Char Test Case 2 :- 1. Test Case ID : TC_FD_Ravi_24th May_2 2. Test Case Name : Verify Amount 3. Test Suit ID : TS_FD 4. Priority : P0 5. Test Setup : Depositor Object is taking inputs
  • 41. 6. Data Matrix : ECP (Type) BVA (Range) I/p Object Valid Invalid Min Max Amount 0-9 a-z, A-Z, Spl.Char, Blank Field 1500 100000 Test Case 3 :- 1. Test Case ID : TC_FD_Ravi_24th May_3 2. Test Case Name : Verify Time 3. Test Suit ID : TS_FD 4. Priority : P0 5. Test Setup : Time Object is taking inputs 6. Data Matrix : ECP (Type) BVA (Range) I/p Object Valid Invalid Min Max Time 0-9 a-z, A-Z, Spl.Char, Blank Field 1 Month 12 Months Test Case 4 :- 1. Test Case ID : TC_FD_Ravi_24th May_4 2. Test Case Name : Verify Interest 3. Test Suit ID : TS_FD 4. Priority : P0 5. Test Setup : Interest Object is taking inputs 6. Data Matrix : ECP (Type) BVA (Range) I/p Object Valid Invalid Min Max Interest 0-9 . 0-9 with one decimal a-z, A-Z, Spl.Char, Blank Field 0.1 100
  • 42. Test Case 5 :- 1. Test Case ID : TC_FD_Ravi_24th May_5 2. Test Case Name : Verify Fixed Deposit Operation 3. Test Suit ID : TS_FD 4. Priority : P0 5. Test Setup : Valid Values are available in hand 6. Test Procedure : Step No. Action Required I/p Expected O/p 1. Connect Bank Server Valid Exp Id Menu Appears 2. Select “FD” Option None Fixed Deposit Form Opened All are valid Acknowledgement 3. Fill Fields and Click Ok Any one Invalid Error Msg. Any one Blank Field Error Msg. Test Case 6 :- 1. Test Case ID : TC_FD_Ravi_24th May_6 2. Test Case Name : Verify Fixed Deposit Operation with Bank Rule 3. Test Suit ID : TS_FD 4. Priority : P0 5. Test Setup : Valid Values are available in hand 6. Test Procedure : Step Action Required I/p Expected O/p No. Connect Bank 1. Valid Exp Id Menu Appears Server Select “FD” Fixed Deposit Form 2. None Option Opened Valid Name, Amount, Time>10 Acknowledgement Fill Fields and with Interest>10 3. Click Ok Valid Name, Amount, Time>10 Error Msg. With Interest <=10
  • 43. Like as above example the Test Engineers are implementing Test Scenarios as Test Cases. Every Test Case is a combination of corresponding Test Scenario and required details to apply this test on S/w Build. Test Cases Selection Review : After completion of Test Scenarios and Cases writing the Test Lead & Test Engineers are conducting a review meeting to estimate the completeness and correctness of that documents. In this review the Testing Team is depending on below coverages. □ Requirements Oriented Coverage (Modules) □ Testing Topic Oriented Coverage (UT,FT,NFT) IV. Test Execution :- After completion of Test Design and Review the Testing Team is concentrating on below issue. □ Formal meeting with developers □ Test Environment Establishment □ Levels of Test Execution □ Formal Meeting :- In general the Test Execution process is starting with a Formal Meeting in between Testing Team & Development Team representatives. In this meeting the corresponding representatives are concentrating on Build Version Control and Defect Tracking. From Build version control concept, the Development Team is modifying S/w Build Coding, to resolve defects and they will release that modified build with Unique version number. This version numbering system is understandable to Test Engineers to distinguish Old Build & Modified Build. For this version controlling, the Developers are using Version Control Tools also. (Ex : - VSS (Visual Source Safe)) To report mismatches to Development Team the Test Engineers are reporting that mismatch to Defect Tracking Team (DTT) First Test Lead + Project Manager + Project Lead + Business Analyst → DTT
  • 44. □ Test Environment Establishment :- After completion of Formal Meeting, the Testing Team is concentrating on Test Environment Establishment with required all Hardware and Software SERVER Configuration Repository TCP/IP TCP/IP FTP FTP TCP/IP FTP Development Project Environment Management Test Environment FTP : File Transfer Petrol (Single Location) TCP/IP : Transmission Control Protocol / Internet Protocol (Different Location(s)) □ Levels of Test Execution:- Development Testing Initial Build Level-0 (Sanity) Stable Build Defect Report Level-1 (Comprehensive) Defect Modified Build Fixing Level-2 (Regression) Level-3 (Final Regression)
  • 45. Case Study :- Initial Build ↓ Sanity Testing (Level-0) ↓ Stable Build ↓ Comprehensive (Level-1) ↓ Defect Detection ↓ Modified Build ↓ Regression Test (Level-2) ↓ Defect Closing ↓ Master Build ↓ Final Regression (Leve-3) ↓ Golden Build (Able to Release) □ Levels of Test Execution Vs Test Cases :- Level -0 → Some P0 (Functional) Test Cases Level–1 → All P0,P1&P2 Test Cases Level-2 → Selected P0,P1&P2 Test Cases with respect to modification Level-3 → Selected P0,P1&P2 Test Cases with respect to Defect Density □ Level-0 Sanity Testing :- After Downloading Initial Build from Configuration Reporting in server, the Testing Team is concentrating on Level-0 sanity testing to estimate Testability of that Software. Testability means that Understandable, Operatable, Observable, Controllable, Consistency, Simplicity, Maintainable and Automatable. If that Initial Build is not Stable then the Testing Team sends back that Build to Developers. If that build is Stable Build then the Test Engineers are concentrating on Level-1 Test Execution to detect defects. This Level-0 testing is also known as Sanity Testing / Smoke Testing / Testability Testing / Tester Acceptance Testing or Build Verification Testing /n Octangle Testing.
  • 46. □ Level-1 Comprehensive / Real Testing :- In this Level-1 Test Execution, the Test Engineers are executing all Test Cases as Batches. Every Test Batch Consist of a set of dependent Test Cases. In these test batches the end state of one test is Base State to Next State. Test batches are also known as Test Suite or Test Set or Test Build or Test Chain. Receive Stable Make Test Select Select a Build from Cases as A Batch Test Case Developers Batches Next Batch Yes Next Case Defect Step Take a Step Reporting No Expected in Case = Actual Build From the above diagram the Test Engineers are continuing Test Execution Batch by Batch and Case by Case in every Batch. If our Test Case Step expected is not equal to actual then the Test Engineer is concentrating on Defect Reporting. If possible, they will continue Test Execution also. In this Level-1 test execution, the Test Engineers are preparing Test Log Document to specify test results. Test Log Document Format :- Test Case Results (Pass / Defect Executed Executed Comments ID Fail) ID By On There are three types of Test Results. → Passed, All expected values are equal to Actual → Failed, Any one expected are not equal to Actual → Blocked, Test execution postponed due to incorrect parent functionality
  • 47. V. Defect Reporting & Tracking :- During Level-1 Test Execution, the Test Case expected values are not equal to Actual. These mismatches are calling as Defects / Issues / Bugs / Flaws Defect Report :- 1. Defect ID : Unique No. or Name 2. Description : Summary of that mismatch in between Tester expected value and Build actual value 3. Build Version ID : The version number of Current Build (The Test Engineers detected this defect in that Build) 4. Feature : The Name of Module or Functionality (Test Engineers detected this defect in that Module) 5. Test Case ID : The ID of failed test case (Test Engineers detected this defect in that case execution) 6. Reproducible : Yes → Defect appears every time in Test Execution No → Defect appears rarely in Test Execution 7. If Yes, attach procedure : 8. If No, attach procedure and screen shots : 9. Severity : The seriousness of defect in terms of Functionality High / Critical :- Not able to continue testing without resolving. Medium / Major :- Able to Continue Testing but Compulsory / Mandatory to resolve Low / Minor :- Able to continue, May or May Not to resolve. 10. Priority : The importance of defect to solve in-terms of customer interest. (High / Medium / Low) 11. Detected By : The name of the Test Engineer 12. Detected On : The data of detection and submission 13. Status : New : Reporting first time Re-Open : Re-Reporting 14. Assigned to : Report to Tracking Team 15. Suggested Fix : Suggestion to Solve that Defect. (Optional)
  • 48. Defect Reporting Process : Test Engineer Report Defect to DTT as New DTT Analize that Defect Accepted Defect Status No Changed to “ Rejected” Yes Categorized that defect and change status to “Open” No Data Assigned to Yes Testing Team related Defect No Procedure Assigned to Yes Testing Team Related Defect No
  • 49. No H/w or Assigned to Yes H/w Team Infrastruct ure Defect No Code Related defect is Assigned to Development Team Case Study :- Report Test Defect Assigned Project Lead Engineer Defect Tracking Team + Programmers Code Related Defect Report Test Defect Assigned Engineer Tracking Team BA+TL+TE Defect Test Case Procedure & Test Data Related Defect H/w or Report Infrastructure Test Defect Assigned Engineer Tracking Team Team Defect H/w or Environment Related Defect
  • 50. Defect Life Cycle or Bug Life Cycle : New ↓ Assigned Reject Deferred ↓ Open ↓ Fixed Reopen ↓ Closed New : Reporting First Time Assigned : Accepted by DTT Reject : Not Accepted by DTT Deferred : Accepted but not interested to solve due to low severity and low priority. Open : Responsible Team is ready to resolve Fixed : Defect not Correctly solved (or) Re reporting Closed : Defect correctly solved and confirmed through Regression Testing. Test Data Related Defect Fixing : If our reported defect accepted by Defect Tracking Team (DTT) and they decided that defect as Test Data Related Mismatch. In this situation the responsible testing team is concentrating Correct Data Collection (CDC) without having conceptual gap with the help of BA and TL and then, the Test Engineers are re-executing previously failed test on same Build with correct test data. This test repetition is calling as Retesting or Confirmation Testing. Testing Build Failed Test Case Defect Reporting Data Related Defect Repeat Test Case Collect Correct Build With correct Data Data Retesting / Confirmation Testing
  • 51. Test Script or Procedure Related Defect Fixing : If our reported defect accepted as Test Procedure Related Defect by DTT, then Responsible Testing Team is preparing Correct Procedure for that Test Case with help of TL and BA Testing Build Failed Test Case Report to DTT Procedure Related Defect Correct Test Repeat Test Case Procedure Build In correct Prepared by Test procedure Engineers Retesting / Confirmation Testing Infrastructure Related Defect Fixing : If our Report Defect Accepted by DTT as Environment Related or Infrastructure Related or Hardware Related Defect, then responsible Hardware Team is Re-establishing correct test environment. Testing Build Failed Test Case Report to DTT Environment Related Defect Re-establish Test Repeat Test Case Environment by Build In modified H/w Team environment Retesting / Confirmation Testing
  • 52. Code Related Defect Fixing :- If our reported defect accepted as Code Related Defect, then the responsible Programmers / Developers are performing changes in Build Coding to Resolve that defect. PL Updates the Impact Analysis Selected Coding status of Defect by areas reviewed to “Open” Programmers by PL Review Document, Changes by changes by concerned Changes Yes Required in BA/Designers & person Project Lead (BA/Design) Documents Unit Test & Changes in No Make modified coding by Build Programmers PL changes Release Modified Build defect status to with Unique Version “Fixed” Number and Release Note After receiving build from Development Team, the Testing Team is concentrating on re-testing & Regression Testing Test Cases Related Passed Tests Modified Passed Build Failed Test Build Passed Programmers Pass Report Defect Faild DTT Code Related Defect From the above model the test engineer is re-executing previously failed test on modified build to confirm defect fixing, called as Retesting or Confirmation Testing.
  • 53. To identify side effects of defect fixing modifications in modified build, the test engineers is re-executing previously passed related test on that modified build called Regression Testing. Level-2 Regression Testing : Take Modified Build and Release Note Identify severity of fixed defect in that Modified Build High Medium Low All P0 All P0 Some P0 All P1 Carefully Selected P1 Some P1 Carefully Selected P2 Cases And Some P2 Test Cases Some P2 Test Cases On that modified build to detect Side Effects in Build with respect to Modifications Specified in Release Note Case 1:- If the development team fixed defect severity is High then the Test Engineers are repeating All P0, All P1 and Carefully Selected P2 Test Cases on that Modified Build w.r.t. modifications specified in release note. Case 2 :- If the Development Team fixed defect severity is Medium then the Test Engineers are repeating All P0, Carefully Selected P1 and Some P2 Test Cases on that modified build w.r.t. modifications specified in release note. Case 3 :- If the Development Team fixed defect severity is Low then the Test Engineers are repeating Some P0, Some P1 and Some P2 Test Cases on that modified build w.r.t. modifications specified in release note. Case 4:- If the development team release modified build w.r.t. changes in Customer Requirements then the Test Engineers are re-executing All P0, All P1 and Carefully Selected P2 Test Cases on that Modified Build w.r.t. changes in Customer requirements. In this case Test Engineers are performing changes in Test Scenarios and Test Cases w.r.t. changes in Customer Requirement. VI. Test Closure :-
  • 54. After completion of all reasonable tests and detected defects closing, the test lead is conducting a review meeting to Stop Testing. In this review the TL is analyzing below factors with the involvement of Test Engineers. 1. Coverage Analysis :- → Requirements Oriented Coverage (Module) → Testing Topic Related Coverage (Usability, Functional, Non-Functional) 2. Defect Density Calculation : Ex : Modules / Requirement % A 20% B 20% C 40% ( Need Regression Test ) D 20% Total 100% 3. Analysis of Deferred Defect : Whether the deferred defects are postponed or not? Level-3 Final Regression Testing : After completion of successful Test Closure review the Testing Team is concentrating Leve-3 or Final Regression Testing. Identify High Defect Density Person / Module Hour Golden Defect Effort Reporting If Estimation Required Regression Plan Testing Regression VII. User Acceptance Testing (UAT) :
  • 55. After Completion of Final Regression Testing the Project Management is concentrating on User Acceptance Testing to collect feedback from Real Customers / Model Customers. There are two ways in User Acceptance Testing, such as Alpha Testing and Beta Testing. VIII. Sign Off : After completion of successful User Acceptance Testing and there modifications, the Test Lead is preparing Final Test Summary Report and review corresponding Test Engineer from this project. The final Test Summary Report is a combination below document. → Test Strategy / Methodology → Test Plan(s) → Test Scenarios → Test Cases → Test Logs → Defect Reports → Requirements Traceability Matrix Required Test Case Result Detected Status Comments ID ID (Pass / Fail) ID (Closed / Deferred) It is a mapping between requirements and defects via test cases. Case Study (5Months of Testing Process) :- Deliverable Responsibility Duration Test Strategy PM / TM 4-5 days Test Planning Test Lead 4-5 days Requirements Training to BA + Domain / Subject Experts 5-10 days Test Engineers Test Scenarios & Review Test Engineer 5-10 days Test Cases Implementation Test Engineer 10-15 days Review Build + Level-0 Test Engineer 2-3 days (Sanity Testing) ** Test Automation Test Engineer 10-15 days Level-1 and Level-2 Test Engineer 30-40 days Testing Execution Deliverable Responsibility Duration
  • 56. On Going Defect Reporting Test Engineer (Same Day) Status Reporting Test Lead Weekly Twice Test Closure & Level-3 Test Lead & Test Engineer 5-10 days Real / Model Customers with In front of User Acceptance Testing 3-5 days Developers and Testers Sign Off Test Lead 1-2 days W-Model System Testing Development And Manual Test Automation N.F.T Load Runner & Req. Analysis J Meter F.T Win Runner / QTP S/w Design / Robot / Silk Usability Coding + Unit Testing Testing No Tools in Market Integration Testing Note : Test Automation is Build Optional From the above W-Model, the Testing Tools are available for Functional Testing and Some of Non-Functional Testing and Endurance Testing and Data Volume Testing. The remaining Non-Functional Tests and Usability Testing conducted by Test Engineers Manually.
  • 57. Win Runner 8.0 : Developed by Mercury Inter Active and Take over by Hewlett Packed (HP) Functional Testing Tool This Version released in “2005”January Supports VB, .Net, Java, Power Builder, HTML, Delphi, VC++, D2K, and Siebel and Siebel Technology Software for Functional Testing. To Support SAP, People Soft, XML, Multimedia and Oracle Applications (“ERPS”) including above technologies, Test Teams are using Quick Test Professional (QTP) Win Runner runs on windows only X-Runner for Unix / Linux Win Runner Test Process : Receive Stable Build From Developers after Sanity Testing ↓ Identify Functional Test Cases (Priority P0) to Automate (English + Manual) ↓ Create Automation Programs (TSL) for that Functional Test Cases ↓ Runs Programs on S/w Build to detect defects ↓ Test Reporting if required From the above approach, the Test Engineers are concentrating Manual Functional Test Cases into Test Script Language (TSL) programs. TSL is a “C” like language Add-in Manager : This window list out all Win Runner supporting technologies with respect to license. Test Engineers are selecting current project technology in that list Welcome Screen : After Successful Win Runner launching Welcome Screen is coming on the Desktop. The screen consists of 3 New Options like → Create a New Test → Open an Existing Test → A Quick Preview of Win Runner
  • 58. Win Runner Icons : Start Recording ↓ Run From Top → Run From Arrow Stop Recording Pause (Stop Run) Win Runner Test Automation Frame Works : The Win Runner 8.0 is allowing you to convert our Manual Functional Test Cases into Test Script Language (TSL) programs in 4 ways → Record and Playback Frame Work → Data Driven Frame Work → Keyword Driven Frame Work → Hybrid Frame Work I. Record & Playback Frame Work : In this frame work the Test Engineers are converting manual test cases into automation programs with Two Steps of procedure. A. Recording Operations B. Inserting Check Points A. Recording Operations :- In Test Automation program creation, the Test Engineers are recording S/w Build operations. There are two modes in recording such as Context Sensitive Mode and Analog Mode. In Context Sensitive Mode, the tool is recording Mouse and Keyboard operations with respect to objects and window in build. To select this mode the Test Engineers are using below options. Click “Start Recording” icon Once Test Menu → Record Context Sensitive Option. To record mouse pointer movements with respect to desktop co-ordinates, Test Engineers are using Analog Mode in Win Runner. To select this mode we can use below options.
  • 59. Click “Start Recording” icon Twice Test Menu → Record Analog Ex :- Digital Signatures, Graphs Drawing and Image Movements. “F2” is a short cut key to change from one mode to another mode. Note :- In Analog Mode the Win Runner is Recording Mouse Pointer Movements with respect to Desktop Co-ordinate. Due to this reason the Test Engineers are not changing corresponding window position and monitor resolution. B. Inserting Check Point : After recording build operations, the Test Engineers are inserting check points with respect to expectations. Every check point is comparing Test Engineer given Expected Value and Build Actual Value. There are Four check points in Win Runner. GUI (Graphical User Interface) Check Point Bitmap Check Point Database Check Point Text Check Point GUI (Graphical User Interface) Check Point : To verify properties of Objects, we can use this check point. It consists of 3 sub options. i. For Single Property ii. For Object / Window iii. For Multiple Object i. For Single Property :- To verify one property of one object we can use this option. Ex.-1 : Test Procedure :- Step Action Required I/p Expected O/p No. Open an order in Flight Order No. as Delete Order button 1 Reservation Window Valid “enabled”
  • 60. Build :- Flight Reservation Window Automation Program :- set_window (“Flight Reservation”,1); menu_select_item (“File;Open Order….”); set_window (“Open Order”,1); button_set (“Order No.”, ON); edit_set (“Edit”, “1”); button_press (“OK”); set_window (“Flight Reservation”,1); button_check_info (“Delete Order”, “enabled”, 1); Ex.-2 : Test Procedure :- Step Action Required I/p Expected O/p No. Open an order in Flight Order No. as Insert Order button 1 Reservation Window Valid “disabled” Build :- Flight Reservation Window Automation Program :- set_window (“Flight Reservation”,1); menu_select_item (“File;Open Order….”); set_window (“Open Order”,1); button_set (“Order No.”, ON); edit_set (“Edit”, “1”); button_press (“OK”); set_window (“Flight Reservation”,1); button_check_info (“Insert Order”, “enabled”, 0); Note :- TSL is case sensitive language and it is taking # symbol for comments.