2. Md. Shaiful Islam
B.Sc. in Computer Science
University of Chennai, India.
Training: Certified Software Test
Professional (CSTP)
Experience: 4 years in Software Testing
2
3. • Software Lifecycle Models
• SLC Process in Industry
• Quality Control
0011 0010 1010 1101 0001 0100 1011
• Q&A
3
4. • Waterfall or Linear Sequential
• Rapid Application Development
• Incremental
0011 • Prototyping 0100 1011
0010 1010 1101 0001
• Spiral
• Joint Application Development
• Rational Unified Process
• Agile Software Development 4
5. System Feasibility
System Feasibility
System Planning &
System Planning &
Requirement Analysis
Requirement Analysis
0011 0010 1010 1101 0001 System Design
0100 Design
System 1011
Coding
Coding
Integration
Integration
Implementation
Implementation
Operation &
Operation &
Maintenance
Maintenance
5
6. Business
Business Business
Business
Modelling
Modelling Modelling
Modelling
Data
Data Data
Data
Modelling
Modelling Modelling
Modelling
0011 0010 1010 1101 0001 0100 1011
Process
Process Process
Process
Modelling
Modelling Modelling
Modelling
Application Module
Application Module Application Module
Application Module
Generation
Generation Generation
Generation
Testing &
Testing & Testing &
Testing &
Delivery
Delivery Delivery
Delivery
6
Team #1
Team #1 Team #2
Team #2
8. • Requirements are development in a series of iteration
• Discussion with client regarding overall system objectives
• Identify requirements
• Conceptual or Gross Design
0011 0010 1010 1101 0001 0100 1011
• Develop Prototype
• Obtain feedback
• Refine requirements
8
9. • More of a project management strategy
and approach;
• Development team, customer management
0011 0010 1010 user 0001 0100work
and 1101 group 1011 together;
• Any lifecycle model is used;
• Lifecycle activities may differ;
9
10. SLC Process for
SLC Process for
Software Development
Software Development
Bespoke Software
Bespoke Software Enhancement of Existing
Enhancement of Existing
Development
Development Software (Large/ Medium)
Software (Large/ Medium)
0011 0010 1010 1101 0001System Analysis
System Analysis
0100 1011 Prepare System Specification
Prepare System Specification
System Design
System Design Design
Design
Coding/ Construction
Coding/ Construction Coding/ Construction
Coding/ Construction
Testing
Testing Testing
Testing
Implementation
Implementation
Implementation & Acceptance
Implementation & Acceptance
Maintenance (During Warranty Period)
Maintenance (During Warranty Period)
10
11. • Template followed, either SSADM or OOAD;
• 4 types of requirements are considered
– Customer Requirements;
– Product Requirements;
– Interface Requirements;
0011 0010 1010 1101 0001 0100 1011
– Implementation Requirements.
• Input and output requirements are identified
• Identify acceptance criteria with reference to
– Scope
– Functionality
– Performance
– Security
11
12. • Draft implementation Plan
• Identify hardware, software and infrastructure requirement
• Review
0011 0010 1010 1101 0001 0100 1011
• Approves and baseline
• Update RTM
12
13. • Template followed, either SSADM or OOAD
• Choose effective design methodology and standards
• Prepare Both HLD and LLD
0011 0010 1010 1101 0001 0100 component
• Product or product 1011 requirements are designed
• Design PI activities;
• Decide on product acquisition type
13
15. • Coding is started as per Schedule;
• Done as per SDD and other required input formats;
• Coding standard is followed;
• Code elements are brought under Configuration
0011 0010 1010 1101 0001 0100 1011
Management at each milestone;
• Conduct code review;
• Update RTM;
• Assemble product components as per design;
15
16. • Documentation is done as follows
– Team member prepares manuals;
– Work hand in hand with other member throughout development
lifecycle;
0011 0010 1010 1101 0001 0100 1011
– Documents are reviewed and approved;
– Documents are configuration controlled;
– Final version is verified.
16
17. • Done as per QC Plan
• Review and Testing Process and Guidelines for
Review and Testing is followed
0011 0010 1010 1101 0001 0100 1011
• PM keeps record of identified defects
• Test result is reviewed and approved
• Delivery package is prepared
17
18. Software Testing & Quality
Control
• Testing is a process of evaluating a system by
manual or automation means and verify that it
satisfies specified requirements or identify
difference between expected and actual result.
0011 0010 1010 1101 0001 0100 1011
• Quality provides customer satisfaction for the first
time and every time. It is the factor affecting an
organizations long term performance and
improves productivity and competitiveness.
18
19. Why Testing?
• Software testing is important as it may cause
mission failure, impact on operational
performance and reliability if not done properly.
0011 0010 Deliver quality software products, satisfy user
• 1010 1101 0001 0100 1011
requirements, needs and expectation.
• Uncover defects before the products install in
production, it can save a huge loss.
19
20. Participants in Testing
• Software Customer
• Software User
0011 0010 1010 1101 0001 Developer
• Software 0100 1011
• Tester
• Information Service Management
• Senior Organization Management
20
21. Recent Major Computer System
Failures
According to news reports in April’04 a software bug was
determined to be a major contribution to the 2003 Northeast
blackout, the worst power system failure in North American
history. The failure involved loss of electrical power to 50
million customers, forced shutdown of 100 power plants, and
0011 0010 1010 1101 0001 0100 1011
economic losses estimated at $6 billion. The bug was
reportedly in one utility company’s vendor_supplied power
monitoring and management systems, which was unable to
correctly handle and report on an unusual confluence of
initially localized events. The error was found and corrected
after examining million of lines of code.
21
22. Software Development Life
Cycle
• Requirement- SRS (Software Requirement Specification)
SRAS (Software Requirement & Analysis
Specification)
FS (Functional Specification)
• Design- HLD (High Level Design)
0011 0010 1010 1101 0001 0100 1011
LLD (Low Level Design)
• Coding- According to code format
• Testing
• Implementation
• Maintenance
22
23. w-Model
Write Test
Requirement Requirement Install Acceptance Test
Logical Design Test Design
0011 0010 1010 1101 0001 0100 1011 Build System System Test
Physical Design Test Design Build Software Integration Test
Unit Test
Code
23
24. Testing Economic & Cost
Traditional Test Continuous Test
Accumulated Accumulated Development Accumulated Accumulated
Test Cost Error Cycle Error Test Cost
Remaining Remaining
0 20 Requirement 10 $10
0 40 Design 15 $25
0011 0010 1010 1101 0001 0100 1011
0 60 Code 18 $42
$480 12 Testing 4 $182
$1690 0 Production 0 $582
24
25. Testing Type
• Static (Review)
• Dynamic (Execution)
Static:
• Only review not execution of the program
0011 0010 1010 1101 0001 0100 1011
Dynamic:
• Structural (logic, white box testing, developer)
• Functional (no logic, black box testing, tester)
25
26. Structural Testing
• Concerned with testing the implementation of the
program
• Focus on the internal structure of the program
0011 0010 1010 1101 0001 0100 1011
• The intention of structural testing is not to be
exercise all the different I/P or O/P condition but
to exercise the different programming structure
and the data structure of the program
26
27. Functional Testing
• Structure of the program is not considered
• Test cases are decided base on the
requirements or specification of the
program or module
0011 0010 1010 1101 0001 0100 1011
• Hence it is called “Black Box” testing
27
28. Some Definition
What is Test Plan?
• Road map for the entire testing activity
What are Test Cases?
• Set of procedures which we execute in our system to find
defects
0011 0010 1010 1101 0001 0100 1011
What is Defects?
• A defect is a variance from a desired product attributes
• Variance from customer/user expectation
28
29. Primary Role of Software Testing
• Determine whether the system meets specification
(Producer View)
• Determine whether the system meets business and user
needs (Customer View)
0011 0010 1010 1101 0001 0100 1011
Role of Tester: Find defect not correcting the defects
Classification of Defects:
• Wrong (ER! = AR)
• Missing (Missing some point)
• Extra (Extra point)
29
30. Testing Levels
• Unit Testing
• Integration Testing
• System Testing &
0011 0010 1010 1101 0001 0100 1011
• Application Under Test (AUT) or
User Acceptance Test (UAT)
30
31. Unit Testing
• LLD
• Module Testing
• Individually Testing
0011 0010 1010 1101 Box Testing
• White 0001 0100 1011
• Developer job
• Test each module individually
• Follow White Box Testing (logic of the program)
31
32. Integration Testing
• LLD+ HLD (Developer+ Tester)
• Communication+ Data Flow
0011 0010 1010 1101 BB= Gray Box
• WB+ 0001 0100 1011
• Integrate two or more module ie.
Communicate between modules
• Follow a White Box Testing (testing the
codes)
32
33. System Testing
• Confirms that the system as a whole
delivers the functionality originally
required.
0011 0010 1010 1101 0001 0100 1011
• Follow Black Box Testing
• Functionality Testing, Tester job
33
34. User Acceptance Testing
• Building the confidence of the client and
users is the role of the acceptance testing
phase
0011 0010 1010 1101 0001 0100 1011
• It is depend on the business scenario
• Red Box Testing (crucial)
34
36. White Box Testing
• Statement Coverage: Execute all statements
at least once
• Decision Coverage: Execute each decision
0011 0010direction at 0100 1011
1010 1101 0001 least once
• Condition Coverage: Execute each decision
with all possible outcome at least once
36
38. Equivalence Partitioning
• A subset of date that is representative of a larger
class
• For example, a program which edits credit limits
within given range ($10000-$15000) would have
0011 0010 1010 1101 0001 0100 1011
3 equivalence classes
a. Less than $10000 (Invalid)
b. Between $10000 and $15000 (Valid)
c. Greater than $15000 (Invalid)
38
39. Boundary Analysis
• A technique that consists of developing test cases and data
that focus on the input and output boundaries of a given
function
• In the same credit limits example boundary analysis would
test:
0011 0010 1010 1101 0001 0100 1011
• Lower Boundary: Plus or minus 1 ($9999 and $10001)
• On the Boundary: ($10000 and $15000)
• Upper Boundary: Plus or minus 1 (14999 and 15001)
39
40. Error Guessing
• Based on the theory that test cases can be
developed based on experience of the test engineer
• For example, in an example where one of the I/P
is the date, a test engineer might try February
29,2001
0011 0010 1010 1101 0001 0100 1011
• One more example is, according to newspaper
reports Microsoft co. losses $100 million only
because their programmer has lack of knowledge
in geographical knowledge.
40
41. Incremental Testing
Top down and Bottom up. A disciplined method of testing the interfaces
between unit-tested programs as well as between system components
Type: Top-down that is HLD
Bottom-up that is LLD
Top-down: Begins testing from the top of the module hierarchy and
works down to the bottom using interim stabs to simulate lower
interfacing modules or programs
0011 0010 1010 1101 0001 0100 1011
Bottom-up:
• Begins testing from the bottom of the hierarchy and works up to the
top
• Bottom-up testing is required the development of driver modules
which provide the test I/P, call the module or program being tested,
and display test O/P
41
42. Thread Testing
• A technique often used during early
integration testing
• Demonstrate key functional capabilities by
testing a string of units that accomplish a
0011 0010 1010 1101 0001 0100 1011
specific function in the application
42
43. Criteria of Testing Level
Entry Level Exit Level
Unit Testing Base Code Logic 100%
complete
Integration Testing Complete Unit Communication &
Testing data flow
0011 0010 1010 1101 0001 0100 1011
System Testing Complete Integration ER=AR
Testing
UAT Implementation Deliver to Customer
Software
Regression Anytime find defect Solve the defect
43
44. Special Test Types
Performance:
• The time taken to complete task
• How performance is measured?
0011 0010 1010 1101 0001 speed
a. Process 0100 1011
b. Response time
c. Efficiency
44
45. Performance Test
• Load (No. Of users)
• Stress (Response time that polling down the
Resource)
0011 0010 1010 1101 0001 0100 1011
• It is designed to test the run time
performance of software
• It occurs throughout all steps in the testing
process (test levels)
45
46. Load
· The maximum no of users a system can
support is called “Load”
· How large can the database grow before
0011 0010 1010 1101 0001 0100degrades
performance 1011
· At what point will more storage space be
required?
46
47. Stress
• Running the software under less conditions
• Low memory, low disk space and so on
• Limiting them to their base minimum
0011 0010 1010 1101 0001 0100 1011
• Pull down resources
47
48. Benefits Realization Test
• It is a test or analysis conducted offer an
application is moved into production
• To determine whether the application is
likely to deliver the original benefits
0011 0010 1010 1101 0001 0100 1011
• This is conducted by the user or client
group who requested the projects
48
49. Configuration, Compatibility,
Recovery & Regression Test
• This testing is performed finding the various supporting
combination of hardware & software
• It is nothing but combination of software itself
• It is nothing but a features built into the application for
handling interruption
0011 0010 1010 1101 0001 0100 1011
• Returning to the actual points/page in the application
• Tester-> 1000-test cases-> 100 defects-> developer->
tester
49
50. Roles and Responsibilities
Test Manager: Manages the entire testing
activity (approve)
Test Leader: Prepare the test plan, review
0011 0010 1010 1101 0001 0100 1011defect tracking, and
test cases, monitor
provide resources.
Test Engineer: Prepare test case design (test
risk & reports)
50
51. Test Environment
• Software Requirements
• Hardware Requirements
• Tools that are needed
0011 0010 1010 1101 0001 0100 1011
51
52. Test Case Design (Manual)
• Test Case ID: Sys_xyz_01 (unique)
• Test Case Description: A short note about the testing
• Test Case Procedure: Each & every step has to be mention in test
case procedure
• Test I/P or Test Data: Input data
• Expected Result: The expected outcome of test cases (as per
0011 0010 1010 1101 0001 0100 1011
requirements)
• Test Execution: What we have received after execution that is actual
result.
• Expected Result = Actual Result, the test is pass otherwise it is fail.
• Test Log: Test log means how many test cases executed and how
many pass and fail? (Result)
52
53. Bug Life Cycle
New (Tester)
Open (Developer)
0011 0010 1010 1101 0001 0100 1011
Fixed (Developer)
Closed (Tester)
Or
Reopen (Developer)
53
54. Defect Tracking
All the failed test execution/defects will come under the defect tracking
Defect Tracking Contains:
• Defect ID: Sys_Def_xyz_01
• Test Case ID: Sys_xyz_01
• Defect Description
• Status of Defect
0011 0010Reproducible
• 1010 1101 0001 0100 1011
• Detected by whom
• Assigned to whom
Reproduce: If developer asks to reproduce any defect tester should
reproduce
54
55. Test Cases
ID Description Test Cases ER AR Status
01 Testing “OK” button Agent It should Accepted Pass
Name>4 accept
chars &
Password
“mercury”
02 Testing “OK” button
Agent It should Not accepted Fail
Name>4 accept
0011 0010 1010 1101 0001 0100 1011
chars &
Password
“mercury”
03 Testing “OK” button Agent It shouldn’t Accepted Fail
Name<4chars accept
& Password
“mercury”
04 Testing “OK” button Agent It shouldn’t Not accepted Pass
Name<4chars accept
& Password
“mercury”
55
56. • Conduct audit before final delivery
• Audit activities are carried out as per QA Process
• Install software in operation environment
– Installation certificate is obtained
0011 0010 – Installation0001 0100 recorded
1010 1101 statistics are 1011
• Acceptance testing are carried out
• User training is provided (if within scope)
• Obtain acceptance certificate
– Acceptance note;
– Acceptance over phone/fax/email 56