Documentos de Académico
Documentos de Profesional
Documentos de Cultura
by
QA Department
<dd-Mmm-yy>
Agenda
About software Product Addressing Defects Role of software testing The Goal of software testing
A good software product is expected to meet the following criteria: Fulfill all the Customer and end user requirements like: Functional requirements
Addressing Defects
Process of exercising or evaluating a system or system component by manual or automated means to verify that it satisfies the specified requirements - (IEEE 83a)
Structured process that uncovers the defects in a software product (Myers) Destructive in nature - dismantling the wishful assumption that the code is bug-free (Boris Byzer) Testing is a process of executing a program with the intent of finding errors (Myers)
Primary role Verification -are we doing things right? (Process) Validation - are we doing the right things? (Requirements) Secondary role Confidence to Ship the Product Insight into software process Improve the software testing process
Users Expectations
Reliable Software: Should not crash Should not cause loss of life, money or property Should not cause loss or alteration of data If fails, should do so Gracefully If fails, should recover from its failure easily Attractive Software: Pleasant to use and attractive to the prospective user Attractive and consistent visual design Compatible Software: Should run on all intended hardware platforms Should run on all intended software configurations Should not interfere with the operation of other software Should work with earlier versions of the same software
Efficient Software: Should meet the end users expectations Fit to use by design and by nature Should not monopolize hardware or software
resources
Usable software: Should not annoy its intended user Consistent in its design and not confusing Not be overly complex to learn Support documentation like online help Always indicate to the user what is happening Should not take control away from the user without any indication Installable Software: Should be able to install quickly and easily Should not interfere with any other software Follow the principles of good software Availability of additional information for unsuccessful installations Uninstall in a clean fashion
Usability
Likeability Maintainability
Interoperability
Reliability
Testers Thinking
Are the requirements complete? Is the design scalable? Will the software be Maintainable? How would be the User Experience? Are the Builds stable? Is the product Ready? If the number of bugs Acceptable?
Optimism
Effective test process Characteristics Test approach Collaborative approach Concept of STLC Testing tasks Test deliverables
Test audit
Step #2: Defining and Managing the Test Objectives Identification of:
Usage scenarios
Step #3: Test Progress Monitoring Activities Accurate time accounting information Product and test quality tracking activities
Collaborative Approach
Roles of Developers and Testers are as follows:
Developer briefs tester on: How the product/ module works ? Intentions of each feature Areas of features which are more error/ risk prone Schedule for delivering the remaining feature sets Concerns about hidden/ abstract activities Tester briefs developer on: Test coverage including types and levels of testing Feature(s) as understood by the testers Required information so as to make feature(s) more granular Testing phases and schedule What testers expect from the development team Methods/ details of status reporting/ defect tracking
Concept of STLC
STLC Tasks
To standardize the testing process Eliminate redundancy in functional testing To enforce defect prevention activities To incorporate configuration control STLC is a systematic approach
STLC Phases
Preparing the Test Plan involves the following activities: Analyze scope of the project Product/ test requirement document Develop risk assessment criteria
This involves the following activities: Set up test environment Design test cases: Requirement based and Code-based test cases Decide if any set of test cases to be automated
Test execution involves: Testing initial test cycles, bug fixes and re-testing Final testing and implementation Setup database to track components of the automated testing system i.e. reusable modules
The tasks involved are: Identify the bugs & Defects as per the test cases Reporting the bugs/Defects through bug reporting system
The activities are: Maintaining the configuration of related work products Final review of testing Metrics to measure improvements Replication of the product Product delivery records Evaluate test effectiveness
Software installation
Provide support during acceptance testing Analyze and address the Error/ Defect/ Bug
Test process is complex, time consuming. So testing life cycle should start at the same time as the development life cycle starts, else it may lead to:
Testing Tasks
Test strategy
Ongoing Activities
Configuration Management and Change Control Independent Verification and Validation (IV&V) Interim reviews Milestone reviews Group reviews Peer reviews Quality Processes Compliance Check
Configuration audits
Process audits Technical audits
Test Deliverables
Test Scripts/ Cases/ Scenarios Post Test Validation Form Test Summary Report Benchmarking Report Records on Test Process
Frozen configuration items in Configuration Management tool Problem reports indicating list of open bugs Release sign-off Test metrics Test Library for future use
Test Audit
To verify the adherence to test process in a project To ensure tests are practical, understood and followed To identify adherence to plans To ensure integrity of status reports To ensure test deliverables To ensure review records To ensure risk mitigation
Test Strategy
Defines the scope and general direction for testing of the project
High-level, prepared together with the project plan What are the trade-offs?
Testing Techniques
Specification based (Black box Testing/ Functional) Equivalence partitioning Cause effect graphing Boundary value Analysis
Category partition
Formal Specification based Control Flow based criteria Data flow based criteria
Condition coverage
Path coverage Cyclomatic complexity
Acceptance testing
Interface testing Regression testing Special testing
Unit Testing
Purpose Typos Basic Logic problems Syntax errors Assumptions/ Pre-conditions Developer should test basic functionality and normal processing paths
Expectations Every path/ line of code new or modified should be executed, tested Code inspection should verify functionality Possible values be tested for data entry fields Error cases to be verified and required to end gracefully
Return values Data accuracy Decimal places Help screens Performance tests Memory leaks Specialized hardware
Affected documentation
Interface Local data structures
Boundary conditions
Independent paths Error handling paths
Functional Testing
Purpose
Small groups of modules that are functionality related Expectations
Passing parameters
Functional outputs or module exit values Error cases gracefully
Report formats
Help screens Performance tests
Clean up
Memory leaks Documentation
Integration Testing
Purpose Test all functional groups and areas Testable requirements Hardware/ Software specifications External interfaces Performance tests Documentation
System Testing
Purpose Test the entire system as a whole Assumptions Completed: o Unit Testing o Functional Testing o Integration Testing
Expectations Verification of the system Software requirements Business workflow perspective Final verification of requirements and design
Interface Testing
Purpose Interfaces with the external system Assumptions Completed unit, functional and integration tests Fixed all critical errors
Both normal cases and exceptions should be tested, on both sides of the interface (if both sides exchange data). The interface should be tested for handling the normal amount and flow of data as well as peak processing
Performance Testing
Purpose
Load Test
Stress Test Volume Test
Test Data
Response Time End-to-end Tests and workflows should be performed Tracking tool for comparison
Regression Testing
Purpose
Expectations
End-to-end
Few targeted test cases Typical tests include:
Purpose Purpose of acceptance testing is to verify the system from users perspective Assumptions/ Pre-conditions Completed system and regression Configuration manager
Test data
Final versions of all documents are ready Overview of the testing procedures
Exit decision
Specific procedures Acceptance criteria must be documented
Expectations Verification from users perspective Performance testing should be conducted again Extra time
Field Testing
Purpose Purpose of field testing is to verify the systems in actual user environment Assumptions/ Pre-conditions System and acceptance testing successfully completed
Expectations Verification of the systems working in the actual user environment Pilot test with the final product Pilot system should work during a problem
Thank You
Q&A