Error: A human action that produces an incorrect result

Fault: A manifestation of an error in software. A fault, if encountered may cause a failure

Failure: Deviation of the software from its expected delivery or service.

 General Testing Principles

Testing shows presence of defects

Exhaustive testing is impossible

Early Testing

Defect Clustering

Pesticide paradox

Testing is context dependant

Absence of errors fallacy

 Fundamental Test Process

Test Planning & Control

Test Analysis & design

Test Implementation & Execution

Evaluating Exit Criteria & Reporting

Test Closure Activities


A Developer’s attributes:

Highly valued within the company

Industry standard qualifications

Seen as being creative

Poor communicators

Skills in a very specific area


 A Tester’s attributes:

Rarely valued within a company

No industry standard qualifications

Seen as being destructive

Very good communicators





 When a fault is found:


When fault is found

Additional work

Potential Effect


Prior to testing


Faults generally found at this stage will be documentation based. Faults with Test Specifications and Design docs can pretty much be eliminated by an effective review process.

If the faults are not found at this stage, the development would go ahead. This could in itself create additional faults.


Just prior to products release


A fault found at this stage would probably be a software fault. There could be many reasons for this, but effective testing leading up to this stage should prevent this from occurring.

When the fault is eventually found, software re-work, re-design and additional testing would be required wasting a considerable amount of time, possibly delaying the products release.




Found by a customer




If the customer finds the fault, additional manpower would be required to resolve the problem. Involving additional development work probably resulting in a ‘patch’ being created.

If the fault occurs after the product has been released, the potential cost of the fault could be devastating.



Software Development Models







V & V - Verification and Validation.


Verification: confirmation by examination and provision of objective evidence that specified requirements have been fulfilled [BS7925-1]


Validation: confirmation by examination and provision of objective evidence that the particular requirements for a specific intended use have been fulfilled [BS7925-1]



Test Levels

Component Testing

Integration Testing

Systems Testing

Acceptance Testing



Test Target Types

Functional Testing

Non-Functional Testing

Structural Testing




Testing Related to Change



“Whenever a fault is detected and fixed then the software should be re-tested to ensure that the original fault has been successfully removed.”


Regression Test

“Regression testing attempts to verify that modifications have not caused unintended adverse side effects in the unchanged software (regression faults) and that the modified system still meets its requirements.”



Maintenance Testing

Maintenance Testing involves testing updates to existing software.



Review Participants








Review Process

Entry Criteria






Exit Criteria



Review Types



Technical review

Informal review



Static Analysis can detect:

Unreachable code

Uncalled functions

Undeclared variables

Parameter type mismatches

Possible array bound violations



Specification-based Techniques

Equivalence Partitioning

Boundary Value Analysis

Decision Table Testing

Use Case Testing

State Transition Testing





White-box Testing (non-functional)

Statement Testing

Branch decision Testing

Dataflow Testing

Branch Condition Testing

Modified Condition Decision

Branch Condition Combination

Linear Code Sequence & Jump



Choosing Test Techniques


If we are testing something new, then the following list contains points to consider when choosing a technique:


Are there any regulatory standards involved?

Is there a level of risk involved?

What is the test objective?

What documentation is available to us?

What is the Testers level of knowledge?

How much time is available?

Do we have any previous experience testing a similar product?

Are there any customer requirements involved?



Test Team Members


Project Manager


Business Analyst

Systems Analyst

Technical Designer


Test Leader




The 16 clauses of the IEEE 829-1998 Test Plan:
1. Test Plan identifier.
2. Introduction.
3. Test items.
4. Features to be tested.
5. Features not to be tested.
6. Approach.
7. Item pass/fail criteria.
8. Suspension criteria and resumption requirements.
9. Test deliverables.
10. Testing tasks.
11. Environmental needs.
12. Responsibilities.
13. Staffing and training needs.
14. Schedule.
15. Risks and contingencies.
16. Approvals.





Exit Criteria


§  Ensuring sufficient coverage of code

§  Ensuring sufficient coverage of functionality

§  Testing all of the high risk areas

§  The reliability of the product

§  The amount and severity of acceptable faults

§  The testing completeness deadline 




Test Approaches


  • Model-based approach
  • Dynamic approach
  • Methodical approach
  • Consultative approach
  • Analytical approach




Test Progress Monitoring & Control


  • Percentage of work done in test preparation
  • Number of test cases ran
  • Number of test cases failed
  • Number of test cases passed
  • Number of test cases remaining
  • Testers confidence in the product
  • Time taken so far
  • Testing milestones/deadlines
  • Amount of faults found
  • Severity of faults found




Configuration Management


  • Configuration Identification
  • Configuration Control
  • Status Accounting
  • Configuration Auditing






§  Probability of a failure

§  Severity of possible failure

§  Customer requirement

§  Testing simplicity

§  Feature history

§  Cost of testing







Incident Report

Software ID

Tester’s name




Steps to reproduce




Tool Support for Management of Testing and Tests

Test Management Tools:

Requirements Management Tools:

Incident Management Tools:

Configuration Management Tools:



Tool Support for Static Testing

Review Process Support Tools:

Static Analysis Tools:

Modeling Tools:



Tool Support for Test Specification

Test Design Tools:

Test Data Preparation Tools:



Tool Support for Test Execution and Logging

Test Harnesses and Drivers:

Test Execution Tools:

Test Comparators:

Security Tools:

Coverage Measurement:


Tool Support for Performance and Monitoring

Dynamic Analysis Tools:

Performance Test Tools:

Monitoring Tools:

Tool Support for Specific Application Areas:



Introducing a Tool into an Organization


1)             Create a list of potential tools that may be suitable

2)             Arrange for a demonstration or free trial

3)             Test the product using a typical scenario (pilot project)

4)             Organise a review of the tool

 Exhaustive Testing - Tests everything.


 Successful Test - A successful test is one that finds a fault.