Errors

Error: A human action that produces an incorrect result

Fault: A manifestation of an error in software. A fault, if encountered may cause a failure

Failure: Deviation of the software from its expected delivery or service.

 General Testing Principles

Testing shows presence of defects

Exhaustive testing is impossible

Early Testing

Defect Clustering

Pesticide paradox

Testing is context dependant

Absence of errors fallacy

 Fundamental Test Process

Test Planning & Control

Test Analysis & design

Test Implementation & Execution

Evaluating Exit Criteria & Reporting

Test Closure Activities

 

A Developer셲 attributes:

Highly valued within the company

Industry standard qualifications

Seen as being creative

Poor communicators

Skills in a very specific area

 

 A Tester셲 attributes:

Rarely valued within a company

No industry standard qualifications

Seen as being destructive

Very good communicators

Multi-talented

 

 

 

 When a fault is found:

 

When fault is found

Additional work

Potential Effect

 

Prior to testing

 

Faults generally found at this stage will be documentation based. Faults with Test Specifications and Design docs can pretty much be eliminated by an effective review process.

If the faults are not found at this stage, the development would go ahead. This could in itself create additional faults.

 

Just prior to products release

 

A fault found at this stage would probably be a software fault. There could be many reasons for this, but effective testing leading up to this stage should prevent this from occurring.

When the fault is eventually found, software re-work, re-design and additional testing would be required wasting a considerable amount of time, possibly delaying the products release.

 

 

 

Found by a customer

 

 

 

If the customer finds the fault, additional manpower would be required to resolve the problem. Involving additional development work probably resulting in a 쁯atch being created.

If the fault occurs after the product has been released, the potential cost of the fault could be devastating.

 

 

Software Development Models

VVT

Waterfall

V-Model

Spiral/RAD

 

 

V & V - Verification and Validation.

 

Verification: confirmation by examination and provision of objective evidence that specified requirements have been fulfilled [BS7925-1]

 

Validation: confirmation by examination and provision of objective evidence that the particular requirements for a specific intended use have been fulfilled [BS7925-1]

 

 

Test Levels

Component Testing

Integration Testing

Systems Testing

Acceptance Testing

 

 

Test Target Types

Functional Testing

Non-Functional Testing

Structural Testing

 

 

 

Testing Related to Change

 

Re-test

쏻henever a fault is detected and fixed then the software should be re-tested to ensure that the original fault has been successfully removed.

 

Regression Test

쏳egression testing attempts to verify that modifications have not caused unintended adverse side effects in the unchanged software (regression faults) and that the modified system still meets its requirements.

 

 

Maintenance Testing

Maintenance Testing involves testing updates to existing software.

 

 

Review Participants

Moderator

Author

Reader

Recorder

Inspector

 

 

Review Process

Entry Criteria

Planning

Preparation

Meeting

Results

Follow-up

Exit Criteria

 

 

Review Types

Walkthrough

Inspection

Technical review

Informal review

 

 

Static Analysis can detect:

Unreachable code

Uncalled functions

Undeclared variables

Parameter type mismatches

Possible array bound violations

 

 

Specification-based Techniques

Equivalence Partitioning

Boundary Value Analysis

Decision Table Testing

Use Case Testing

State Transition Testing

 

 

 

 

White-box Testing (non-functional)

Statement Testing

Branch decision Testing

Dataflow Testing

Branch Condition Testing

Modified Condition Decision

Branch Condition Combination

Linear Code Sequence & Jump

 

 

Choosing Test Techniques

 

If we are testing something new, then the following list contains points to consider when choosing a technique:

 

Are there any regulatory standards involved?

Is there a level of risk involved?

What is the test objective?

What documentation is available to us?

What is the Testers level of knowledge?

How much time is available?

Do we have any previous experience testing a similar product?

Are there any customer requirements involved?

 

 

Test Team Members

Client

Project Manager

User

Business Analyst

Systems Analyst

Technical Designer

Developer

Test Leader

Tester

 

 

The 16 clauses of the IEEE 829-1998 Test Plan:
1. Test Plan identifier.
2. Introduction.
3. Test items.
4. Features to be tested.
5. Features not to be tested.
6. Approach.
7. Item pass/fail criteria.
8. Suspension criteria and resumption requirements.
9. Test deliverables.
10. Testing tasks.
11. Environmental needs.
12. Responsibilities.
13. Staffing and training needs.
14. Schedule.
15. Risks and contingencies.
16. Approvals.

 

 

 

 

Exit Criteria

 

  Ensuring sufficient coverage of code

  Ensuring sufficient coverage of functionality

  Testing all of the high risk areas

  The reliability of the product

  The amount and severity of acceptable faults

  The testing completeness deadline 

 

 

 

Test Approaches

 

  • Model-based approach
  • Dynamic approach
  • Methodical approach
  • Consultative approach
  • Analytical approach

 

 

 

Test Progress Monitoring & Control

 

  • Percentage of work done in test preparation
  • Number of test cases ran
  • Number of test cases failed
  • Number of test cases passed
  • Number of test cases remaining
  • Testers confidence in the product
  • Time taken so far
  • Testing milestones/deadlines
  • Amount of faults found
  • Severity of faults found

 

 

 

Configuration Management

 

  • Configuration Identification
  • Configuration Control
  • Status Accounting
  • Configuration Auditing

 

 

 

Prioritization

 

  Probability of a failure

  Severity of possible failure

  Customer requirement

  Testing simplicity

  Feature history

  Cost of testing

 

 

 

 

 

 

Incident Report

Software ID

Tester셲 name

Severity

Scope

Priority

Steps to reproduce

 

 

 

Tool Support for Management of Testing and Tests

Test Management Tools:

Requirements Management Tools:

Incident Management Tools:

Configuration Management Tools:

 

 

Tool Support for Static Testing

Review Process Support Tools:

Static Analysis Tools:

Modeling Tools:

 

 

Tool Support for Test Specification

Test Design Tools:

Test Data Preparation Tools:

 

 

Tool Support for Test Execution and Logging

Test Harnesses and Drivers:

Test Execution Tools:

Test Comparators:

Security Tools:

Coverage Measurement:

 

Tool Support for Performance and Monitoring

Dynamic Analysis Tools:

Performance Test Tools:

Monitoring Tools:

Tool Support for Specific Application Areas:

 

 

Introducing a Tool into an Organization

 

1)             Create a list of potential tools that may be suitable

2)             Arrange for a demonstration or free trial

3)             Test the product using a typical scenario (pilot project)

4)             Organise a review of the tool

 Exhaustive Testing - Tests everything.

 

 Successful Test - A successful test is one that finds a fault.

Share