Agile Software Test Strategy

Summary

Use this section to briefly describe the document purpose, scope and the approach in relation to the software under test.


Agile Software Test Strategy

Summary

Purpose

Guiding Principles

Quality and Test Objectives

Test Scope

In Scope

Out of Scope

Test Approach

Test Types

Test Preparation

Acceptance Test Driven Development

Environments

Test Execution

Defect Management

Classifying Defects

Defect LifeCycle

References


Purpose

The purpose of a Test Strategy is to create a shared understanding of the overall approach, tools, targets and timing of test activities. Use this section to describe the context and the approach to assessing the quality of the developed software.

Guiding Principles

Describe the principles underlying the testing approach. Consider the following basic principle as a starting point:

Principle

Description

Shared Responsibility

Everyone is Responsible for Testing and Quality

Test Automation

All types of tests (unit, integration, acceptance, regression, performance) should be automated. Manual testing will only be used for exploratory type testing.

Data Management

Production data must be obfuscated before being used for testing

Test Management

Test cases, code, documents and data will be treated with the same importance as production code.

Quality and Test Objectives

The following quality attributes have been identified as relevant and are used as a basis for the test approach in terms of priority and test targets.

Attribute

Description

Measure and Target

Priority

Correctness

Features and functions work as intended

  • 100% completion of agreed features
  • Severity 1 defects = 0
  • Severity 2 defects = 0
  • Severity 3 defects < 5
  • Severity 4 defects < 10

Must
Have

Integrity

Ability to prevent unauthorised access, prevent information loss, protect from viruses infection, protect privacy of data entered

  • All access will be via HTTPS over a secured connection.
  • User passwords and session tokens are encrypted.

Must
Have

Maintainability

How easy it is to add features, correct defects or release changes to the system.

  • Code Duplication < 5%
  • Code Complexity < 8
  • Unit Test Coverage > 80%
  • Method Length < 20 Lines

Must
Have

Availability

Percentage of planned up-time that the system is required to be operational.

System is available for 99.99% for the scheduled up-time measured through system logs.

Should
Have

Interoperability

Ease with which the system can exchange information with other systems.

User interface renders and functions properly on the following (and later) browsers versions:

  • IE version = 8.0
  • Firefox version = 3.5
  • Safari version = 5.0
  • Chrome version 11.0

Must
Have

Performance

Measures the responsiveness of the system under a given load and the ability to scale to meet growing demand.

  • Apdex Score > 0.9
  • Response Time < 200ms
  • Throughput > 100 rpm

Should
Have

Test Scope

Describe the systems and features in and out of scope.  Consider the following:

In Scope

Out of Scope

Test Approach

Test Types

Use this section to identify the test types being used and tools needed to support the execution. It is also important to include the timing for when each test type will be executed. A standard list of test types is included below:

Type

Definition

Test Tools

Execution / Timing

Unit

Testing that verifies the implementation of software elements in isolation

Integration

Testing in which software elements, hardware elements, or both are combined and tested until the entire system has been integrated

Functional

Testing an integrated hardware and software system to verify that the system meets required functionality

Acceptance

Testing based on acceptance criteria to enable the customer to determine whether or not to accept the system

Performance

Testing confirm that the system meets required performance

Data Conversion

Performed to verify the correctness of automated or manual conversions and/or loads of data in preparation for implementing the new system

Test Preparation

In agile projects user stories are typically used to capture requirements. Ideally, each user story should include acceptance criteria that can be used as the basis for developing test cases. Use this section to describe the approach used to prepare and execute test cases. Below is an example based around ATDD practices.

Acceptance Test Driven Development

Step

Description

Role Participation

1
Discuss

Discuss or workshop to create a shared understanding of the required feature / user stories. Identify real-life scenarios and examples that have realistic context.

  • customer
  • analyst
  • developer
  • tester

2
Distill

Distill the required feature into an executable specification based upon the user stories, examples and acceptance criteria. Specifications are kept in human readable form using the following format:

Feature/Story: Title

In order to [value]

As a [role]

I want [feature]

Scenario Outline: Title

Given [context]

And [more context]

When [event]

Then [outcome]

And [another outcome]

  • analyst
  • developer
  • tester

3
Develop

Develop the required feature using automated test-first practices. Automated acceptance tests are built around the identified scenarios. Automated unit and integration tests are used to support the implementation using the Red, Green, Refactor pattern.

  • developer
  • tester

4
Demonstrate

Demonstrate the implementation by running the acceptance tests and performing manual exploratory tests.

  • customer
  • analyst
  • tester

Environments

This section is used to describe the environments used and the test types performed. The process for migrating/promoting versions of the system between environment should also be included.

Name

Description

Data Setup

Test Usage

Development

This environment is local and specific to each developer/tester machine. It is based on the version/branch of source code being developed. Integration points are typically impersonated.

Data and configuration is populated through setup scripts.

Unit, Functional and Acceptance Tests.

Integration

This environment supports continuous integration of code changes and execution of unit, functional and acceptance tests. Additionally, static code analysis is completed in this environment.

Data and configuration is populated through setup scripts.

Unit, Functional and Acceptance Tests.

Static code analysis

Staging

This environment support exploratory testing

Populated with obfuscated production data

Exploratory testing

Production

Live environment

New instances will contain standard project reference data. Existing instances will have current data migrated into the environment

Production verification testing

Test Execution

Use this section to describe the steps for executing tests in preparation for deployment/release/upgrade of the software. Key execution steps could include:

Test Data Management

Use this section to describe the approach for identifying and managing test data.  Consider the following guidelines:

Defect Management

Ideally, defects are only raised and recorded when they are not going to be fixed immediately. In this case the conditions under which they occur and the severity needs to be accurately recorded so that the defect can be easily reproduced and then fixed. Use this section to describe the approach to defect management.

Classifying Defects

Severity

Description

1
Critical

Defect causes critical loss of business functionality or a complete loss of service has occurred

2
Major

Defect causes major impact to business functionality and there is not an interim workaround available.

3
Minor

Defect causes minor impact to business functionality and there is an interim workaround available.

4
Trivial

Defect is cosmetic only and usability is not impacted

Defect LifeCycle

Use this section to describe how defects are to be recorded, prioritised, resolved and closed. An example process is outlined below:

  1. Identify Defect - Ensure defect can be reproduced. Raise in defect tracking system
  2. Prioritise Defect - Based on severity defect is prioritised in team backlog
  3. Analysis Defect - Analysis acceptance criteria and implementation details
  4. Resolve Defect - Implement changes and/or remediate failing tests
  5. Verify Resolution - Execute tests to verify defect is resolved and no regression
  6. Close Defect - Independent test execution to verify defect is resolved and no regression

References

General Agile Testing

  1. http://en.wikipedia.org/wiki/Agile_testing
  2. http://testobsessed.com/wp-content/uploads/2011/04/AgileTestingOverview.pdf
  3. http://www.io.com/~wazmo/papers/agile_testing_20021015.pdf

Requirements as Stories

  1. http://dannorth.net/whats-in-a-story/
  2. http://www.agileforall.com/2009/05/14/new-to-agile-invest-in-good-user-stories/

Test Execution

  1. http://en.wikipedia.org/wiki/Continuous_integration
  2. http://www.satisfice.com/articles/et-article.pdf

Ruby and Rails Testing

  1. http://www.bootspring.com/2010/08/09/current-state-of-rails-testing/
  2. http://www.ibm.com/developerworks/web/library/wa-rails4/
  3. http://lostechies.com/jimmybogard/2010/08/25/an-effective-testing-strategy/