Importance of Testing From Developer's Point of View: Part II - The Process Flow

Before reading this article, I highly recommend reading my previous parts:

Testing Analysis

Test Process analysis ensures that an overall test process and strategy are in place and are modified, if necessary, to allow successful introduction of the automated test. And the Test goals, objectives and strategies must be defined and the test process must be documented and communicated to the test team. The test process is the fourth phase of the automated testing life cycle methodology.

Testing Review

Test engineers need to analyze the existing development and test process. During this analytical phase, the test engineer should determine whether the current testing process meets the following prerequisites. 
  • Testing goals and objectives have been defined.
  • Testing strategies have been defined.
  • A testing methodology has been defined
  • Testing process implementation is implemented.
  • Testing process is being measured.
  • Testing is involved from the beginning of the SDLC.
  • The testing is conducted in parallel to the SDLC.
  • The schedule allows for process implementation.
  • The budget allows for process implementation.

Test Hierarchy

The following actions are in the hierarchy of the testing process.

  • Plan and design tests
  • Define and create test data
  • Capture and store scripts
  • Automate test execution
  • Analyze results
  • Track defects
  • Emulate multi-user load
  • Make Applications Production Ready

Test Plan

Why Test Plan

A test plan prescribes the scope, approach, resources and schedule of the testing activities and identifies the items being tested, the features to be tested, the testing tasks to be done and the personal responsible for each task and the risk associated with this plan. A Test plan should have the following structures.

Test Plan Identifier

Specifies the unique identifier assigned to this test plan that supports our document naming conventions and this is used to cross-reference this document with the other document.

Introduction

Summarize the software items and software features to be tested.

Test Items

Identifies the test items including their version/revision level. Specify the characteristics of their transmittal media that impacts hardware requirements or indicates the necessiry for logical or physical transformations before testing can begin and supply the following item documentation if they exist:

  • Requirements Specification
  • Design Specification
  • User's Guide
  • Operations Guide
  • Installation Guide

Features to be tested

Identify all software features and combinations of software features to be tested.

Features not to be tested

Identify all the software features that wiil not be tested.

Test Plan Approach

Describe the overall approach to testing. For each major group of features or feature combinations or combinations of software, specify the approach that will ensure that these feature groups are adequately tested.

The test approach should be described in sufficient detail to permit identification of the major testing tasks and estimation of the time required to do each one. And specify the minimum degree of comprehensiveness desired. To identify significant constraints on testing such as test-item availability, testing resource availability and deadlines.

Test Plan Objective

Item Pass/Fail Criteria

Specify the criteria to be used to determine whether each test item has passed or failed testing.

Suspension criteria and Resumption criteria

Specify the criteria used to suspend all or a portion of the testing activity on the test items associated with this plan. Specify the testing activity that must be repeated when testing is resumed.

Test Plan Deliverable

Test Deliverables can identify the deliverable documents and the following documents should be included:

  • Test Plan
  • Test Design
  • Test Case
  • Test Procedure
  • Test Item transmittal reports
  • Test logs
  • Test Incident reports
  • Test summary report
  • Test input & output data

Testing Tasks

This task can identify the set of tasks necessary to prepare for and do the testing. And it can identify all the intertask dependencies and special skills required.

Environment Needs

Environmental needs can specify both the necessary and desired properties of the test environment. It should cover both the hardware and software specifications. And it can identify the special tools needed. Also it can identify the source for all the needs that are not currently available to the test group.

Test Plan Responsibilities in Test Plan

Identify the groups responsible for managing, designing, preparing, executing, witnessing, checking and resolving. These can include the developers, testers, operations staff, user representatives, technical support staff, data administration staff and quality support staff.

Staffing

Staffing and Training needs to specify the skill level to identify the training options for providing the necessary skills.

Schedule

It should include the necessary test milestones. Estimate the time required for each task and specify the schedule for each testing task and test milestone.

Test Plan Risks and Contingencies

Need to identify the high risk assumptions of test plan. And specify the contingency plan for each.

Test Plan Approvals

Need to specify the names and titles of all the persons who must approve this plan. And provide space for signatures and dates.

Test Design

Test Design must specify the refinements of the test approach and to identify the features to be tested by this design and its associated tests. And a test design should cover the following structures.

Test Design Specification Identifier

Specify the unique identifier assigned to this design specification.

Features to be tested

Identify the test items and describe the features and combinations of features that are the object of this design specification.

Approach Refinements

Specify refinements to the approach described in the test plan. And Include test techniques to be used. Also the method of analyzing test results should be identified.

Test Identification

List the identifier and a brief description of each test case associated with the test design.

Feature Pass/Fail criteria

Specify the criteria to be used to determine whether the feature or feature combination has passed or failed.

Test Case

To define a test case identified by a test-design specification and test case specification should have the following structure.

Test Case Specification Identifier

Specify a unique identifier assigned to this test case specification

Test Items

Identify and briefly describe the items and features to be exercised by this test case. For each item, consider the following documents:

  • Requirement specification
  • Design specification
  • User's guide
  • Operations guide
  • Installation guide

Input Specifications

Specify each input required to execute the test case and identify all the appropriate databases, files, terminal messages, memory resident areas and values passed by the operating system. Also specify all the required relationships among inputs.

Output Specifications

Specify all the outputs and features required to the test item and provide the exact value for each required output or feature.

Environmental needs

    Hardware

    Specify the characteristics and configurations of the hardware required to execute this test case.

    Software

    Specify the system and application software required to execute this test case. This may include system software such as operating systems, compilers, simulators and test tools.

Special procedural requirements

Describe any special constraints on the test procedures that execute the test case. These constraints may involve special set up, operator intervention, output determination procedures and special wrap up.

Intercase Dependencies

List the identifiers of the test cases that must be executed prior to this test case. Summarize the nature of the dependencies.