Software Testing Guide Part V

i) Test any modifications to the system to ensure that no new problems are introduced and that the operational performance is not degraded due to the modifications.

ii) Any changes to the system after the completion of any phase of testing or after the final testing of the system must be subjected to a thorough Regression test. This is to ensure that the effects of the changes are transparent to other areas of the system and other systems that interface with the system.

iii) The project team must create test data based on predefined specifications. The original test data should come from other levels of testing and then it should be modified along with test cases.

14.8.3 Acceptance Testing

Acceptance testing is performed on a collection of business functions in a production environment, and after the completion of functional testing. This is the final stage in the testing process before the system is accepted for operational use. It involves testing the system with data supplied by the customer or the site visitor rather than the simulated data developed as part of the testing process.

Acceptance testing often reveals errors and omissions in the system requirements definition. The requirements may not reflect the actual facilities and performance required by the user. Acceptance testing may demonstrate that the system does not exhibit the anticipated performance and functionality. This test confirms that the system is ready for production.

Running a pilot for a select set of customers helps in Acceptance testing for an e-commerce site. A survey is conducted among these site visitors on different aspects of the Web site, such as user friendliness, convenience, visual appeal, relevance, and responsiveness.

A sample of test plan and test cases on Web Testing has been added in Appendix – 3.

15 Guidelines to prepare Test Plan

15.1 Preparing Test Strategy

The test strategy should be:

· Risk based – the amount and rigour of testing at each stage corresponds to the risk of failure due to errors.

· Layered and staged – aligned with the development process, testing aims to detect different errors at each stage

· Prepared early – to allow testing to be planned, resourceful and scheduled and to identify any tool requirements

· Capable of automation – throughout the life cycle, there are opportunities for automated support, particularly in the areas of requirements testing test execution and test management.

Since the strategy is aimed at addressing the risks of a client/server development, knowledge of the risks of these projects is required.

15.2 Standard Sections of a Test Plan

A standard test plan contains different sections. Those sections with their explanations are given below:

1. Introduction

Set goals and expectations of the testing effort. Summarise the software items and software features to be tested. The purpose of each item and its history may be included.

References to the following documents, if they exist, are required in the highest-level test plan:

· · Project authorisation

· · Project plan

· · Relevant policies

· · Relevant standards

In multilevel test plans, each lower level plan must reference the next higher level plan.

1.1 Purpose

Describe the purpose of the test plan. Multiple components can be incorporated into one test plan

1.2 System Overview

This section provides an overview of the project and identifies critical and high-risk functions of the system.

1.3 Test Team Resources

The composition of the ABC Project test team is outlined within the test team profile depicted in Table 9.1. This table identifies the test team positions on the project together with the names of the personnel who will fill these positions. The duties to be performed by each person are described, and the skills of the individuals filling the positions are documented. The last two columns reflect the years of experience for each test team member with regard to total test program experience as well as years of experience with the designated test management tool for the project.

Table 9.1 Test Team Profile

Position

Name

Duties/Skills

Test Experience

(years)

Test Tool

Experience

(years)

2. Test Environment

2.1 Test Environment

The test environment mirrors the production environment. This section describes the hardware and software configurations that compose the system test environment. The hardware must be sufficient to ensure complete functionality of the software. Also, it should support performance analysis aimed at demonstrating field performance.

2.2 Automated Tools

List any testing tools you may want to use.

2.3 Test Data

Working in conjunction with the database group, the test team will create the test database. The test database will be populated with unclassified production data.

3. Test Program

3.1 Scope of testing

List of the features to be tested, such as particular field and expected values, etc., Identify software features to be tested. Identify the Test-Design Specification associated with each Feature

3.2 Areas beyond the scope

Identify all features and significant combinations of features, which will not be tested, and the reasons.

3.3 Test Plan Identifier

Specify a unique identifier to be used when referring to this document.

Naming Convention for Test Identifier

ABCXXYYnn

ABC - First 3 letters of the project name

XX - Type of Testing Code

For e.g.

System Testing - ST

Integration Testing -IT

Unit Testing -UT

Functional Testing - FT

YY - A particular document code/ abbreviation within the type of testing.

For e.g.

Test Plan - TP

Test Case -TC

Test Result - TR

Test Specifications - TS

Defect Reports - DR

nn - Version Serial Number

Identifier

Document

Remark

ARCSTTP1.0

System test Plan

MS-Word Document

3.4 Test Item

Identify the test items including their version/revision level. Also specify characteristics of their transmittal media which impact hardware requirements or indicate the need for logical or physical transformations before testing can begin. (An example would be that the code must be placed on a production server or migrated to a special testing environment separate from the development environment.)

Supply references to the following item documentation, if it exists:

· · Requirements specification

· · Design specification

· · User guide

· · Operations guide

· · Installation guide

· · Reference any incident reports relating to the test items.

Items that are to be specifically excluded from testing may be identified.

3.5 Test Schedule

Include test milestones identified in the software project schedule as well as all item transmittal events. Define any additional test milestones needed. Estimate the time required for each testing task. Specify the schedule for each testing task and test milestone. For each testing resource (that is, facilities, tools, and staff), specify its periods of use.

3.6 Test Approach

Describe the general approach to testing software features and how this approach will ensure that these features are adequately tested. Specify the major activities, techniques, and tools, which will be used to test the described features. The description should include such detail as identification of the major testing tasks and estimation of the time required to do each one.

3.6.1 Test Coverage

(Determine the adequacy of test plan) Indicate branch or multiple location. If all the conditions are covered in test cases.

3.6.2 Fixes and Regression Testing

3.6.3 Preparation of test Specification

This contains the criteria and reference used to develop test specification. Type of testing (Black or White Box Testing is to be mentioned).

3.6.4 Pass/Fail Criteria

Specify the criteria to be used to determine whether each test item has passed or failed testing. If no basis exists for passing or failing test items, explain how such a basis could be created and what steps will be taken to do so.

3.6.5. Suspension criteria and resumption requirements

Specify the criteria used to suspend all or a portion of the testing activity on the test items associated with this plan. Specify the testing activities that must be repeated, when testing resumes.

3.6.6 Defect Tracking

To track defects, a defect workflow process has to be implemented.

3.6.7 Constraints

Identify significant constraints on testing such as test item availability, testing resource availability, and deadlines.

3.6.8 Entry and Exit Criteria

3.6.9 Test Deliverables

Identify all documents relating to the testing effort. These should include the following documents: Test Plan, Test-Design Specifications, Test-Case Specifications, Test-Procedure Specifications, Test-Item Transmittal Reports, Test Logs, Test-Incident Reports, and Test-Summary Reports.

Also identify test input/output data, test drivers, testing tools, etc.

3.6.10 Dependencies and Risk

Identify the high-risk assumptions of the test plan. Specify contingency plans for each.

3.6.11 Approvals

Specify the names and titles of all persons who must approve this plan. Provide space for the signatures and dates.

16 Amendment History

V1.0 – First Release

17 Guideline for Test Specifications

An overall plan for integration of the software and a description of specific tests are documented in the test Specification. This document contains a test plan and a test procedure, is a work product of the software process, and becomes part of the software configuration. A history of actual test results, problems or peculiarities is recorded in this document.

Sr.

Test Items

Test Condition

Expected Results

Observed Results

Remarks

References

1. Kaner, Cem (1997). Improving the Maintainability of Automated Test Suites. www.kaner.com/lawst1.htm

2. Myers, Glenford J. (1978). The Art of Software Testing. John Wiley & Sons.

3. Pressman, Roger S. (5/e). Software Engineering – A practitioner’s Approach. McGraw Hill.

4. Beizer, Boris, (1995). Black Box Testing – Techniques for Functional Testing of Software and Systems. John Wiley & Sons

5. Dustin, Elfriede ; Rashka, Jeff ; Paul, John; (1999). Automated Software Testing – Introduction, Management and Performance. Addison Wesley.

Appendix – 1

List of Testing Tools:

Life-Cycle

Phase

Type of Tool,

Tool Description

Tool Example

Business Analysis Phase

Business Modeling Tool

Allow for recording

definitions of user needs and automating the rapid

construction of flexible,

graphical, client-server

applications

Oracle Designer 2000, Rational Rose

Configuration Management Tools

Allow for baseliningimportant data repositories

Rational ClearCase,

PVCS

Defect Tracking Tools

Manage system life-cycle defects

TestTrack, Census,

PVCS Tracker, Spyder

Technical Review Management

Facilitate communication,

while automating technical review/inspection process

ReviewPro (Software Development Technologies)

Documentation Generators

Automate document generation

Rational SoDA

Life-Cycle

Type of Tool,

Tool Description

Tool Example

Requirements Definition Phase

Requirements Management Tools

Manage and organize

requirements; allow fortest procedure design; allow

for test progress reporting

Rational Requisite Pro, QSS DOORS

Requirements Verifiers

Verify syntax, semantics,

and testability

Aonix Validator/Req

Use Case Generators

Allow for creation of

use cases

Rational Rose

Life-Cycle

Type of Tool,

Tool Description

Tool Example

Analysis and Design Phase

Database Design Tools

Provide a solution fordeveloping second- ,

generation enterprise

client-server systems

Oracle Developer 2000, Erwin, Popkins ,Terrain byCayenne

Application Design Tools

Help define softwarearchitecture; allow for object-oriented analysis,

modeling, design, and construction

Rational Rose, Oracle Developer 2000, Popkins, Platinum, Object Team by Cayenne

Structure Charts and Sequence Diagrams

Help manage processes Flowcharts,

Micrografx and FlowCharter 7

Test Procedure Generators

Generate test procedures from requirements or

design or data and object models

Aonix Validator, StP/T from IDE, Rational TestStudio

Life-Cycle

Type of Tool,

Tool Description

Tool Example

Programming Phase

Syntax Checkers/ Debuggers

Allow for syntax checking and debugging capability; usually come with built-in

programming language compiler

Miscellaneous language compilers

(C, C++, VB, Powerbuilder)

Memory Leak and Runtime Error Detection Tools

Detect runtime errors and memory leaks

Rational Purify

Source Code Testing

Verify maintainability, Tools portability, complexity, cyclomatic complexity, and

standards compliance

CodeCheck from

Abraxas Software,

Visual Quality from McCabe & Associates

Static and Dynamic Analyzers

Depict quality and structure of code

LDRA Testbed, Discover

Various Code Implementation Tools

Depending on the application, support code generation, among other things

PowerJ, JbuilderSilverStreamSymantec Café

Unit Test Tools

Automate the unit testing

process

MTE from Integrisoft

Life-Cycle

Type of Tool,

Tool Description

Tool Example

Metrics Tools

Code (Test) Coverage Analyzers or Code Instrumentors

Identify untested code and

support dynamic testing

STW/Coverage, Software Research TCAT, Rational Pure Coverage, IntegriSoft, Hindsight and EZCover

Usability Measurements

Provide usability testing as conducted in usability labs

ErgoLight

Life-Cycle

Type of Tool,

Tool Description

Tool Example

Other Testing Life-cycle Support Tools

Test Data Generators

Generate test data

TestBytes, Rational

Performance Studio

Prototyping Tools

Allow for prototyping of applications, using

programming languages like Visual Basic or using tools like Access 97

VB, Powerbuilder

File Compare Utilities

Allow for searching for

discrepancies between files that should be identical in content

Often part of capture/playback tools such as Rational’s Team Test, GMR Technologies’ D2K/PLUS, and

Software Research's

EXDIFF

Simulation Tools

Simulate application to

measure for scalability,

among other tasks

OPNET

Life-Cycle

Type of Tool,

Tool Description

Tool Example

Testing Phase

Test Management Tools

Allow for test management

Rational Suite TestStudio,Test

Director from

Mercury Interactive

Network Testing Tools

Allow for monitoring, measuring, testing, diagnosis of performance across the entire network

NETClarity, Applied

and Computer Technology ITF

GUI Testing Tools-(

Capture/Playback)

Allow for automated GUI tests; capture/playback tools interactions with online systems, so they may be replayed automatically

Rational Suite Test Studio, Visual Test,

Mercury Interactive’s

WinRunner, Segue’s

Silk, STW/Regression

from Software Research, Auto Scriptor Inferno,

Automated Test

Facility from Softbridge, QARUN

from Compuware

Non-GUI Test Drivers

Allow for automated

execution of tests for

products without a graphical

user interface

Load/Performance Testing Tools

Allow for load/performance and stress testing

Rational Performance

Studio

Web Testing Tools

Allow for testing of Web Applications, and so on

Segue’s Silk, , Java ParaSoft’s Jtest

Environment Testing

Various testing tools are on

Tools the market for various testing environments

Mercury Interactive’sXRunner,

Rational’s Prevue-X

0 comments:

Post a Comment