Software Testing Guide Part VI

Appendix - 2

Sample system test plan

q

1.0 Introduction

1.1 Purpose

This test plan for ABC version 1.0 should support the following objectives.

1. To detail the activities required to prepare for and conduct the system test.

2. To communicate to all responsible parties the task(s), which they are to perform, and the schedule to be followed in performing the tests.

3. To define the sources of the information used to prepare the plan.

4. To define the test tools and environment needed to conduct the system test.

1.2 Background

ABC is an integrated set of software tools developed to extract raw information and data flow information from C programs and then identify objects, patterns, and finite state machines.

1.3 Test Team Resources

The composition of the ABC Project test team is outlined within the test team profile depicted in Table 2A. This table identifies the test team positions on the project together with the names of the personnel who will fill these positions. The duties to be performed by each person are described, and the skills of the individuals filling the positions are documented. The last two columns reflect the years of experience for each test team member with regard to total test program experience as well as years of experience with the designated test management tool for the project.

Table 2A Test Team Profile

Position

Name

Duties/Skills

Test Experience

(years)

Test Tool

Experience

(years)

Test manager

Mr. X

Responsible for test program, customer interface, recruiting, test tool introduction, and staff supervision. Skills: MS Project, C,

Test tool experience

12

1

Test lead

Miss. A

Performs staff supervision, cost/progress status reporting, test planning/design/ development and execution.

Skills: TeamTest, , SQL, SQA Basic, UNIX, MS Access, C/C++, SQL Server.

5

3

Test engineer

Mr. D

Performs test planning/design/ development and execution.

Skills: Test tool experience, C.

2

5

Test engineer

Miss T

Responsible for test tool environment, network and middleware testing. Performs all other test activities.

Skills: CNE, UNIX, C/C++,

1

-

Junior test

Miss J

Performs test planning/design/ engineer development and execution. Skills: C/C++

-

-

2.0 Test Environment

2.1 Hardware & Software

Hardware

The tests will be conducted on one of the machines licensed to run REFINE/C.

Software

REFINE/C, ABC modules, etc.

2.2 Automated Tools

One tool, which we will use, is DGL. DGL is a test case generator tool. We will build a C grammar to generate random C code. While the generated code will not be syntactically correct in all cases, it will give us some good ideas for use in stress testing our code. Thus, the main purpose of DGL will be to generate arbitrary code that will give the test team ideas on building tests that might otherwise not be considered.

2.3 Test Data

Discuss with the development team to generate test data and create a test database. May contact with the customer also.

3.0 Test Program

3.1 Scope of Testing

This test plan covers a complete "black box" or functional test of the associated program modules (below). We assume the correctness of the REFINE/C grammar and parser. We also assume that "white box" or "glass box" testing will be done prior to these tests. We will not explicitly test the interfaces between modules.

3.2 Area Beyond the Scope

There is no need to test security, recovery, or performance for this system.

3.3 System Test Plan Identifier

Identifier

Document

Remark

ABC-STTP1.0

System Test Plan

MS-Word Document

3.4 Test Items

Program Modules

The program modules are detailed below. The design documents and the references mentioned below will provide the basis for defining correct operation.

Design Documents

These are links to the program module design documents. The indentation shows the dependencies between modules. Modules at the same level do not depend upon each other. The inner level indented module depends upon the outer levels. All depend either directly or indirectly on "Interface ..". Control Dependency Graphs and Reaching Definitions depend upon Control Flow Graphs, but are independent of each other, and so on.

Interface to REFINE and REFINE/C

Control Flow Graphs

Control Dependency Graphs

Reaching Definitions

Data Dependency Graphs

Canonicalize Variables

Variable Dependency Graphs

Cohesion Processing

Slice Module

Test Documents

This set of links [1]points to the root of each individual module's test document tree.

· Interface to REFINE and REFINE/C

· Control Flow Graphs

· Control Dependency Graphs

· Reaching Definitions

· Data Dependency Graphs

· Canonicalize Variables

· Variable Dependency Graphs

· Cohesion Processing

· Slice Module

3.5 Test Schedule

A detailed test schedule (portion of schedule) is given below:

Task

ID

Task Description

Duration

Start

Finish

Responsibility

1

Develop test responsibilities

1d

11/25

11/25

PM

2

Develop review and reporting methods

1d

11/26

11/26

PL

3

Develop management of test sessions

1d

11/27

11/27

PM/PL

4

Verify change-control activities

1d

11/27

11/27

PL

5

Develop issue/problem reporting standards

1d

11/30

11/30

PL

6

Develop test procedures

59d

12/12

2/12

PL

7

Develop functional/usability test procedures

55d

12/12

2/8

PL

8

Develop security test procedures

15d

12/22

1/7

PL

9

Develop stress/volume test procedures

16d

1/7

1/23

PL

10

Develop performance test procedures

14d

1/23

1/27

PL

3.6 Test Approach

The test personnel will use the design document references in conjunction with the ANSI C grammar by Jutta Degener to devise a comprehensive set of test cases. The aim will be to have a representative sample of any possible constuct which the module should handle. For example, in testing the Control Flow Graph module, we would want cases containing various combinations of iteration-statements, jump-statements, and selection-statements.

3.6.1 Fixes and Regression Testing

The complete set of test cases developed for a particular module will be rerun after program changes to correct errors found in that module during the course of testing.

3.6.2 Comprehensiveness

Using the C grammar as a basis for generating the test cases should result in a comprehensive set of test cases. We will not necessarily try to exhaustively cover all permutations of the grammar, but will strive for a representative sample of the permutations.

3.6.3 Pass/Fail Criteria

The initial run of tests on any given module will be verified by one of the test team personnel. After these tests are verified as correct, they will be archived and used as an oracle for automatic verification for additional or regression testing. As an example, when testing the Control Flow Graph module, an output is deemed correct if the module outputs the correct set of nodes and edges for a particular input C program fragment.

3.6.4 Suspension Criteria and Resumption Requirements

N/A

3.6.5 Defect Tracking

To track defects, will be used as a tracking tool. The status of a bug, recorded in , would follow the standard status identification terms and methodology of throughout its lifecycle. Status could be any of the following:

Example:

1. OpenForDev: Bug identified and reported to the development team.

2. OpenForQA: Bug fixed and sent back to QA for verification.

3. Fixed Bug fix verified by QA.

3.6.6 Constraints

We need to plug in our deadline(s) here. Testing deadlines are not firm at this time.

3.6.7 Test Deliverables

These will be detailed at the module test plan level.

We have decided that because the vast majority of the test cases will be submitted by the test team personnel, there is no need for a Test Item Transmittal Report. If there are any test cases submitted by outside parties, we will handle these as if they were a change request. This means that the test team must approve of the reason for the test and the specific test before it will be placed into the appropriate module's test suite.

3.6.8 Dependencies and Risk

Risk

Affect

Resolution

Chances of Occurrence

Attrition Rate

Very High/High/Medium/Low/Very Low

Hardware sharing with other projects

Very High/High/Medium/Low/Very Low

Non Availability of H/W & S/W in time

Very High/High/Medium/Low/Very Low

3.6.9 Approvals

Test Manager/Date

Development Project Manager/Date

Quality Assurance Manager/Date

Appendix - 3

Sample Test Plan for Web Testing:

PRE-DELIVERY TEST PLAN

Plan ID: TPA100

Date: 28-Dec-00

Project : ABC 1.00

AMENDMENT HISTORY

Amendment ID

Version

Date

Amendment

Author

01

1.00

28-Dec-2000

Document Created

Sanjay

Brief description of the Product :

This project is an e-commerce order-placement site. The client is a US based organisation “ABCCables” which deals in various types of cable. The Cables are available with a standard size as well as customisation of the length of Cable is also there. This site is restricted for the user of US only.

< Name of the Company > has developed a dynamic site, that displays the product ranges based on the database and allows a user to select items and place order through Internet.

The testing is being planned on special request from the Project Manager.

Test Objectives

The objective is to conduct black box testing of the system from user perspective.

Test Scope

1. Testing of all the features submitted by the Project Manager is to be done thoroughly. All these documents are available in the baseline library under SRS folder of ABC Cables.

2. Since the project has been used with Internet Explorer so the testing should be done under Netscape. The test cases should cover:

· Verification of maximum no. of Item (150 Item) selected by all the available methods.

· Normal (Below 150 Pounds of weight of selected Item), Boundary (150 Pounds of weight of selected Item) & Extreme (Above 150 Pounds of weight of selected Item) condition of Shopping should be tested on Live site.

· Testing should be done on Local Site for extreme conditions of large quantity (9999) of an Item, Large value (9999999999.99) of an invoice, large number of Items (100) in the Shopping Cart and large number of operations (approx. 50, other than adding item) on the shopping cart.

3. Coverage of the System (Based on Working Model & Specification document):

· Menu Options – 100%.

· Functionality – 100% – based on the specification document submitted by Project Manager.

· User Interface – 75% (Mainly covering the general Look & Feel, screen appearance & Popup Menu of each type of the page).

· Navigation - 30% (Mainly covering Switching from one page to another through Search (15% items) and links (15% items) and movement within the page)

· Security – 75% - Covering in detail Login & Logout for registered users (at least one of each type) and some invalid conditions.

Test Environment

· Pentium based PC and 486 DX.

· Netscape < Ver > and Internet Explorer <Ver >

· Modem – < Detail of the Modem > for example: US-Robotics, Speed 28,800/ Fax Modem with V.34 & V.32 bis.

· Internet site http://000.00.000.00

· Local site http://ABC_ntserver/Abc.com

Stop Criteria

· All the test cases are tested at least once.

· All the critical defects are addressed and verified.

· All the unsolved defects are analysed and marked with necessary comments / status.

· In the last iteration of testing, there are no critical defects reported.

Test Process

· Testing team will prepare Test Case List & Test Cases based on the documents provided by the development team.

· Testing shall be done based on these test cases and Test Report will be prepared.

· The bugs encountered shall be reported using <Name of the Defect tracking System > simultaneously.

· The decision for Project acceptance or rejection will be based on the feedback from the Project Manager.

· The verification of the fixed defects shall be done after the release of fresh software (if required).

· In case of any defects, that do not allow the test case to be tested completely, no further testing will be done on that test case. During verification of the fixed defect, complete testing of the test case will be repeated.

· Testing team will maintain the status and criticality of each reported defect.

· The process of defect finding and verification shall be iterated until stop criteria is satisfied.

Human Resources

All the team members of Testing Group < Name of the Team members > will be involved in testing. However depending on other tasks and resource availability, reallocation may be done.

Reporting

· After the completion of each test cycle Testing Head will submit the defect report and inform whether the software is rejected or not.

Training Requirement

· Domain Area / Application knowledge

The Project Manager has given a proper training.

· Operational

It is acquired by the working on the site since it is an Internet Site.

Sample Test cases For Login Page

Topic: Login

Functionality: Login

Reference(s): Nil.

Data set should cover the normal, boundary and extreme cases of data for each field in the screen concerned.

1. The testing should be done for the following valid conditions at least once:

q Login as a privileged user (with all 7 types) & add 5 items randomly selected & verify the cost of an item against the percentage of discount allowed to that category of user.

q Verify that after successful login, the control goes to the screen-displaying catalogue of ultra spec cables.

q Search at least 2 items with no further sub-levels (after successful login).

q Search at least 3 items with further sub-levels (after successful login).

q Clicking on an item category display that category in details i.e. showing the contents or items available under that category.

2. The testing should be done for the following invalid conditions:

q Try to login with a non-existing login name.

q Allows login into the system without entering login name (blank).

3. Testing for the User Interface issues of the screen should be done covering following points:

q Make sure that control(s), caption(s), text etc. are clearly visible and looking fine.

q Make sure that Alt + Tab is working for switching between different opened applications.

q Make sure that pop-up menu is context sensitive.

q Make sure that the Heading, Sub-heading & Normal text are identifiable clearly by the font size, attribute & color.

q Make sure that the company’s logo is clearly visible.

Testing Information

Environment:

Iteration #:

Start Date:

End Date:

Tester Name:

Status:

Sample Test cases for First page

Topic: General

Functionality: All

Reference(s): Nil.

Data set should cover the normal, boundary and extreme cases of data for each field in the screen concerned.

2. The testing should be done for the following valid conditions least once with:

2.1. About Us:

q Make sure that there is/are no spelling mistake(s).

2.2. Ordering Information:

q Make sure that there is/are no spelling mistake(s).

q Make sure that adequate information is given.

2.3. Terms and Conditions:

q Make sure that there is/are no spelling mistake(s).

q Make sure that adequate information is given.

q Make sure that all 16 hypertext is functioning properly.

3. The testing should be done for the following invalid conditions:

q Try to edit any information directly.

3. User Interface:

q Make sure that control / text / Caption is/are clearly visible.

q Make sure that Alt + Tab is working fine.

q Make sure that Heading, Subheading and normal text clearly identified by font size.

q Make sure that 'catch word' is clearly identified by attribute and/or color.

q Make sure that the hypertext clearly identified by its font color whether it is opened or not.

q Make sure that logo is clearly visible and looking fine.

q Make sure that pop-up menu is context sensitive.

Testing Information

Environment:

Iteration #:

Start Date:

End Date:

Tester Name:

Status:

Sample Test Case for User Registration Page

Topic: Register Me

Functionality: Register

Reference(s): Nil

Data set should cover the normal, boundary and extreme cases of data for each field in the screen concerned.

1. The testing should be done for the following valid conditions:

1.1. Prepare the test 8 data set fulfilling at least following criteria:

q with only mandatory fields

q with optional fields

q with maximum length data in all fields

q with minimum length data in all fields

q with at least 8 states selected from the combo with their respective zip codes NJ(08817-20), NY(11100-01, 11104), AL(12200-10), CT(06030-34), OH(43803-04)

q for address in US

q with ‘ship to’ same as ‘bill to’

q with ‘ship to’ different from ‘bill to’

q at least 1 entry with all fields

q Register me with at least 6 different combinations

q moving to Home at least 2 times and make sure that home page is opened

q searching at least 2 different part IDs

2. The testing should be done for the following invalid conditions:

q Try to enter invalid e-mail ID(without @ in the address)

q Try to break the maximum allowable value in all fields

q Try to break the minimum allowable value in all fields

q Try to enter with zero length of mandatory fields

q Try to enter invalid time

q Try to edit state field

q Try to enter values in ‘Ship to’ fields when ‘same as bill to’ is selected

q Try to enter values for address outside US

q Try to search a non-existing/invalid part ID

3. Testing for the User Interface issues of the screen should be done covering following points:

q as soon as the screen is opened, make sure that the cursor is positioned at the first enterable field

q The unrelated options should be disabled.

q Alt+down key for listing from combo box.

q All the message boxes should have relevant and correct messages.

q Make sure that button(s) is/are accordingly enabled/disabled or displayed according to screen functionality.

q Make sure that control(s), caption(s), text etc. are clearly visible and looking fine.

q Make sure that Alt + Tab is working for switching between different opened applications.

q Make sure that pop-up menu is context sensitive.

q Cut – Copy – Paste with short cut Keys (Like Cntrl C, Cntrl V etc.) are working properly with every input screen as per Windows norms.

q Pasting of any text to Date field should not be allowed.

q Look and feel the appearance of all the controls like Text Boxes, Date Box etc should be normal.

q Check that a scroll bar appears when a long text is entered in an editable control.

q Make sure that the screens are invoked through all their available options.

Testing Information

Environment:

Iteration #:

Start Date:

End Date:

Tester Name:

Status:

Sample Test Case for Search Functionality of the Page

Topic: Search

Functionality: Search

Reference(s): Nil.

Data set should cover the normal, boundary and extreme cases of data for each field in the screen concerned.

4. The testing should be done for the following valid conditions at least once with:

q Make sure that search option is opening the adequate page (the page, which contains the searched Item) and the cursor is positioned on quantity field.

q Search at least 40 items covering each group.

q Try to search at least 20 item from different page.

q Try to search at least 10 item without login. (as a casual user)

5. The testing should be done for the following invalid conditions:

q Try to search non-existing / invalid part_id

q Try to search Empty part_id

Testing Information

Environment:

Iteration #:

Start Date:

End Date:

Tester Name:

Status:

GLOSSARY:

The Tester - Test engineer who manually tests or the Test Supervisor who starts the automatic test scripts

The Test Bed -This comprises of Hardware, Software, Test Scripts, Test Plan, Test Cases, etc

Module or Unit testing: It is the verification effort on the smallest unit of software design – the software component or module.

Integration testing: It is a systematic technique for constructing for constructing the program structure while at the same time conducting tests to uncover errors associated with interfacing.

System testing: It is a series of different tests whose primary purpose is to fully exercise the computer-based system to verify that system elements have been properly integrated and perform allocated functions.

Regression testing: It is the re-execution of some subset of tests that have already been conducted to ensure that changes have not propagated unintended side effects.

Acceptance testing: This test is conducted to enable the customer to validate all requirements.

Black & White box testing: White box testing is a testcase design method that uses the control structure of the procedural design to derive test cases to:

§ Exercise all independent paths within a module at least once.

§ Exercise all logical decisions on their true and false sides

§ Execute all loops at their boundaries and within their operational bound

§ Exercise internal data structures to ensure their validity.

Black box testing is to derive sets of input conditions that will fully exercise all functional requirements for a program.

[1] The links mentioned above, are for redirecting the reader to some pre-existing test documents. Since this is a sample test plan, it is beyond the scope of this document to include actual test documents. It only gives the idea that one can refer to existing test document with hyperlink.

0 comments:

Post a Comment