Thursday, January 31, 2013

Unit Testing using Junit


As we move to a distributed, multi-tiered and heterogeneous computing, the Java™ 2 platform, Enterprise Edition (J2EE™) technology, has become the most popular for developing component-based multi-tiered, distributed enterprise application. The J2EE technology has integrated application clients and applets, Web components (JSP and servlets), and Enterprise JavaBeans™ (EJB™) components. Web and EJB components run on an application server, such as the IBM® WebSphere® Application Server software. As people use J2EE technology to develop large, complex enterprise applications, it is inevitable that different components are assembled cohesively to produce one integrated application.
Before this assembly takes place, it is necessary and critical to properly unit test each component independently. Unit testing each component in isolation significantly reduces bugs and helps to ensure high-quality software. Some developers or testers might argue that unit testing J2EE components is too time-consuming, labor intensive, and error prone. Although it is beyond the scope of this article to delve into the details of unit testing various J2EE components, it will show you how easy and effective it is to unit test the EJB components with the JUnit and JUnitEE test frameworks
Unit testing is a piece of code written by a developer that exercises a very small, specific area of functionality of the code being tested. Usually, a unit test exercises some particular method in a particular context, thus it falls in the broader category of white box testing. Since the need to unit test J2EE components has become apparent, developers have been exploiting the benefits of the JUnit test framework to perform these tests. The JUnit test framework, originally written by Enrich Gamma and Kent Beck, is a framework for unit testing a client-side Java application. It offers several benefits:
  • Simple framework for writing automated, self-verifying tests in Java
  • Support for test assertions
  • Test suite development
  • Immediate test reporting
JUnit provides a text-based command line, as well as AWT-based and Swing-based graphical test reporting mechanisms. The IBM Rational Application Developer integrated development platform (IDE) includes JUnit.
However, for developers who want to unit test within the application server's containers and display results in HTML or XML, the JUnit framework is limited and ineffective. That is why JUnitEE, an open source development community, has extended the JUnit capabilities by implementing the JUnitEE test framework that enables you to run JUnit unit tests within the application server's containers.
This series of two articles explains how you can use IBM tools and technologies to unit test Java applications, EJBs and Web services by using JUnit and JUnitEE test frameworks.
  • Part 1 (this article) starts with an overview of JUnit and JunitEE, and then demonstrates the use of JUnit and JUnitEE within Rational Application Developer IDE to unit test a simple Java application and a stateless session bean. Fist, we present a brief overview of the structure of JUnit and JUnitEE followed by how to develop, configure and setup unit tests in JUnit and JUnitEE test framework. Finally, we demonstrate how to deploy and execute out unit tests in application server environment.
  • Part 2 discusses unit testing Web services by using the JUnit and JUnitEE test frameworks within Rational Application Developer platform, as well as how to deploy and execute unit tests in the IBM WebSphere Application Server Version 6 environment.
To benefit from this article, you need a basic understanding of Web services and of Rational Application Developer. The IBM® developersWorks® Web site and Resources at the end of this article provide introductory materials.
This tutorial was developed and tested using using the final release of JUnitEE Version 1.10 and Rational Application Developer Version V 6.0.2. You can download the trial version of Rational Application Developer and the binary files for this example (seeResources).
JUnit is now the unofficial standard for unit testing Java-based applications. Although the Junit.org Web site provides more compressive information and tutorials (see Resources), this section gives you an overview of the JUnit test framework. The principal purpose of the set up APIs that comprise JUnit is to make writing Java unit test cases fast and easy. At minimum, a JUnit test case has the common structure shown in code Listing 1.

Listing 1. JUnit common structure
 
1. Import junit.framework.TestCase;
2. 
3. Public class AddJavaTest extends TestCase {
4. 
5. protected void setUp() throws Exception
6.     {
7.      // create some object
8.     }
9. protected void tearDown() throws Exception
10. {
11.  //release any recourse that was created in       
12.   setup()
13. } 
14.      public AddJavaTest (String name){
15.            super (name);
16. 
17.      public void testSimpleAddition (){
18.             assertTrue (expect == actual);
19.    }
}

As Listing 1 shows in line 3, all Java™ cases must extend the junit.framework.TestCase which is the core class for JUnit. In line 5, TestCase.setUp() is overridden to initialize or instantiate the object under test. Conversely, in line 9,TestCase.tearDown() is overridden to release any allocated resource. In line 14, a test case must have a single string parameter constructor that passes the argument to its parent class (TestCase) for the purpose of displaying the test case name in a log.
A test case method must be declared public void with no formal parameters. In addition, it is desirable to prefix the test method name with "test" so that the test runner will execute all methods automatically. Lastly, in line 18 an assertion statement is issued to determine the success or failure of a test case. The method assert compares an expected value to an actual value for the particular test scenario. You can use the fail() method to force a test case to fail, for example if you want to force a timeout of an operation. JUnit provides an additional mechanism to determine the success or failure of a test case. Table 1 shows a sampling of different assert and fail method signatures.

Table 1. Assert methods
static voidassertEquals (boolean expected, boolean actual)
Asserts that two booleans are equal.
static voidassertFalse (boolean condition)
Asserts that a condition is false.
static voidassertNotNull (java.lang.Object object)
Asserts that an object isn't null.
static voidassertNotSame (java.lang.Object expected, java.lang.Object actual)
Asserts that two objects refer to the same object.
static voidassertNull (java.lang.Object object)
Asserts that an object is null.
static voidassertSame
(java.lang.Object expected, java.lang.Object actual)
Asserts that two objects refer to the same object.
static voidassertTrue (boolean condition)
Asserts that a condition is true.
static voidfail (java.lang.String message)
Fails a test with the given message.
static voidfailNotEquals
(java.lang.String message, java.lang.Object expected, java.lang.Object actual)
private static voidfailNotSame
(java.lang.String message, java.lang.Object expected, java.lang.Object actual)
private static voidfailNotEquals
(java.lang.String message, java.lang.Object expected, java.lang.Object actual)
private static voidfailSame (java.lang.String message)
(package private) static java.lang.Stringformat (java.lang.String message, java.lang.Object expected, java.lang.Object actual)

Note: The Assert class contains many different overloaded methods. For a complete list of all overloaded assert methods, see this page on JUnit.org: http://www.junit.org/junit/javadoc/3.8.1/.
JUnit provides a TestRunner class to execute test cases. There are various ways to execute tests. Test reports are displayed using graphics and text. To get the most popular graphical results, use junit.swingui.TestRunner andjunit.awtgui.TestRunner. For the less-popular text-based results, use junit.textui.TestRunner, instead.
Additionally, JUnit Test cases may be run from Rational Application Developer's IDE or automated in an Ant build script.
JUnit also provides a way to group tests into a suite by using the junit.framework.TestSuite class. A test suite is a composite of related tests that you can run during the same session. There are two convenient ways that JUnit runs test cases:
  • With the first way, you pass the test class to the TestSuite constructor by using this command:
    TestSuite suite= new TestSuite(testAddJava.class) In this case, the TestRunner extracts all test methods that have a test prefix, and then executes each test case automatically.
  • The alternative way is add each test case by using the TestSuite.addTest method:
    TestSuite suite = new TestSuite(); suite.addTest(new AddJavaTest("testSimpleAddition"));
JUnit provides an effective and easy way to unit test client-side Java applications, but it has some limitations; therefore, testing in each application server container becomes a tedious process. The IBM Rational Application Developer platform features a Web-based Universal Test Client (UTC) that provides a seamless and integrated mechanism for unit testing Enterprise JavaBeans (EJBs). However, the Rational Application Developer UTC is an interactive unit-testing mechanism, thus it falls short when it comes to automating unit tests.
The JUnitEE test framework addresses these limitations, as well as the tedious process. This framework extends the standard JUnit so that it can execute unit tests in an application server container. It is configured in the J2EE Web module of a unit test application, and it uses a TestRunner to output HTML or XML test results. It also includes a TestServlet for an entry point to JUnit test cases. As result, according to JUnitEE.org, building your test harness as a standard J2EE Web offers several benefits:
  • Tests are packaged in a J2EE Web module (in a WAR file), which is easy to deploy and execute.
  • Test cases look just like production code, and they can use the same Java beans that you use as a facade for your EJBs.
  • Tests can be automated by using an Ant script.
This section presents an example of how you can take advantage of the JUnit and JUnitEE test frameworks to unit test a simple stateless session EJB that was developed in the Rational Application Developer IDE. The EJB simply adds two numbers and returns the sum of those numbers. Although this is a trivial function, it includes everything necessary to illustrate this example.
For simplicity, this example is a unit test of a stateless session bean. However, you can unit test other types of EJBs using JUnit and JUnitEE test framework, too. The simplified approach here includes these steps:
  1. Create a simple Java bean calculator.
  2. Create an EJB for a basic calculator.
  3. Generate an EJB client.
  4. Develop JUnit test cases to test both the Java application and the calculator EJB.
  5. Create and configure a JUnitEE test module.
  6. Deploy and execute test cases within the WebSphere Application Server environment.
You can download the source code for all examples in this article (see Resources).
The first example is a simple Java application that adds two numbers and returns their sum. Code Listing 2 shows the class implementation.

Listing 2. Implementation of the Basic Calculator Java Bean
 

package calc;

public class BasicCalculator {

   public double addTwoNumbers(double first, double second) {
 return first + second;
   }
}


Use Rational Application developer to generate a stateless session bean called BasicCalculator.
Code Listing 3 contains a sample of what the EJB should look like.
The BasicCalculatorBean EJB class contains one business method, addTwoNumbers, for calculator function. This business method is added to a remote interface to give clients access to this method. EJBs can be remote, distributed objects. The Rational Application Developer IDE generates client artifacts to access remote objects easily. The test application may be in a separate Enterprise Archive (EAR) so we use remote rather than local interfaces to the EJB.

Listing 3. Implementation of the Basic Calculator EJB 
 

package ejbs;
/**
 * Bean implementation class for Enterprise Bean: BasicCalculator
 */
public class BasicCalculatorBean implements javax.ejb.SessionBean {
 private javax.ejb.SessionContext mySessionCtx;

 public javax.ejb.SessionContext getSessionContext() {
  return mySessionCtx;
 }
 public void setSessionContext(javax.ejb.SessionContext ctx) {
  mySessionCtx = ctx;
 }
 public void ejbCreate() throws javax.ejb.CreateException {
 }
 public void ejbActivate() {
 }
 public void ejbPassivate() {
 }
 public void ejbRemove() {
 }
 
 public double addTwoNumbers(double first, double second) {
  return first + second;
 }
 
}

In this step, follow the Rational Application Developer's smart guides to generate the client artifacts.
After you have implemented the Java Bean Calculator, Basic Calculator EJB and client EJB, you're ready to write JUnit test cases.
Unit testing a Java application with JUnit
Here, you take advantage of the JUnit test structure outlined in Listing 1 to write three test methods to unit test theBasicCalulator Java Bean and EJB.
Listing 4 unit tests illustrate various assert methods to perform positive and negative unit testing. It has the following test methods:
  • The testSimpleAddition() method contains two assertions. The first one asserts that the calculator instance object is not null, i.e. it exists. The second one verifies that the calculator is correctly adding 2 plus 2 by comparing the result to the expected result, 4.
  • The testSimpleAdditionNotSame() method demonstrates negative testing to ensure that two distinct values are not the same. That is, the sum of two numbers (2+2) does not equal 5.
  • The testDesignedToFail() method demonstrates what a failed test case looks like in the JUnitEE framework. In this test case, two arguments, 1 and 1 are summed by the BasicCalculator, and then the result is compared to 3. This test fails. The output shows you exactly where and why it failed.

Listing 4. JUnit Test for Calc Java Bean
package calc.test;

import calc.BasicCalculator;
import junit.framework.TestCase;


public class BasicCalculatorTest extends TestCase {

 BasicCalculator aBasicCalculator = null;

 /*
  * Setup before each test case
  */
 protected void setUp() throws Exception {
  super.setUp();
  aBasicCalculator = new BasicCalculator();
 }
 public void testSimpleAdditionNotSame() throws Exception {

  double result = aBasicCalculator.addTwoNumbers(2, 2);
  assertNotSame("2+2 does not = 5", new Double(result), new Double(5));
 }

 public void testDesignedToFail() throws Exception {
  double result = aBasicCalculator.addTwoNumbers(1, 1);
  assertTrue("1 + 1 = 3", result == 3);

 }
 
 public void testSimpleAddition() throws Exception {

  assertTrue("Calculator instance is not null", aBasicCalculator != null);

  double result = aBasicCalculator.addTwoNumbers(2, 2);
  assertTrue("2+2=4", result == 4);

 }
}




Listing 5. JUnit source code to test Basic Calculator EJB
package calculator.test;

import java.rmi.RemoteException;
import com.ibm.etools.service.locator.ServiceLocatorManager;

import ejbs.BasicCalculator;
import ejbs.BasicCalculatorHome;
import junit.framework.TestCase;

public class BasicCalculatorTest extends TestCase {

    BasicCalculator aBasicCalculator = null;

       /*
        * Setup before each test case
        */
 protected void setUp() throws Exception {
  super.setUp();
  aBasicCalculator = createBasicCalculator();
 }

 protected void tearDown() throws Exception {
  aBasicCalculator = null;
 }

 public void testSimpleAddition() throws Exception {

  assertTrue("Calculator instance is not null", aBasicCalculator != null);

  double result = aBasicCalculator.addTwoNumbers(2, 2);
  assertTrue("2+2=4", result == 4);

 }

 public void testSimpleAdditionNotSame() throws Exception {

  double result = aBasicCalculator.addTwoNumbers(2, 2);
  assertNotSame("2+2 does not = 5", new Double(result), new Double(5));
 }

 public void testDesignedToFail() throws Exception {

  double result = aBasicCalculator.addTwoNumbers(1, 1);
  assertTrue("1 + 1 = 3", result == 3);

 } /*
    * Rational generated code snippet to access EJB
    */

 private BasicCalculator createBasicCalculator() {

  BasicCalculatorHome aBasicCalculatorHome = 
(BasicCalculatorHome) ServiceLocatorManager
    .getRemoteHome(STATIC_BasicCalculatorHome_REF_NAME,
      STATIC_BasicCalculatorHome_CLASS);
  try {
   if (aBasicCalculatorHome != null) {
    return aBasicCalculatorHome.create();
   }
  } catch (javax.ejb.CreateException ce) {
   ce.printStackTrace();
  } catch (RemoteException re) {
   re.printStackTrace();
  }
  return null;
 }

 private final static String STATIC_BasicCalculatorHome_REF_NAME = "ejb/BasicCalculator";

 private final static Class STATIC_BasicCalculatorHome_CLASS = BasicCalculatorHome.class;
}


Unit testing EJBs with JUnit
Because EJBs are managed by their containers they must be looked up in a JNDI directory. Unit testing can be a tedious and difficult process. Fortunately, Rational Application Developer provides code snippets that perform the task of creating an EJB instance for the developer.
The calculator methods of the stateless session EJB are the same as the Java bean implementation. Therefore, you can reuse the underlying JUnit test cases for your EJB unit tests. But you must make a few changes in the setup() method to locate and instantiate the EJB object. Do you recall that the purpose of a setup() method is to initialize the object under test. In this method, a call to createBasicCalculator() is made, which returns a EJB instance of the calculator. The code in this method was generated by Rational Application Developer. The method is shown below in Listing 6.

Listing 6. Locating and accessing the EJB home
private BasicCalculator createBasicCalculator() {

  BasicCalculatorHome aBasicCalculatorHome = 
(BasicCalculatorHome) ServiceLocatorManager
    .getRemoteHome(STATIC_BasicCalculatorHome_REF_NAME,
      STATIC_BasicCalculatorHome_CLASS);
  try {
   if (aBasicCalculatorHome != null) {
    return aBasicCalculatorHome.create();
   }
  } catch (javax.ejb.CreateException ce) {
   ce.printStackTrace();
  } catch (RemoteException re) {
   re.printStackTrace();
  }
  return null;
 }

 private final static String STATIC_BasicCalculatorHome_REF_NAME = "ejb/BasicCalculator";

 private final static Class STATIC_BasicCalculatorHome_CLASS = BasicCalculatorHome.class;
}

To execute your JUnit test cases within the WebSphere Application Server software, you can take advantage of the JUnitEE framework. As mentioned previously, JUnitEE is an extension of the JUnit framework that executes tests in an application server and uses a TestRunner interface to display test results in either HTML or XML format. This section shows how to use both test frameworks together to create a single deployable Web archive (WAR) file that contains your tests and a JUnitEE test servlet to run your Java and EJB unit test cases.
Creating and configuring a JUnitEE test module in Rational Application Developer software is an intuitive process.
  1. First, create a dynamic Web project in Rational Application Developer, named BasicAddJUnitEEWeb.
  2. Place junit.jar and junitee.jar files in the project classpath by placing them in the WEB-INF/lib directory of the Web project.
  3. Create a jar file, named BasicAddUnitTest.jar, containing the two Test case classes. Copy the jar file,BasicAddUnitTest.jar into the WEB-INF/lib directory.
  4. Include the XML code shown below in Listing 7 in the web.xml deployment descriptor.

Listing 7. JUnitEE test servlet mapping in the web.xml Web deployment descriptor
1. <servlet>
2.   <servlet-name>JUnitEEServlet</servlet-name>
3.   <display-name>JUnitEEServlet</display-name>
4.     <servlet-class>org.junitee.servlet.JUnitEEServlet
5.     </servlet-class>
6.   <init-param>
7.       <param-name>searchResources</param-name>
8.       <param-value>BasicAddUnitTest.jar</param-value>
     </init-param>
9. </servlet>
10. <servlet-mapping>
11.    <servlet-name>JUnitEEServlet</servlet-name>
12.     <url-pattern>/TestServlet/*</url-pattern>
       </servlet-mapping>

JunitEE test cases can be executed using either smart or basic mode within the application server environment. Using the smart mode, the JUnitEE TestRunner automatically locates all test classes that end with "Test" or "Tests" and executes them. The smart mode is configured by specifying the name of the JAR file in the deployment descriptor as outlined in Listing 7, lines 6-8. If there are more than one JAR file, delimit them with a comma. During the servlet initialization, the container activates the servlet by calling the init method once. The init method initializes a set of parameters for the servlet instance. The parameter name searchResources is declared in line 7, and in line 8, the value BasicAddUnitTest.jar is assigned to this parameter. As a result, whenever the container activates the JunitEEServlet, JUnitEE looks for all test classes in the BasicAddUnitTest.jar file.
An alternative to the smart test mode is basic test mode. In the basic test mode, the TestRunner executes only test classes that are specified in WEB-INF/testCase.txt file, which is a simple file that contains one JUnit test class name per line, as Listing 8 shows.

Listing 8. Text file with JUnit test package and class names
                
com.ibm.junitee.sample.operation.test.BasicAddJavaTest
com.ibm.junitee.sample.operation.test.BasicAddEJBTest

In the final step, before deploying, make sure that the build path and all JAR dependencies are in place. In addition, make sure that the EAR file contains all of the necessary utility JAR files, as well as the JUnitEE Web module, packaged in a WAR file. When this is complete, proceed to the next section to deploy your tests in the WebSphere Application Server environment.
Thus far, you have completed all of the groundwork, and you have an EAR file that contains your configured and packaged JUnitEE test module (as a WAR file).
After you have deployed the EAR file on the application server, launch your browser and point to the JUnitEE servlet URL. The URL format is http://<hostname>:<portnumber>/BasicAddJUnitEEWeb/TestServlet, where BasicAddJUnitEEWeb is the name of the Web context, the name of the dynamic Web project and TestServlet is the JUnitEE TestRunner servlet. This example uses http://localhost:9080/BasicAddJUnitEEWeb/TestServlet. Figure 1 shows a conceptual view of how information is exchanged between J2EE components.
Illustration of component interaction
A successful run displays all of the tests in JUnitEE TestRunner view, as depicted by Figure 2 The screen shows different ways to execute your unit test. Select the test option that you prefer or enter the name of your test suite in the field provided to start running test cases.

Figure 2. JUnitEE TestRunner screen
JUnitEE TestRunner screen capture
Screen capture of JUnit Test Results for a failed test
Examples in this article demonstrated how you can quickly and easily develop, configure, and execute unit tests of Java and EJB applications by using IBM technologies and open source JUnit and JUnitEE test frameworks to deliver accurate, higher-quality J2EE applications.

SW Design Specification


Software Design Specification


1.       Introduction

This section  provides an overview of this entire document.

1.1.     Project Overview
Describe the client, the problem to be solved, and the intended users.  Explain the context in which your software will be used, i.e. the big picture.

1.2.     Project Scope
Mention the most important features of the system, inputs, data stores, and outputs.  Do not discuss implementation details.  Note any major constraints.

1.3.     Document Preview
Describe the purpose, scope of this document, and intended audience of this document.  Mention the major sections that follow.  Provide references to companion documents.


2.       Architectural Design

This section provides an overview and rationale for the program's data and architectural design decisions.

2.1.     Section Overview
Provide a summary of the contents of this section. 

2.2.     General Constraints
Describe global limitations or constraints that have a significant impact on your system design.  Examples include hardware and software environments, interface requirements, external data representations, performance requirements, network requirements, etc. 

2.3.     Data Design
Describe the structure of any databases, external files, and internal data structures.  You may wish to include references to appendices containing ER diagrams, data, or file formats.

2.4.     Program Structure
Describe the architectural model chosen and the major components.  Include a pictorial representation (or reference to an appendix block or class diagram) of the major components.

2.5.     Alternatives Considered
Discuss the alternative architectural models considered and justify your choice for your architectural design.


3.       Detailed Design

This section represents the meat of your document.  Be as detailed as time allows.

3.1.     Section Overview
Provide a summary of the contents of this section

3.2.     Component n Detail (include a sub-section for each component)
A structured description usually works.  For example, if your components are classes you may wish to include the following subsections
3.2.1.  Description
3.2.2.  Data Members (include type, visibility, and description)
3.2.3.  Methods  (include English or psuedocode descriptions for each one)



4.       User Interface Design

4.1.     Section Overview
Provide a summary of the contents of this section.

4.2.     Interface Design Rules
Describe and justify the conventions and standards used to design your interface.  You may be able to re-use some of the material prepared for CS 480 documents in this section.

4.3.     GUI Components
Note the GUI components or API's provided in the development environment that you plan on using.

4.4.     Detailed Description
Provide a detailed description of the user interface including screen images.  You may prefer to reference an appendix containing the screen snapshots.


5.       Conclusion
Provide an ending to this document with a mention of implementation and testing strategies resulting from this design


6.       Appendices (a list of possibilities)
6.1.     Database Entity-Relationship Diagram
6.2.     Architectural Design Block Diagram(s)
6.3.     Class Diagram(s)
6.4.     Class Sequence Diagram(s)
     6.5       User Interface Screen Snaps

Wednesday, January 30, 2013

Test Plan Format


TEST PLAN FORMAT



1.         TEST PLAN.

1.1       Introduction. This section should summarise the software items and software features to be tested. A justification of the need for testing may be included.

1.2       Test items. This section should identify the test items. References to other software documents should be supplied to provide information about what the test items are supposed to do, how they work, and how they are operated. Test items should be grouped according to release number when delivery is incremental.

1.3       Features to be tested. This section should identify all the features and combinations of features that are to be tested. This may be done by referencing sections of requirements or design documents. References should be precise yet economical, for example:
·                     'the acceptance tests will cover all requirements in the User Requirements Document except those identified in Section 1.4';

·                     'the unit tests will cover all modules specified in the Detailed Design Document except those modules listed in Section 1.4'.

1.4       Features not to be tested. This section should identify all the features and significant combinations of features that are not to be tested, and why.

1.5       Approach. This section should specify the major activities, methods (e.g. structured testing) and tools that are to be used to test the designated groups of features. Activities should be described in sufficient detail to allow identification of the major testing tasks and estimation of the resources and time needed for the tests. The coverage required should be specified.

1.6       Item pass/fail criteria. This section should specify the criteria to be used to decide whether each test item has passed or failed testing.

1.7       Suspension criteria and resumption requirements. This section should specify the criteria used to suspend all, or a part of, the testing activities on the test items associated with the plan. This section should specify the testing activities that must be repeated when testing is resumed.

1.8       Test deliverables. This section should identify the items that must be delivered before testing begins and should identify the items that must be delivered when testing is finished.

            1.8.1.  Items to be delivered before testing.

·         test plan;
·         test designs;
·         test cases;
·         test procedures;
·         test input data;
·         test tools.

1.8.2.  Items to be delivered after testing.

·         test reports;
·         test output data;
·         problem reports.

1.9       Testing tasks.            This section should identify the set of tasks necessary to prepare for and perform testing. This section should identify all inter-task dependencies and any special skills required. Testing tasks should be grouped according to release number when delivery is incremental.

1.10    Environmental needs. This section should specify both the necessary and desired properties of the test environment, including:

·         physical characteristics of the facilities including hardware;
·         communications software;
·         system software;
·         mode of use (i.e. standalone, networked);
·         security;
·         test tools.

1.11    Responsibilities. This section should identify the groups responsible for managing, designing, preparing, executing, witnessing, and checking tests. Groups may include developers, operations staff, user representatives, technical support staff, data administration staff, independent verification and validation personnel and quality assurance staff.

1.12    Staffing and training needs. This section should specify staffing needs according to skill. Identify training options for providing necessary skills.

1.13    Schedule. This section should include test milestones identified in the software project schedule and all item delivery events, such as programmer delivers unit for integration testing, developers deliver system for independent verification. This section should specify:

·         any additional test milestones and state the time required for each testing task;
·         the schedule for each testing task and test milestone;
·         the period of use for all test resources (e.g. facilities, tools, staff).

1.14    Risks and contingencies. This section should identify the high-risk assumptions of the test plan. It should specify contingency plans for each.

1.15    Approvals. This section should specify the names and titles of all persons who must approve this plan. Alternatively, approvals may be shown on the title page of the plan.


2.         TEST DESIGNS. (For each plan, there can be ‘1…n..’ test designs).

2.n.1. Test Design identifier. The title of this section should specify the test design uniquely. The content of this section should briefly describe the test design.

2.n.2.  Features to be tested. This section should identify the test items and describe the features, and combinations of features, that are to be tested. For each feature or feature combination, a reference to its associated requirements in the item requirement specification (URD, SRD) or design description (ADD, DDD) should be included.

2.n.3. Approach refinements. The description should provide the rationale for test-case selection and the packaging of test cases into procedures. The method for analyzing test results should be identified (e.g. compare with expected output, compare with old results, proof of consistency etc). The tools required to support testing should be identified. This section should describe the results of the application of the methods described in the approach section of the test plan. Specifically it may define the:

·         module assembly sequence (for unit testing);
·         paths through the module logic (for unit testing);
·         component integration sequence (for integration testing);
·         paths through the control flow (for integration testing);
·         types of test (e.g. white-box, black-box, performance, stress etc).

2.n.4.  Test case identification. This section should list the test cases associated with the design and give a brief description of each.

2.n.5.  Feature pass/fail criteria.  This section should specify the criteria to be used to decide whether the feature or feature combination has passed or failed.


3.         TEST CASE SPECIFICATION.

3.n.1   Test Case identifier. The title of this section should specify the test case uniquely. The content of this section should briefly describe the test case.

3.n.2   Test items. This section should identify the test items. References to other software documents should be supplied to help understand the purpose of the test items, how they work and how they are operated.

3.n.3   Input specifications. This section should specify the inputs required to execute the test case. File names, parameter values and user responses are possible types of input specification. This section should not duplicate information held elsewhere (e.g. in test data files).

3.n.4   Output specifications. This section should specify the outputs expected from executing the test case relevant to deciding upon pass or failure. File names and system messages are possible types of output specification. This section should not duplicate information held elsewhere (e.g. in log files).

3.n.5   Environmental needs.

3.n.5.1            Hardware. This section should specify the characteristics and configurations of
the hardware required to execute this test case.

3.n.5.2 Software. This section should specify the system and application software required to execute this test case.

3.n.5.3            Other. This section should specify any other requirements such as special equipment or specially trained personnel.

3.n.6   Special procedural requirements. This section should describe any special constraints on the test procedures that execute this test case.

3.n.7   Inter-case dependencies. This section should list the identifiers of test cases that must be executed before this test case. The nature of the dependencies should be summarised.



4.         TEST PROCEDURES.

4.n.1.  Test Procedure identifier. The title of this section should specify the test procedure uniquely. This section should reference the related test design.

4.n.2. Purpose. This section should describe the purpose of this procedure. A reference for each test case the test procedure uses should be given.

4.n.3.  Special requirements. This section should identify any special requirements for the execution of this procedure.

4.n.4.  Procedure steps. This section should include the steps described in the subsections below as applicable.

4.n.4.1            Log. This section should describe any special methods or formats for logging the results of test execution, the incidents observed, and any other events pertinent to the test.

4.n.4.2            Set up. This section should describe the sequence of actions necessary to prepare for execution of the procedure.

4.n.4.3            Start. This section should describe the actions necessary to begin execution of the procedure.

4.n.4.4            Proceed. This section should describe the actions necessary during the execution of the procedure.

4.n.4.5            Measure. This section should describe how the test measurements will be made.

4.n.4.6            Shut downThis section should describe the actions necessary to suspend testing when interruption is forced by unscheduled events.

4.n.4.7            Restart. This section should identify any procedural restart points and describe the actions necessary to restart the procedure at each of these points.

4.n.4.8            Stop. This section should describe the actions necessary to bring execution to an orderly halt.

4.n.4.9            Wrap up. This section should describe the actions necessary to terminate testing.

4.n.4.10 Contingencies. This section should describe the actions necessary to deal with anomalous events that may occur during execution.



5.         TEST REPORTS.

5.n.1   Test Report identifier. The title of this section should specify the test report uniquely.

5.n.2   Description. This section should identify the items being tested including their version numbers. The attributes of the environment in which testing was conducted should be identified.

5.n.3   Activity and event entries. This section should define the start and end time of each activity or event. The author should be identified. One or more of the descriptions in the following subsections should be included.

5.n.3.1            Execution description. This section should identify the test procedure being executed and supply a reference to its specification. The people who witnessed each event should be identified.
5.n.3.2            Procedure results. For each execution, this section should record the visually observable results (e.g. error messages generated, aborts and requests for operator action). The location of any output, and the result of the test, should be recorded.

5.n.3.3            Environmental information. This section should record any environmental conditions specific for this entry, particularly deviations from the normal.