Using Integrated Testing Plans to Deliver Superior Software for Enterprises

Our client’s customers expect that their systems are reliable, fast, and secure and that’s exactly what we deliver for them. How do we do it? Exceptional staff, technology, architecture, and solid integrated testing plans are key components of software development projects at QAT Global.

You’re probably pretty familiar with the first three elements, but what about integrated testing? It’s important because it allows us to manage risk and verify that the system meets the client’s requirements. These requirements include functional, design, performance, and implementation specifications. Testing is a continuous process within the software development life-cycle at QAT Global.

With each enterprise software development project, we produce and provide clients with an integrated test plan and related test results, conduct testing and resolve issues during development to ensure system readiness prior to final implementation. These testing results are communicated to stakeholders for ongoing system validation.

Integrated Testing Plans for Software DevelopmentWondering what’s involved in the integrated testing process?  Here’s a general description of how it works:

First off, this process may be initiated when:

  • The integrated software components have been base-lined
  • The hardware is configured and complete.
  • The Build Instructions are complete and base-lined.

Process Inputs

This table lists the artifacts and infrastructure that must be in place for use by one or more of the Process Activities. It also includes the use of the artifacts and infrastructure elements.

Artifacts/InfrastructureUse
Requirements Traceability Matrix (RTM)The RTM is used for the verification and validation and to help in solving any problem found during testing.
Integrated Software componentsThe Software needs to be built as it will be in Production
Hardware ComponentsThe hardware needs to be configured as it will be in Production
Design DocumentsThe Documents used to develop the code.
Defect Report (DR) Log TemplateThe Defect log is used to capture any defects identified during the test phase and the name of the phase in which the defect was injected.

Process Description

Activity 1: Planning for Testing

The Project Manager plans the process activities using the inputs described in the table. The Project Manager also identifies the Technical Lead, identifies stakeholders for the process, ensures that test activities are planned and ensures the inputs are available. The Technical Lead reports to the Project Manager. Having the client involved in testing and integration, even participating on the testing team, allows the client to know more about the product and ensures that what is being built agrees with the requirements and is what the client wants before production.

Activity 2: Establish Test Environment

  1. The Technical Lead identifies requirements for the test environment.
  2. The Technical Lead determines whether to build test equipment, purchase test tools, develop simulators, use actual hardware, develop test drivers and stubs, and/or reuse existing test products.
  3. If simulators are needed, the Technical Lead plans and monitors simulation development.
  4. The Technical Lead identifies how much testing is required for tools, simulators, test drivers, etc. which are used to qualify the software.
  5. The Technical Lead maintains the test environment, as needed, and oversees the testing team.
  6. The Technical Lead works with the client to determine the client’s level of involvement in testing and integration.

Activity 3: Develop Test Plan

This Test Plan may be part of the SDP or its own document. The Testing may require its own WBS. The Technical Lead may establish and maintain a strategy for acceptance testing as part of the Test Plan. Acceptance testing is performed after Testing, usually by the Client, in the client environment to ensure that the product meets its intended use in the intended environment.

  1. The Test Plan and procedures are prepared by the testing team, independent from the designers and developers and are developed from the Requirements and Design Documents.
  2. The testing team establishes and maintains a Test Plan.
  3. The plan identifies the products to be tested and the overall verification approach; identifies test objectives; identifies regression testing approach; identifies test cases / test threads; defines responsibilities; defines entry, exit, and acceptance criteria for each level of test (i.e., unit, component, system, acceptance); identifies an integration strategy; and defines acceptance testing activities.
  4. The Test Plan verifies the product meets the requirements and validate that the product is suitable for use in the users’ environment.
  5. The Test Plan is approved by the Project Manager.
  6. The Test Plan is placed under CM control per the Configuration Management Process.
  7. The Technical Lead updates the Test Plan and procedures whenever the requirements, design, or implementation changes.
  8. The testing team uses the Requirements Traceability Matrix (RTM) to trace the requirements to products and test cases.

Activity 4: Develop Test Scripts

The testing team uses the RTM and other requirements and design documents to develop test scripts to thoroughly work all component, products, interfaces and functions of the system. The Test Script set depends on the complexity of the system and may include:

  • Test Scripts;
  • Product verification test scripts for COTS products, if any;
  • Interface test scripts;
  • Hardware test scripts;
  • Sub-system test scripts, if part of the design;
  • End to end functional tests;
  • Scripts including intentional data errors to test error handling;
  • Scripts containing security breaches to test System Security;
  • Disaster recovery test scripts;
  • System back-out test scripts;
  • Interrupted power test scripts;
  • Scripts addressing all of the high risks identified for the system operations.

Activity 5: Conduct Testing

  1. The Technical Lead ensures all hardware components are available and integrated (configured).
  2. The testing team executes the test scripts according to the test plan, documenting defects in a Defect Report (DR) Log.  Problem tickets may be used in lieu of the DR Log.  Problem tickets will be tracked and maintained in the Problem Report Database.   PR will be used when a separate tester is assigned for the project.  The PR log will not be used for developer testing.  The problem tickets will follow the steps below.
    1. For all defects, the tester completes the appropriate fields of the DR Log and submits it to the Technical Lead, who assigns the DR.
    2. The responsible party assigned to the DR corrects the problem as directed and completes the appropriate fields of the DR. If the Development team is the responsible party, all software defects are resolved, re-baselined in CM and the test re-started.
    3. If necessary, the solution is verified to solve the problem. For software development, this will mean executing the test cases that failed to verify the results. The Technical Lead and Project Manager  approve the results and complete the DR Log.
    4. If it becomes apparent during the test that additional test scripts are necessary, they are documented in the Test Plan. The test team documents any deviations from the test procedure during test execution and forwards the documentation for these to the Project Manager for review. Serious deviation could cause the Project Manager to declare the test invalid.
    5. The test results document if components meet their requirements, if components meet the criteria defined in the integration strategy, and if the component’s interfaces are compliant with its interface descriptions.
    6. The Technical Lead places the test results under CM control.

Outputs

The following is a common list of the work products which are produced by the execution of the Testing process activities:

  • Updated RTM
  • Test Scripts
  • Defect Reports
  • Hardware Configuration Specification
  • Software Configuration Specification

Exit Criteria

The system passes all required tests!

What do we get by testing software this way?

  • Reduced costs through less waste
  • Faster response to requirements changes and bug reports
  • Better application security
  • Happier clients!
  • Recent Posts
Creative Development and Marketing Director , QAT Global
is the Creative Development and Marketing Director at QAT Global. She has over 20 years of proven success leading corporate marketing, communications, IT, and business strategy development. Karie is responsible for driving creative strategy and execution to develop and produce quality creative technology and marketing solutions that meet internal and external client’s business objectives and goals.
follow me