Sunday, September 20, 2009

Testing Models

There are various models which have been presented in the past 20 years in the field of Software Engineering for Development and Testing. Let us discuss and explore into few of the famous models.

The following models are addressed:
Waterfall Model.
Spiral Model.
'V' Model.
'W' Model, and
Butterfly Model.

The Waterfall Model
This is one of the first models of software development, presented by B.W.Boehm. The Waterfall model is a step-by-step method of achieving tasks. Using this model, one can get on to the next phase of development activity only after completing the current phase. Also one can go back only to the immediate previous phase.
In Waterfall Model each phase of the development activity is followed by the Verification and Validation activities. One phase is completed with the testing activities, then the team proceeds to the next phase. At any point of time, we can move only one step to the immediate previous phase. For example, one cannot move from the Testing phase to the Design phase.

Spiral Model
In the Spiral Model, a cyclical and prototyping view of software development is shown. Test are explicitly mentioned (risk analysis, validation of requirements and of the development) and the test phase is divided into stages. The test activities include module, integration and acceptance tests. However, in this model the testing also follows the coding. The exception to this is that the test plan should be constructed after the design of the system. The spiral model also identifies no activities associated with the removal of defects.

'V' Model

Many of the process models currently used can be more generally connected by the 'V' model where the 'V' describes the graphical arrangement of the individual phases. The 'V' is also a synonym for Verification and Validation.
By the ordering of activities in time sequence and with abstraction levels the connection between development and test activities becomes clear. Oppositely laying activities complement one another (i.e.) server as a base for test activities. For example, the system test is carried out on the basis of the results specification phase.

The 'W' Model
From the testing point of view, all of the models are deficient in various ways:
The Test activities first start after the implementation. The connection between the various test stages and the basis for the test is not clear.
The tight link between test, debug and change tasks during the test phase is not clear.

Why 'W' Model?
In the models presented above, there usually appears an unattractive task to be carried out after coding. In order to place testing on an equal footing, a second 'V' dedicated to testing is integrated into the model. Both 'V's put together give the 'W' of the 'W-Model'.

Butterfly Model of Test Development
Butterflies are composed of three pieces – two wings and a body. Each part represents a piece of software testing, as described hereafter.
Test Analysis
The left wing of the butterfly represents test analysis – the investigation, quantization, and/or re-expression of a facet of the software to be tested. Analysis is both the byproduct and foundation of successful test design. In its earliest form, analysis represents the thorough pre-examination of design and test artifacts to ensure the existence of adequate testability, including checking for ambiguities, inconsistencies, and omissions.
Test analysis must be distinguished from software design analysis. Software design analysis is constituted by efforts to define the problem to be solved, break it down into manageable and cohesive chunks, create software that fulfills the needs of each chunk, and finally integrate the various software components into an overall program that solves the original problem. Test analysis, on the other hand, is concerned with validating the outputs of each software development stage or micro-iteration, as well as verifying compliance of those outputs to the (separately validated) products of previous stages.
Test analysis mechanisms vary according to the design artifact being examined. For an aerospace software requirement specification, the test engineer would do all of the following, as a minimum:
Verify that each requirement is tagged in a manner that allows correlation of the tests for that requirement to the requirement itself. (Establish Test Traceability)
Verify traceability of the software requirements to system requirements.
Inspect for contradictory requirements.
Inspect for ambiguous requirements.
Inspect for missing requirements.
Check to make sure that each requirement, as well as the specification as a whole, is understandable.
Identify one or more measurement, demonstration, or analysis method that may be used to verify the requirement’s implementation (during formal testing).
Create a test “sketch” that includes the tentative approach and indicates the test’s objectives.
Out of the items listed above, only the last two are specifically aimed at the act of creating test cases. The other items are almost mechanical in nature, where the test design engineer is simply checking the software engineer’s work. But all of the items are germane to test analysis, where any error can manifest itself as a bug in the implemented application.
Test analysis also serves a valid and valuable purpose within the context of software development. By digesting and restating the contents of a design artifact (whether it be requirements or design), testing analysis offers a second look – from another viewpoint – at the developer’s work. This is particularly true with regard to lower-level design artifacts like detailed design and source code. This kind of feedback has a counterpart in human conversation. To verify one’s understanding of another person’s statements, it is useful to rephrase the statement in question using the phrase “So, what you’re saying is…”. This powerful method of confirming comprehension and eliminating miscommunication is just as important for software development – it helps to weed out misconceptions on the part of both the developer and tester, and in the process identifies potential problems in the software itself.
It should be clear from the above discussion that the tester’s analysis is both formal and informal. Formal analysis becomes the basis for documentary artifacts of the test side of the V. Informal analysis is used for immediate feedback to the designer in order to both verify that the artifact captures the intent of the designer and give the tester a starting point for understanding the software to be tested.
In the bulleted list shown above, the first two analyses are formal in nature (for an aerospace application). The verification of system requirement tags is a necessary step in the creation of a test traceability matrix. The software to system requirements traceability matrix similarly depends on the second analysis.
The three inspection analyses listed are more informal, aimed at ensuring that the specification being examined is of sufficient quality to drive the development of a quality implementation. The difference is in how the analytical outputs are used, not in the level of effort or attention that go into the analysis.

0 comments:

Based on original Visionary template by Justin Tadlock
Visionary Reloaded theme by Blogger Templates
This template is brought to you by Blogger templates

Visionary WordPress Theme by Justin Tadlock Powered by Blogger, state-of-the-art semantic personal publishing platform