Tuesday, June 9, 2015

Review – The Static Testing


This article is about the review and its’ importance. We generally ignore the test cases review because of time constraint. As we know testers duty is to identify the defects in the software developed. What happens if they misunderstood the requirements? They will raise the wrong defects which result in loss of quality development time. A review helps a lot to avoid this tricky situation. So we should manage our schedule to have the review as a key component, which plays a key role of the project success.The review is not only limited to the test cases but from the requirements to test cases i.e., Business requirements understanding and analysis, Test scenarios, Traceability, Completeness and correctness of the test cases.
Review is a static testing technique i.e., testing by not executing the code. This can be done efficiently with the combination of human and tools effort.

Ex: We can use the Microsoft Excel/Word for spell check and grammatical errors. We can manually review the scenarios covered and test design techniques like BVA and or ECP are used.

The below mentioned items should be reviewed in the same priority order.
1.       Business Requirements
2.       Test Scenarios
3.       Test Cases



Business Requirements Review:
Testing team should understand the business requirements first in order to proceed with their regular testing activities. To have proper understanding of the requirements, the requirements should be properly document with expected level of details.
The requirements checklist assist in determining whether the requirements are documented, traceable, correct, complete, unambiguous, consistent, verifiable and approved.
Here I am concentrating from the testing point of view so given only a subset of the overall checklist i.e., specific to the testing team.

Entry Criteria: BRD/SID should be signed off or near to sign off

First fill the below table then continue with the checklist.
Version
BRD/SID NAME
Author(s)
Reviewer(s)
Designation











Business requirements review Checklist.

ID
Items to Verify
Y/N/NA
Comments
1
Is each requirement uniquely and correctly identified?


2
Is each requirement traceable to its source (including derived requirements)?


3
Are all requirements written at a consistent and appropriate level of detail?


4
Are individual requirements rated (or ranked), with descriptions of priority provided?


5
Do the requirements provide an adequate basis for design and system test?


6
Is each requirement verifiable by testing, demonstration, review, or analysis?


7
Are all internal cross-references to other requirements correct?


8
Does each functional requirement specify input and output, as well as function, as appropriate?


9
Are validity checks on the inputs defined?


10
Is the exact sequence of operations described?


11
Have all dependencies on other systems been identified? (Applications or application interfaces, databases, communications subsystems, networking, etc.)


12
Are specific responses to abnormal situations needed?
(e.g., overflow, communication facilities, error handling/recovery)


13
Are any design or implementation constraints described?


14
Have business scenarios been constructed to illustrate (or draw out) the requirements?


15
Are all reliability, recoverability (business continuity), and performance requirements properly specified?


16
Are all security requirements properly specified?


17
Have all data privacy requirements been included?


18

Are time critical functions identified, and timing criteria specified for them?

 

 

19
Have any regulatory, legislative, or standards-driven requirements been addressed?


20
Have all quality attributes (characteristics) been properly specified (i.e. efficiency, flexibility, interoperability, maintainability, portability, reusability, usability, availability)


21
Are the requirements free of duplication and conflict with other requirements?


22
Does each requirement have only one interpretation? If a term could have multiple meanings, is it defined?


23
Are there measurable acceptance criteria for each functional and non-functional requirement?


24
Is each requirement in scope for the project?


25
Are all requirements actually requirements, not design or implementation solutions?


26
Are the time-critical functions identified, and timing criteria specified for them?


27
Have internationalization issues been adequately addressed?




Exit Criteria:
Repeat the review of business requirements till most of the checklist responses say ‘Y’ and remaining ‘NA’ with supporting comments.
Test Scenarios Review:

Once the business requirements understanding and review is completed tester by coordinating with the lead should prepare the test strategy for that BRD/SID. Based on the strategy tester should go for test scenarios preparation. Once the scenarios are completed test lead/BA/SA or appropriate person should take care of the scenarios review with below mentioned check list.

The scenarios checklist assists in determining whether all the business requirements are covered in the scenarios and are documented, traceable, correct, complete, unambiguous, consistent, verifiable and approved.

Entry Criteria:
BRD/SID should be reviewed
Test Strategy should be reviewed
First fill the below table then continue with the checklist.
Version
BRD/SID/Strategy/Scenario Document Name
Author(s)
Reviewer(s)
Designation











Scenarios review Checklist.
ID
Items to Verify
Y/N/NA
Comments
1
Is the document spell checked?


2
Is the correct version of the BRD/SID referred?


3
Has the correct template been used?


4
Does each scenario possess a unique identifier?


5
Is there at least one positive scenario exist for each requirement?


6
Is there at least one negative (error handling/recovery or exception) scenario exist for each requirement?


7
Is BVA technique used to write the scenarios?


8
Is ECP technique used to write scenarios?


9
Is the business requirement split in to maximum possible scenarios?


10
Are all requirements covered and traceability exist?


11
Are all scenarios written at a consistent and appropriate level of detail?


12
Is test data requirements identified?


13
Are the scenarios free of duplication?


14
Have the related areas that could possibly be affected by the implementation of the requirement been identified and included in the test cases



Exit Criteria:
Repeat the review of scenarios till most of the checklist responses say ‘Y’ and remaining ‘NA’ with supporting comments.

Test Cases Review:

Once the scenarios review is successful, tester prepares the test cases making sure all the scenarios are covered and by making all the possible scenarios are covered in one test case. Once the test cases are completed test lead or appropriate person should take care of the test cases review with below mentioned check list.

The Test Cases checklist assists in determining whether all the scenarios are covered in the test cases and are documented, traceable, correct, complete, unambiguous, consistent, verifiable and approved.

Entry Criteria:
BRD/SID should be reviewed
Test Strategy should be reviewed
Scenarios should be reviewed 

First fill the below table then continue with the checklist.

Version
BRD/SID/Strategy/Scenario/Test Case Document Name
Author(s)
Reviewer(s)
Designation











 Test Case review Checklist.

ID
Items to Verify
Y/N/NA
Comments
1
Is the document spell checked?


2
Does each Test Case possess a unique identifier?


3
Are the Test Cases traceable backwards to requirements and design?


4
Are the Test Cases traceable forward to test results?


5
Do the Test Cases specify the desired test procedures (steps, actions, activities) for executing the tests?


6
Do the Test Cases define the input values required for test execution?


7
Do the Test Cases define output values (expected results) required for test execution?


8
Do the Test Cases record the actual results for the tests?


9
Do the Test Cases specify test data requirements?


10
Do the Test Cases specify pre-conditions for test execution?


11
Do the Test Cases specify post-conditions for test execution?


12
Do the Test Cases provide adequate space for capturing detailed comments?


13
Do the Test Cases indicate whether the test passed or failed?


14
Has the Test data been embedded into the test cases?



Exit Criteria:
Repeat the review of test cases till most of the checklist responses say ‘Y’ and remaining ‘NA’ with supporting comments.



 Review Findings Summary Instructions:

A Review Findings Summary is a tool created to document and track anomalies and issues identified during reviews.

The Review Findings Summary contains the following information:

Item
Definition
Review Type
Peer Review or Formal Review
Artifact
The category of the artifact under review, such as: Business Requirements Document, SID, Test Scenarios and Test Cases
Author
The person who created the work product under review.
Project
The official project name.
Version
The version number of the work product under review.
Date Review Started
The date of the review meeting.
Date Review Closed
The date all anomalies, issues and action items are closed.
Identifier
A unique identifier that permits identification and sorting; suggested Project acronym + sequential number (i.e., PROJ0001)
Anomaly Category
DC=Documentation Content, TR=Traceability
Anomaly or Issue
Items identified and described during the review.
Resolution
The solution for the identified anomaly.
Date Resolved
The date an issue was resolved and the Review Team agrees it was resolved correctly.
Status
The various states through which an anomaly passes on the way to resolution and closure. The anomaly states are:
·         Submitted – when an item is logged and reported for repair.
·         Assigned – when an item is assigned for repair.
·         Opened – when an anomaly is assigned for correction.
·         Deleted – when an item is originally reported as an anomaly, but later deleted because the item is either a duplicate or not an anomaly.
·         Resolved - when an anomaly is corrected and sent for review or verification.
·         Re-Opened – when an anomaly is closed and then reopened for modification.
·         Returned - when an anomaly is reviewed, verified as "incorrect", and returned to author.
·         Verified - when an anomaly is reviewed and verified as "correct".
·         Closed - when an anomaly is successfully reviewed and closed with a resolution and resolution date.
·         Deferred - when an anomaly is designated for correction at a later date.
·         Duplicated – when an item is assessed to be a duplicate of a prior record.
·         Escalated – when an item requires evaluation by management.
Note: The statuses listed can be changed based on convenience.
Impact
They are classified in three levels:
·         High Impact
·         Medium Impact
·         Low Impact



<Review Type> Review Findings Summary

Artifact:                                 Author:                                              Project:          
Version:                                  Date Review Begun:                         Date Review Closed:

Project acronym-number

Anomaly Category

Anomaly or Issue
Date Resolved
Status
Impact


Anomaly or Issue:
Location:
Resolution:





Anomaly or Issue:
Location:
Resolution:





Anomaly or Issue:
Location:
Resolution:





Anomaly or Issue:
Location:
Resolution:





Anomaly or Issue:
Location:
Resolution:





 Common Review Defects, their Causes and suggested Preventive Measures:


The most common defects reported during the review of test cases and the causes for each of them along with the suggested preventive actions are tabulated below. Depending on what the review defects trend is in the project, these actions can be implemented accordingly.

Defects
Causes
Preventive Actions
Incomplete Test Cases

Missed negative test cases
·         Inadequate Functionality Knowledge

·         Inadequate Testing experience


·         Too Much or too little information in the requirements

·         Oversight

·         Changes to requirements after preparation of test cases
·         Provide application specific Training

·         Provide training in Testing concepts and methods

·         Do a thorough Requirements review before test case preparation

·         Use Test Case Review Checklist for all reviews

·         Periodic Requirements review before submitting test cases for review
No Test Data

Inappropriate/Incorrect Test data
·         Inadequate information in the Requirements


·         Inadequate Functionality Knowledge



·         Thorough Requirements review before test case preparation

·         Provide application specific Training
Incorrect Expected behavior
·         Inadequate Functionality Knowledge

·         Changes to requirements after preparation of test cases
·         Provide application specific Training

·         Do periodic Requirements review before submitting test cases for review
Documentation Errors

Grammatical errors

Typos

Inconsistent tense/voice

Incomplete results/number of test runs

Defect details not updated

·         Oversight



·         Not attaching importance to grammar checks


·         Not updating the test cases with the actual results and defect details after every run.
·         Do a thorough spell check before submitting the document for review

·         Provide training in written communication


·         Discuss and reiterate the process of updating the test cases for every test run
Changes to requirements not updated in Test case
·         Changes to requirements not communicated by business.


·         Not checking the comments in the requirements document periodically
·         Request the Business Analysts to pass on the information about changes to requirements.

·         Do periodic Requirements review and update the test cases, when required before submitting test cases for review


Possessing the knowledge of what could go wrong and how it could be prevented will enable the team to prepare better test cases. This, coupled with the usage of the review checklists in each phase would definitely pave way to achieve effective and efficient review of test cases.