Tuesday, December 6, 2011

Test Case

DEFINITION
A test case is a set of conditions or variables under which a tester will determine whether a system under test satisfies requirements or works correctly.
The process of developing test cases can also help find problems in the requirements or design of an application.
TEST CASE TEMPLATE
A test case can have the following elements. Note, however, that normally a test management tool is used by companies and the format is determined by the tool used.
Test Suite ID The ID of the test suite to which this test case belongs.
Test Case ID The ID of the test case.
Test Case Summary The summary / objective of the test case.
Related Requirement The ID of the requirement this test case relates/traces to.
Prerequisites Any prerequisites or preconditions that must be fulfilled prior to executing the test.
Test Procedure Step-by-step procedure to execute the test.
Test Data The test data, or links to the test data, that are to be used while conducting the test.
Expected Result The expected result of the test.
Actual Result The actual result of the test; to be filled after executing the test.
Status Pass or Fail. Other statuses can be ‘Not Executed’ if testing is not performed and ‘Blocked’ if testing is blocked.
Remarks Any comments on the test case or test execution.
Created By The name of the author of the test case.
Date of Creation The date of creation of the test case.
Executed By The name of the person who executed the test.
Date of Execution The date of execution of the test.
Test Environment The environment (Hardware/Software/Network) in which the test was executed.
TEST CASE EXAMPLE / SAMPLE
Test Suite ID TS001
Test Case ID TC001
Test Case Summary To verify that clicking the Generate Coin button generates coins.
Related Requirement RS001
Prerequisites
  1. User is authorized.
  2. Coin balance is available.
Test Procedure
  1. Select the coin denomination in the Denomination field.
  2. Enter the number of coins in the Quantity field.
  3. Click Generate Coin.
Test Data
  1. Denominations: 0.05, 0.10, 0.25, 0.50, 1, 2, 5
  2. Quantities: 0, 1, 5, 10, 20
Expected Result
  1. Coin of the specified denomination should be produced if the specified Quantity is valid (1, 5)
  2. A message “Please enter a valid quantity between 1 and 10” should be displayed if the specified quantity is invalid.
Actual Result
  1. If the specified quantity is valid, the result is as expected.
  2. If the specified quantity is invalid, nothing happens; the expected message is not displayed
Status Fail
Remarks
Created By John Doe
Date of Creation 01/14/2020
Executed By Jane Roe
Date of Execution 02/16/2020
Test Environment
  • OS: Windows Y
  • Browser: Chrome N
WRITING GOOD TEST CASES
  • As far as possible, write test cases in such a way that you test only one thing at a time. Do not overlap or complicate test cases. Attempt to make your test cases ‘atomic’.
  • Ensure that all positive scenarios and negative scenarios are covered.
  • Language:
    • Write in simple and easy to understand language.
    • Use active voice: Do this, do that.
    • Use exact and consistent names (of forms, fields, etc).
  • Characteristics of a good test case:
    • Accurate: Exacts the purpose.
    • Economical: No unnecessary steps or words.
    • Traceable: Capable of being traced to requirements.
    • Repeatable: Can be used to perform the test over and over.
    • Reusable: Can be reused if necessary.

Test Plan

TEST PLAN DEFINITION
A Software Test Plan is a document describing the testing scope and activities. It is the basis for formally testing any software/product in a project.
ISTQB Definition
  • test plan: A document describing the scope, approach, resources and schedule of intended test activities. It identifies amongst others test items, the features to be tested, the testing tasks, who will do each task, degree of tester independence, the test environment, the test design techniques and entry and exit criteria to be used, and the rationale for their choice,and any risks requiring contingency planning. It is a record of the test planning process.
  • master test plan: A test plan that typically addresses multiple test levels.
  • phase test plan: A test plan that typically addresses one test phase.

TEST PLAN TYPES
One can have the following types of test plans:
  • Master Test Plan: A single high-level test plan for a project/product that unifies all other test plans.
  • Testing Level Specific Test Plans: Plans for each level of testing.
    • Unit Test Plan
    • Integration Test Plan
    • System Test Plan
    • Acceptance Test Plan
  • Testing Type Specific Test Plans: Plans for major types of testing like Performance Test Plan and Security Test Plan.
TEST PLAN TEMPLATE
The format and content of a software test plan vary depending on the processes, standards, and test management tools being implemented. Nevertheless, the following format, which is based on IEEE standard for software test documentation, provides a summary of what a test plan can/should contain.
Test Plan Identifier:
  • Provide a unique identifier for the document. (Adhere to the Configuration Management System if you have one.)
Introduction:
  • Provide an overview of the test plan.
  • Specify the goals/objectives.
  • Specify any constraints.
References:
  • List the related documents, with links to them if available, including the following:
    • Project Plan
    • Configuration Management Plan

Test Items:
  • List the test items (software/products) and their versions.
Features to be Tested:
  • List the features the software/product to be tested.
  • Provide references to the Requirements and/or Design specifications of the features to be tested
Features Not to Be Tested:
  • List the features of the software/product which will not be tested.
  • Specify the reasons these features won’t be tested.
Approach:
  • Mention the overall approach to testing.
  • Specify the testing levels, the testing types, and the testing methods.
Item Pass/Fail Criteria:
  • Specify the criteria that will be used to determine whether each test item (software/product) has passed or failed testing.
Suspension Criteria and Resumption Requirements:
  • Specify criteria to be used to suspend the testing activity.
  • Specify testing activities which must be redone when testing is resumed.
Test Deliverables:
  • List test deliverables, and links to them if available, including the following:
    • Test Plan (this document itself)
    • Test Cases
    • Test Scripts
    • Defect/Enhancement Logs
    • Test Reports

Test Environment:
  • Specify the properties of test environment: hardware, software, communications etc.
  • List any testing or related tools.
Estimate:
  • Provide a summary of test estimates (cost or effort) and/or provide a link to the detailed estimation.
Schedule:
  • Provide a summary of the schedule, specifying key test milestones, and/or provide a link to the detailed schedule.
Staffing and Training Needs:
  • Specify staffing needs by role and required skills.
  • Identify training that is necessary to provide those skills, if not already acquired.
Responsibilities:
  • List the responsibilities of each team/role/individual.
Risks:
  • List the risks that have been identified.
  • Specify the mitigation plan and the contingency plan for each risk.
Assumptions and Dependencies:
  • List the assumptions that have been made during the preparation of this plan.
  • List the dependencies.
Approvals:
  • Specify the names and roles of all persons who must approve the plan.
  • Provide space for signatures and dates. (If the document is to be printed.)
Tools:
  •       Specify the tools used for  project like Defect tracking Tool, Test case Tools,Perormance tool or any other tools.
TEST PLAN GUIDELINES
  • Make the plan concise. Avoid redundancy and superfluousness. If you think you do not need a section that has been mentioned in the template above, go ahead and delete that section in your test plan.
  • Be specific. For example, when you specify an operating system as a property of a test environment, mention the OS Edition/Version as well, not just the OS Name.
  • Make use of lists and tables wherever possible. Avoid lengthy paragraphs.
  • Have the test plan reviewed a number of times prior to baselining it or sending it for approval. The quality of your test plan speaks volumes about the quality of the testing you or your team is going to perform.
  • Update the plan as and when necessary. An out-dated and unused document stinks.
 

Thursday, May 19, 2011

Approach towards implementing agile testing method


  1. Introduction
Our approach towards implementing agile testing method was very different from the
traditional testing practices. We recently successfully completed testing a product using
this approach. It was a rich learning experience and helped us in gaining the knowledge
of testing products in an Agile environment. The product, that we were supposed to work
on, was completely a new product for us and we had no prior knowledge of it. We were
responsible for the first release of this product. The following sections of this paper
identify our methodology in successfully releasing it.
  1. Challenges Faced
    1. Initial Hiccups
• There was no formal documentation provided by the client: Requirements were in
form of user stories. It was a very high-level definition of any requirement containing
just enough information, which made it quite difficult to understand the scope and
hence draft the test cases.

• There was no prior knowledge of the product: The product was an entirely new
one and we were unsure of the workflows, which made it difficult to prioritize the
business and technical requirements.

    1. Roadblocks during the course of the Project
• QA build was short-lived: Due to the tight project deadlines, and frequent changes
in the requirements, the testing time was getting continuously shortened. Initially,
biweekly QA builds were delivered, but it later got reduced to daily nightly builds,
which posed a great difficulty in managing our QA efforts.

• Time to update the test case documents was not available: Requirements were
never frozen throughout the life cycle of the project. So, the documents (test cases,
regression test cases) required frequent updates. With QA build cycle getting
continuously shortened as the release date approached, the update task was getting
quite difficult to accomplish.

• Managing the regression task: As per the client’s instructions to perform the
regression as frequently as possible to ensure that no fixed bugs get reopened, or the
changes in one module doesn’t affect the other modules, it became essential to
perform regression testing with wide coverage. Additionally, keeping track of the
bugs and updating the same in the tracking tool was equally important.

    1. Are we ready for release?
It was challenging for the testing team to provide a definitive to answer to this question
because all the QA efforts were employed in a very short time frame. With the kind of
build cycles being followed and the corresponding testing, this was a question that was
difficult to answer.

  1. Our Approach to overcome challenges in Agile
    1. Efficient Documentation – Capture the essence, not the details!
To minimize the risk of over-documentation, thereby wasting time on test cases which
might get irrelevant with the changing requirements in the successive builds, we ensured
that our documentation was highly effective and required minimal changes in case the
requirements changed.
We ensured this by using the following techniques:
• Maintained test scenario sheets: A test scenario sheet is a matrix which includes all
the possible flows which could be performed by a user in the application. In this
sheet, the steps to execute these scenarios were not included. It was assumed that
while execution the tester would be well aware of the steps to follow for these
scenarios.


Payment Mode/Order Type Pickup Carry Out On Hold Backorder
Cash × × × ×
Check √ √ √ √
Gift Card √ √ √ √
Credit Card × × √ √
×-Scenario Passed √-Scenario Failed
Table 1: Test Scenario Sheet

• Maintained review comments sheet: The documents we prepared based on the
communication with the client, were sent for review and the review comments on
these documents were maintained in a sheet known as the review comments sheet.
So, instead of incorporating the changes in our documents we referred to the review
comments sheet for any validation. So, it helped us to save some time in investing our
efforts in documenting the changes.

• Maintained activity flow diagrams through mind mapping techniques, where-in
we mapped the user’s possible flows down the application.
Figure 1. Activity Flow Diagram

    1. Exploring the product – Understanding the business
As the product was new to us and we had no prior knowledge, we initially started with
exploratory testing and used software from the end-user’s perspective within its overall
environment. This rough test helped us to uncover all the critical functionalities, defect
prone areas and few important failure scenarios. Also, it helped us in prioritizing these
features and areas in our future testing course.

    1. Active Communication – Let’s talk!
We adopted an approach where-in we interacted with the client and the onsite
Development team, not only through the formal ways of communication like client calls
and emails, but also through informal chats like IM and calling directly on their cell
phones in case of any issues.

We believed that it would be a good approach to reach the client to clarify our issues on
that day itself, rather than to wait for their response through any formal means of
communication.

Also, we helped the onsite Development Team in reproducing the logged bugs in case
they had any issues surrounding the bugs. We kept this communication as informal as
possible, which established a comfort zone between the Development and the QA team.

    1. Constant Feedbacks
We ensured to have frequent bug triages, where-in we discussed on the priority of the
bugs which need to be fixed and the bugs which need to be deferred. Also, we provided
feedback on the design features from the customer’s perspective.

As the QA builds were short-lived, we discussed regarding the prioritization of our
testing efforts and requested the client and the onsite Development team a list of the
impacted areas and new modules, modified or added in the coming build which helped us
in further focusing on the impacted areas accordingly.

    1. Identification of different sets of test cases – Categorization: It always helps!
We identified and continuously prioritized executable and regression test cases with each
new build. We felt the need to identify two sets of test cases:
• Executable Test Cases: Test cases which could be executed for the features/modules
which have been built.
• Regression Test Cases: Test cases which needed to be run on daily basis to ensure
that the newly added features do not affect the flow of the previously working
modules.

    1. Increasing the scope of the bug tracking tool
We used JIRA as the bug tracking tool in our project. Apart from logging and tracking
bugs we requested the client to raise any kind of requirement changes as a change request
in JIRA. This helped us in keeping the teams in sync and helped the testing team to
incorporate the changes accordingly in the next test cycle.

    1. Yes, we are ready for release – Let’s go live!
When the time of the release approached, we sat with the client and judged the readiness
for the build release to production. Instead of looking at the number of open bugs, we
analyzed the consequence of these bugs on the overall application.
In couple of meetings with the client, we were able to identify the bugs which can be left
open and does not pose a serious threat for going live. The status of these open bugs was
changed to deferred and it was decided that these bugs would be resolved in the next
version.
  1. Benefits of our Approach
    1. Adaptive Testing Approach – Plan as we Progress
Though the testing time was less but still we were able to execute our testing efficiently
which might not have been possible had we followed any of the traditional testing
approach.

    1. Encouraged Futuristic Thinking
The brainstorming sessions between the client, the development team and the testing
team helped encourage the team members to provide some valuable feedback back to the
client about the product. The client highly appreciated such lateral thinking and decided
to implement a few of those suggestions as enhancements in the future release.

    1. Identification of show stoppers at early stages
Running the prioritized regression tests with each new build helped us to identify the
critical bugs affecting the previously built features, when incorporated with the new
modules.

    1. Timely Delivery
With this approach, we were able to manage testing activities efficiently and hence give a
green signal for the product to go live despite the tight deadlines for testing.

    1. Avoiding waste in documentation
With our new technique to manage documents efficiently, we were able to avoid a lot of
waste in terms of time and resources required for the documentation process.

  1. Improvement Areas
    1. Adding Automation for Test Management
To speed up the entire effort, we could have automated the regression tests which were
being executed in each new build.

    1. Increasing client awareness regarding our approach for future products
As this approach was planned as we progressed with the project, it was not well
structured. But we were able to initiate the process at our project level to make the client
aware of the process we followed throughout the product delivery cycle. This made them
well-informed about our working method of testing under Agile environment.

  1. Conclusion
The successful delivery and our experience with this approach made us believe that
testing can be well justified even in Agile environment.