Test Plan

Vinaysimha Varma Yadavali
3 min readApr 27, 2021

--

1. Objective:

This section mentions the objective or purpose of the Test Plan.

For Eg, this may be like — “This document creates the road map to be referred or followed in testing process of xxx feature/project”.

2. Pre-requisites:

I mention the project dependencies here, that are to be ready/setup to start this project.

Eg: For eg, my project is about testing the new promotion feature offered by any carrier. For this, my pre-requisite is — “The promotion offers should have been loaded in the database from where I fetch from my application. Otherwise, I will not land into the required screen to test.”

3. Assumptions:

Based on the project and testing scope there will be some assumptions which I mention in this section. Documenting this will help any new team member / stake holder to be aware of the same.

4. References:

In this section, I provide the document / resource links that have been taken into consideration while documenting the plan, and also for referring the required resources.

5. In-Scope:

In this section, I list the testing scope of the project. Here, I mention high-level testing areas / features & different testing types that are applicable as per the scope. Different testing types are based on the project and scope.

What I include is — Functional testing, Browser version compatibility testing, MOW(Mobile Optimized Web) testing, Platform compatibility testing, 508 compliance testing, L10 Localization testing, UI/UX Standards testing.

6. Out-of Scope:

In this section, I list the test areas that are not applicable or out-of-scope in current project. This also depends on the project.

What I typically include is — Performance Testing: Even if performance testing is applicable to my application, that will be taken care by other team. So, will mention the same. And, security testing which is also not in our scope. It will taken care by other team.

Also, based on the testing scope and roles of different teams, few test areas would not come as my scope, as they will be verified by other teams.

7. Entry Criteria:

In this section, I list the entry criteria, i.e the criteria that has to be met to start the testing. This depends on the project and scope.

What I typically include is like — Unit testing is completed, All sanity tests are passed.,

8. Exit Criteria:

In this section, I list the exit criteria, i.e the criteria that has to be met to sign-off/end the project. This depends on the project.

What I typically include is —

Whether all the Integration tests are passed?

Whether all the Functional tests are passed?

All regression tests are passed?

100% test execution done?

All the reported issues were fixed, and verified?

No P1 issues are open.

If there are any open issues, they should be Medium / Low priority issues and should be reviewed by PMs / Stakeholders / Business.

9. Test Environment:

Here, I will mention the different environments where I need to test, and about the environment setup, and who create the environment and who will troubleshoot if any issue.

Here, I will also mention different device details, where the application needs to be tested. Also about the browsers and their versions. And, different platform details if platform compatibility is in scope.

10. Functional Testing:

Here I will mention the high-level test areas / features to be tested. Will mention the high-level test scenarios as a work flow. This will help any team member to understand the flow and features to be tested in the project.

11. Test data:

Mention about the ways and strategies to procure the necessary test data to perform the testing.

12. Test Automation:

In this section, based on the automation scope I list the features/scenarios that can be tested by existing automation scripts, as-is or with any changes.

I will also list the high-level scenarios / features of the project that can be/need to be automated based on the scope of the project.

13. Test Execution / Test Strategy:

Here, I mention the timelines of the project. I will include — when to start testing, testing window, The order of testing different features of the project.

14. Bug Tracking:

Here, I will mention on the standards we will follow while tracking bugs —

This include tagging / labelling the bugs, as per the feature or team that need to work on. About, Milestoning & assigning the bugs.

Subject/Keyword formats if any to highlight specific issues like regression., and adding any keywords if retest fails.

15. Glossary:

Here, I will list and elaborate about any new/specific terminology that needs explanation.

--

--

Vinaysimha Varma Yadavali
Vinaysimha Varma Yadavali

Written by Vinaysimha Varma Yadavali

Seasoned QA and automation expert with nearly two decades of experience. Enthusiast in AI, focused on applying it to revolutionize software testing.