QA/Platform/DOM/Feature Testplan Template

< QA‎ | Platform‎ | DOM
Revision as of 18:28, 16 February 2015 by Ashughes (talk | contribs) (Created page with "= Template = ; ''To be edited as necessary'' == Introduction == ; ''Brief description of the area/feature(s) covered by this document.'' The primary purpos...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Template

To be edited as necessary

Introduction

Brief description of the area/feature(s) covered by this document.

The primary purpose here is provide enough information about this functional area to a new person that they will be able to achieve a basic understanding of the feature and provide links to the any documentation or engineering docs for further reading is needed. If this is a new feature include which release it is targeted at.

Testing Approach

High level overview of the testing methodologies used in each type of testing done for this project (Manual and Automated)

The purpose of this section is provide guidance on how this area can be tested and what methodologies are most likely to be productive. For example when testing WebRTC the approach for manual testing would be to initiate calls connections between to clients and verify audio and video quality. The automated approach would be to use predictable data sources for audio and video steams allowing you to preform data analysis on the call statistics. Additionally you will want to provide some guidance on what can and can not be tested.

Include:

  • Examples of things to watch for.
  • What are some of common errors and issues that this testing is targeted at finding
  • Filing Bugs
    • How are bugs reported.
    • What component(s) should they be filed under
    • Define keywords, whiteboard tags and other flags or verbage that is expected to be used when reporting bugs

Get Involved

How can volunteers and community members become involved in this project.
  • Links to One and Done tasks
  • Links to Moztrap tests
  • Good First Verify in bugzilla
  • Links to any tutorials and other QA introductory material
  • Contact information and Meetings schedules and information on how to join

Requirements

What are the minimum requirements for becoming involved (Hardware, Software, Skills)
  • Describe the required test environment and provide instructions on how to create it.
  • If special skills are required, provide links to any tutorials that may be available on the subject.
  • If special hardware is required, provide steps on how to verify that the testers systems meet the minimum requirements.

Related Prefs

Define any preferences or about:config options that are related to the behavior of this area

Describe what pref or option does and what values should be used and how they will change the behavior of the browser. Be sure to include what the default value should be.

Related Features

What other features are either directly related too or can be affected by changes made to this feature.

For instance, changes to the javascript engine can have effects on emscipten and asm.js. WebRTC has dependencies on graphics (Open H.264) and Networking.

Test Cases

Define the test cases required to test this feature/area.

Include which tests can and should be automated, which framework used and how often the should be executed. * Provide link to repository(ies) for automated tests.

  • Smoke
  • Describe basic smoke tests required to prove minimum acceptance
  • Functional
  • List each major functional area to be tested and basic concepts for testing
  • End-to-end User Stories
  • Describe primary use cases
  • Exploratory
  • Describe some related areas and user stories that may be useful to explore

Bug Triage

Methodology for bug triage
  • Hold twice weekly triage sessions for the bug states below follows:
    • Mondays from 4pm-5pm Eastern Standard Time (time intended slot for West coast US)
    • Fridays from 9am-10am Eastern Standard Time (time slot for East coast US and Europeans)
    • In #qa on irc.mozilla.org
Queries (in order of priority)
  • verification of FIXED bugs:
    • We have only time for fixes that are part of a new, or major rework of an existing, feature. The main task is determining if the level of automated testing is sufficient in these focus areas.
      • If a fix is determined there can't be sufficient automation coverage, flip flags to in-testsuite- and qe-verify? These must be verified manually. As such, ensure there are clear STR's. (once verified flip flag to qe-verify+ note: not part of this triage process)
      • If sufficient automation exists flip flag to in-testsuite+
      • If automation is possible but insufficient, flip flags to in-testsuite? and qe-verify? Manual verification, as in above, will be needed, if automated test won't be added in a timely manner.
  • unconfirmed all: see if there's any bugs that need reproducing or need clearer STR's
  • unconfirmed general: move bugs into the appropriate sub-component
  • intermittent failures: developers feel this is the least useful task we can be doing. But if time and interest allows, they suggest:
    • 1. Getting an idea on how reproducible the issue is. For example, can you reproduce the failure in 20 test runs locally? Can you reproduce on the try server? What if you got a loan on a slave and ran tests on the slave? If the failure happens on Linux, and you have a machine that engineers can log into remotely, capturing an rr <http://rr-project.org/> trace of the failure would be tremendously helpful.
    • 2. When did it start to happen? Did it happen for as long the test was added or did it start to happen way after the test was originally written? Can you go back on TBPL and retrigger test runs there to try to narrow down when the failure started to happen? (Being able to reproduce locally obviously helps with bisection as well.)

Risks

What are the primary areas of risk involved in this area.

For example Graphics has the risk of not being to have a broad enough test bed to provide coverage for edge case testing and may result in unexpected behavior when released to a wider audience.

Reporting and Status

Describe how are test results reported and provide links to any automated test reports or dashboards.
  • List milestones and current progress
  • Include bug queries for tracked bugs
  • Sign-off status for each release tested.