QA/e10s + A11y Test Plan

From MozillaWiki
Jump to navigation Jump to search

Overview

Purpose

Quality assurance plan to ensure accessibility features with e10s enabled using Windows and Linux operating system are ready for public release.

Note: Despite this wiki page title, this test plan is for accessibility + e10s in general, not just touchscreen support.

Quality Criteria

Risk area Requirement Status
Screen readers There should be no significant difference in using screen readers than on non-e10s TBD
Touch screens Touch enabled devices, including soft keyboards, should function as well as in non-e10s (confirm a11y isn't activated for touch screen users) TBD
IME clients Common IME clients should function as well as in non-e10s TBD
Popular sites (ie, gmail, twitter, facebook, and the other Google apps No significant regression in site correctness, stability or performance with e10s and a11y compared to a11y on non-e10s TBD
ARIA markup There should be no significant difference in a11y exposed interfaces for e10s vs non-e10s TBD
General performance Overall performance of Firefox with a11y on e10s should not be notably worse than with a11y on non-e10s TBD

Testing summary

Scope of Testing

In Scope

The scope of our testing is the A11y+e10s accessibility and functionality and performance of the most popular sites and most commonly used 3rd party tools.

  • Integration: Verify the integration with the current browser functionalities and UI;
  • Functionality: Basic and advanced functionality to be verified according to the existing requirements;
  • Usability: Intuitive and how users interact with the feature;
  • Performance: Reference, where applicable, observed and collected performance data.

Out of Scope

We will not be testing with less popular 3rd party tools or on obscure web sites.

Requirements for testing

Environments

Testing will be performed on following OSes:

  • Windows 10 (x64)
  • Linux - Ubuntu 16.04 (x64)

Quality Assurance Strategy

Test Items

Screen Readers

Client Free/Licensed Demo Available? Demo Expiration Purchased Copy? Owner/Location
NVDA (Win) (top priority) free TBD
Window-Eyes (Win) licensed link[1] 60 days TBD TBD
JAWS (Win) licensed TBD TBD TBD TBD
Dolphin (Win) licensed TBD TBD TBD TBD

[1] requires filling in a contact information form

Criteria Description Metric non-e10s value e10s value Criteria Met? QA Owner
Manual testing Test cases passed # (passed -TBD) out of # (total test cases run -TBD) # (passed -TBD) out of # (total test cases run -TBD) TBD (Date status updated) SoftVision

Touch Screens

Because a11y should no longer be triggered by touch screen devices, only the first metric below should block release of a11y+e10s.

Criteria Description Metric non-e10s value e10s value Criteria Met? QA Owner
Manual testing A11y enabled? (should be false) true or false true or false TBD (Date status updated) SoftVision
Manual testing TestRail Test cases passed or Google Docs # (passed -TBD) out of 26 test cases run # (passed -TBD) out of 26 test cases run TBD (Date status updated) SoftVision

IME Clients

These are very difficult to test if one is not familiar with the language under test. We may have to recruit some help from our Asian counterparts.

Language Primary IME 3rd party
Japanese Win: MS-IME, macOS: Apple Japanese IME Google Japanese Input, ATOK (not free)
Simplified Chinese Pinyin ABC
Traditional Chinese Changjie
Korean Win: MS-IME, macOS: Apple Korean IME
Criteria Description Metric non-e10s value e10s value Criteria Met? QA Owner
Manual testing Test cases passed # (passed -TBD) out of # (total test cases run -TBD) # (passed -TBD) out of # (total test cases run -TBD) TBD (Date status updated) SoftVision

Popular Sites

Criteria Description Metric non-e10s value e10s value Criteria Met? QA Owner
Manual testing Test cases passed # (passed -TBD) out of # (total test cases run -TBD) # (passed -TBD) out of # (total test cases run -TBD) TBD (Date status updated) SoftVision

ARIA Markup

Criteria Description Metric non-e10s value e10s value Criteria Met? QA Owner
Manual testing using existing tools/suites Test cases passed # (passed -TBD) out of # 44 test cases run # (passed -TBD) out of # 44 test cases run TBD (Date status updated) SoftVision

General Performance

Acceptable regression ranges, if any, need to be determined.

Criteria Description Metric non-e10s value e10s value Criteria Met? QA Owner
CPU usage (observed) Peak/average % CPU %peak/%average %peak/%average TBD (Date status updated) SoftVision
Memory usage (observed) Peak/Average % memory %peak/%average %peak/%average TBD (Date status updated) SoftVision
Telemetry - overall crash rate with a11y enabled crashes per 1000 use hours # crashes # crashes a/b experiment on beta TBD tracy
Talos - a11yr summary Page load times 490.6ms (Win 7) 531.6ms (Win 7) FAIL - 8.36% regression on Win 7 (20170115) tracy

Builds

TBD

Test Execution Schedule

The following table identifies the anticipated testing period available for test execution.

Project phase Start Date End Date
Start project December 2016 -
Study documentation/specs received from developers TBD -
QA - Test plan creation 12/13/2016 -
QA - Test cases/Env preparation 12/12/2016 -
QA - Nightly Testing - -
QA - Aurora Testing December 2016
QA - Beta Testing
Release Date

Testing Tools

Detail the tools to be used for testing, for example see the following table:

Process Tool
Test plan creation Mozilla wiki
Test case creation TestRail
Test case execution TestRail
Bugs management Bugzilla/GitHub (mainly)

Status

Overview

  • Track the dates and build number where feature was released to Nightly
  • Track the dates and build number where feature was merged to Aurora
  • Track the dates and build number where feature was merged to Release/Beta

References

Testcases

Available on TestRail and Google Doc format TestRail Google Docs

To Be filed:

  • live regions test case (bug 1322532)
  • Browser hang up when Open View Source (IME related, bug 1318900)

Overview

  • Summary of testing scenarios

Test Areas

Test Areas Covered Details
Private Window Yes
Multi-Process Enabled Yes
Multi-process Disabled Yes
Theme (high contrast) No
UI
Mouse-only operation Yes
Keyboard-only operation Yes
Display (HiDPI) No
Interraction (scroll, zoom) Yes
Usable with a screen reader Yes e.g. with NVDA
Usability and/or discoverability testing Yes Is this feature user friendly
Help/Support
Help/support interface required No Make sure link to support/help page exist and is easy reachable.
Support documents planned(written) Yes Make sure support documents are written and are correct.
Install/Upgrade
Feature upgrades/downgrades data as expected No
Does sync work across upgrades No
Requires install testing Yes Requires NVDA Installation
Affects first-run or onboarding No
Does this affect partner builds? Partner build testing No
Enterprise Raise up the topic to developers to see if they are expecting to work different on ESR builds
Enterprise administration No
Network proxies/autoconfig No
ESR behavior changes No
Locked preferences No
Data Monitoring
Temporary or permanent telemetry monitoring No Testing was not conducted by SV QA Eng team.
Telemetry correctness testing No Testing was not conducted by SV QA Eng team.
Server integration testing No Testing was not conducted by SV QA Eng team.
Offline and server failure testing No
Load testing No Testing was not conducted by SV QA Eng team.
Add-ons If add-ons are available for testing feature, or is current feature will affect some add-ons, then API testing should be done for the add-on.
Addon API required? No
Comprehensive API testing No
Permissions No
Testing with existing/popular addons Yes Ensure no performance/stability regressions
Security
3rd-party security review No
Privilege escalation testing No
Fuzzing No
Web Compatibility depends on the feature
Testing against target sites Yes
Survey of many sites for compatibility Yes
Interoperability depends on the feature
Common protocol/data format with other software: specification available. Interop testing with other common clients or servers. Yes NVDA should cover most of this. Other common clients are closed-source, expensive, and do not offer trial versions.
Coordinated testing/interop across the Firefoxes: Desktop, Android, iOS No
Interaction of this feature with other browser features Yes

Test suite

Bug Work

Sign off

Criteria

Check list

  • All Criteria under each section of Quality Assurance Strategy should be green.
  • All test cases should be executed
  • All blockers, criticals must be fixed and verified or have an agreed-upon timeline for being fixed (as determined by engineering/RelMan/QA)

Results

Aurora testing

  • TBD on TestRail

Merge to Aurora Sign-off
List of OSes that will be covered by testing

  • Link for the tests run - TBD
    • Full Test suite - TBD

Checklist

Exit Criteria Status Notes/Details
Testing Prerequisites (specs, use cases)
Testing Infrastructure setup No
Test Plan Creation [IN PROGRESS]
Test Cases Creation [IN PROGRESS]
Full Functional Tests Execution
Smoke Tests Execution
Automation Coverage
Performance Testing [IN PROGRESS]
All Defects Logged
Critical/Blockers Fixed and Verified
Daily Status Report (email/etherpad statuses/ gdoc with results)
Metrics/Telemetry N/A
QA Signoff - Nightly Release Email to be sent
QA Aurora - Full Testing
QA Signoff - Aurora Release Email to be sent
QA Beta - Full Testing
QA Signoff - Beta Release Email to be sent

Ownership

Product contact:
Erin Lancaster (IRC: elan)

User Experience contact:
Not applicable

Engineering contact:
Aaron Klotz (IRC: aklotz) (Windows)
Trevor Saunders (IRC: tbsaunde) (Linux)

QA contact:
Marco Zehe (IRC: MarcoZ)
Tracy Walker (IRC: tracy)

QA:
PM for QA team - Rares Bologa (IRC: RaresB)
QA Lead - Grover Wimberly IV (IRC: Grover-QA)
QA - Kanchan Kumari (IRC: Kanchan_QA)
QA - Justin Williams (IRC: JW_SoftvisionQA)
QA - Stefan Georgiev (IRC: StefanG_QA)
QA - Abe Masresha (IRC: Abe_LV)

Revision History

This section describes the modifications that have been made to this wiki page. A new row has been completed each time the content of this document is updated (small corrections for typographical errors do not need to be recorded). The description of the modification contains the differences from the prior version, in terms of what sections were updated and to what extent.

Date Version Author Description
12/13/2016 1.0 Grover Wimberly IV Created first draft
12/13/2016 1.1 Kanchan Kumari Added some more info
01/10/2017 1.2 Tracy Walker Make Risk/Requirements/Status prominent
01/11/2017 1.3 Grover Wimberly IV Added details of test suite, test cases, and updated status of project