QA/BrowserID/TestPlan: Difference between revisions

From MozillaWiki
< QA‎ | BrowserID
Jump to navigation Jump to search
No edit summary
Line 86: Line 86:
** Mac: Safari latest, FF 4/5, Google Chrome latest
** Mac: Safari latest, FF 4/5, Google Chrome latest
** Linux: FF 3/4/5, Google Chrome latest
** Linux: FF 3/4/5, Google Chrome latest
== Logging ==
There are four primary logs that should be monitored during testing: two for BrowserID and two for nginx:
* Nginx logs: /var/log/nginx/
** access.log
** error.log
** archived logs
* BrowserID (node) logs
** <install path>/var_browserid/server.log
** <install path>/var_verifier/server.log


= Major Test Areas =  
= Major Test Areas =  

Revision as of 17:39, 5 August 2011

Overview

This test plan covers the general weekly testing that will happen against a built, unit-tested, and deployed BrowserID product in the Beta Test environment. The goal is to ensure a defined/consistent amount of quality and usability in the server-side and client side portions of the BrowserID product. Since this is a public-facing product, we need to make sure that the Mozilla community can access, test, and develop around a solid product.

Strategy

Create a consistent and repeatable set of tests for qualifying and releasing the weekly builds and deployments for BrowserID. The focus needs to be on quality, but also on providing accurate feedback on successes and issues to Dev and PM so that the Production releases provide a useful product for the Mozilla and development communities.

  • Ad Hoc testing to define test paths on the client and server side.
  • Create useful client-side and server-side test cases that can be imported into Litmus for repeatable manual testing.
  • Create automated client and server tests that can more accurately cover the full functionality of the UI, the API, and the main roles (PIA, RP, IP) and various flows.
  • Emphasis on security and privacy issues that might come up across OS, browsers, accounts, and emails on the various clients tested against the server.
  • Identifying and tracking issues in GitHub and Bugzilla.
  • Provide useful metrics to Dev and PM.

Scope of Testing

On the Client-side, testing will cover the basic functionality and UI, accounts and emails, interaction with the Server, security/privacy, and usability/compatibility across OS and browsers.

On the Server-side, testing will start with basic functionality, support for multiple client sites, user security/privacy, communication with various clients, then move on to information handling and storage, security/privacy of information, information persistence across deployments, logging.

General Test Information

Links and Documentation

Weekly Test Schedules

  • Thursdays: deployment to Production, Beta (QA), and Dev
  • Thursdays/Fridays: open testing and experimentation by Dev, QA, and community
  • Following week:
    • Monday - Wednesday: QA testing and sign off of current deployment

Weekly Meetings

Weekly meeting notes are kept on the following EtherPad site:

Posting Feedback

Bugs and Open Issues

Ops Bugs in Bugzilla

  • Classification: Other
  • Product: mozilla.org
  • Component: Server Operations: Labs

BrowserID Client/Server Bugs/Issues

Client/Server Test Environment

Beta test environment

  • TBD

Prod test environment

Supported OS and Browsers

Operating Systems

  • PC: Win 7, WinXP, Vista (in priority order)
  • Mac 10.6, 10.7, 10.5
  • Linux: Fedora 14/15, EL5/6, Ubuntu
  • Mobile: Android, iOS

Browsers

  • Minimal support would have to include a native browser plus Firefox
    • Windows: IE 8/9, FF 4/5, Google Chrome latest
    • Mac: Safari latest, FF 4/5, Google Chrome latest
    • Linux: FF 3/4/5, Google Chrome latest

Logging

There are four primary logs that should be monitored during testing: two for BrowserID and two for nginx:

  • Nginx logs: /var/log/nginx/
    • access.log
    • error.log
    • archived logs
  • BrowserID (node) logs
    • <install path>/var_browserid/server.log
    • <install path>/var_verifier/server.log

Major Test Areas

Code Verification

Verify non-QA-testable fixes/code changes by direct inspection (as required)

Installation/Deployment Verification

QA needs access to Beta test environment in order to verify the deployment to server side (PIA/IP) and client side (RP). It would be helpful for QA to verify the weekly deployment to the server, along with any client-side changes (JavaScript library, JavaScript API). For early trains, this should be covered by information given in the BrowserID deployment ticket (Bugzilla). This would amount to a delta from the previous train that QA can record as part of their testing.

For subsequent trains, the Beta test environment or a QA environment (deployed with the same build) could be used to verify server deployments and client-side changes.

Sanity/Acceptance/Smoke

Small, repeatable set of tests with known, good, expected results. Manual and automated testing on client-side to pass a minimal level of acceptance without which, QA testing of BrowserID can not proceed. This will probably be a very small subset of the basic functional tests or some automated smoke test: TBD

Bug Verification

Manual testing of bugs/issues resolved for this weekly cycle of testing. Test cases generated during this testing can be moved to an automation tool for bug regression (see below).

Basic Functional

Manual and automated testing on the client and the server to verify basic functionality of BrowserID:

  • Creating an account first
  • Creating an account inline (at first use)
  • Email verification for new accounts and emails
    • Would be useful to have email accounts on various, popular email servers/services
    • Need a method to verify this by direct inspection in server: comparing token in email link to information on server (or wherever this is temporarily stored)
  • Creating multiple accounts
  • Deleting one or more accounts (cancellation)
  • Adding additional emails
  • Deleting one or more emails (maintaining the account though)
  • Forgotten account information - mail, passwords
  • Leaving/returning to sites
  • Browser restart after creation of identity
  • Always logging out from sites vs. never logging out from sites
  • Email and Password character compatibility
  • Valid vs. invalid email formats
  • Different accounts using same email/password combos
  • Shared access to same computer or profiles or accounts
  • Browser settings and preferences, esp. pop-ups, security, privacy
  • FireFox about:config, similar, if any, for other browsers


UI

Manual and automated testing on the client and the server to cover all aspects of the current UI: (yes, this is a bit of an overkill for a Test Plan...)

Bug Regression

Manual and automated testing of Verified/Closed bugs and issues from previous cycles of testing. Ideally, this would be in the form of a quick automated test/script to check key fixes/changes/updates. QA to work with Dev and PM to come up with a list of bugs/issues candidates to from the test cases for manual and automated testing.

OS/Browser integration

Manual testing/verification of the required OS and browser configurations:

  • Verify access and use across browsers on same OS, different OS
  • Verify access and use across OS platforms, same browser, different browser
  • This includes account creation/update/deletion and email addition/deletion
  • OS/Browser-specific local storage verification
  • Browser preferences, esp for privacy/security
  • Browser synchronization - same platform, across platforms

Mobile Testing

Milestone is completed, so some Mobile specific tests should be added: TBD

  • Sanity/Acceptance
  • Functional/UI
  • OS/Browser

Client-side Testing

QA will need the local storage information per browser in order to verify accurate creation/update/use/removal of BrowserID data.

  • Possible FireFox example:
    • Users > USER-NAME > Library > Application Support > Firefox > Profiles > PROFILE.default
  • Where is the local storage for BrowserID?
    • Is it just localstore.rdf, or a whole set of files per profile?
  • Local storage verification
    • Per OS, per browser, per user/account
    • Is this also further defined by profile?
      • OS > browser > profile > account/user

Server-side Testing

Need the list of server and services or components that describe the BrowserID architecture to accurately focus on server-side testing.


Areas Not Covered or In Development

Server and Client Automation

Currently, QA has no automation configured for the weekly BrowserID testing.

  • Client-side automation using Selenium is being investigated
  • TBD: Server-side automation, primarily to cover the API

Security: VEP, VES, VEC

QA could use some help from Dev and PM on testable use cases.

Flows/End-To-End

Verifying each step in the various flows and also the complete End-To-End flow. This testing will need to be done manually, if possible, for a few weekly trains to get the required tests and tools scoped out. Then, this can be moved over to automated E2E testing (alongside the automated smoke testing). This seems like a better fit for server-side automation, where the appropriate actions and verifications could be tested per flow. Important flows to test/validate:

  • Certificate Provisioning: 8 steps to verify full certificate provisioning path
  • Assertion Generation: 4 steps to verify assertion generation path
  • Assertion Verification: 6 steps to verify assertion verification path

QA could use some help from PM and Dev here to better define use cases for each step of these flows. Provide ideas/cases for testing each of the steps, plus the included assumptions and requirements. If applicable we should design both client-side and server-side tests. QA could also leverage off the design/purpose/coverage of the unit tests. REF: http://lloyd.io/how-browserid-works

FireFox Add-On

For Firefox users with the addon, the addon flow needs tested. Test design: TBD

Internationalization, Localization

  • Localization (L10n) requirements TBD
  • Internationalization (i10n) requirements TBD

Setting Up Test Emails

Email-side (PIA)

  • This is the Primary Identity Authority (PIA) for all the transactions.
  • Examples for testing: Mozilla, Yahoo, Google, ISPs
  • Also available for testing are sites like Mailinator

Using Internal Email Accounts

  • If secure enough to do so: <mozilla name>+tag@mozilla.com
  • Otherwise: Create emails at www.mailinator.com
  • Do we have other tested (re: reliable) methods for creating test/dummy email accounts?


Open Questions/Issues for QA, Dev, and PM

  1. How do we verify this statement: "Unlike other sign-in systems, BrowserID does not leak information back to any server (not even to the BrowserID servers) about which sites a user visits." Could use some help from Dev and PM on testable use cases.
  2. How does this site/technology fit in? Or is it not applicable to our weekly testing?

http://people.mozilla.com/~faaborg/files/projects/firefoxAccount/index.html

  1. Is there any benefit to testing/comparing BrowserID with OpenID or any other ID mechanism?
  2. Where is local storage for BrowserID? What data is stored per browser/profile/account/user?
  3. What is the best way to approach security and privacy testing? Through the UI? through the API? Or, by setting up some unsecure environment or hackable instance?
  4. Is it possible to have multiple accounts referencing the same email(s)?
  5. What about multiple accounts per user (for work/prof/public emails vs. private/personal emails)?
  6. Do we need to test/verify deviations from the standard VEC?