QA/Execution/Web Testing/roles/buildmaster

From MozillaWiki
Jump to navigation Jump to search

Introduction

Create a set of guidelines for investigating failures in our public Jenkins instance. This would be used by whoever is monitoring for failures, but would also be valuable for community who want to dive in. It should include how to identify failures, how to determine if they're already known, how to replicate them locally, how to determine if they're application bugs or test bugs, where to raise them, who to notify, and even how to fix them (if they're test failures) and submit pull requests. This could form part of a boot camp similar to other teams. -- from our 2015, Q2 goals brainstorm.

Open Questions

  • In the interests of reducing complexity, do we need tiers? Are we anticipating that there will be so many failures that some need prioritizing over others?
  • On the topic of sending daily emails - I think this would be extra work and considered noise by most recipients. It could be done via a whiteboard entry and Bugzilla whines.
  • On the topic of checking builds once a day - There are several methods for doing this: view the web dashboard, subscribe to RSS feeds, read e-mail alerts, watch IRC notifications. We could consider others using Jenkins plugins.

Rotation

The Web QA Buildmaster Rotation page contains the past and upcoming schedule.

These entries are in reverse chronological order.

  • 2016-05-05 - 2016-05-19 - stephend
  • 2016-04-21 - 2016-05-05 - mbrandt
  • 2016-04-07 - 2016-04-21 - davehunt
  • 2016-03-24 - 2016-04-07 - rbillings
  • 2016-03-10 - 2016-03-24 - krupa

Definition

  • buildmaster role last 2 weeks
  • edit jenkins desc with your name as buildmaster
  • check builds at least once per day (there are several methods for doing this: view the web dashboard, subscribe to RSS feeds, read e-mail alerts, watch IRC notifications. We could consider others using Jenkins plugins)
    • investigate failures
    • If it is a locator issue, if you have a question, if you wonder if the test is still valid or important then file a GitHub issue
    • File bugs on projects based on info below, then contact the noted team members
  • File Git Issues for test failures that require a test update.
    • Label issue "test failure" with a priority
    • Xfail the failing test to get the build green
  • buildmaster is point of contact for open issues/bugs
  • role includes filing bugs/issues, sending out emails, investigating issues
  • does NOT include escalation paths, prioritizing fixes, following up with other teams
  • send daily email with list of generally prioritized github issues to be fixed, blocking bugs that were filed

Known Issues

For the latest known issues check this etherpad. If you are the current buildmaster please try to keep this pad updated. It helps when it comes to hand the role onto the next buildmaster, and can avoid duplicating effort investigating failures.

Support Tiers

Tier 1

  • Marketplace
  • AMO
  • Mozilla.org

Tier 2

  • SUMO
  • Socorro

Tier 3

  • BIDPOM
  • Moztrap
  • One and Done

Tier 4 (Unsupported)

  • Affiliates
  • Mozillians

Projects

amo

bidpom

  • towards low priority
  • fail: john morrison [jrgm] if infrastructure related (time outs, buttons not loading, etc.); bob or davehunt are the ones to fix
  • IRC: mozwebqa
  • known bug file bug and also need info him; esp if you know who checked in the change who made it fail

Hello (Loop)

Marketplace

mozilla.org

mozillians.org

MozTrap

mozwebqa dashboard

One and Done

Socorro

Sumo

QMO

FAQ

Which bugs are currently open that correspond to known test failures?

The buildmaster maintains an etherpad which lists bugs that currently impact jobs.

Who do I contact if the issue is related to Persona?

If you trace an issue to Persona (the sign-on service) you should contact :jrgm in #persona. You can also raise issues in the project's GitHub repository.

Who do I contact if the issue is related to Firefox Accounts?

If you trace an issue to Firefox Accounts you should contact one of the following in #fxa: Shane Tomlinson (stomlinson), Zachary Carter (zaach), or Vladislav Filippov (vladikoff).

Why is the failure only happening on Sauce Labs?

It could be that the failure is only presenting itself on specific browser window sizes. Sauce Labs uses virtual machines with screen resolutions that may differ from our internal Selenium Grid. You could try specifying a browser window size to make the results consistent, or at least consider how the size of the browser might affect the tests that are failing.