TestEngineering/Performance: Difference between revisions

no edit summary
No edit summary
Line 13: Line 13:
= Where to find us =
= Where to find us =
* [https://chat.mozilla.org/#/room/#perftest:mozilla.org #perftest]
* [https://chat.mozilla.org/#/room/#perftest:mozilla.org #perftest]
= Team purpose =
To support the infrastructure and creation of automated tests for evaluating the performance of Firefox products. This provides value by exposing gaps in coverage, revealing areas where we can make performance gains, identifying performance regressions in a timely manner, and by providing key performance metrics that assist in determining how Firefox measures against release criteria.


= What we do =
= What we do =
We provide automated testing support for measuring the performance of Firefox products. Here are a few examples of what you can expect our team to be working on.
* Identification of gaps in performance test infrastructure and monitoring.
* Provide advice/troubleshooting related to performance testing or any of our tools and harnesses.
* Designing and building performance test infrastructure and monitoring solutions.
* Develop test plans that involve automation of performance testing.
* Supporting Firefox engineers on writing performance tests.
* Prototype, build, and maintain test harnesses for performance testing.
* Supporting Firefox engineers on investigating regressions identified by tests.
* Monitoring and reporting of performance regressions.
* Collaboration with release operations on infrastructure requirements.
* Standing up performance tests in continuous integration environments.
* Monitoring performance test results and identifying potential regressions.
* Supporting performance sheriffs with tools to assist in identifying regressions.
* Developing test plans for performance testing.
* Running adhoc manual or partially automated performance testing.


= What we don't do =
= What we don't do =
* Own all performance tests. We work on the test harnesses and tools that are used for performance testing, but the tests themselves are often developed outside of our team. Every test should have an owner, who is responsible for responding to questions related to the test, and may be asked to assist when the test detects a regression.
* Maintenance of infrastructure hardware.
* Review all performance tests. Similar to test ownership, we enable others to contribute performance tests. We can provide advice and reviews, but do not impose this as a restriction to landing test changes.
* Maintain the continuous integration pipeline.
* Maintain the infrastructure the tests run on
* Writing/maintaining all performance tests.
* Maintain the continuous integration pipeline
* Maintain the reporting tools


= Meetings =
= Meetings =
Confirmed users
2,177

edits