canmove, Confirmed users
2,041
edits
Line 25: | Line 25: | ||
== General Development Workflow == | == General Development Workflow == | ||
[[File:Fxos-perf-auto- | A performance automation metric consists of a type of measurement applied to a series of test cases. | ||
Depending on the metric, these may be one basic test scenario applied to a number of apps (such as launch latency testing) or a number of distinct test case designs (such as memory or power consumption testing). | |||
; Design : Detailed description of the test case's setup needs and procedure, the test procedure, execution details, and results to deliver. This might be for a common scenario, or might be per-case. | |||
; Validity Review : Review by test stakeholders for whether the test case design meets requirements and will measure the right thing. Tests that measure the wrong thing can still give repeatable results, so it is important to review the initial design. | |||
; Workload : Data (a fixture) to be preloaded into the system under test prior to execution. There are default workflows described at [https://developer.mozilla.org/en-US/Firefox_OS/Developing_Gaia/make_options_reference#Reference_Workloads MDN], but some tests may need special-case workflows to be developed. | |||
; Instrumentation : Code either injected to or permanently added to Firefox OS in order to let automation monitor performance details. Some tests do not need this. | |||
; On-Demand Test : Test that can be run by a developer on command line, try server, or some other way to test a set of arbitrary code. | |||
; Results Review : Review by test stakeholders of a set of results of the implemented test. This is a sanity check for whether the results look like something valid (a result frame rate of 70fps on a 60Hz screen would fail this, for example) and whether they are essentially repeatable (reasonably grouped, assuming no code changes). | |||
; Published Results : Automated results published to a dashboard or similar display. This usually also implies having a triggering mechanism (nightly, on-commit, etc.) to generate them. | |||
; Documentation : Finalized documentation on wiki and/or MDN of the test case and its usage. | |||
Below are two graphs illustrating the general workflow. | |||
The first is the general order of operations of developing a performance test case, described in the above deliverables. It is split into three milestones. Milestones each result in a final deliverable, and should be each completed as a whole within a single development effort. However, they can be pipelined or freezered on a milestone-by-milestone basis. | |||
The second is a basic bug dependency tree for a performance test case, assuming one bug per deliverable (including reviews) and that all deliverables are necessary. | |||
[[File:Fxos-perf-auto-workflow.png|framed|left|FxOS Performance Automation Development Workflow]] | |||
<small>[http://wiki.mozilla.org/FirefoxOS/Performance/Automation/Metric_Template template]</small> | <small>[http://wiki.mozilla.org/FirefoxOS/Performance/Automation/Metric_Template template]</small> |