Game Benchmark Automation: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
(Created)
 
(Add list of benchmark bugs)
 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
== What? ==
== What? ==


The Browser Benchmark Automation project will run automated performance tests on Firefox's four release channels (Nightly, Aurora, Beta, and Release) and Chrome's three release channels (Dev, Beta, Release, and possibly Canary) on platforms representative of typical Firefox users. Our first benchmark will run [https://github.com/padenot/webaudio-benchmark padenot's webaudio benchmark] in Firefox Nightly and Chrome Canary on Windows 7. Later, other benchmarks may be selected from the [[Web Browser Grand Prix|list of benchmarks]] used by Tom's Hardware Guide.
The Browser Benchmark Automation project will run automated performance tests on Firefox's four release channels (Nightly, Aurora, Beta, and Release) and Chrome's three release channels (Dev, Beta, Release, and possibly Canary) on platforms representative of typical Firefox users. Our first benchmark will run [https://github.com/padenot/webaudio-benchmark padenot's webaudio benchmark] in Firefox Nightly and Chrome Canary on Windows 7. Later, other benchmarks may be selected from the [[Web Browser Grand Prix|list of benchmarks]] used by Tom's Hardware Guide. This project sometimes goes by the name "game benchmark automation" or "browser benchmark automation".


== Why? ==
== Why? ==
Line 18: Line 18:
== When? ==
== When? ==


Alan started this project in 2013 Q4 to run game-related benchmarks. Around the same time, Chris was organizing SpiderMonkey team team project to profile and optimize the benchmarks used by Tom's Hardware Guide's Web Browser Grand Prix. These two efforts merged in 2014 Q1. Development picked up speed in 2014 Q2 when Dan from the A-Team joined the effort.
Alan started this project in 2013 Q4 to run game-related benchmarks. Around the same time, Chris was organizing a SpiderMonkey team project to profile and optimize the benchmarks used by Tom's Hardware Guide's Web Browser Grand Prix. These two efforts merged in 2014 Q1. Development picked up speed in 2014 Q2 when Dan from the A-Team joined the effort.


== Where? ==
== Where? ==
Line 24: Line 24:
* Weekly status meeting: Thursday at 10:30 AM PT (17:30 UTC) in the "Games" Vidyo room
* Weekly status meeting: Thursday at 10:30 AM PT (17:30 UTC) in the "Games" Vidyo room
* Meeting notes: https://etherpad.mozilla.org/Games-Performance-Testing
* Meeting notes: https://etherpad.mozilla.org/Games-Performance-Testing
* A-Team's mozbench project page: https://wiki.mozilla.org/Auto-tools/Projects/Mozbench
* IRC: '''#games''' channel on irc.mozilla.org
* IRC: '''#games''' channel on irc.mozilla.org
* Test results are on [https://datazilla.mozilla.org/?product=chrome&repository=canary&test=webaudio-benchmark&page=Simple%20gain%20test&compare_product=chrome&compare_repository=canary&project=mozbench Datazilla]
The benchmark test machines are located in Mozilla's Toronto office, where Aaron and Kamil (or Alan) can fix machine issues in person that Dan may be unable to do remotely.
== Bugs ==
* See meta {{bug|1013650}}


The benchmark test machines are located in Mozilla's Toronto office, where Aaron and Kamil (or Alan) can fix machine issues in person that Dan may be unable to do remotely.
<bugzilla>
{
    "blocks": "1013650",
    "resolution": "---",
    "include_fields": "id, summary, whiteboard, keywords, assigned_to, status"
}
</bugzilla>

Latest revision as of 06:59, 13 August 2014

What?

The Browser Benchmark Automation project will run automated performance tests on Firefox's four release channels (Nightly, Aurora, Beta, and Release) and Chrome's three release channels (Dev, Beta, Release, and possibly Canary) on platforms representative of typical Firefox users. Our first benchmark will run padenot's webaudio benchmark in Firefox Nightly and Chrome Canary on Windows 7. Later, other benchmarks may be selected from the list of benchmarks used by Tom's Hardware Guide. This project sometimes goes by the name "game benchmark automation" or "browser benchmark automation".

Why?

Mozilla wants to track Firefox's performance improvements and regressions and do competitive analysis with other browsers, like Chrome or IE.

Who?

  • Dan Minor <dminor>, A-Team engineer developing the benchmark automation framework
  • Chris Peterson <cpeterson>, Engineering program manager
  • Kyle Lahnakoski <klahnakoski>, Statistics and Visualization Engineer (for benchmark reports)
  • Aaron Train <atrain>, QA Engineer
  • Kamil Jozwiak <kjozwiak>, QA Engineer
  • Alan Kligman <akligman>, Platform Engineer who designed original automation framework

When?

Alan started this project in 2013 Q4 to run game-related benchmarks. Around the same time, Chris was organizing a SpiderMonkey team project to profile and optimize the benchmarks used by Tom's Hardware Guide's Web Browser Grand Prix. These two efforts merged in 2014 Q1. Development picked up speed in 2014 Q2 when Dan from the A-Team joined the effort.

Where?

The benchmark test machines are located in Mozilla's Toronto office, where Aaron and Kamil (or Alan) can fix machine issues in person that Dan may be unable to do remotely.

Bugs

No results.

0 Total; 0 Open (0%); 0 Resolved (0%); 0 Verified (0%);