Buildbot/Talos/Tests: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
(→‎Possible regression causes: Add a JavaScript GC regression)
(Document IsASCII and IsUTF8 benches)
Line 1,352: Line 1,352:
==== Possible regression causes ====
==== Possible regression causes ====
* None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.
* None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.
=== IsASCII and IsUTF8 gtest microbenchmarks ===
* contact: :hsivonen
* source: [https://dxr.mozilla.org/mozilla-central/source/xpcom/tests/gtest/TestStrings.cpp]
* type: [https://wiki.mozilla.org/Buildbot/Talos/Tests#Microbench Microbench]
* reporting: intervals in ms (lower is better)
* data: each test is run and measured 5 times
* summarization: take the [https://wiki.mozilla.org/Buildbot/Talos/Data#median median] of the 5 data points; [https://dxr.mozilla.org/mozilla-central/source/testing/gtest/mozilla/MozGTestBench.cpp#43-46 source: MozGTestBench.cpp]
Test whose name starts with PerfIsASCII test the performance of the XPCOM string IsASCII function with ASCII inputs if different lengths.
Test whose name starts with PerfIsUTF8 test the performance of the XPCOM string IsUTF8 function with ASCII inputs if different lengths.
==== Possible regression causes ====
* The --enable-rust-simd accidentally getting turned off in automation.
* Changes to encoding_rs internals.
* LLVM optimizations regressing between updates to the copy of LLVM included in the Rust compiler.


=== TestStandardURL ===
=== TestStandardURL ===

Revision as of 10:49, 7 September 2017

Talos Tests

Where to get this information

  • Talos tests are defined in test.py
  • Treeherder abbreviations are defined in buildbot.py
  • Talos suites are configured for production in config.py; these names are mapped to Treeherder via regexes: buildbot.py

Talos Test Types

There are two different species of Talos tests:

  • #Startup Tests : start up the browser and wait for either the load event or the paint event and exit, measuring the time
  • #Page Load Tests : load a manifest of pages

Startup Tests

Startup tests launch Firefox and measure the time to the onload or paint events. We run this in a series of cycles (default to 20) to generate a full set of data. Tests that currently are startup tests are:

Page Load Tests

Many of the talos tests use the page loader to load a manifest of pages. These are tests that load a specific page and measure the time it takes to load the page, scroll the page, draw the page etc. In order to run a page load test, you need a manifest of pages to run. The manifest is simply a list of URLs of pages to load, separated by carriage returns, e.g.:

https://www.mozilla.org
https://www.mozilla.com

Example: svgx.manifest

Manifests may also specify that a test computes its own data by prepending a % in front of the line:

% https://www.mozilla.org
% https://www.mozilla.com

Example: v8.manifest

The file you created should be referenced in your test config inside of [test.py]. for example, open test.py, and look for the line referring to the test you want to run:

tpmanifest = '${talos}/page_load_test/svgx/svgx.manifest'
tpcycles = 1 # run a single cycle
tppagecycles = 25 # load each page 25 times before moving onto the next page

Paint Tests

Paint tests are measuring the time to receive both the MozAfterPaint and OnLoad event instead of just the OnLoad event. Most tests now look for this unless they are an ASAP test, or an internal benchmark

ASAP Tests

We have a variety of tests which we now run in ASAP mode where we render as fast as possible (disabling vsync and letting the rendering iterate as fast as it can using `requestAnimationFrame`). In fact we have replaced some original tests with the 'x' versions to make them measure. We do this with RequestAnimationFrame().

ASAP Tests are:

Internal Benchmarks

Many tests have internal benchmarks which we report as accurately as possible. These are the exceptions to the general rule of calculating the suite score as a geometric mean of the subtest values (which are median values of the raw data from the subtests).

Tests which are imported benchmarks are:

Row Major vs. Column Major

To get more stable numbers, tests are run multiple times. There are two ways that we do this: row major and column major. Row major means each test is run multiple times and then we move to the next test (and run it multiple times). Column major means that each test is run once one after the other and then the whole sequence of tests is run again.

More background information about these approaches can be found in Joel Maher's Reducing the Noise in Talos blog post.

tp5n pages set

Some tests make use of a set of 50 "real world" pages, known as the tp5n set. These pages are not part of the talos repository, but without them the tests which use them won't run.

  • To add these pages to your local setup, download tp5n.zip, and extract it such that `tp5n` ends up as `testing/talos/talos/tests/tp5n`.
  • see also tp5 test.

Test Descriptions

Basic Compositor Video

  • contact: :ethlin, :avih
  • source: video
  • type: PageLoader
  • data: 12 cycles of the entire benchmark, each subtest will have 12 data points (see below)
  • summarization:
  • Lower is better
  • unit: ms/frame
Talos test name Description
basic_compositor_video
Example data
;0;240p.120fps.mp4_scale_fullscreen_startup;11.112;11.071;11.196;11.157;11.195;11.240;11.196;11.155;11.237;11.074;11.154;11.282
;1;240p.120fps.mp4_scale_fullscreen_inclip;10.995;11.114;11.052;10.991;10.876;11.115;10.995;10.991;10.997;10.994;10.992;10.993
;2;240p.120fps.mp4_scale_1_startup;1.686;1.690;1.694;1.683;1.689;1.692;1.686;1.692;1.689;1.704;1.684;1.686
;3;240p.120fps.mp4_scale_1_inclip;1.666;1.666;1.666;1.668;1.667;1.669;1.667;1.668;1.668;1.667;1.667;1.669
;4;240p.120fps.mp4_scale_1.1_startup;1.677;1.672;1.673;1.677;1.673;1.677;1.672;1.677;1.677;1.671;1.676;1.679
;5;240p.120fps.mp4_scale_1.1_inclip;1.667;1.668;1.666;1.667;1.667;1.668;1.667;1.667;1.667;1.667;1.668;1.668
;6;240p.120fps.mp4_scale_2_startup;1.927;1.908;1.947;1.946;1.902;1.932;1.916;1.936;1.921;1.896;1.908;1.894
;7;240p.120fps.mp4_scale_2_inclip;1.911;1.901;1.896;1.917;1.897;1.921;1.907;1.944;1.904;1.912;1.936;1.913
;8;480p.60fps.webm_scale_fullscreen_startup;11.675;11.587;11.539;11.454;11.723;11.410;11.629;11.410;11.454;11.498;11.540;11.540
;9;480p.60fps.webm_scale_fullscreen_inclip;11.304;11.238;11.370;11.300;11.364;11.368;11.237;11.238;11.434;11.238;11.304;11.368
;10;480p.60fps.webm_scale_1_startup;3.386;3.360;3.391;3.376;3.387;3.402;3.371;3.371;3.356;3.383;3.376;3.356
;11;480p.60fps.webm_scale_1_inclip;3.334;3.334;3.329;3.334;3.334;3.334;3.334;3.334;3.334;3.335;3.334;3.334
;12;480p.60fps.webm_scale_1.1_startup;3.363;3.363;3.368;3.356;3.356;3.379;3.364;3.360;3.360;3.356;3.363;3.356
;13;480p.60fps.webm_scale_1.1_inclip;3.329;3.334;3.329;3.334;3.333;3.334;3.334;3.334;3.340;3.335;3.329;3.335
;14;480p.60fps.webm_scale_2_startup;4.960;4.880;4.847;4.959;4.802;4.863;4.824;4.926;4.847;4.785;4.870;4.855
;15;480p.60fps.webm_scale_2_inclip;4.903;4.786;4.892;4.903;4.822;4.832;4.798;4.857;4.808;4.856;4.926;4.741
;16;1080p.60fps.mp4_scale_fullscreen_startup;14.638;14.495;14.496;14.710;14.781;14.853;14.639;14.637;14.707;14.637;14.711;14.636
;17;1080p.60fps.mp4_scale_fullscreen_inclip;13.795;13.798;13.893;13.702;13.799;13.607;13.798;13.705;13.896;13.896;13.896;14.088
;18;1080p.60fps.mp4_scale_1_startup;6.995;6.851;6.930;6.820;6.915;6.805;6.898;6.866;6.852;6.850;6.803;6.851
;19;1080p.60fps.mp4_scale_1_inclip;6.560;6.625;6.713;6.601;6.645;6.496;6.624;6.538;6.539;6.497;6.580;6.558
;20;1080p.60fps.mp4_scale_1.1_startup;7.354;7.230;7.195;7.300;7.266;7.283;7.196;7.249;7.230;7.230;7.212;7.264
;21;1080p.60fps.mp4_scale_1.1_inclip;6.969;7.222;7.018;6.993;7.045;6.970;6.970;6.807;7.118;6.969;6.997;6.972
;22;1080p.60fps.mp4_scale_2_startup;6.963;6.947;6.914;6.929;6.979;7.010;7.010245327102808;6.914;6.961;7.028;7.012;6.914
;23;1080p.60fps.mp4_scale_2_inclip;6.757;6.694;6.672;6.669;6.737;6.831;6.716;6.715;6.832;6.670;6.672;6.759

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.


CanvasMark

Talos test name Description
tcanvasmark

These tests run the third-party CanvasMark benchmark suite, which measures the browser's ability to render a variety of canvas animations at a smooth framerate as the scenes grow more complex.

Results are a score "based on the length of time the browser was able to maintain the test scene at greater than 30 FPS, multiplied by a weighting for the complexity of each test type".

Example data
1;Asteroids - Vectors;704;672;713;738;796
2;Asteroids - Bitmaps- shapes- text;1105;1139;1002;1122;1050
3;Asteroids - Shapes- shadows- blending;740;731;775;725;732
4;Arena5 - Vectors- shadows- bitmaps- text;841;679;865;942;873
5;Plasma - Maths- canvas shapes;583;583;557;604;632
6;3D Rendering - Maths- polygons- image transforms;340;445;374;425;425
7;Pixel blur - Math- getImageData- putImageData;1409;1408;1415;1508;1335

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

glterrain

  • contact: :avih :jgilbert
  • source: [glterrain]
  • type: PageLoader
  • data: we load the perftest.html page (which generates 4 metrics to track) 25 times, resulting in 4 sets of 25 data points
  • summarization:
Talos test name Description
glterrain Measures average frames interval while animating a simple WebGL scene

This tests animates a simple WebGL scene (static textured landscape, one moving light source, rotating viewport) and measure the frames throughput (expressed as average interval) over 100 frames. It runs in ASAP mode (vsync off) and measures the same scene 4 times - for all combination of antialiasing and alpha. It reports the results as 4 values - one for each combination. Lower results are better.

Example data
0;0.WebGL-terrain-alpha-no-AA-no;19.8189;20.57185;20.5069;21.09645;20.40045;20.89025;20.34285;20.8525;20.45845;20.6499;19.94505;20.05285;20.316049;19.46745;19.46135;20.63865;20.4789;19.97015;19.9546;20.40365;20.74385;20.828649;20.78295;20.51685;20.97069
1;1.WebGL-terrain-alpha-no-AA-yes;23.0464;23.5234;23.34595;23.40609;22.54349;22.0554;22.7933;23.00685;23.023649;22.51255;23.25975;23.65819;22.572249;22.9195;22.44325;22.95015;23.3567;23.02089;22.1459;23.04545;23.09235;23.40855;23.3296;23.18849;23.273249
2;2.WebGL-terrain-alpha-yes-AA-no;24.01795;23.889449;24.2683;24.34649;23.0562;24.02275;23.54819;24.1874;23.93545;23.53629;23.305149;23.62459;24.01589;24.06405;24.143449;23.998549;24.08205;24.26989;24.0736;24.2346;24.01145;23.7817;23.90785;24.7118;24.2834
3;3.WebGL-terrain-alpha-yes-AA-yes;25.91375;25.87005;25.64875;25.15615;25.5475;24.497449;24.56385;25.57529;25.54889;26.31559;24.143949;25.09895;24.75049;25.2087;25.52385;25.9017;25.4439;24.3495;25.9269;25.734449;26.4126;25.547449;25.667249;25.679349;25.9565

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

glvideo

  • Lower is better
  • unit: ms/100 ticks
Talos test name Description
glvideo WebGL video texture update with 1080p video. Measures mean tick time across 100 ticks.
Example data
0;Mean tick time across 100 ticks: ;54.6916;49.0534;51.21645;51.239650000000005;52.44295

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

tp5o_scroll

Talos test name Description
tp5o_scroll Measures average frames interval while scrolling the pages of the tp5o set

This test is identical to tscrollx, but it scrolls the 50 pages of the tp5o set (rather than 6 synthetic pages which tscrollx scrolls). For each page, the test waits 500ms after the page load event fires, then iterates 100 scroll steps of 10px each (or until the bottom of the page is reached - whichever comes first), then reports the average frame interval.

Example data
0;163.com/www.163.com/index.html;9.73;8.61;7.37;8.17;7.58;7.29;6.88;7.45;6.91;6.61;8.47;7.12
1;56.com/www.56.com/index.html;10.85;10.24;10.75;10.30;10.23;10.10;10.31;10.06;11.10;10.06;9.56;10.30
2;aljazeera.net/aljazeera.net/portal.html;9.23;7.15;7.50;7.26;7.73;7.05;7.14;7.66;7.23;7.93;7.26;7.18
3;amazon.com/www.amazon.com/Kindle-Wireless-Reader-Wifi-Graphite/dp/B002Y27P3M/507846.html;7.14;6.87;7.18;6.31;6.93;6.71;6.37;7.00;6.59;5.37;7.31;6.13
4;bbc.co.uk/www.bbc.co.uk/news/index.html;7.39;6.33;6.22;7.66;6.67;7.77;6.91;7.74;7.08;6.36;6.03;7.12
5;beatonna.livejournal.com/beatonna.livejournal.com/index.html;5.79;5.79;5.68;5.46;5.55;5.48;5.69;5.83;5.88;5.97;5.93;5.88
6;bild.de/www.bild.de/index.html;8.65;7.63;7.17;6.36;7.44;7.68;8.63;6.71;8.34;7.15;7.82;7.70
7;cgi.ebay.com/cgi.ebay.com/ALL-NEW-KINDLE-3-eBOOK-WIRELESS-READING-DEVICE-W-WIFI-/130496077314@pt=LH_DefaultDomain_0&hash=item1e622c1e02.html;7.12;6.81;7.22;6.98;7.05;5.68;7.15;6.54;7.31;7.18;7.82;7.77
8;chemistry.about.com/chemistry.about.com/index.html;6.76;6.17;6.41;6.88;5.67;5.47;6.83;6.28;6.16;6.81;6.21;6.75
9;chinaz.com/chinaz.com/index.html;10.72;7.99;7.33;7.10;7.85;8.62;8.39;6.72;6.26;6.65;8.14;7.78
10;cnn.com/www.cnn.com/index.html;7.73;6.80;6.08;8.27;9.24;7.81;7.69;7.05;8.17;7.70;7.90;6.81
11;dailymail.co.uk/www.dailymail.co.uk/ushome/index.html;6.37;8.28;7.19;8.00;8.09;7.43;6.90;7.24;7.77;7.29;7.38;6.14
12;dailymotion.com/www.dailymotion.com/us.html;9.53;9.80;9.29;9.03;9.10;8.64;8.62;8.71;8.77;9.81;9.64;8.96
13;digg.com/digg.com/news/story/New_logo_for_Mozilla_Firefox_browser.html;7.72;7.06;7.60;5.67;6.85;7.32;7.80;5.98;8.27;6.68;7.52;8.39
14;ezinearticles.com/ezinearticles.com/index.html@Migraine-Ocular---The-Eye-Migraines&id=4684133.html;7.14;7.11;8.09;7.17;6.87;7.12;7.65;7.74;7.26;7.36;6.91;6.95
15;globo.com/www.globo.com/index.html;6.71;7.91;5.83;7.34;7.75;8.00;7.73;7.85;7.03;6.42;8.43;8.11
16;google.com/www.google.com/search@q=mozilla.html;6.49;6.23;7.96;6.39;7.23;8.19;7.35;7.39;6.94;7.24;7.55;7.62
17;goo.ne.jp/goo.ne.jp/index.html;8.56;7.18;7.15;7.03;6.85;7.62;7.66;6.99;7.84;7.51;7.23;7.18
18;guardian.co.uk/www.guardian.co.uk/index.html;7.32;7.62;8.18;7.62;7.83;8.08;7.60;8.17;8.47;7.54;7.92;8.09
19;homeway.com.cn/www.hexun.com/index.html;10.18;8.75;8.83;8.64;8.98;8.07;7.76;9.29;8.05;7.55;8.91;7.78
20;huffingtonpost.com/www.huffingtonpost.com/index.html;8.38;7.17;7.03;6.83;6.49;6.47;6.69;7.08;6.81;7.29;7.13;7.70
21;ifeng.com/ifeng.com/index.html;12.45;8.65;8.75;7.56;8.26;7.71;8.04;7.45;7.83;7.14;8.38;7.68
22;imdb.com/www.imdb.com/title/tt1099212/index.html;8.53;5.65;6.94;7.18;6.10;7.57;6.26;8.34;8.16;7.29;7.62;8.51
23;imgur.com/imgur.com/gallery/index.html;8.10;7.20;7.50;7.88;7.27;6.97;8.13;7.14;7.59;7.39;8.01;8.82
24;indiatimes.com/www.indiatimes.com/index.html;8.00;6.74;7.37;8.52;7.03;8.45;7.08;8.47;9.26;7.89;7.17;6.74
25;mail.ru/mail.ru/index.html;7.64;9.50;9.47;7.03;6.45;6.24;8.03;6.72;7.18;6.39;6.25;6.25
26;mashable.com/mashable.com/index.html;7.97;8.03;6.10;7.80;7.91;7.26;7.49;7.45;7.60;7.08;7.63;7.36
27;media.photobucket.com/media.photobucket.com/image/funny%20gif/findstuff22/Best%20Images/Funny/funny-gif1.jpg@o=1.html;290.00;195.00;217.00;199.00;204.00;196.00;198.00;206.00;209.00;208.00;192.00;196.00
28;myspace.com/www.myspace.com/albumart.html;14.40;13.45;13.29;13.62;13.42;14.15;13.86;14.34;14.69;14.10;13.82;14.13
29;naver.com/www.naver.com/index.html;9.15;8.31;9.40;9.89;7.29;8.43;8.87;8.77;8.96;8.24;8.16;8.21
30;noimpactman.typepad.com/noimpactman.typepad.com/index.html;7.27;7.14;7.70;7.86;7.43;6.95;7.30;7.58;7.51;7.95;7.43;7.05
31;page.renren.com/page.renren.com/index.html;7.94;8.13;6.76;7.77;6.93;6.60;7.62;7.61;6.88;7.56;7.55;7.48
32;people.com.cn/people.com.cn/index.html;11.92;9.22;8.49;8.55;8.34;8.49;6.91;9.92;8.69;8.63;7.69;9.34
33;rakuten.co.jp/www.rakuten.co.jp/index.html;11.10;7.13;8.68;7.85;8.37;7.91;6.74;8.27;8.55;8.93;7.15;9.02
34;reddit.com/www.reddit.com/index.html;6.38;7.95;6.84;7.04;6.96;7.15;8.05;7.71;8.13;7.13;6.60;7.53
35;reuters.com/www.reuters.com/index.html;7.51;7.25;6.60;6.98;7.41;6.45;7.61;7.46;6.11;7.15;7.05;6.94
36;slideshare.net/www.slideshare.net/jameswillamor/lolcats-in-popular-culture-a-historical-perspective.html;7.20;6.32;6.80;6.87;6.29;6.45;7.18;6.92;6.57;7.41;7.08;6.51
37;sohu.com/www.sohu.com/index.html;11.72;9.64;8.85;7.12;7.96;9.14;7.76;8.19;7.14;7.68;8.08;7.24
38;spiegel.de/www.spiegel.de/index.html;7.24;7.30;6.64;7.01;6.74;6.70;6.36;6.84;7.86;7.08;7.12;7.40
39;stackoverflow.com/stackoverflow.com/questions/184618/what-is-the-best-comment-in-source-code-you-have-ever-encountered.html;7.39;5.88;7.22;6.51;7.12;6.51;6.46;6.53;6.63;6.54;6.48;6.80
40;store.apple.com/store.apple.com/us@mco=Nzc1MjMwNA.html;6.23;7.17;7.39;8.98;7.99;8.03;9.12;8.37;8.56;7.61;8.06;7.55
41;thepiratebay.org/thepiratebay.org/top/201.html;9.08;8.93;8.09;7.49;7.30;7.80;7.54;7.65;7.91;7.53;8.37;8.04
42;tudou.com/www.tudou.com/index.html;10.06;9.38;8.68;7.37;8.57;9.11;8.20;7.91;8.78;9.64;8.11;8.47
43;uol.com.br/www.uol.com.br/index.html;9.04;9.49;9.48;9.31;8.68;8.41;9.16;8.91;9.49;8.37;9.77;8.73
44;w3.org/www.w3.org/standards/webdesign/htmlcss.html;6.62;5.98;6.87;6.47;7.22;6.05;6.42;6.50;7.47;7.18;5.82;7.11
45;wsj.com/online.wsj.com/home-page.html;7.49;8.57;6.84;8.12;7.60;7.24;8.16;8.22;6.81;8.28;8.11;8.58
46;xinhuanet.com/xinhuanet.com/index.html;13.66;9.21;10.09;9.56;8.99;10.29;10.24;8.91;11.23;10.82;9.64;10.11
47;xunlei.com/xunlei.com/index.html;8.99;8.16;8.82;8.37;7.01;8.48;7.98;8.69;8.10;8.10;7.10;6.41
48;yelp.com/www.yelp.com/biz/alexanders-steakhouse-cupertino.html;8.18;7.45;7.01;8.14;7.12;7.82;8.24;7.13;7.00;6.41;6.85;7.31
49;youku.com/www.youku.com/index.html;12.21;10.29;10.37;10.34;8.40;9.82;9.23;9.91;9.64;9.91;8.90;10.23
50;youtube.com/www.youtube.com/music.html;9.90;9.06;9.29;9.17;8.85;8.77;9.83;9.21;9.29;10.09;9.69;8.64

Possible regression causes

Some examples of things that cause regressions in this test are:

  • Increased displayport size (which causes a larger display list to be built)
  • Slowdown in the building of display list
  • Slowdown in rasterization of content

tp5

Talos test name Description
tp5o tp5 with limited pageset (48 pages as others have too much noise)

Tests the time it takes Firefox to load the tp5 web page test set. The web set was culled from the Alexa top 500 April 8th, 2011 and consists of 51 pages.

Here are the broad steps we use to create the test set:

  1. Take the Alexa top 500 sites list
  2. Remove all sites with questionable or explicit content
  3. Remove duplicate site (for ex. many Google search front pages)
  4. Manually select to keep interesting pages (such as pages in different locales)
  5. Select a more representative page from any site presenting a simple search/login/etc. page
  6. Deal with Windows 255 char limit for cached pages
  7. Limit test set to top 100 pages

Note that the above steps did not eliminate all outside network access so we had to take further action to scrub all the pages so that there are 0 outside network accesses (this is done so that the tp test is as deterministic measurement of our rendering/layout/paint process as possible).

Example data
0;163.com/www.163.com/index.html;1035;512;542;519;505;514;551;513;554;793;487;528;528;498;503;530;527;490;521;535;521;496;498;564;520
1;56.com/www.56.com/index.html;1081;583;580;577;597;580;623;558;572;592;598;580;564;583;596;600;579;580;566;573;566;581;571;600;586
2;aljazeera.net/aljazeera.net/portal.html;688;347;401;382;346;362;347;372;337;345;365;337;380;338;355;360;356;366;367;352;350;366;346;375;382
3;amazon.com/www.amazon.com/Kindle-Wireless-Reader-Wifi-Graphite/dp/B002Y27P3M/507846.html;1392;878;901;852;886;867;877;864;862;877;866;888;3308;870;863;869;873;850;851;850;857;873;869;860;855
4;bbc.co.uk/www.bbc.co.uk/news/index.html;455;271;272;271;279;289;276;285;277;291;281;286;278;286;274;285;276;285;287;286;276;288;279;272;278
5;beatonna.livejournal.com/beatonna.livejournal.com/index.html;290;123;123;129;120;121;124;125;119;125;120;150;121;147;121;121;113;121;119;122;117;112;127;117;139
6;bild.de/www.bild.de/index.html;1314;937;946;931;922;918;920;937;934;930;947;928;936;933;933;928;930;941;951;946;947;938;925;939;938
7;cgi.ebay.com/cgi.ebay.com/ALL-NEW-KINDLE-3-eBOOK-WIRELESS-READING-DEVICE-W-WIFI-/130496077314@pt=LH_DefaultDomain_0&hash=item1e622c1e02.html;495;324;330;328;321;308;315;308;321;313;327;330;317;339;333;322;312;370;336;327;310;312;312;355;330
8;chemistry.about.com/chemistry.about.com/index.html;238;156;156;154;158;161;152;151;152;167;179;152;154;156;161;161;157;167;151;167;154;149;178;153;160
9;chinaz.com/chinaz.com/index.html;347;223;255;234;245;233;264;234;244;228;260;224;258;223;280;220;243;225;251;226;258;232;258;232;247
10;cnn.com/www.cnn.com/index.html;551;384;436;394;391;375;371;407;371;374;398;372;368;388;376;380;386;377;363;383;384;370;388;381;374
11;dailymail.co.uk/www.dailymail.co.uk/ushome/index.html;984;606;551;561;545;542;576;564;543;560;566;557;561;544;545;576;548;539;568;567;557;560;545;544;578
12;dailymotion.com/www.dailymotion.com/us.html;473;271;286;272;285;288;290;290;280;268;286;269;287;275;289;282;293;287;304;261;289;284;281;277;286
13;digg.com/digg.com/news/story/New_logo_for_Mozilla_Firefox_browser.html;410;321;304;303;322;300;319;321;320;306;323;313;312;305;312;338;317;338;301;325;297;302;309;305;300
14;ezinearticles.com/ezinearticles.com/index.html@Migraine-Ocular---The-Eye-Migraines&id=4684133.html;234;177;163;163;186;176;185;175;167;156;162;199;163;190;173;181;175;178;165;159;182;170;183;169;158
15;globo.com/www.globo.com/index.html;684;468;466;485;482;445;433;467;467;450;487;466;440;484;444;451;511;443;429;469;468;430;485;459;447
16;google.com/www.google.com/search@q=mozilla.html;150;100;102;101;97;104;99;116;107;100;98;137;102;102;99;106;98;112;100;102;105;104;107;96;100
17;goo.ne.jp/goo.ne.jp/index.html;328;125;132;132;143;121;122;120;132;145;166;139;144;125;128;152;128;145;130;132;154;126;142;133;139
18;guardian.co.uk/www.guardian.co.uk/index.html;462;311;296;322;309;305;303;288;301;308;301;304;319;309;295;305;294;308;304;322;310;314;302;303;292
19;homeway.com.cn/www.hexun.com/index.html;584;456;396;357;417;374;376;406;363;392;400;378;378;402;390;373;398;393;366;385;383;361;418;386;351
20;huffingtonpost.com/www.huffingtonpost.com/index.html;811;609;575;596;568;585;589;571;568;600;571;588;585;570;574;616;576;564;598;594;589;590;572;572;612
21;ifeng.com/ifeng.com/index.html;829;438;478;462;448;465;469;470;429;463;465;432;482;444;476;453;460;476;461;484;467;510;447;477;443
22;imdb.com/www.imdb.com/title/tt1099212/index.html;476;337;358;332;414;379;344;420;354;363;387;345;358;371;341;385;359;379;353;349;392;349;358;378;347
23;imgur.com/imgur.com/gallery/index.html;419;205;224;231;207;222;206;231;204;219;209;210;210;208;202;215;203;210;203;212;218;219;202;224;230
24;indiatimes.com/www.indiatimes.com/index.html;530;339;348;384;376;381;353;350;403;333;356;393;350;328;393;329;389;346;394;349;382;332;409;327;354
25;mail.ru/mail.ru/index.html;500;256;308;251;271;270;270;246;273;252;279;249;301;252;251;256;271;246;267;254;265;248;277;247;275
26;mashable.com/mashable.com/index.html;699;497;439;508;440;448;512;446;431;500;445;427;495;455;458;499;440;432;522;443;447;488;445;461;489
27;media.photobucket.com/media.photobucket.com/image/funny%20gif/findstuff22/Best%20Images/Funny/funny-gif1.jpg@o=1.html;294;203;195;189;213;186;195;186;204;188;190;220;204;202;195;204;192;204;191;187;204;199;191;192;211
28;myspace.com/www.myspace.com/albumart.html;595;446;455;420;433;425;416;429;452;411;435;439;389;434;418;402;422;426;396;438;423;391;434;438;395
29;naver.com/www.naver.com/index.html;626;368;338;386;342;377;371;352;379;348;362;357;359;354;386;338;394;330;326;372;345;392;336;336;368
30;noimpactman.typepad.com/noimpactman.typepad.com/index.html;431;333;288;292;285;313;277;289;282;292;276;293;270;294;289;281;275;302;285;290;280;285;278;284;283
31;page.renren.com/page.renren.com/index.html;373;232;228;228;213;227;224;227;226;216;234;226;230;212;213;221;224;230;212;218;217;221;212;222;230
32;people.com.cn/people.com.cn/index.html;579;318;305;339;307;341;325;326;307;309;315;314;318;317;321;309;307;299;312;313;305;326;318;384;310
33;rakuten.co.jp/www.rakuten.co.jp/index.html;717;385;371;388;381;348;394;358;396;368;343;386;348;388;393;388;360;339;398;357;392;378;395;386;367
34;reddit.com/www.reddit.com/index.html;340;254;248;255;241;241;248;275;251;250;250;252;243;274;240;269;254;249;242;257;271;253;243;278;252
35;reuters.com/www.reuters.com/index.html;513;404;355;358;379;343;354;385;379;354;418;363;342;412;355;351;402;375;354;400;362;355;380;373;336
36;slideshare.net/www.slideshare.net/jameswillamor/lolcats-in-popular-culture-a-historical-perspective.html;397;279;270;283;285;283;291;286;289;284;275;281;288;284;280;279;290;301;290;270;292;282;289;267;278
37;sohu.com/www.sohu.com/index.html;727;414;479;414;431;485;418;440;488;431;432;464;442;407;488;435;416;465;445;414;480;416;403;463;429
38;spiegel.de/www.spiegel.de/index.html;543;430;391;380;440;387;375;430;380;397;415;383;434;420;384;399;421;392;384;418;388;380;427;403;392
39;stackoverflow.com/stackoverflow.com/questions/184618/what-is-the-best-comment-in-source-code-you-have-ever-encountered.html;503;377;356;438;370;388;409;367;366;407;375;363;393;360;363;396;376;391;426;363;378;408;400;359;408
40;store.apple.com/store.apple.com/us@mco=Nzc1MjMwNA.html;488;327;344;343;333;329;328;348;361;342;362;332;389;333;382;331;382;343;405;343;326;325;329;323;340
41;thepiratebay.org/thepiratebay.org/top/201.html;412;274;317;260;256;269;266;261;258;289;245;284;256;277;251;302;276;307;268;268;247;285;260;271;257
42;tudou.com/www.tudou.com/index.html;522;304;281;283;287;285;288;307;279;288;282;303;292;288;290;287;311;271;279;274;294;272;290;269;290
43;uol.com.br/www.uol.com.br/index.html;668;387;450;411;395;452;386;431;452;394;385;436;413;414;440;401;412;439;408;430;426;415;382;433;387
44;w3.org/www.w3.org/standards/webdesign/htmlcss.html;225;143;129;151;181;141;147;137;159;179;142;153;136;139;191;140;151;143;141;181;140;154;142;143;183
45;wsj.com/online.wsj.com/home-page.html;634;466;512;463;467;507;461;432;492;494;491;507;466;477;495;455;451;495;461;463;494;468;444;497;442
46;xinhuanet.com/xinhuanet.com/index.html;991;663;727;659;647;639;644;656;666;658;670;648;676;653;652;654;641;636;664;668;655;657;646;674;633
47;xunlei.com/xunlei.com/index.html;802;625;624;636;641;652;659;642;623;635;628;606;667;688;683;694;672;640;628;620;653;626;633;654;643
48;yelp.com/www.yelp.com/biz/alexanders-steakhouse-cupertino.html;752;475;502;472;477;512;489;478;501;472;454;517;487;474;521;467;450;513;491;464;536;507;455;511;481
49;youku.com/www.youku.com/index.html;844;448;498;441;417;497;478;439;467;436;447;465;438;461;466;446;452;496;457;446;486;449;467;499;442
50;youtube.com/www.youtube.com/music.html;443;338;253;289;238;296;254;290;242;302;237;290;253;305;253;293;251;311;244;293;255;291;246;316;249

File IO

Possible regression causes
  • nonmain_startup_fileio opt (with or without e10s) windows7-32bug 1274018 This test seems to consistently report a higher result for mozilla-central compared to Try even for an identical revision due to extension signing checks. In other words, if you are comparing Try and Mozilla-Central you may see a false-positive regression on perfherder. Graphs: non-e10s e10s

Private Bytes

A memory metric tracked during tp4 test runs. This metric is sampled every 20 seconds.

For windows, a description from Microsoft TechNet.

Possible regression causes
  • Changing JavaScript GC policies or tuning parameters.
  • If you fix a regression for this test and have some tips to share, this is a good place for them.

RSS (Resident Set Size)

A memory metric tracked during tp5 test runs. This metric is sampled every 20 seconds. This metric is collected on linux/mac only.

Description from wikipedia.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

Xres (X Resource Monitoring)

A memory metric tracked during tp5 test runs. This metric is sampled every 20 seconds. This metric is collected on linux only.

xres man page.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

Working Set (tp5_memset)

A memory metric tracked during tp5 test runs. This metric is sampled every 20 seconds. This metric is collected on windows only. Description from Microsoft TechNet.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

Modified Page List Bytes

A memory metric tracked during tp5 test runs. This metric is sampled every 20 seconds. This metric is collected on Windows7 only. Description from Microsoft MSDN.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

% CPU

Cpu usage tracked during tp5 test runs. This metric is sampled every 20 seconds. This metric is collected on windows only.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

Responsiveness

contact: :jimm, :jmaher

Measures the delay for the event loop to process a tracer event. For more details, see bug 631571.

The score on this benchmark is proportional to the sum of squares of all event delays that exceed a 20ms threshold. Lower is better.

We collect 8000+ data points from the browser during the test and apply this formula to the results:

return sum([float(x)*float(x) / 1000000.0 for x in val_list])
Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

ts_paint

  • contact: :mak, :jimm, :jmaher
  • source: tspaint_test.html
  • Perfomatic: "Ts, Paint"
  • type: Startup
  • data: 20 times we start the browser and time how long it takes to paint the startup test page, resulting in 1 set of 20 data points.
  • summarization:

Starts the browser to display tspaint_test.html with the start time in the url, waits for MozAfterPaint and onLoad to fire, then records the end time and calculates the time to startup.

Example data
[1666.0, 1195.0, 1139.0, 1198.0, 1248.0, 1224.0, 1213.0, 1194.0, 1229.0, 1196.0, 1191.0, 1230.0, 1247.0, 1169.0, 1217.0, 1184.0, 1196.0, 1192.0, 1224.0, 1192.0]

Possible regression causes

  • (and/or maybe tpaint?) will regress if a new <panel> element is added to the browser window (e.g. browser.xul) and it's frame gets created. Fix this by ensuring it's display:none by default.

tsvgx

  • contact: :jwatt, :jmaher, :avih
  • source: svgx
  • type: PageLoader
  • data: we load the 7 svg pages 25 times, resulting in 7 sets of 25 data points
  • summarization:
Talos test name Description
tsvgx SVG-ASAP Replacing tscroll,tsvg with tscrollx,tsvgx

An svg-only number that measures SVG rendering performance, with animations or iterations of rendering. This is an ASAP test -- i.e. it iterates in unlimited frame-rate mode thus reflecting the maximum rendering throughput of each test. The reported value is the overall duration the sequence/animation took to complete. To turn on ASAP mode, we set these preferences:

preferences = {'layout.frame_rate': 0, 'docshell.event_starvation_delay_hint': 1}
Example data
0;hixie-001.xml;562;555;508;521;522;520;499;510;492;514;502;504;500;521;510;506;511;505;495;517;520;512;503;504;502
1;hixie-002.xml;510;613;536;530;536;522;498;505;500;504;498;529;498;509;493;512;501;506;504;499;496;505;508;511;503
2;hixie-003.xml;339;248;242;261;250;241;240;248;258;244;235;240;247;248;239;247;241;245;242;245;251;239;241;240;237
3;hixie-004.xml;709;540;538;536;540;536;552;539;535;535;543;533;536;535;545;537;537;537;537;539;538;535;536;538;536
4;hixie-005.xml;3096;3086;3003;3809;3213;3323;3143;3313;3192;3203;3225;3048;3069;3101;3189;3251;3172;3122;3266;3183;3159;3076;3014;3237;3100
5;hixie-006.xml;5586;5668;5565;5666;5668;5728;5886;5534;5484;5607;5678;5577;5745;5753;5532;5585;5506;5516;5648;5778;5894;5994;5794;5852;5810
6;hixie-007.xml;1823;1743;1739;1743;1744;1787;1802;1788;1782;1766;1787;1750;1748;1788;1766;1779;1767;1794;1758;1768;1781;1773;1765;1798;1805

Possible regression causes

  • Did you change the dimensions of the content area? Even a little? The tsvgx test seems to be sensitive to changes like this. See bug 1375479, for example. Usually, these sorts of "regressions" aren't real regressions - they just mean that we need to re-baseline our expectations from the test.

tsvg_static

  • contact: :jwatt, :jmaher, :dholbert, :neerja
  • source: svg_static
  • type: PageLoader
  • data: we load the 5 svg pages 25 times, resulting in 5 sets of 25 data points
  • summarization:
Talos test name Description
tsvg_static An svg-only number that measures SVG rendering performance of some complex (but static) SVG content.
Example data
0;gearflowers.svg;262;184;184;198;197;187;181;186;177;192;196;194;194;186;195;190;237;193;188;182;188;196;191;194;184
1;composite-scale.svg;69;52;48;49;57;51;52;87;52;49;49;51;58;53;64;57;49;65;67;58;53;59;56;68;50
2;composite-scale-opacity.svg;72;53;91;54;51;58;60;46;51;57;59;58;66;70;57;61;47;51;76;65;52;65;64;64;63
3;composite-scale-rotate.svg;70;76;89;62;62;78;57;77;79;82;74;56;61;79;73;64;75;74;81;82;76;58;77;61;62
4;composite-scale-rotate-opacity.svg;91;60;67;84;62;66;78;69;65;68;62;73;68;63;64;71;79;77;63;80;85;65;82;76;81

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

tsvgr_opacity

  • contact: :jwatt, :jmaher, :avih
  • source: [1]
  • type: PageLoader
  • data: we load the 2 svg opacity pages 25 times, resulting in 2 sets of 25 data points
  • summarization:
Talos test name Description
tsvgr_opacity Row Major and 25 cycles/page.

Renders many semi-transparent, partially overlapping SVG rectangles, and measures time to completion of this rendering.

Note that this test also tends to reflect changes in network efficiency and navigation bar rendering issues:

  • Most of the page load tests measure from before the location is changed, until onload + mozafterpaint, therefore any changes in chrome performance from the location change, or network performance (the pages load from a local web server) would affect page load times. SVG opacity is rather quick by itself, so any such chrome/network/etc performance changes would affect this test more than other page load tests (relatively, in percentages).
Example data
0;big-optimizable-group-opacity-2500.svg;170;171;205;249;249;244;192;252;192;431;182;250;189;249;151;168;209;194;247;250;193;250;255;247;247
1;small-group-opacity-2500.svg;585;436;387;441;512;438;440;380;443;391;450;386;459;383;445;388;450;436;485;443;383;438;528;444;441

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

tpaint

  • contact: :jimm, :jmaher
  • source: [tpaint-window.html]
  • type: Startup
  • data: we load the tpaint test window 20 times, resulting in 1 set of 20 data points.
  • summarization:
Talos test name Description
tpaint twinopen but measuring the time after we receive the MozAfterPaint and OnLoad event.

Tests the amount of time it takes the open a new window. This test does not include startup time. Multiple test windows are opened in succession, results reported are the average amount of time required to create and display a window in the running instance of the browser. (Measures ctrl-n performance.)

Example data
[209.219, 222.180, 225.299, 225.970, 228.090, 229.450, 230.625, 236.315, 239.804, 242.795, 244.5, 244.770, 250.524, 251.785, 253.074, 255.349, 264.729, 266.014, 269.399, 326.190]

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

Dromaeo Tests

Dromaeo suite of tests for JavaScript performance testing. See the Dromaeo wiki for more information.

This suite is divided into several sub-suites.

Each sub-suite is divided into tests, and each test is divided into sub-tests. Each sub-test takes some (in theory) fixed piece of work and measures how many times that piece of work can be performed in one second. The score for a test is then the geometric mean of the runs/second numbers for its sub-tests. The score for a sub-suite is the geometric mean of the scores for its tests.

Dromaeo CSS

  • contact: :bz, :jmaher
  • source: [css.manifest]
  • type: PageLoader
  • reporting: speed in test runs per second (higher is better)
  • data: Dromaeo has 6 subtests which run internal benchmarks, each benchmark reports about 180 raw data points each

summarization:

  • subtest: Dromaeo is a custom benchmark which has a lot of micro tests inside each subtest, because of this we use a custom dromaeo filter to summarize the subtest. Each micro test produces 5 data points and for each 5 data points we take the mean, leaving 36 data points to represent the subtest (assuming 180 points). These 36 micro test means, are then run through a geometric_mean to produce a single number for the dromaeo subtest. source: filter.py

Each page in the manifest is part of the dromaeo css benchmark. Each page measures the performance of searching the DOM for nodes matching various CSS selectors, using different libraries for the selector implementation (jQuery, Dojo, Mootools, ExtJS, Prototype, and Yahoo UI).

Example data
0;dojo.html;2209.83;2269.68;2275.47;2278.83;2279.81;4224.43;4344.96;4346.74;4428.69;4459.82;4392.80;4396.38;4412.54;4414.34;4415.62;3909.94;4027.96;4069.08;4099.63;4099.94;4017.70;4018.96;4054.25;4068.74;4081.31;3825.10;3984.20;4053.23;4074.59;4106.63;3893.88;3971.80;4031.15;4046.68;4048.31;3978.24;4010.16;4046.66;4051.68;4056.37;4189.50;4287.98;4390.98;4449.89;4450.20;4536.23;4557.82;4588.40;4662.58;4664.42;4675.51;4693.13;4743.72;4758.12;4764.67;4138.00;4251.60;4346.22;4410.12;4417.23;4677.53;4702.48;4714.62;4802.59;4805.33;4445.07;4539.91;4598.93;4605.45;4618.79;4434.40;4543.09;4618.56;4683.98;4689.51;4485.26;4496.75;4511.23;4600.86;4602.08;4567.52;4608.33;4615.56;4619.31;4622.79;3469.44;3544.11;3605.80;3647.74;3658.56;3101.88;3126.41;3147.73;3159.92;3170.73;3672.28;3686.40;3730.74;3748.89;3753.59;4411.71;4521.50;4633.98;4702.72;4708.76;3626.62;3646.71;3713.07;3713.13;3718.91;3846.17;3846.25;3913.61;3914.63;3916.22;3982.88;4112.98;4132.26;4194.92;4201.54;4472.64;4575.22;4644.74;4645.42;4665.51;4120.13;4142.88;4171.29;4208.43;4211.03;4405.36;4517.89;4537.50;4637.77;4644.28;4548.25;4581.20;4614.54;4658.42;4671.09;4452.78;4460.09;4494.06;4521.30;4522.37;4252.81;4350.72;4364.93;4441.40;4492.78;4251.34;4346.70;4355.00;4358.89;4365.72;4494.64;4511.03;4582.11;4591.79;4592.36;4207.54;4308.94;4309.14;4406.71;4474.46
1;ext.html;479.65;486.21;489.61;492.94;495.81;24454.14;33580.33;34089.15;34182.83;34186.15;34690.83;35050.30;35051.30;35071.65;35099.82;5758.22;5872.32;6389.62;6525.38;6555.57;8303.96;8532.96;8540.91;8544.00;8571.49;8360.79;8408.79;8432.96;8447.28;8447.83;5817.71;5932.67;8371.83;8389.20;8643.44;7983.80;8073.27;8073.84;8076.48;8078.15;24596.00;32518.84;32787.34;32830.51;32861.00;2220.87;2853.84;3333.53;3345.17;3445.47;24785.75;24971.75;25044.25;25707.61;25799.00;2464.69;2481.89;2527.57;2534.65;2534.92;217793.00;219347.90;219495.00;220059.00;297168.00;40556.19;53062.47;54275.73;54276.00;54440.37;50636.75;50833.49;50983.49;51028.49;51032.74;10746.36;10972.45;11450.37;11692.18;11797.76;8402.58;8415.79;8418.66;8426.75;8428.16;16768.75;16896.00;16925.24;16945.58;17018.15;7047.68;7263.13;7313.16;7337.38;7383.22;713.88;723.72;751.47;861.35;931.00;25454.36;25644.90;25801.87;25992.61;25995.00;819.89;851.23;852.00;886.59;909.89;14325.79;15064.92;15240.39;15431.23;15510.61;452382.00;458194.00;458707.00;459226.00;459601.00;45699.54;46244.54;46270.54;46271.54;46319.00;1073.94;1080.66;1083.35;1085.84;1087.74;26622.33;27807.58;27856.72;28040.58;28217.86;37229.81;37683.81;37710.81;37746.62;37749.81;220386.00;222903.00;240808.00;247394.00;247578.00;25567.00;25568.49;25610.74;25650.74;25710.23;26466.21;28718.71;36175.64;36529.27;36556.00;26676.00;30757.69;31965.84;34521.83;34622.65;32791.18;32884.00;33194.83;33720.16;34192.66;32150.36;32520.02;32851.18;32947.18;33128.01;29472.85;30214.09;30708.54;30999.23;32879.51;23822.88;23978.28;24358.88;24470.88;24515.51
2;jquery.html;285.42;288.57;292.66;293.75;294.14;10313.00;10688.20;13659.11;13968.65;14003.93;13488.39;13967.51;13980.79;14545.13;15059.77;4361.37;4488.35;4489.44;4492.24;4496.69;3314.32;3445.07;4412.51;5020.75;5216.66;5113.49;5136.56;5141.31;5143.87;5156.28;5055.95;5135.02;5138.64;5215.82;5226.48;4550.98;4551.59;4553.07;4557.77;4559.16;18339.63;18731.53;18738.63;18741.16;18806.15;1474.99;1538.31;1557.52;1703.67;1772.16;12209.94;12335.44;12358.32;12516.50;12651.94;1520.94;1522.62;1541.37;1584.71;1642.50;57533.00;59169.41;59436.11;59758.70;59872.40;8669.13;8789.34;8994.37;9016.05;9064.95;11047.39;11058.39;11063.78;11077.89;11082.78;6401.81;6426.26;6504.35;6518.25;6529.61;6250.22;6280.65;6304.59;6318.91;6328.72;5144.28;5228.40;5236.21;5271.26;5273.79;1398.91;1450.05;1456.39;1494.66;1519.42;727.85;766.62;844.35;858.49;904.87;9912.55;10249.54;14936.71;16566.50;16685.00;378.04;381.34;381.44;385.67;387.23;5362.60;5392.78;5397.14;5497.12;5514.83;213309.00;318297.00;320682.00;322681.00;322707.00;56357.44;67892.66;68329.66;68463.32;69506.00;418.91;424.49;425.19;425.28;426.40;9363.39;9559.95;9644.00;9737.07;9752.80;33170.83;33677.33;34950.83;35607.47;35765.82;44079.34;44588.55;45396.00;46309.00;46427.30;6302.87;6586.51;6607.08;6637.44;6642.17;9776.17;9790.46;9931.90;10391.79;10392.43;8739.26;8838.38;8870.20;8911.50;8955.15;8422.83;8786.21;8914.00;9135.82;9145.36;8945.28;9028.37;9035.23;9116.64;9137.86;6433.90;6688.73;6822.11;6830.08;6833.90;8575.23;8599.87;8610.91;8655.65;9123.91
3;mootools.html;1161.69;1333.31;1425.89;1500.37;1557.37;6706.93;7648.46;8020.04;8031.36;8049.64;7861.80;7972.40;7978.12;7993.32;7993.88;1838.83;1862.93;1864.11;1866.28;1866.71;1909.93;1921.83;1928.53;1954.07;1969.98;1808.68;1820.01;1821.30;1825.92;1826.91;1849.07;1904.99;1908.26;1911.24;1912.50;1856.86;1871.78;1873.72;1878.54;1929.57;6506.67;6752.73;7799.22;7830.41;7855.18;4117.18;4262.42;4267.30;4268.27;4269.62;2720.56;2795.36;2840.08;2840.79;2842.62;699.12;703.75;774.36;791.73;798.18;11096.22;11126.39;11132.72;11147.16;11157.44;3934.33;4067.37;4140.94;4149.75;4150.38;9042.82;9077.46;9083.55;9084.41;9086.41;4431.47;4432.84;4437.33;4438.40;4440.44;3935.67;3937.31;3937.43;3940.53;3976.68;3247.17;3307.90;3319.90;3323.32;3330.60;1001.90;1016.87;1021.12;1021.67;1022.05;1016.34;1019.09;1036.62;1056.81;1057.76;7345.56;7348.56;7391.89;7393.85;7406.30;374.27;392.53;394.73;397.28;398.26;5588.58;5653.21;5655.07;5659.15;5660.66;9775.41;9860.51;9938.40;9959.85;9968.45;9733.42;9904.31;9953.05;9960.55;9967.20;6399.26;6580.11;7245.93;7336.96;7386.78;7162.00;7245.49;7249.38;7250.75;7304.63;8458.24;8583.40;8651.57;8717.39;8742.39;8896.42;8904.60;8927.96;8960.73;8961.82;7483.48;7747.77;7763.46;7766.34;7773.07;7784.00;7821.41;7827.18;7849.18;7855.49;7012.16;7141.57;7250.09;7253.13;7335.89;6977.97;7015.51;7042.40;7204.35;7237.20;7160.46;7293.23;7321.27;7321.82;7331.16;6268.69;6324.11;6325.78;6328.56;6342.40;6554.54;6625.30;6646.00;6650.30;6674.90
4;prototype.html;237.05;251.94;256.61;259.65;263.52;4488.53;4676.88;4745.24;4745.50;4748.81;4648.47;4660.21;4666.58;4671.88;4677.32;3602.84;3611.40;3613.69;3615.69;3619.15;3604.41;3619.44;3623.24;3627.66;3628.11;3526.59;3589.35;3615.93;3616.35;3622.80;3624.69;3626.84;3628.47;3631.22;3632.15;3184.76;3186.11;3187.16;3187.78;3189.35;4353.43;4466.46;4482.57;4616.72;4617.88;4012.18;4034.84;4047.07;4047.82;4055.29;4815.11;4815.21;4816.11;4817.08;4820.40;3300.31;3345.18;3369.55;3420.98;3447.97;5026.99;5033.82;5034.50;5034.95;5038.97;3516.72;3520.79;3520.95;3521.81;3523.47;3565.29;3574.23;3574.37;3575.82;3578.37;4045.19;4053.51;4056.76;4058.76;4059.00;4714.67;4868.66;4869.66;4873.54;4878.29;1278.20;1300.92;1301.13;1301.17;1302.47;868.94;871.16;878.50;883.40;884.85;3874.71;3878.44;3881.61;3882.67;3886.92;4959.83;4968.45;4969.50;4971.38;4972.30;3862.69;3870.15;3871.79;3873.83;3878.07;2690.15;2711.66;2714.42;2715.39;2715.89;4349.04;4349.63;4351.33;4353.59;4355.46;4950.95;5101.08;5107.69;5120.21;5120.39;4336.63;4360.76;4361.96;4362.28;4365.43;4928.75;4939.41;4939.56;4943.95;4966.78;4869.03;4886.24;4888.85;4889.14;4895.76;4362.39;4362.78;4363.96;4365.00;4365.08;3408.00;3470.03;3476.37;3546.65;3547.34;4905.73;4926.21;4926.70;4926.93;4929.43;4682.88;4694.91;4696.30;4697.06;4699.69;4688.86;4691.25;4691.46;4698.37;4699.41;4628.07;4631.31;4633.42;4634.00;4636.00;4699.44;4796.02;4808.83;4809.95;4813.52;4719.10;4720.41;4722.95;4723.03;4723.53
5;yui.html;569.72;602.22;627.02;647.49;692.84;9978.30;10117.54;10121.70;10129.75;10137.24;9278.68;9291.44;9349.00;9370.53;9375.86;475.79;481.92;606.51;607.42;618.73;617.68;618.89;623.30;626.58;631.85;501.81;649.76;653.22;655.69;656.71;510.62;645.56;657.42;657.88;658.39;475.53;476.77;476.80;476.92;476.96;9895.16;9976.15;9988.25;9989.85;9996.40;9483.15;9545.75;9676.37;9808.51;10360.22;8331.29;8397.87;8538.06;8714.69;8803.78;2748.93;2800.93;2802.59;2857.33;2864.46;33757.16;33804.83;33859.32;33931.00;33991.32;7818.65;7846.92;7892.09;8170.55;8217.75;13691.38;13692.86;13693.25;13698.73;13706.66;5378.70;5517.83;5615.86;5616.16;5624.00;2985.63;3002.97;3003.07;3037.73;3038.87;2459.10;2502.52;2504.91;2507.07;2507.26;396.62;405.78;411.43;412.03;412.56;543.45;550.75;568.50;578.59;592.25;6762.21;6901.72;6984.27;7064.22;7122.29;454.78;519.40;539.29;543.96;566.16;3235.39;3266.13;3453.26;3498.79;3518.54;39079.22;39722.80;41350.59;41422.38;41540.17;34435.14;34606.31;34623.31;34661.00;34672.48;29449.12;29530.11;30507.24;31938.52;31961.52;7449.33;7524.62;7629.73;7712.96;7796.42;22917.43;23319.00;23441.41;23582.88;23583.53;29780.40;30272.55;31761.00;31765.84;31839.36;6112.45;6218.35;6476.68;6603.54;6793.66;10385.79;10471.69;10518.53;10552.74;10644.95;9563.52;9571.33;9617.09;9946.35;9976.80;9406.11;9518.48;9806.46;10102.44;10173.19;9482.43;9550.28;9878.21;9902.90;9951.45;8343.17;8511.00;8606.00;8750.21;8869.29;8234.96;8462.70;8473.49;8499.58;8808.91
Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

Dromaeo DOM (Linux64 only)

  • contact: :bz, :jmaher
  • source: [dom.manifest]
  • type: PageLoader
  • data: see Dromaeo DOM
  • reporting: speed in test runs per second (higher is better)

Each page in the manifest is part of the dromaeo dom benchmark. These are the specific areas that Dromaeo DOM covers:

DOM Attributes

Measures performance of getting and setting a DOM attribute, both via getAttribute and via a reflecting DOM property. Also throws in some expando getting/setting for good measure.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.
DOM Modification

Measures performance of various things that modify the DOM tree: creating element and text nodes and inserting them into the DOM.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.
DOM Query

Measures performance of various methods of looking for nodes in the DOM: getElementById, getElementsByTagName, and so forth.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.
DOM Traversal

Measures performance of various accessors (childNodes, firstChild, etc) that would be used when doing a walk over the DOM tree.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

Please see Dromaeo CSS for examples of data.

a11y

  • contact: :davidb, :tbsaunde, :jmaher
  • source: [a11y.manifest]
  • type: PageLoader
  • measuring: ???
  • data: we load 2 pages 25 times each, collect 2 sets of 25 data points
  • summarization
  • reporting: test time in ms (lower is better)
Talos test name Description
a11yr Row Major testing with 25 cycles per page

This test ensures basic a11y tables and permutations do not cause performance regressions.

Example data
0;dhtml.html;1584;1637;1643;1665;1741;1529;1647;1645;1692;1647;1542;1750;1654;1649;1541;1656;1674;1645;1645;1740;1558;1652;1654;1656;1654
1;tablemutation.html;398;385;389;391;387;387;385;387;388;385;384;31746;386;387;384;387;389;387;387;387;388;391;386;387;388

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

tscrollx

  • contact: :jrmuizel, :jmaher, :avih
  • source: [scroll.manifest]
  • type: PageLoader
  • measuring: Scroll performance
  • reporting: Average frame interval (1/FPS). Lower is better.
  • data: we load 6 pages 25 times each, collecting 6 sets of 25 data points
  • summarization:
Talos test name Description
tscrollx Replacing tscroll,tsvg with tscrollx,tsvgx


This test scrolls several pages where each represent a different known "hard" case to scroll (* needinfo), and measures the average frames interval (1/FPS) on each. The ASAP test (tscrollx) iterates in unlimited frame-rate mode thus reflecting the maximum scroll throughput per page. To turn on ASAP mode, we set these preferences:

preferences = {'layout.frame_rate': 0, 'docshell.event_starvation_delay_hint': 1}
Example data
0;tiled.html;5.41;5.57;5.34;5.64;5.53;5.48;5.44;5.49;5.50;5.50;5.49;5.66;5.50;5.37;5.57;5.54;5.46;5.31;5.41;5.57;5.50;5.52;5.71;5.31;5.44
fixed.html;10.404609053497941;10.47;10.66;10.45;10.73;10.79;10.64;10.64;10.82;10.43;10.92;10.47;10.47;10.64;10.74;10.67;10.40;10.83;10.77;10.54;10.38;10.70;10.44;10.38;10.56
downscale.html;5.493209876543211;5.27;5.50;5.50;5.51;5.46;5.58;5.58;5.51;5.49;5.49;5.47;9.09;5.56;5.61;5.50;5.47;5.59;5.47;5.49;5.60;5.61;5.58;5.40;5.43
downscale.html;10.676522633744854;10.82;10.79;10.41;10.75;10.91;10.52;10.61;10.50;10.55;10.80;10.17;10.68;10.41;10.42;10.41;10.58;10.28;10.56;10.66;10.68;10.47;10.60;10.61;10.26
4;iframe.svg;13.82;14.87;14.78;14.35;14.73;14.50;14.15;14.46;14.80;14.48;15.10;14.93;14.77;14.52;14.08;15.01;14.58;14.52;15.23;14.35;14.72;14.28;14.30;14.27;14.96
5;reader.htm;10.72;10.62;10.23;10.48;10.42;10.64;10.40;10.40;10.14;10.60;10.51;10.36;10.57;10.41;10.52;10.75;10.19;10.72;10.44;9.75;10.49;10.07;10.54;10.46;10.44

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

tresize

  • contact: :jimm, :jmaher, :avih
  • source: [tresize-test.html]
  • type: StartupTest
  • measuring: Time to do XUL resize, in ms (lower is better).
  • data: we run the tresize test page 20 times, resulting in 1 set of 20 data points.
  • summarization:

A purer form of paint measurement than tpaint. This test opens a single window positioned at 10,10 and sized to 300,300, then resizes the window outward |max| times measuring the amount of time it takes to repaint each resize. Dumps the resulting dataset and average to stdout or logfile.

In bug 1102479 tresize was rewritten to work in e10s mode which involved a full rewrite of the test.

To run resize locally without talos, please install the addon to run the test locally.

Example data
[23.2565333333333, 23.763383333333362, 22.58369999999999, 22.802766666666653, 22.304050000000025, 23.010383333333326, 22.865466666666677, 24.233716666666705, 24.110983333333365, 22.21390000000004, 23.910333333333316, 23.409816666666647, 19.873049999999992, 21.103966666666686, 20.389749999999978, 20.777349999999984, 20.326283333333365, 22.341616666666667, 20.29813333333336, 20.769600000000104]

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

xperf

  • contact: :aklotz, :jmaher
  • source: xperf instrumentation
  • type: Pageloader (tp5n) / Startup
  • measuring: IO counters from windows (currently, only startup IO is in scope)
  • reporting: Summary of read/write counters for disk, network (lower is better)

Talos will turn orange for 'x' jobs on windows 7 if your changeset accesses files which are not predefined in the [whitelist] during startup; specifically, before the "sessionstore-windows-restored" Firefox event. If your job turns orange, you will see a list of files in Treeherder (or in the log file) which have been accessed unexpectedly (similar to this):

* TEST-UNEXPECTED-FAIL : xperf: File '{profile}\secmod.db' was accessed and we were not expecting it. DiskReadCount: 6, DiskWriteCount: 0, DiskReadBytes: 16904, DiskWriteBytes: 0
* TEST-UNEXPECTED-FAIL : xperf: File '{profile}\cert8.db' was accessed and we were not expecting it. DiskReadCount: 4, DiskWriteCount: 0, DiskReadBytes: 33288, DiskWriteBytes: 0
* TEST-UNEXPECTED-FAIL : xperf: File 'c:\$logfile' was accessed and we were not expecting it. DiskReadCount: 0, DiskWriteCount: 2, DiskReadBytes: 0, DiskWriteBytes: 32768 TEST-UNEXPECTED-FAIL : xperf: File '{profile}\secmod.db' was accessed and we were not expecting it. DiskReadCount: 6, DiskWriteCount: 0, DiskReadBytes: 16904, DiskWriteBytes: 0
* TEST-UNEXPECTED-FAIL : xperf: File '{profile}\cert8.db' was accessed and we were not expecting it. DiskReadCount: 4, DiskWriteCount: 0, DiskReadBytes: 33288, DiskWriteBytes: 0
* TEST-UNEXPECTED-FAIL : xperf: File 'c:\$logfile' was accessed and we were not expecting it. DiskReadCount: 0, DiskWriteCount: 2, DiskReadBytes: 0, DiskWriteBytes: 32768 

In the case that these files are expected to be accessed during startup by your changeset, then we can add them to the whitelist.

Xperf runs tp5 while collecting xperf metrics for disk IO and network IO. The providers we listen for are:

The values we collect during stackwalk are:

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

kraken

  • contact: :h4writer, :jmaher
  • source: [kraken.manifest]
  • type: PageLoader
  • measuring: JavaScript performance
  • reporting: Total time for all tests, in ms (lower is better)
  • data: there are 14 subtests in kraken, each subtest is an internal benchmark and generates 10 data points, or 14 sets of 10 data points.
  • summarization:
    • subtest: For all of the 10 data points, we take the mean to report a single number.
    • suite: geometric mean of the 14 subtest results.

This is the Kraken javascript benchmark taken verbatim and slightly modified to fit into our pageloader extension and talos harness.

Example data
0;ai-astar;100;95;98;102;101;99;97;98;98;102
1;audio-beat-detection;147;147;191;173;145;139;186;143;183;140
2;audio-dft;161;156;158;157;160;158;160;160;159;158
3;audio-fft;82;83;83;154;83;83;82;83;160;82
4;audio-oscillator;96;96;141;95;95;95;129;96;95;134
5;imaging-gaussian-blur;116;115;116;115;115;115;115;115;117;116
6;imaging-darkroom;166;164;166;165;166;166;165;165;165;166
7;imaging-desaturate;87;87;87;87;88;87;88;87;87;87
8;json-parse-financial;75;77;77;76;77;76;77;76;77;77
9;json-stringify-tinderbox;79;79;80;79;78;79;79;78;79;79
10;stanford-crypto-aes;98;97;96;98;98;98;98;98;113;95
11;stanford-crypto-ccm;130;138;130;127;137;134;134;132;147;129
12;stanford-crypto-pbkdf2;176;187;183;183;176;174;181;187;175;173
13;stanford-crypto-sha256-iterative;86;85;83;84;86;85;85;86;83;83

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

V8, version 7

  • contact: :h4writer, :jmaher
  • source: [v8.manifest]
  • type: PageLoader
  • measuring: ???
  • reporting: weighted score (higher is better)
  • data: there are 10 subtests in kraken, each subtest is an internal benchmark and generates a single replicate.
  • summarization:
    • subtest: For all of the 10 data points, we take filter with v8_subtest to apply the reference value for the subtest to generate a single number. source: test.py
    • suite: v8 metric is basically the geometric mean of the 10 subtests and multiples the output by 100.

this is the V8 (version 7) javascript benchmark taken verbatim and slightly modified to fit into our pageloader extension and talos harness.

The previous version of this test is V8 version 5 which was run on selective branches and operating systems.

Example data
0;Richards;152.99877600979192
1;DeltaBlue;210.7481559536354
2;Encrypt;291.29041654529567
3;Decrypt;5358.288770053476
4;RayTrace;1833.3333333333333
5;Earley;678.42605156038
6;Boyer;10265.30612244898
7;RegExp;28333.333333333332
8;Splay;528.2620179609086
9;NavierStokes;5055.555555555556

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

TART/CART

  • contact: :avih, :jmaher, :MattN
  • source: tart
  • type: PageLoader
  • measuring: Desktop Firefox UI animation speed and smoothness
  • reporting: intervals in ms (lower is better) - see below for details
  • data: there are 30 reported subtests from TART which we load 25 times, resulting in 30 sets of 25 data points.
  • summarization:

TART is the Tab Animation Regression Test and CART is the Customize Animation Regression Test.

TART tests tab animation on these cases:

  • Simple: single new tab of about:blank open/close without affecting (shrinking/expanding) other tabs.
  • icon: same as above with favicons and long title instead of about:blank.
  • Newtab: newtab open with thumbnails preview - without affecting other tabs, with and without preload.
  • Fade: opens a tab, then measures fadeout/fadein (tab animation without the overhead of opening/closing a tab).
    • Case 1 is tested with DPI scaling of 1.
    • Case 2 is tested with DPI scaling of 1.0 and 2.0.
    • Case 3 is tested with the default scaling of the test system.
    • Case 4 is tested with DPI scaling of 2.0 with the "icon" tab (favicon and long title).
    • Each animation produces 3 test results:
      • error: difference between the designated duration and the actual completion duration from the trigger.
      • half: average frame interval over the 2nd half of the animation.
      • all: average frame interval over all recorded intervals.
      • And the run logs also include the explicit intervals from which these 3 values were derived.

CART uses the same framework to measure performance of the Australize "customize" animation (for entering the toolbar/menu customization view). Subtests include:

  • Customize-enter animation (full and css-animation-only part).
  • Customize-exit animation

TART/CART can be used as a stand-alone addon:

  • Set the browser to ASAP mode (preferences layout.frame_rate=0, docshell.event_starvation_delay_hint=1). This makes the browser refresh the screen as fast as possible instead of limiting it to 60hz, thus allows higher resolution measurements. Requires restart to take effect.
  • Install the latest addon xpi, or zip the addon dir of the source code, rename the extension to xpi and install.
  • Visit chrome://tart/content/tart.html
  • Select subtests to run. By default the selected tests are all the TART tests. CART is the "Customize" test.
Example data
0;simple-open-DPI1.half.TART;2.35;2.42;2.63;2.47;2.71;2.38;2.37;2.41;2.48;2.70;2.44;2.41;2.51;2.43;2.41;2.56;2.76;2.49;2.36;2.40;2.70;2.53;2.35;2.46;2.47
1;simple-open-DPI1.all.TART;2.80;2.95;3.12;2.92;3.46;2.87;2.92;2.99;2.89;3.24;2.94;2.95;3.25;2.92;3.02;3.00;3.21;3.31;2.84;2.87;3.10;3.13;3.10;2.94;2.95
2;simple-open-DPI1.error.TART;33.60;36.33;35.93;35.97;38.17;34.77;36.00;35.01;36.25;36.22;35.24;35.76;36.64;36.31;34.74;34.40;34.34;41.48;35.04;34.83;34.27;34.04;34.37;35.22;36.52
3;simple-close-DPI1.half.TART;1.95;1.88;1.91;1.94;2.00;1.97;1.88;1.76;1.84;2.18;1.99;1.83;2.04;1.93;1.81;1.77;1.79;1.90;1.82;1.84;1.78;1.75;1.76;1.89;1.81
4;simple-close-DPI1.all.TART;2.19;2.08;2.07;2.32;2.65;2.32;2.26;1.96;2.02;2.26;2.05;2.16;2.19;2.11;2.04;1.98;2.05;2.02;2.01;2.11;1.97;1.97;2.05;2.01;2.12
5;simple-close-DPI1.error.TART;21.35;23.87;22.60;22.02;22.97;22.35;22.15;22.79;21.81;21.90;22.26;22.58;23.15;22.43;22.76;23.36;21.86;22.70;22.96;22.70;22.28;22.03;21.78;22.33;22.23
6;icon-open-DPI1.half.TART;2.42;2.33;2.50;2.58;2.36;2.51;2.60;2.35;2.52;2.51;2.59;2.34;3.29;2.63;2.46;2.57;2.53;2.50;2.39;2.51;2.44;2.66;2.72;2.36;2.52
7;icon-open-DPI1.all.TART;3.12;2.94;3.42;3.23;3.10;3.21;3.33;3.14;3.24;3.32;3.46;2.90;3.65;3.19;3.27;3.47;3.32;3.13;2.95;3.23;3.21;3.33;3.47;3.15;3.32
8;icon-open-DPI1.error.TART;38.39;37.96;37.03;38.85;37.03;37.17;37.19;37.56;36.67;36.33;36.89;36.85;37.54;38.46;35.38;37.52;36.68;36.48;36.03;35.71;37.12;37.08;37.74;38.09;35.85
9;icon-close-DPI1.half.TART;1.94;1.93;1.79;1.89;1.83;1.83;1.90;1.89;1.75;1.76;1.77;1.74;1.81;1.86;1.95;1.99;1.73;1.83;1.97;1.80;1.94;1.84;2.01;1.88;2.03
10;icon-close-DPI1.all.TART;2.14;2.14;1.98;2.03;2.02;2.25;2.29;2.13;1.97;2.01;1.94;2.01;1.99;2.05;2.11;2.09;2.02;2.02;2.12;2.02;2.20;2.11;2.19;2.07;2.27
11;icon-close-DPI1.error.TART;24.51;25.03;25.17;24.54;23.86;23.70;24.02;23.61;24.10;24.53;23.92;23.75;23.73;23.78;23.42;25.40;24.21;24.55;23.96;24.96;24.41;24.96;24.16;24.20;23.65
12;icon-open-DPI2.half.TART;2.60;2.60;2.60;2.53;2.51;2.53;2.59;2.43;2.66;2.60;2.47;2.61;2.64;2.43;2.49;2.63;2.61;2.60;2.52;3.06;2.65;2.74;2.69;2.44;2.43
13;icon-open-DPI2.all.TART;3.45;3.22;3.38;3.47;3.10;3.31;3.47;3.13;3.37;3.14;3.28;3.20;3.40;3.15;3.14;3.23;3.41;3.16;3.26;3.55;3.29;3.49;3.44;3.11;3.22
14;icon-open-DPI2.error.TART;40.20;37.86;37.53;41.46;37.03;38.77;37.48;36.97;37.28;37.72;36.09;36.71;38.89;38.21;37.37;38.91;36.79;36.10;37.60;36.99;37.56;35.76;38.92;37.46;37.52
15;icon-close-DPI2.half.TART;2.27;1.97;1.83;1.86;2.08;1.88;1.80;1.90;1.77;1.89;2.06;1.89;1.89;1.86;2.01;1.79;1.78;1.83;1.89;1.85;1.74;1.82;1.84;1.81;1.79
16;icon-close-DPI2.all.TART;2.33;2.13;2.18;2.03;2.33;2.03;1.95;2.06;1.96;2.13;2.25;2.10;2.13;2.03;2.18;2.00;2.05;2.01;2.08;2.05;1.96;2.04;2.10;2.04;2.08
17;icon-close-DPI2.error.TART;22.99;23.02;24.32;23.58;23.05;23.34;22.92;23.22;22.90;23.33;23.33;23.05;22.80;23.45;24.05;22.39;22.14;22.97;22.85;22.13;22.89;22.98;23.69;22.99;23.08
18;iconFade-close-DPI2.half.TART;2.14;1.84;1.78;1.84;1.66;2.07;1.81;3.82;1.68;1.85;1.62;2.54;2.06;1.85;2.17;1.80;1.71;1.67;2.11;1.73;2.94;2.14;1.93;1.72;2.05
19;iconFade-close-DPI2.all.TART;2.17;1.76;1.80;1.89;1.70;1.93;1.80;3.38;1.78;1.90;1.70;2.50;1.94;1.81;2.29;1.82;1.79;1.76;2.23;1.80;2.85;2.06;1.84;1.83;2.09
20;iconFade-close-DPI2.error.TART;4.67;4.11;3.69;4.51;4.46;3.88;4.54;3.68;4.56;3.82;4.32;4.87;4.42;3.72;3.72;4.54;4.93;4.46;4.64;3.39;4.09;3.28;3.58;4.11;3.80
21;iconFade-open-DPI2.half.TART;2.37;2.61;2.37;2.62;2.54;2.84;2.57;2.44;4.33;2.57;2.59;2.67;2.58;2.48;2.38;2.39;2.50;2.39;2.50;2.57;2.52;2.55;2.47;2.69;2.41
22;iconFade-open-DPI2.all.TART;2.45;2.64;2.39;2.60;2.57;2.60;2.61;2.59;3.14;2.55;2.54;2.66;2.57;2.48;2.47;2.46;2.55;2.45;2.51;2.61;2.54;2.58;2.50;2.54;2.40
23;iconFade-open-DPI2.error.TART;3.64;4.67;4.31;5.79;6.43;3.64;4.82;8.68;5.78;4.38;3.80;3.98;4.64;653.63;4.63;3.76;4.23;5.01;5.48;4.99;3.48;5.10;5.02;6.14;5.58
24;newtab-open-preload-no.half.TART;5.02;2.90;3.06;3.03;2.94;2.94;3.08;3.12;3.60;3.19;2.82;2.96;3.67;7.85;2.79;3.12;3.18;2.92;2.86;2.96;2.96;3.00;2.90;2.97;2.94
25;newtab-open-preload-no.all.TART;7.11;4.66;5.03;4.68;4.50;4.58;4.76;4.76;5.67;4.96;4.36;4.51;5.21;11.16;4.38;4.69;4.77;4.45;4.45;4.70;4.51;4.61;4.54;4.69;4.60
26;newtab-open-preload-no.error.TART;40.82;40.85;37.38;37.40;36.30;36.47;36.89;37.63;37.12;38.65;36.73;36.95;36.11;38.59;37.39;37.77;37.93;37.54;37.46;38.29;36.58;38.25;38.32;37.92;36.93
27;newtab-open-preload-yes.half.TART;3.14;2.96;2.97;8.37;2.98;3.00;2.96;3.05;3.12;3.48;3.07;3.23;3.05;2.88;2.92;3.06;2.90;3.01;3.19;2.90;3.18;3.11;3.04;3.16;3.21
28;newtab-open-preload-yes.all.TART;5.10;4.60;4.63;8.94;5.01;4.69;4.63;4.67;4.93;5.43;4.78;5.12;4.77;4.65;4.50;4.78;4.75;4.63;4.76;4.45;4.86;4.88;4.69;4.86;4.92
29;newtab-open-preload-yes.error.TART;35.90;37.24;38.57;40.60;36.04;38.12;38.78;36.73;36.91;36.69;38.12;36.69;37.79;35.80;36.11;38.01;36.59;38.85;37.14;37.30;38.02;38.95;37.64;37.86;36.43

Possible regression causes

  • CART
    • Regressions might be reported for patches that add new buttons to the menu panel by default, since this can add new rows to the panel and make it scrollable, which increases the amount of work to do during the transition. See bug 1230671 for details.

DAMP

  • contact: :bgrins, :jmaher
  • source: damp
  • type: PageLoader
  • measuring: Developer Tools toolbox startup, shutdown, and reload performance
  • reporting: intervals in ms (lower is better) - see below for details
  • data: there are 36 reported subtests from DAMP which we load 25 times, resulting in 36 sets of 25 data points.
  • summarization:

To run this locally, you'll need to pull down the tp5 page set and run it in a local web server. See the tp5 section or contact jmaher for more information.

Example data
0;simple.webconsole.open.DAMP;1198.86;354.38;314.44;337.32;344.73;339.05;345.55;358.37;314.89;353.73;324.02;339.45;304.63;335.50;316.69;341.05;353.45;353.73;342.28;344.63;357.62;375.18;326.08;363.10;357.30
1;simple.webconsole.reload.DAMP;44.60;41.21;25.62;29.85;38.10;42.29;38.25;40.14;26.95;39.24;40.32;34.67;34.64;44.88;32.51;42.09;28.04;43.05;40.62;36.56;42.44;44.11;38.69;29.10;42.00
2;simple.webconsole.close.DAMP;27.26;26.97;25.45;27.82;25.98;26.05;38.00;26.89;24.90;26.61;24.90;27.22;26.95;25.18;24.24;25.60;28.91;26.90;25.57;26.04;26.79;27.33;25.76;26.47;27.43
3;simple.inspector.open.DAMP;507.80;442.03;424.93;444.62;412.94;451.18;441.39;435.83;441.27;460.69;440.93;413.13;418.73;443.41;413.93;447.34;434.69;459.24;453.60;412.58;445.41;466.34;441.89;417.59;428.82
4;simple.inspector.reload.DAMP;169.45;165.11;163.93;181.12;167.86;164.67;170.34;173.12;165.24;180.59;176.72;187.42;170.14;190.35;176.59;155.00;151.66;174.40;169.46;163.85;190.93;217.00;186.25;181.31;161.13
5;simple.inspector.close.DAMP;44.40;42.28;42.71;47.21;41.74;41.24;42.94;43.73;48.24;43.04;48.61;42.49;45.93;41.36;43.83;42.43;41.81;43.93;41.38;40.98;49.76;50.86;43.49;48.99;44.02
6;simple.jsdebugger.open.DAMP;642.59;464.02;540.62;445.46;471.09;466.57;466.70;511.91;424.12;480.70;448.37;477.51;488.99;437.97;442.32;459.03;421.54;467.99;472.78;440.27;431.47;454.76;436.86;453.61;485.59
7;simple.jsdebugger.reload.DAMP;51.65;55.46;225.46;53.32;58.78;53.23;54.39;51.59;55.46;48.03;50.70;46.34;230.94;53.71;54.23;53.01;61.03;51.23;51.45;293.01;56.93;51.44;59.85;63.35;57.44
8;simple.jsdebugger.close.DAMP;29.12;30.76;40.34;32.09;31.26;32.30;33.95;31.89;29.68;31.39;32.09;30.36;44.63;32.33;30.16;32.43;30.89;30.85;31.99;49.86;30.94;44.63;32.54;29.79;33.15
9;simple.styleeditor.open.DAMP;758.54;896.93;821.17;1026.24;887.14;867.39;927.86;962.80;740.40;919.39;741.01;925.21;807.39;1051.47;729.04;1095.78;755.03;888.70;900.52;810.30;1090.09;869.72;737.44;893.16;927.72
10;simple.styleeditor.reload.DAMP;57.32;178.13;59.23;60.82;71.45;78.86;74.35;60.11;66.43;77.41;61.96;69.22;65.97;45.53;67.88;74.76;124.61;60.01;36.66;59.24;65.01;165.68;34.61;69.02;71.42
11;simple.styleeditor.close.DAMP;28.28;56.50;36.18;30.00;36.32;34.85;35.33;36.24;25.45;36.72;26.53;36.90;28.88;30.94;26.56;31.34;47.79;30.90;30.52;27.95;30.75;56.28;26.76;30.25;37.42
12;simple.performance.open.DAMP;444.28;357.87;331.17;335.16;585.71;402.99;504.58;466.95;272.98;427.54;345.60;441.53;319.99;327.91;312.94;349.79;399.51;465.60;418.42;295.14;362.06;363.11;445.71;634.96;500.83
13;simple.performance.reload.DAMP;38.07;33.44;35.99;70.57;64.04;106.47;148.31;29.60;68.47;28.95;148.46;75.92;32.15;93.72;36.17;44.13;75.11;154.76;98.28;75.16;29.39;36.68;113.16;64.05;135.60
14;simple.performance.close.DAMP;23.98;25.49;24.19;24.61;27.56;40.33;33.85;25.13;22.62;25.28;41.84;25.09;26.39;25.20;23.76;25.44;25.92;30.40;40.77;25.41;24.57;26.15;43.65;28.54;30.16
15;simple.netmonitor.open.DAMP;438.85;350.64;318.04;329.12;341.91;352.33;344.05;334.15;514.57;327.95;471.50;334.55;344.94;364.39;727.56;374.48;339.45;344.31;345.61;329.78;325.74;334.74;350.36;342.85;344.64
16;simple.netmonitor.reload.DAMP;59.68;47.50;69.37;61.18;76.89;83.22;68.11;81.24;56.15;68.20;32.41;81.22;81.62;44.30;39.52;29.60;86.07;71.18;76.32;79.93;79.63;82.15;83.58;87.04;82.97
17;simple.netmonitor.close.DAMP;38.42;39.32;52.56;43.37;48.08;40.62;51.12;41.11;59.54;43.29;41.72;40.85;51.61;49.61;51.39;44.91;40.36;41.10;45.43;42.15;42.63;40.69;41.21;44.04;41.95
18;complicated.webconsole.open.DAMP;589.97;505.93;480.71;530.93;460.60;479.63;485.33;489.08;605.28;457.12;463.95;493.28;680.05;478.72;504.47;578.69;488.66;485.34;504.94;460.67;548.38;474.98;470.33;471.34;464.58
19;complicated.webconsole.reload.DAMP;2707.20;2700.17;2596.02;2728.09;2905.51;2716.65;2657.68;2707.74;2567.86;2726.36;2650.92;2839.14;2620.34;2718.36;2595.22;2686.28;2703.48;2609.75;2686.41;2577.93;2634.47;2745.56;2655.89;2540.09;2649.18
20;complicated.webconsole.close.DAMP;623.56;570.80;636.63;502.49;565.83;537.93;525.46;565.78;532.90;562.66;525.42;490.88;611.99;486.45;528.60;505.35;480.55;500.75;532.75;480.91;488.69;548.77;535.31;477.92;519.84
21;complicated.inspector.open.DAMP;1233.26;753.57;742.74;953.11;653.29;692.66;653.75;767.02;840.68;707.56;713.95;685.79;690.21;1020.47;685.67;721.69;1063.72;695.55;702.15;760.91;853.14;660.12;729.16;1044.86;724.34
22;complicated.inspector.reload.DAMP;2384.90;2436.35;2356.11;2436.58;2372.96;2558.86;2543.76;2351.03;2411.95;2358.04;2413.27;2339.85;2373.11;2338.94;2418.88;2360.87;2349.09;2498.96;2577.73;2445.07;2354.88;2424.90;2696.10;2362.39;2493.29
23;complicated.inspector.close.DAMP;541.96;509.38;476.91;456.48;545.48;634.04;603.10;488.09;599.20;480.45;617.93;420.39;514.92;439.99;727.41;469.04;458.59;539.74;611.55;725.03;473.36;484.60;481.46;458.93;554.76
24;complicated.jsdebugger.open.DAMP;644.97;578.41;542.23;595.94;704.80;603.08;689.18;552.99;597.23;584.17;682.14;758.16;791.71;738.43;640.30;809.26;704.85;601.32;696.10;683.44;796.34;657.25;631.89;739.96;641.82
25;complicated.jsdebugger.reload.DAMP;2676.82;2650.84;2687.78;2401.23;3421.32;2450.91;2464.13;2286.40;2399.40;2415.97;2481.48;2827.69;2652.03;2554.63;2631.36;2443.83;2564.73;2466.22;2597.57;2552.73;2539.42;2481.21;2319.50;2539.00;2576.43
26;complicated.jsdebugger.close.DAMP;795.68;616.48;598.88;536.77;435.02;635.61;558.67;841.64;613.48;886.60;581.38;580.96;571.40;605.34;671.00;882.02;619.01;579.63;643.05;656.78;699.64;928.99;549.76;560.96;676.32
27;complicated.styleeditor.open.DAMP;2327.30;2494.19;2190.29;2205.60;2198.11;2509.01;2189.79;2532.05;2178.03;2207.75;2224.96;2665.84;2294.40;2645.44;2661.41;2364.60;2395.36;2582.72;2872.03;2679.29;2561.24;2330.11;2580.16;2510.36;2860.83
28;complicated.styleeditor.reload.DAMP;2218.46;2335.18;2284.20;2345.05;2286.98;2453.47;2506.97;2661.19;2529.51;2289.78;2564.15;2608.24;2270.77;2362.17;2287.31;2300.19;2331.56;2300.86;2239.27;2231.33;2476.14;2286.28;2583.24;2540.29;2259.67
29;complicated.styleeditor.close.DAMP;302.67;343.10;313.15;305.60;317.92;328.44;350.70;370.12;339.77;308.72;312.71;320.63;305.52;316.69;324.92;306.60;313.65;312.17;326.26;321.45;334.56;307.38;312.95;350.94;339.36
30;complicated.performance.open.DAMP;477.99;537.96;564.85;515.05;502.03;515.58;492.80;689.06;448.76;588.91;509.76;485.39;548.17;479.14;638.67;535.86;541.61;611.52;554.72;665.37;694.04;470.60;746.16;547.85;700.02
31;complicated.performance.reload.DAMP;2258.31;2345.74;2509.24;2579.71;2367.94;2365.94;2260.86;2324.23;2579.01;2412.63;2540.38;2069.80;2534.91;2443.48;2193.01;2442.99;2422.42;2475.35;2076.48;2092.95;2444.53;2353.86;2154.28;2354.61;2104.82
32;complicated.performance.close.DAMP;334.44;516.66;432.49;341.29;309.30;365.20;332.16;311.42;370.81;301.81;381.13;299.39;317.60;314.10;372.44;314.76;306.24;349.85;382.08;352.53;309.40;298.44;314.10;315.44;405.22
33;complicated.netmonitor.open.DAMP;469.70;597.87;468.36;823.09;696.39;477.19;487.78;495.92;587.89;471.48;555.02;507.45;883.33;522.15;756.86;713.64;593.82;715.13;477.15;717.85;586.79;556.97;631.43;629.55;581.16
34;complicated.netmonitor.reload.DAMP;4033.55;3577.36;3655.61;3721.24;3874.29;3977.92;3778.62;3825.60;3984.65;3707.91;3985.24;3565.21;3702.40;3956.70;3627.14;3916.11;3929.11;3934.06;3590.60;3628.39;3618.84;3579.52;3953.04;3781.01;3682.69
35;complicated.netmonitor.close.DAMP;1042.98;920.21;928.19;940.38;950.25;1043.61;1078.16;1077.38;1132.91;1095.05;1176.31;1256.83;1143.14;1234.61;1248.97;1242.29;1378.63;1312.74;1371.48;1373.15;1544.55;1422.51;1549.48;1616.55;1506.58

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

sessionrestore/sessionrestore_no_auto_restore/sessionrestore_many_windows

Three tests measure the time spent reading and restoring the session from a valid sessionstore.js. Time starts ticking during event sessionRestoreInit and stops during event sessionRestored.

In sessionrestore, this is tested with a configuration that requires the session to be restored. In sessionrestore_no_auto_restore, this is tested with a configuration that requires the session to not be restored. Both of the above tests use a sessionstore.js file that contains one window and roughly 89 tabs. In sessionrestore_many_windows, this is tested with a sessionstore.js that contains 3 windows and 130 tabs. The first window contains 50 tabs, 80 remaning tabs are divided equally between the second and the third window.

Example data
[2362.0, 2147.0, 2171.0, 2134.0, 2116.0, 2145.0, 2141.0, 2141.0, 2136.0, 2080.0]

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

tps

Example data
0;amazon.com/www.amazon.com/Kindle-Wireless-Reader-Wifi-Graphite/dp/B002Y27P3M/507846.html;66.34;54.15;53.08;55.79;49.12
1;cgi.ebay.com/cgi.ebay.com/ALL-NEW-KINDLE-3-eBOOK-WIRELESS-READING-DEVICE-W-WIFI-/130496077314@pt=LH_DefaultDomain_0&hash=item1e622c1e02.html;50.85;46.57;39.51;36.71;36.47
2;163.com/www.163.com/index.html;95.05;80.80;76.09;69.29;68.96
3;mail.ru/mail.ru/index.html;66.21;52.04;56.33;55.11;45.61
4;bbc.co.uk/www.bbc.co.uk/news/index.html;35.80;44.59;48.02;45.71;42.58
5;store.apple.com/store.apple.com/us@mco=Nzc1MjMwNA.html;52.98;49.45;58.47;56.79;61.29
6;imdb.com/www.imdb.com/title/tt1099212/index.html;46.51;55.12;46.22;50.60;47.63
7;cnn.com/www.cnn.com/index.html;43.02;50.77;43.88;49.70;50.02
8;sohu.com/www.sohu.com/index.html;74.03;62.89;63.30;67.71;89.35
9;youku.com/www.youku.com/index.html;43.98;52.69;45.80;63.00;57.02
10;ifeng.com/ifeng.com/index.html;88.01;87.54;104.47;94.93;113.91
11;tudou.com/www.tudou.com/index.html;45.32;48.10;46.03;39.26;58.38
12;chemistry.about.com/chemistry.about.com/index.html;38.24;37.07;39.59;39.48;39.60
13;beatonna.livejournal.com/beatonna.livejournal.com/index.html;35.59;50.75;36.17;48.49;52.61
14;rakuten.co.jp/www.rakuten.co.jp/index.html;90.28;71.95;62.66;60.33;67.76
15;uol.com.br/www.uol.com.br/index.html;42.89;48.05;53.77;40.02;42.41
16;thepiratebay.org/thepiratebay.org/top/201.html;40.46;42.46;47.63;57.66;45.49
17;page.renren.com/page.renren.com/index.html;47.61;66.78;47.91;62.78;47.19
18;chinaz.com/chinaz.com/index.html;50.34;58.17;118.43;55.47;63.80
19;globo.com/www.globo.com/index.html;41.34;38.52;42.82;53.14;45.20
20;spiegel.de/www.spiegel.de/index.html;33.60;34.34;36.25;36.18;47.04
21;dailymotion.com/www.dailymotion.com/us.html;37.68;36.13;39.52;37.15;42.79
22;goo.ne.jp/goo.ne.jp/index.html;50.74;47.30;63.04;58.41;58.96
23;stackoverflow.com/stackoverflow.com/questions/184618/what-is-the-best-comment-in-source-code-you-have-ever-encountered.html;44.66;44.40;43.39;47.38;57.65
24;ezinearticles.com/ezinearticles.com/index.html@Migraine-Ocular---The-Eye-Migraines&id=4684133.html;37.38;45.01;40.29;36.26;39.28
25;huffingtonpost.com/www.huffingtonpost.com/index.html;39.57;43.35;55.01;44.13;58.28
26;media.photobucket.com/media.photobucket.com/image/funny%20gif/findstuff22/Best%20Images/Funny/funny-gif1.jpg@o=1.html;39.77;42.46;75.54;42.38;47.72
27;imgur.com/imgur.com/gallery/index.html;34.72;37.37;46.74;40.93;37.08
28;reddit.com/www.reddit.com/index.html;42.47;39.89;51.54;51.51;41.68
29;noimpactman.typepad.com/noimpactman.typepad.com/index.html;54.28;47.40;52.38;52.15;50.97
30;myspace.com/www.myspace.com/albumart.html;48.97;64.12;61.66;48.32;68.53
31;mashable.com/mashable.com/index.html;36.76;40.95;35.30;53.86;42.76
32;dailymail.co.uk/www.dailymail.co.uk/ushome/index.html;42.06;40.64;44.24;37.32;61.35
33;whois.domaintools.com/whois.domaintools.com/mozilla.com.html;34.73;35.23;39.25;48.24;35.72
34;indiatimes.com/www.indiatimes.com/index.html;52.67;55.51;46.29;52.69;58.82
35;reuters.com/www.reuters.com/index.html;32.92;33.08;36.95;39.23;39.27
36;xinhuanet.com/xinhuanet.com/index.html;125.85;102.56;138.89;130.34;147.45
37;56.com/www.56.com/index.html;63.89;75.00;61.45;62.20;58.67
38;bild.de/www.bild.de/index.html;35.61;43.74;34.79;33.45;31.83
39;guardian.co.uk/www.guardian.co.uk/index.html;38.91;55.93;62.34;42.63;45.99
40;naver.com/www.naver.com/index.html;78.10;89.07;127.67;75.18;109.32
41;yelp.com/www.yelp.com/biz/alexanders-steakhouse-cupertino.html;42.54;46.92;39.19;49.82;50.43
42;wsj.com/online.wsj.com/home-page.html;46.43;55.51;44.16;81.79;48.78
43;google.com/www.google.com/search@q=mozilla.html;35.62;36.71;44.47;45.00;40.22
44;xunlei.com/xunlei.com/index.html;67.57;60.69;83.83;85.53;85.08
45;aljazeera.net/aljazeera.net/portal.html;65.03;51.84;73.29;64.77;69.70
46;w3.org/www.w3.org/standards/webdesign/htmlcss.html;53.57;58.50;72.98;66.95;55.62
47;homeway.com.cn/www.hexun.com/index.html;105.59;117.32;108.95;116.10;70.32
48;youtube.com/www.youtube.com/music.html;40.53;41.48;59.67;40.81;40.07
49;people.com.cn/people.com.cn/index.html;96.49;103.64;115.12;66.05;117.84

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

Microbench

  • contact: :benwa
  • source: [MozGTestBench.cpp]
  • type: Custom GTest micro-benchmarking
  • data: Time taken for a GTest function to execute
  • summarization: Not a Talos test. This suite is provides a way to add low level platform performance regression tests for things that are not suited to be tested by Talos. See the [Microbench Sheriffing Policy] for some notes on how to treat regressions.

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

tabpaint

  • contact: :mconley
  • source: tabpaint
  • bug: https://bugzilla.mozilla.org/show_bug.cgi?id=1253382
  • type: Pageloader
  • measuring:
    • The time it takes to paint the content of a newly opened tab when the tab is opened from the parent (ex: by hitting Ctrl-T)
    • The time it takes to paint the content of a newly opened tab when the tab is opened from content (ex: by clicking on a target="_blank" link)
  • NOT measuring:
    • The time it takes to animate the tabs. That's the responsibility of the TART test. tabpaint is strictly concerned with the painting of the web content.
  • data: we load the tabpaint trigger page 20 times, each run produces two values (the time it takes to paint content when opened from the parent, and the time it takes to paint content when opened from content), resulting in 2 sets of 20 data points.
    • Example:
Example data
0;tabpaint-from-parent;105;76;66;64;64;69;65;63;70;68;64;60;65;63;54;61;64;67;61;64
1; tabpaint-from-content;129;68;72;72;70;78;86;85;82;79;120;92;76;80;74;82;76;89;77;85
  • summarization:
    • subtest: ignore first data point, then take the median of the remaining 19 data points
    • suite: geometric_mean(subtests)

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

Other data

These are not part of the Talos code, but like Talos they are benchmarks that record data using the graphserver and are analyzed by the same scripts for regressions.

Number of Constructors (num_ctors)

This test runs at build time and measures the number of static initializers in the compiled code. Reducing this number is helpful for startup optimizations.

Possible regression causes
  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

bloom_basic

  • contact: :bholley, :rwood, :jmaher
  • source: perf-reftest
  • type: PageLoader
  • reporting: intervals in ms (lower is better)
  • data: each test loads 25 times
  • summarization:

The bloom_basic test (perf-reftest suite) is a unique talos suite that loads two different test pages: a 'base' page (bloom_basic) and a 'reference' page (bloom_basic_ref), and then compares each of the page load times against eachother to determine the variance.

Talos runs each of the two pages as if they are stand-alone tests, and then calculates and reports the variance; the test output 'replicates' reported from bloom_basic are actually the comparisons between the 'base' and 'reference' pages for each page load cycle.

If you wish to see the individual 'base' and 'reference' page results instead of just the reported difference, the 'base_replicates' and 'ref_replicates' can be found in the PERFHERDER_DATA log file output, and in the 'local.json' talos output file when running talos locally. In production, both of the page replicates are also archived in the perfherder-data.json file. The perfherder-data.json file is archived after each run in production, and can be found on the Treeherder Job Details tab when the perf-reftest job symbol is selected.

Bloom_basic ensures that we correctly fast-reject ancestor selectors that clearly don't match any ancestor elements. If the bloom filter isn't working correctly, the 'test' page will take orders of magnitude longer. The 'ref' page will take somewhat longer, but not nearly as much.

Example data
"replicates": [1.185, 1.69, 1.22, 0.36, 11.26, 3.835, 3.315, 1.355, 3.185, 2.485, 2.2, 1.01, 0.9, 1.22, 1.9,
0.285, 1.52, 0.31, 2.58, 0.725, 2.31, 2.67, 3.295, 1.57, 0.3], "value": 1.7349999999999999, "unit": "ms",

"base_replicates": [166.94000000000003, 165.16, 165.64000000000001, 165.04000000000002, 167.355, 165.175,
165.325, 165.11, 164.175, 164.78, 165.555, 165.885, 166.83499999999998, 165.76500000000001, 164.375, 166.825,
167.13, 166.425, 169.22500000000002, 164.955, 165.335, 164.45000000000002, 164.85500000000002, 165.005, 166.035]}],

"ref_replicates": [165.755, 166.85000000000002, 166.85999999999999, 165.4, 178.615, 169.01, 168.64, 166.465,
167.36, 167.265, 167.75500000000002, 166.895, 167.735, 166.985, 166.275, 166.54000000000002, 165.61, 166.115,
166.64499999999998, 165.68, 167.64499999999998, 167.12, 168.15, 166.575, 166.335], 

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

bloom_basic_singleton

The bloom_basic_singleton test (perf-reftest-singletons suite) runs the 'base' page (bloom_basic) test individually, and reports the values for that single test page alone, NOT the comparison of two different pages.

Bloom_basic ensures that we correctly fast-reject ancestor selectors that clearly don't match any ancestor elements. If the bloom filter isn't working correctly, the 'test' page will take orders of magnitude longer. The 'ref' page will take somewhat longer, but not nearly as much.

Example data
bloombasic.html;88.34000000000003;88.66499999999999;94.815;92.60500000000002;95.30000000000001;
98.80000000000001;91.975;87.73500000000001;86.925;86.965;93.00500000000001;98.93;87.45000000000002;
87.14500000000001;92.78500000000001;86.96499999999999;98.32000000000001;97.485;90.67000000000002;
86.72500000000001;95.665;100.67;101.095;94.32;91.87

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

tp6

  • contact: :rwood, :jmaher, :bsmedberg
  • source: tp6
  • type: PageLoader
  • reporting: intervals in ms (lower is better)
  • data: we load each of the tp6 pages 25 times
  • summarization:
    • subtest: ignore first five data points, then take the median of the remaining 20; source: test.py
    • suite: identical to subtest; each tp6_* test is reported individually (they're NOT aggregated)
  • NOTE: Currently only supported on Windows

The talos tp6 (also known as 'quantum-pageload') test suite loads various pages and reports their load times. Each test page/URL is reported individually. The result values (ms) is the 'time to first non-blank paint'.

Instead of loading the page sets from a local file, a tool called mitmproxy is used to playback page archives via an https proxy. When the tp6 suite is run via ./mach talos-test, mozharness automatically downloads and installs the mitmproxy tool and pageset archives.

The mitmproxy tool is started, the proxy setup in Firefox, and the mitmproxy https security certificate is loaded into Firefox, all automatically. The pagesets are made available for playback via the proxy, at which point each of the tp6 test pages are ready for load and measurement.

The tp6 suite currently consists of these individual pageload tests:

Talos test name Description
tp6_amazon Measures page load of an amazon search result for the keyword "laptop"
tp6_facebook Measures page load of a facebook timeline (post-login)
tp6_google Measures page load of a google search result for the keywords "barack obama"
tp6_youtube Measures page load of the youtube home page
Example data for tp6_youtube
https://www.youtube.co/;409;441;354;337;326;375;353;650;354;712;667;336;791;497;342;355;611;334;
692;320;714;434;582;610;333

Possible regression causes

Some examples of things that cause regressions in this test are:

  • Please update this when regressions are found

Stylo gtest microbenchmarks

  • contact: :bholley, :SimonSapin
  • source: [2]
  • type: Microbench
  • reporting: intervals in ms (lower is better)
  • data: each test is run and measured 5 times
  • summarization: take the median of the 5 data points; source: MozGTestBench.cpp


Servo_StyleSheet_FromUTF8Bytes_Bench parses a sample stylesheet 20 times with Stylo’s CSS parser that is written in Rust. It starts from an in-memory UTF-8 string, so that I/O or UTF-16-to-UTF-8 conversion is not measured.

Gecko_nsCSSParser_ParseSheet_Bench does the same with Gecko’s previous CSS parser that is written in C++, for comparison.

Servo_DeclarationBlock_SetPropertyById_Bench parses the string "10px" with Stylo’s CSS parser and sets it as the value of a property in a declaration block, a million times. This is similar to animations that are based on JavaScript code modifying Element.style instead of using CSS @keyframes.

Servo_DeclarationBlock_SetPropertyById_WithInitialSpace_Bench is the same, but with the string " 10px" with an initial space. That initial space is less typical of JS animations, but is almost always there in stylesheets or full declarations like "width: 10px". This microbenchmark was used to test the effect of some specific code changes. Regressions here may be acceptable if Servo_StyleSheet_FromUTF8Bytes_Bench is not affected.

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

IsASCII and IsUTF8 gtest microbenchmarks

Test whose name starts with PerfIsASCII test the performance of the XPCOM string IsASCII function with ASCII inputs if different lengths.

Test whose name starts with PerfIsUTF8 test the performance of the XPCOM string IsUTF8 function with ASCII inputs if different lengths.

Possible regression causes

  • The --enable-rust-simd accidentally getting turned off in automation.
  • Changes to encoding_rs internals.
  • LLVM optimizations regressing between updates to the copy of LLVM included in the Rust compiler.

TestStandardURL

  • contact: :valentin
  • source:
  • type:
  • reporting:
  • data:
  • summarization:

TestStandardURL.Perf - Runs 10K iterations on the most important getters and setters of nsStandardURL

TestStandardURL.NormalizePerf - Runs 20K x 5 iterations on nsStandardURL::NormalizeIPv4 on input that are valid IPv4 strings

TestStandardURL.NormalizePerfFails - Runs 20K x 5 iterations on nsStandardURL::NormalizeIPv4 on input that is not valid IPv4

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.

PerfStrip Tests

PerfStripWhitespace - call StripWhitespace() on 5 different test cases 20k times (each)

PerfStripCharsWhitespace - call StripChars("\f\t\r\n") on 5 different test cases 20k times (each)

PerfStripCRLF - call StripCRLF() on 5 different test cases 20k times (each)

PerfStripCharsCRLF() - call StripChars("\r\n") on 5 different test cases 20k times (each)

Possible regression causes

  • None listed yet. If you fix a regression for this test and have some tips to share, this is a good place for them.