TestEngineering/Performance/Talos/Misc: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 77: Line 77:


== Hardware Profile of machines used in automation ==
== Hardware Profile of machines used in automation ==
{{todo|move to [[../Talos/Platforms]]}}
{{note|Moved to [[../Platforms]]}}
 
=== 2018 - Firefox 60 and forward ===
 
==== linux64, win7x32, win10x64 ====
Note: We test win32 builds on win10x64 instead of win7
 
HPE Moonshot 1500 System (45 cartridges per 4.3U)
1500W Hot Plug redundant (1+1) Power Supply
45 m710x ProLiant cartridges
1 Intel E3-1585Lv5 3.0GHz CPU
8GB DDR4 2400MHz RAM
1 256GB PCIe M.2 2280 SSD
1 64GB SATA M.2 2242 SSD
1 Intel Iris Pro Graphics P580
 
==== Mac Mini r7 model ====
 
Model Name: Mac mini
Model Identifier: Macmini7,1
Processor Name: Intel Core i7
Processor Speed: 3 GHz
Number of Processors: 1
Total Number of Cores: 2
GPU: Intel Iris Graphics 5100
L2 Cache (per Core): 256 KB
L3 Cache: 4 MB
Memory: 16 GB
Disk: SSD 251 GB (251,000,193,024 bytes)
 
=== Older IX hardware for versions prior to Firefox 60 ===
 
==== linux64, win7x32, win10x64 ====
iX21X4 2U Neutron "Gemini" Series Four Node Hot-Pluggable Server (4 nodes per 2U)
920W High-Efficiency redundant (1+1) Power Supply
1 Intel X3450 CPU per node
8GB Total: 2 x 4GB DDR3 1333Mhz ECC/REG RAM per node
1 WD5003ABYX hard drive per node
1 NVIDIA GPU GeForce GT 610 per node
 
==== Mac Mini r5 model ====
Model Name: Mac mini
Model Identifier: Macmini5,3
Processor Name: Intel Core i7
Processor Speed: 2 GHz
Number of Processors: 1
Total Number of Cores: 4
L2 Cache (per Core): 256 KB
L3 Cache: 6 MB
Memory: 8 GB
Disk: 2 x 500G SATA drives (RAID)
 
** All Mac Minis have EDID devices attached that set the resolution at 1600x1200.


== Naming convention ==
== Naming convention ==
't' is pre-pended to the names to represent 'test'.  Thus, ts = 'test startup', tp = 'test pageload', tdhtml = 'test dhtml'.
't' is pre-pended to the names to represent 'test'.  Thus, ts = 'test startup', tp = 'test pageload', tdhtml = 'test dhtml'.

Revision as of 15:44, 12 August 2019

Adding a new test

[TODO] move to TestEngineering/Performance/Talos/Adding tests

Adding a new performance test or modifying an existing test is much easier than most people think. In general, we need to create a patch for talos which has:

  • determine if this is a startup test or a page load test and create the appropriate folder for your test in talos
  • add all files and resources for the test (make sure there is no external network accesses) to that folder
  • file a bug in the testing:talos component with your patch

What we need to know about tests

When adding a new test, we really need to understand what we are doing. Here are some questions that you should know the answer to before adding a new test:

  • What does this test measure?
  • Does this test overlap with any existing test?
  • What is the unit of measurement that we are recording?
  • What would constitute a regression?
  • What is the expected range in the results over time?
  • Are there variables or conditions which would affect this test?
    • browser configuration (prefs, environment variables)?
    • OS, resources, time of day, etc... ?
  • Indepenedent of Observation? Will this test produce the same number regardless of what was run before it?
  • What considerations are there for how this test should be run and what tools are required?

Please document the answers to these questions on TestEngineering/Performance/Talos/Tests.

Making a Pageloader test work

A pageloader test is the most common. Using the pageloader extension allows for most of the reporting and talos integration to be done for you. Here is what you need to do:

  • Create a new directory in the tests subdirectory of talos
  • Create a manifest
    • add a <testname>.manifest file in your directory (svg example)
      • raw page load will just be a single url line (svg example)
      • internal measurements will have a % prior to the url (svg example)
  • Add your tests
  • Add your test definition to talos via [test.py] to add a new class for your test. ([tscrollx example])
    • Add an item for tpmanifest, ${talos} is replaced with the current running directory.
      • example: tpmanifest = '${talos}/tests/scroll/scroll.manifest'
    • Add an item for tpcycles (we recommend 1) and tppagecycles (we recommend 25 for a simple page, and 5 for a internal recording benchmark)
    • Add an item for filters. ([tcanvasmark example])

Making a Startup test work

A startup test is designed to load the browser with a single page many times. This is useful for tests which shouldn't have any extensions, can handle all loading and measuring themselves, or require the measurement of the browser startup. Here is what you need to do:

Steps to add a test to production

  • for aurora/central on desktop only - just land your new test to your in-tree commit (on inbound/autoland) and ensure that you adjust talos.json to run the test.
  • Update talos
    • just land code in-tree (inbound/autoland)
  • Ensure the test follows the branch through the [release stages]
    • if we are just updating talos.json, this will be automatic
  • if this is an update to an existing test that could change the numbers, this needs to be treated as a new test and run side by side for a week to get a new baseline for the numbers.

While that is a laundry list of items to do, if you are developer of a component just talk to the a*team ([jmaher]) and they will handle the majority of the steps above.

Background Information

Hardware Profile of machines used in automation

Naming convention

't' is pre-pended to the names to represent 'test'. Thus, ts = 'test startup', tp = 'test pageload', tdhtml = 'test dhtml'.