StandaloneTalos: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
(cvsroot has been updated in revision 1.2 generate-tpcomponent.py)
Line 10: Line 10:
# Install the new pageloader (only for browsers not built with --enable-tests flag)
# Install the new pageloader (only for browsers not built with --enable-tests flag)
## <code>cvs -d:pserver:anonymous@cvs-mirror.mozilla.org:/cvsroot co -d scripts mozilla/tools/buildbot-configs/testing/talos/perfmaster/scripts/generate-tpcomponent.py</code>
## <code>cvs -d:pserver:anonymous@cvs-mirror.mozilla.org:/cvsroot co -d scripts mozilla/tools/buildbot-configs/testing/talos/perfmaster/scripts/generate-tpcomponent.py</code>
## Edit generate-tpcomponent.py to use 'cvs-mirror@mozilla.org' instead of 'cvs.mozilla.org'
## Run in talos\page_load_test directory with <br><code> python generate-tpcomponent.py </code>
## Run in talos\page_load_test directory with <br><code> python generate-tpcomponent.py </code>
## You should see that <code>chrome\</code> and <code>components\</code> directories have been created, these need to be present in your page_load_test directory for the tests to run
## You should see that <code>chrome\</code> and <code>components\</code> directories have been created, these need to be present in your page_load_test directory for the tests to run

Revision as of 02:17, 13 February 2008

How to set up Talos for testing at home

First Steps

  1. Prerequisites
    1. Python 2.5
    2. Pyyaml
    3. PyWin32 Extensions (if this is to be run on a windows box)
  2. Get the Talos code
    • Check out talos:
      cd c:\
      cvs -d:pserver:anonymous@cvs-mirror.mozilla.org:/cvsroot co -d talos mozilla/testing/performance/talos
  3. Install the new pageloader (only for browsers not built with --enable-tests flag)
    1. cvs -d:pserver:anonymous@cvs-mirror.mozilla.org:/cvsroot co -d scripts mozilla/tools/buildbot-configs/testing/talos/perfmaster/scripts/generate-tpcomponent.py
    2. Run in talos\page_load_test directory with
      python generate-tpcomponent.py
    3. You should see that chrome\ and components\ directories have been created, these need to be present in your page_load_test directory for the tests to run
  4. Get the standalone directory
    1. standalone.zip
    2. unzip in the talos directory
  5. Set up your local web page set

How to generate your own page set for local testing

There are two ways of gathering your own web page set for testing with. The first is to use a script using wget/bash to collect and clean web pages, the second is running the page cycler script through a browser that has been proxied and then using the proxy cache as the set. They both have problems in creating a repeatable, re-playable page set as described below.

Why can't Mozilla just give you a web page set for testing? The re-distribution of copied web pages does not fall under fair use. It would certainly be nice to have everyone testing from the same set of web pages but, for now, the best that we can do is provide the tools to create your own test set.

Disclaimer

As a warning, be aware that the web pages lists provided in the standalone directory (proxy_manifest.txt or wget_sitelist.txt) will result in the collection of a lot of pornographic pages. When creating this list the goal was simply to take the Alexa top 500 for a given day. If you are uncomfortable viewing this material feel free to create your own list of sites - as long as the format is the same as the given files.

Using wget/bash

  1. Prerequisites
    1. wget css parser
    2. Copy of getpages.sh
      • cvs -d:pserver:anonymous@cvs-mirror.mozilla.org:/cvsroot co -d grabber mozilla/testing/tools/grabber
    3. Apache
      • You'll need to serve the local pages from a local server
        • Why can't I just do file load? Isn't that simpler?
          • No - file load takes a different path through rendering than loading pages through the web. So, you can use file load but you may not end up with the most meaningful load time results
  2. Grabbing the pages
    1. Run in a directory containing both getpages.sh and wget-1.10-css-parser
      • ./getpages.sh wget_sitelist.txt wget_manifest.txt
        • wget_sitelist.txt is in the standalone directory
      • a testpages directory will be created, it contains the web page collection
    2. Change the Apache DocumentRoot to the testpages directory
    3. In testpages/ you'll find the wget_manifest.txt file
      1. Edit wget_manifest.txt to reflect correct localhost urls
      2. Move the updated wget_manifest.txt file to your standalone directory
  3. Pitfalls
    1. Cleaning the web pages results in broken pages
      • You will have to run the page cycler over the page set to determine which pages have become broken - if the cycler stalls or is unable to continue you remove the page from the set
        • To do this follow the directions for running the tests below, on test stalling/failure remove the page that caused the failure from the wget_manifest.txt file
      • This is tedious work and for a large web page set can take a long time to figure out a series of pages that works consistently
    2. Pages won't look exactly as they do in the wild
      • Through the cleaning web pages can end up looking odd/incomplete, so you don't get pages looking as they do when served live
  4. Advantages
    1. Initial collection of the page set is easy, set the getpages.sh script going and walk away

Using the proxy server

  1. Prerequisites
    1. Copy of talos
    2. Copy of the standalone directory
    3. Copy of the proxy server
      • cvs -d:pserver:anonymous@cvs-mirror.mozilla.org:/cvsroot co -d proxyserver mozilla/testing/tools/proxyserver
      • copy proxyserver\proxyserver.py into the standalone directory
  2. From the talos directory
    1. Set up the proxy server
      python standalone\proxyserver.py -p 9001
    2. in standalone\proxy_setup.config change the firefox: firefox/firefox.exe to a valid path for firefox on your machine
    3. Run the tests once from the talos directory
      python run_tests.py standalone\proxy_setup.config
  3. Pitfalls
    1. All pages in the set may not be served successfully, resulting in a stalled cycler. You'll have to remove the page from the proxy_manifest.txt and give it another try
      • This is tedious work and for a large web page set can take a long time to figure out a series of pages that works consistently
    2. Some pages are not served correctly from the cache. Again, these pages will have to be removed from the manifest
      • To do this, follow the steps to run the tests from below, on page stalling/failure remove the page from proxy_manifest.txt and start again
      • More tedious work after the set has been collected to ensure that it cycles correctly
  4. Advantages
    1. Pages are served exactly as they would look in the wild (no alterations during page set cleaning/localizing)
    2. No Apache dependency

Start Testing

  • If you are using a wget/bash generated page set...
    1. in standalone\wget_run.config change the firefox: firefox/firefox.exe to a valid path for firefox on your machine
    2. Run the tests from the talos directory
      python run_tests.py standalone\wget_run.config
    3. All the results are found in the talos\standalone\output directory
    4. Use your favorite spreadsheet program to examine the results
  • If you are using a proxy server cache...
    1. in standalone\proxy_run.config change the firefox: firefox/firefox.exe to a valid path for firefox on your machine
    2. Flip the proxy server to only serve pages from its cache
      python standalone\proxyserver.py -p 9001 -l
    3. Run the tests from the talos directory
      python run_tests.py standalone\proxy_run.config
    4. All the results are found in the talos\standalone\output directory
    5. Use your favorite spreadsheet program to examine the results