Confirmed users
328
edits
(→Static Analysis for Rooting Hazards: updated with new names, instructions) |
(→Static Analysis for Rooting Hazards: put useful stuff at top) |
||
Line 1: | Line 1: | ||
== Static Analysis for Rooting Hazards == | == Static Analysis for Rooting Hazards == | ||
Treeherder can run two static analysis builds: the full browser (linux64-haz), just the JS shell (linux64-shell | Treeherder can run two static analysis builds: the full browser (linux64-haz), just the JS shell (linux64-shell-haz). They show up on treeherder as '''H''' and '''SM(H)'''. | ||
=== Diagnosing a hazard failure === | |||
Click on the '''H''' build link, select the "Job details" pane on the bottom right, follow the "Inspect Task" link, and download the "public/build/hazards.txt.gz" file. | |||
Example snippet: | |||
<pre> | <pre> | ||
Line 66: | Line 44: | ||
* during the resulting garbage collection, the object pointed to by ed.obj is moved to a different location. All pointers stored in the JS heap are updated automatically, as are all rooted pointers. ed.obj is not, because the GC doesn't know about it. | * during the resulting garbage collection, the object pointed to by ed.obj is moved to a different location. All pointers stored in the JS heap are updated automatically, as are all rooted pointers. ed.obj is not, because the GC doesn't know about it. | ||
* after decompilePC returns, something accesses ed.obj. This is now a stale pointer, and may refer to just about anything -- the wrong object, an invalid object, or whatever. Badness 10000, as TeX would say. | * after decompilePC returns, something accesses ed.obj. This is now a stale pointer, and may refer to just about anything -- the wrong object, an invalid object, or whatever. Badness 10000, as TeX would say. | ||
=== Analysis implementation === | |||
These builds are performed as follows: | |||
* run the script testing/taskcluster/scripts/builder/build-haz-linux.sh, which sets up a build environment and runs the analysis within it, then uploads the resulting files | |||
** compile an optimized JS shell to later run the analysis | |||
** compile the browser with gcc, using a slightly modified version of the sixgill (http://svn.sixgill.org) gcc plugin, producing a set of .xdb files describing everything encountered during the compilation | |||
** analyze the .xdb files with scripts in js/src/devtools/rootAnalysis | |||
=== Running the analysis === | |||
==== Pushing to try ==== | |||
The easiest way to run an analysis is to push to try with the trychooser line |try: -b do -p linux64-haz| (or, if the hazards of interest are contained entirely within js/src, use |try: -b do -p linux64-shell-haz| for a much faster result). The expected turnaround time for linux64-haz is just under 2 hours. | |||
The output will be uploaded and a link named "results" will be placed into the "job details" info pane on treeherder. If the analysis fails, you will see the number of failures. Navigate to the hazards.txt.gz file. | |||
==== Running locally ==== | |||
To run the browser analysis, you must be on a Fedora/RedHat/CentOS linux64 machine. See js/src/devtools/rootAnalysis/README.md. | |||
If you are running Debian or Ubuntu, then there is currently a problem running the full browser analysis. You can coerce the shell-only build to work by doing something like: | |||
sudo apt-get install autoconf2.13 libnspr4 libnspr4-dev | |||
sudo ln -s autoconf2.13 /usr/bin/autoconf-2.13 | |||
export CFLAGS="-B/usr/lib/x86_64-linux-gnu -I/usr/include/x86_64-linux-gnu" | |||
export CXXFLAGS="-B/usr/lib/x86_64-linux-gnu -I/usr/include/x86_64-linux-gnu" | |||
before running the script. | |||
=== So you broke the analysis by adding a hazard. Now what? === | === So you broke the analysis by adding a hazard. Now what? === |