gecko-dev/servo/etc/ci/performance
Alan Jeffrey 72c474fa4b servo: Merge #19875 - Use https github URL to clone servo-warc-tests repo (from asajeffrey:test-perf-use-https-for-git-clone); r=jdm
<!-- Please describe your changes on the following line: -->

Fixes timeouts such as http://build.servo.org/builders/linux-nightly/builds/583/steps/test/logs/stdio caused by github.com not being in the ssh known hosts.

---
<!-- Thank you for contributing to Servo! Please replace each `[ ]` by `[X]` when the step is complete, and replace `__` with appropriate data: -->
- [X] `./mach build -d` does not report any errors
- [X] `./mach test-tidy` does not report any errors
- [X] These changes do not require tests because this is test infrastructure

<!-- Also, please make sure that "Allow edits from maintainers" checkbox is checked, so that we can help you if you get stuck somewhere along the way.-->

<!-- Pull requests that do not address these steps are welcome, but they will require additional verification as part of the review process. -->

Source-Repo: https://github.com/servo/servo
Source-Revision: f2cc98f1869d8b1656e6135a66e59d4ae0da759f

--HG--
extra : subtree_source : https%3A//hg.mozilla.org/projects/converted-servo-linear
extra : subtree_revision : 2ca739ed465693dd094ff107d6625f50b33f2af6
2018-01-26 10:02:00 -06:00
..
harness servo: Merge #15067 - Added async performance test (from shinglyu:stylo-perf-async); r=Manishearth 2017-02-07 18:59:03 -08:00
page_load_test servo: Merge #19245 - Update tp5n manifest for test-perf (from asajeffrey:test-perf-update-manifest); r=jdm 2017-11-20 11:09:38 -06:00
user-agent-js servo: Merge #19528 - Capture loadEventEnd in test-perf (from asajeffrey:test-perf-measure-loadEventEnd); r=avadacatavra 2017-12-18 16:54:07 -06:00
.gitignore servo: Merge #19820 - Run servo-warc-tests as part of test-perf (from asajeffrey:test-perf-run-warc-tests); r=jdm 2018-01-23 17:48:07 -06:00
README.md servo: Merge #19207 - Add --base option to test-perf (from asajeffrey:test-perf-add-base-url-option); r=edunham 2017-11-13 19:51:32 -06:00
download_buildbot_timings.py servo: Merge #19524 - Use urllib rather than httplib2 when downloading build perf data (from asajeffrey:build-perf-urllib); r=jdm 2017-12-08 14:30:46 -06:00
gecko_driver.py servo: Merge #19507 - Add a --date option to test-perf (from asajeffrey:test-perf-date-option); r=avadacatavra 2017-12-12 13:52:59 -06:00
git_log_to_json.sh
prepare_manifest.sh servo: Merge #12946 - Etc ci performance run locally (from asajeffrey:etc-ci-performance-run-locally); r=shinglyu 2016-09-06 04:07:05 -05:00
runner.py servo: Merge #19523 - Filter out failed test-perf runs (from asajeffrey:test-perf-filter-out-failed-tests); r=avadacatavra 2018-01-05 09:15:48 -06:00
set_s3_policy.py servo: Merge #19244 - Submit test-perf CSV files to S3 (from asajeffrey:test-perf-submit-to-s3); r=jdm 2017-11-21 11:07:55 -06:00
submit_to_perfherder.py servo: Merge #15847 - Remove link to servo from treeherder job detail to reduce noise (from shinglyu:treeherder-cleanup); r=Wafflespeanut 2017-03-14 04:08:49 -07:00
submit_to_s3.py servo: Merge #19244 - Submit test-perf CSV files to S3 (from asajeffrey:test-perf-submit-to-s3); r=jdm 2017-11-21 11:07:55 -06:00
test_all.sh servo: Merge #19507 - Add a --date option to test-perf (from asajeffrey:test-perf-date-option); r=avadacatavra 2017-12-12 13:52:59 -06:00
test_differ.py servo: Merge #19206 - Add total time to test_differ in etc/ci/performance (from asajeffrey:test-perf-add-differ-summary); r=aneeshusa 2017-11-15 18:34:12 -06:00
test_perf.sh servo: Merge #19875 - Use https github URL to clone servo-warc-tests repo (from asajeffrey:test-perf-use-https-for-git-clone); r=jdm 2018-01-26 10:02:00 -06:00
test_runner.py servo: Merge #15067 - Added async performance test (from shinglyu:stylo-perf-async); r=Manishearth 2017-02-07 18:59:03 -08:00
test_submit_to_perfherder.py

README.md

Servo Page Load Time Test

Prerequisites

  • Python3

Basic Usage

./mach test-perf can be used to run a performance test on your servo build. The test result JSON will be saved to etc/ci/performance/output/. You can then run python test_differ.py to compare these two test results. Run python test_differ.py -h for instructions.

Setup for CI machine

CI for Servo

  • Setup your Treeherder client ID and secret as environment variables TREEHERDER_CLIENT_ID and TREEHERDER_CLIENT_SECRET
  • Run ./mach test-perf --submit to run and submit the result to Perfherder.

CI for Gecko

  • Install Firefox Nightly in your PATH
  • Download geckodriver and add it to the PATH (e.g. for Linux export PATH=$PATH:path/to/geckodriver)
  • export FIREFOX_BIN=/path/to/firefox
  • pip install selenium
  • Run python gecko_driver.py to test
  • Run test_all.sh --gecko --submit (omit --submit if you don't want to submit to perfherder)

How it works

  • The testcase is from tp5, every testcase will run 20 times, and we take the median.
  • Some of the tests will make Servo run forever, it's disabled right now. See https://github.com/servo/servo/issues/11087
  • Each testcase is a subtest on Perfherder, and their summary time is the geometric mean of all the subtests.
  • Notice that the test is different from the Talos TP5 test we run for Gecko. So you can NOT conclude that Servo is "faster" or "slower" than Gecko from this test.

Comparing the performance before and after a patch

  • Run the test once before you apply a patch, and once after you apply it.
  • python test_differ.py output/perf-<before time>.json output/perf-<after time>.json
  • Green lines means loading time decreased, Blue lines means loading time increased.

Add your own test

  • You can add two types of tests: sync test and async test
    • sync test: measure the page load time. Exits automatically after page loaded.
    • async test: measures your custom time markers from JavaScript, see page_load_test/example/example_async.html for example.
  • Add you test case (html file) to the page_load_test/ folder. For example we can create a page_load_test/example/example.html
  • Add a manifest (or modify existing ones) named page_load_test/example.manifest
  • Add the lines like this to the manifest:
# Pages got served on a local server at localhost:8000
# Test case without any flag is a sync test
http://localhost:8000/page_load_test/example/example_sync.html
# Async test must start with a `async` flag
async http://localhost:8000/page_load_test/example/example.html
  • Modify the MANIFEST=... link in test_all.sh and point that to the new manifest file.

Unit tests

You can run all unit tests (include 3rd-party libraries) with python -m pytest.

Individual test can be run by python -m pytest <filename>:

  • test_runner.py
  • test_submit_to_perfherder.py

Advanced Usage

Reducing variance

Running the same performance test results in a lot of variance, caused by the OS the test is running on. Experimentally, the things which seem to tame randomness the most are a) disbling CPU frequency changes, b) increasing the priority of the tests, c) running one one CPU core, d) loading files directly rather than via localhost http, and e) serving files from memory rather than from disk.

First run the performance tests normally (this downloads the test suite):

./mach test-perf

Disable CPU frequency changes, e.g. on Linux:

sudo cpupower frequency-set --min 3.5GHz --max 3.5GHz

Copy the test files to a tmpfs file, such as /run/user/, for example if your uid is 1000:

cp -r etc/ci/performance /run/user/1000

Then run the test suite on one core, at high priority, using a file:// base URL:

sudo nice --20 chrt -r 99 sudo -u *userid* taskset 1 ./mach test-perf --base file:///run/user/1000/performance/

These fixes seem to take down the variance for most tests to under 5% for individual tests, and under 0.5% total.

(IRC logs: 2017-11-09 | 2017-11-10 )

Test Perfherder Locally

If you want to test the data submission code in submit_to_perfherder.py without getting a credential for the production server, you can setup a local treeherder VM. If you don't need to test submit_to_perfherder.py, you can skip this step.

  • Add 192.168.33.10 local.treeherder.mozilla.org to /etc/hosts
  • git clone https://github.com/mozilla/treeherder; cd treeherder
  • vagrant up
  • vagrant ssh
    • ./bin/run_gunicorn
  • Outside of vm, open http://local.treeherder.mozilla.org and login to create an account
  • vagrant ssh
    • ./manage.py create_credentials <username> <email> "description", the email has to match your logged in user. Remember to log-in through the Web UI once before you run this.
    • Setup your Treeherder client ID and secret as environment variables TREEHERDER_CLIENT_ID and TREEHERDER_CLIENT_SECRET

Troubleshooting

If you saw this error message:

venv/bin/activate: line 8: _OLD_VIRTUAL_PATH: unbound variable

That means your virtualenv is too old, try run pip install -U virtualenv to upgrade (If you installed ubuntu's python-virtualenv package, uninstall it first then install it through pip)