Bug 1765457 - Enable custom browsertime testing through raptor. r=perftest-reviewers,kshampur

This patch builds off the ability to specify custom browsertime arguments on the command line in raptor to let the user run custom tests in "vanilla" browsertime. In this patch, we create a new test called `browsertime` that can be used to get a "dummy" template for a raptor test. Then, the user will be responsible for providing either a test script, or a URL to test.

To use a custom script we can pass something like `--browsertime-arg test_script=/path/to/script` or `--browsertime-arg url=https://www.sitespeed.io`. Furthermore, we can also use `test_script=pageload` to specify that we want to use the browsertime pageload script.

Differential Revision: https://phabricator.services.mozilla.com/D144168
This commit is contained in:
Gregory Mierzwinski 2022-04-21 18:05:53 +00:00
Родитель 3b88452e0b
Коммит b9466bda67
8 изменённых файлов: 237 добавлений и 124 удалений

Просмотреть файл

@ -20,66 +20,25 @@ Running Locally
- A local mozilla repository clone with a `successful Firefox build <https://developer.mozilla.org/en-US/docs/Mozilla/Developer_guide/Build_Instructions>`_ completed
Setup
-----
Running on Firefox Desktop
--------------------------
Note that if you are running Raptor-Browsertime then it will get installed automatically and also updates itself.
Vanilla Browsertime tests
-------------------------
- ``./mach browsertime --clobber --setup --install-vismet-reqs``
If you want to run highly customized tests, you can make use of our customizable ``browsertime`` test.
This will automatically check your setup, which will output something like this:
With this test, you can customize the page to test, test script to use, and anything else required. It will make use of default settings that Raptor uses in browsertime but these can be overriden with ``--browsertime-arg`` settings.
For example, here's a test on ``https://www.sitespeed.io`` using this custom test:
::
ffmpeg: OK
convert: OK
compare: OK
Pillow: OK
SSIM: OK
./mach raptor --browsertime -t browsertime --browsertime-arg test_script=pageload --browsertime-arg browsertime.url=https://www.sitespeed.io --browsertime-arg iterations=3
- To manually check your setup, run ``./mach browsertime --check``
That test will perform 3 iterations of the given url. Note also that we can use simplified names to make use of test scripts that are built into raptor. You can use ``pageload``, ``interactive``, or provide a path to another test script.
Known Issues
^^^^^^^^^^^^
**If you aren't running visual metrics, then failures in** ``Pillow`` **and** ``SSIM`` **can be ignored.**
`Bug 1735410: [meta] browsertime visual metrics dependencies not installing correctly <https://bugzilla.mozilla.org/show_bug.cgi?id=1735410>`_
Currently there are issues on all platforms installing browsertime vismet dependencies. There is a fix for Linux (`Bug 1746208 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746208>`__) but not on Windows (`Bug 1746206 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746206>`__) or OSX (`Bug 1746207 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746207>`__)
Linux
"""""
`Bug 1746208 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746208>`__ **(resolved)**
If ``ffmpeg`` is listed as FAIL, try `downloading ffmpeg manually <https://ffmpeg.org/>`_ and adding it to your PATH
OSX
"""
`Bug 1746207 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746207>`__ **(resolved)**
**Current Status**: ``convert`` and ``compare`` fail to install. Rebuilding Firefox and running browsertime setup has not shown to resolve this issue.
Windows
"""""""
`Bug 1746206 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746206>`__ **(unresolved)**
If the ImageMagick URL returns a 404 during setup, please `file a bug like this <https://bugzilla.mozilla.org/show_bug.cgi?id=1735540>_` to have the URL updated.
**Current Status**: ``convert``, ``compare``, and ``ffmpeg`` fail to install. Neither adding ``ffmpeg`` to the PATH, nor rebuilding Firefox have shown to resolve this issue.
- For other issues, try deleting the ``~/.mozbuild/browsertime`` folder and re-running the browsertime setup command.
- If you plan on running Browsertime on Android, your Android device must already be set up (see more below in the :ref: `Running on Android` section)
- **If you encounter any issues not mentioned here, please** `file a bug <https://bugzilla.mozilla.org/enter_bug.cgi?product=Testing&component=Raptor>`_ **in the** ``Testing::Raptor`` **component.**
Running on Firefox Desktop
--------------------------
This custom test is only available locally.
Page-load tests
---------------
@ -143,13 +102,13 @@ Chrome releases are tied to a specific version of ChromeDriver -- you will need
There are two ways of doing this:
1. Download the ChromeDriver that matches the chrome you wish to run from https://chromedriver.chromium.org/ and specify the path:
* Download the ChromeDriver that matches the chrome you wish to run from https://chromedriver.chromium.org/ and specify the path:
::
./mach browsertime https://www.sitespeed.io -b chrome --chrome.chromedriverPath <PATH/TO/VERSIONED/CHROMEDRIVER>
2. Upgrade the ChromeDriver version in ``tools/browsertime/package-lock.json`` (see https://www.npmjs.com/package/@sitespeed.io/chromedriver for versions).
* Upgrade the ChromeDriver version in ``tools/browsertime/package-lock.json`` (see https://www.npmjs.com/package/@sitespeed.io/chromedriver for versions).
Run ``npm install``.
@ -175,14 +134,13 @@ Passing Additional Arguments to Browertime
Browsertime has many command line flags to configure its usage, see `Browsertime configuration <https://www.sitespeed.io/documentation/browsertime/configuration/>`_.
We do not currently support passing additional arguments, this work can be tracked in `Bug 1750976 <https://bugzilla.mozilla.org/show_bug.cgi?id=1750976>`_.
There are multiple ways of adding additional arguments to Browsertime from Raptor. The primary method is to use ``--browsertime-arg``. For example: ``./mach raptor --browsertime -t amazon --browsertime-arg iterations=10``
There are two options to work around this and enable additional arguments.
Other methods for adding additional arguments are:
1. Define additional arguments in `testing/raptor/raptor/browsertime/base.py <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/browsertime/base.py#220-252>`_.
2. Add a ``browsertime_args`` entry to the appropriate manifest with the desired arguments, i.e. `browsertime-tp6.ini <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/tests/tp6/desktop/browsertime-tp6.ini>`_ for desktop page load tests. `Example of browsertime_args format <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/tests/custom/browsertime-process-switch.ini#27>`_.
* Define additional arguments in `testing/raptor/raptor/browsertime/base.py <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/browsertime/base.py#220-252>`_.
* Add a ``browsertime_args`` entry to the appropriate manifest with the desired arguments, i.e. `browsertime-tp6.ini <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/tests/tp6/desktop/browsertime-tp6.ini>`_ for desktop page load tests. `Example of browsertime_args format <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/tests/custom/browsertime-process-switch.ini#27>`_.
Running Browsertime on Try
--------------------------
@ -244,3 +202,60 @@ Comparing Before/After Browsertime Videos
We have some scripts that can produce side-by-side comparison videos for you of the worst pairing of videos. You can find the script here: https://github.com/mozilla/mozperftest-tools#browsertime-side-by-side-video-comparisons
Once the side-by-side comparison is produced, the video on the left is the old/base video, and the video on the right is the new video.
Mach Browsertime Setup
----------------------
Note that if you are running Raptor-Browsertime then it will get installed automatically and also updates itself.
- ``./mach browsertime --clobber --setup --install-vismet-reqs``
This will automatically check your setup, which will output something like this:
::
ffmpeg: OK
convert: OK
compare: OK
Pillow: OK
SSIM: OK
- To manually check your setup, run ``./mach browsertime --check``
Known Issues
^^^^^^^^^^^^
**If you aren't running visual metrics, then failures in** ``Pillow`` **and** ``SSIM`` **can be ignored.**
`Bug 1735410: [meta] browsertime visual metrics dependencies not installing correctly <https://bugzilla.mozilla.org/show_bug.cgi?id=1735410>`_
Currently there are issues on all platforms installing browsertime vismet dependencies. There is a fix for Linux (`Bug 1746208 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746208>`__) but not on Windows (`Bug 1746206 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746206>`__) or OSX (`Bug 1746207 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746207>`__)
Linux
"""""
`Bug 1746208 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746208>`__ **(resolved)**
If ``ffmpeg`` is listed as FAIL, try `downloading ffmpeg manually <https://ffmpeg.org/>`_ and adding it to your PATH
OSX
"""
`Bug 1746207 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746207>`__ **(resolved)**
**Current Status**: ``convert`` and ``compare`` fail to install. Rebuilding Firefox and running browsertime setup has not shown to resolve this issue.
Windows
"""""""
`Bug 1746206 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746206>`__ **(unresolved)**
If the ImageMagick URL returns a 404 during setup, please `file a bug like this <https://bugzilla.mozilla.org/show_bug.cgi?id=1735540>_` to have the URL updated.
**Current Status**: ``convert``, ``compare``, and ``ffmpeg`` fail to install. Neither adding ``ffmpeg`` to the PATH, nor rebuilding Firefox have shown to resolve this issue.
- For other issues, try deleting the ``~/.mozbuild/browsertime`` folder and re-running the browsertime setup command.
- If you plan on running Browsertime on Android, your Android device must already be set up (see more below in the :ref: `Running on Android` section)
- **If you encounter any issues not mentioned here, please** `file a bug <https://bugzilla.mozilla.org/enter_bug.cgi?product=Testing&component=Raptor>`_ **in the** ``Testing::Raptor`` **component.**

Просмотреть файл

@ -7319,6 +7319,31 @@ Custom
------
Browsertime tests that use a custom pageload test script. These use the pageload type, but may have other intentions.
.. dropdown:: browsertime
:container: + anchor-id-browsertime-c
**Owner**: PerfTest Team
* **alert on**: fcp, loadtime
* **alert threshold**: 2.0
* **apps**: firefox
* **browser cycles**: 1
* **expected**: pass
* **gecko profile entries**: 14000000
* **gecko profile interval**: 1
* **lower is better**: true
* **measure**: fnbpaint, fcp, dcf, loadtime
* **page cycles**: 1
* **page timeout**: 60000
* **playback**: mitmproxy
* **playback pageset manifest**: null.manifest
* **playback version**: 5.1.1
* **test script**: None
* **test url**: `<None>`__
* **type**: pageload
* **unit**: ms
* **use live sites**: true
.. dropdown:: process-switch
:container: + anchor-id-process-switch-c

Просмотреть файл

@ -207,7 +207,32 @@ class Browsertime(Perftest):
)
else:
# Custom scripts are treated as pageload tests for now
if test.get("interactive", False):
if test.get("name", "") == "browsertime":
# Check for either a script or a url from the
# --browsertime-arg options
browsertime_script = None
for option in self.browsertime_user_args:
arg, val = option.split("=")
if arg in ("test_script", "url"):
browsertime_script = val
if browsertime_script is None:
raise Exception(
"You must provide a path to the test script or the url like so: "
"`--browsertime-arg test_script=PATH/TO/SCRIPT`, or "
"`--browsertime-arg url=https://www.sitespeed.io`"
)
# Make it simple to use our builtin test scripts
if browsertime_script == "pageload":
browsertime_script = os.path.join(
browsertime_path, "browsertime_pageload.js"
)
elif browsertime_script == "interactive":
browsertime_script = os.path.join(
browsertime_path, "browsertime_interactive.js"
)
elif test.get("interactive", False):
browsertime_script = os.path.join(
browsertime_path,
"browsertime_interactive.js",
@ -257,6 +282,8 @@ class Browsertime(Perftest):
# Raptor's `post startup delay` is settle time after the browser has started
"--browsertime.post_startup_delay",
str(self.post_startup_delay),
"--iterations",
str(test.get("browser_cycles", 1)),
]
if test.get("secondary_url"):
@ -428,8 +455,6 @@ class Browsertime(Perftest):
+ self.driver_paths
+ [browsertime_script]
+ browsertime_options
# -n option for the browsertime to restart the browser
+ ["-n", str(test.get("browser_cycles", 1))]
)
def _compute_process_timeout(self, test, timeout):

Просмотреть файл

@ -20,66 +20,25 @@ Running Locally
- A local mozilla repository clone with a `successful Firefox build <https://developer.mozilla.org/en-US/docs/Mozilla/Developer_guide/Build_Instructions>`_ completed
Setup
-----
Running on Firefox Desktop
--------------------------
Note that if you are running Raptor-Browsertime then it will get installed automatically and also updates itself.
Vanilla Browsertime tests
-------------------------
- ``./mach browsertime --clobber --setup --install-vismet-reqs``
If you want to run highly customized tests, you can make use of our customizable ``browsertime`` test.
This will automatically check your setup, which will output something like this:
With this test, you can customize the page to test, test script to use, and anything else required. It will make use of default settings that Raptor uses in browsertime but these can be overriden with ``--browsertime-arg`` settings.
For example, here's a test on ``https://www.sitespeed.io`` using this custom test:
::
ffmpeg: OK
convert: OK
compare: OK
Pillow: OK
SSIM: OK
./mach raptor --browsertime -t browsertime --browsertime-arg test_script=pageload --browsertime-arg browsertime.url=https://www.sitespeed.io --browsertime-arg iterations=3
- To manually check your setup, run ``./mach browsertime --check``
That test will perform 3 iterations of the given url. Note also that we can use simplified names to make use of test scripts that are built into raptor. You can use ``pageload``, ``interactive``, or provide a path to another test script.
Known Issues
^^^^^^^^^^^^
**If you aren't running visual metrics, then failures in** ``Pillow`` **and** ``SSIM`` **can be ignored.**
`Bug 1735410: [meta] browsertime visual metrics dependencies not installing correctly <https://bugzilla.mozilla.org/show_bug.cgi?id=1735410>`_
Currently there are issues on all platforms installing browsertime vismet dependencies. There is a fix for Linux (`Bug 1746208 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746208>`__) but not on Windows (`Bug 1746206 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746206>`__) or OSX (`Bug 1746207 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746207>`__)
Linux
"""""
`Bug 1746208 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746208>`__ **(resolved)**
If ``ffmpeg`` is listed as FAIL, try `downloading ffmpeg manually <https://ffmpeg.org/>`_ and adding it to your PATH
OSX
"""
`Bug 1746207 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746207>`__ **(resolved)**
**Current Status**: ``convert`` and ``compare`` fail to install. Rebuilding Firefox and running browsertime setup has not shown to resolve this issue.
Windows
"""""""
`Bug 1746206 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746206>`__ **(unresolved)**
If the ImageMagick URL returns a 404 during setup, please `file a bug like this <https://bugzilla.mozilla.org/show_bug.cgi?id=1735540>_` to have the URL updated.
**Current Status**: ``convert``, ``compare``, and ``ffmpeg`` fail to install. Neither adding ``ffmpeg`` to the PATH, nor rebuilding Firefox have shown to resolve this issue.
- For other issues, try deleting the ``~/.mozbuild/browsertime`` folder and re-running the browsertime setup command.
- If you plan on running Browsertime on Android, your Android device must already be set up (see more below in the :ref: `Running on Android` section)
- **If you encounter any issues not mentioned here, please** `file a bug <https://bugzilla.mozilla.org/enter_bug.cgi?product=Testing&component=Raptor>`_ **in the** ``Testing::Raptor`` **component.**
Running on Firefox Desktop
--------------------------
This custom test is only available locally.
Page-load tests
---------------
@ -143,13 +102,13 @@ Chrome releases are tied to a specific version of ChromeDriver -- you will need
There are two ways of doing this:
1. Download the ChromeDriver that matches the chrome you wish to run from https://chromedriver.chromium.org/ and specify the path:
* Download the ChromeDriver that matches the chrome you wish to run from https://chromedriver.chromium.org/ and specify the path:
::
./mach browsertime https://www.sitespeed.io -b chrome --chrome.chromedriverPath <PATH/TO/VERSIONED/CHROMEDRIVER>
2. Upgrade the ChromeDriver version in ``tools/browsertime/package-lock.json`` (see https://www.npmjs.com/package/@sitespeed.io/chromedriver for versions).
* Upgrade the ChromeDriver version in ``tools/browsertime/package-lock.json`` (see https://www.npmjs.com/package/@sitespeed.io/chromedriver for versions).
Run ``npm install``.
@ -175,14 +134,13 @@ Passing Additional Arguments to Browertime
Browsertime has many command line flags to configure its usage, see `Browsertime configuration <https://www.sitespeed.io/documentation/browsertime/configuration/>`_.
We do not currently support passing additional arguments, this work can be tracked in `Bug 1750976 <https://bugzilla.mozilla.org/show_bug.cgi?id=1750976>`_.
There are multiple ways of adding additional arguments to Browsertime from Raptor. The primary method is to use ``--browsertime-arg``. For example: ``./mach raptor --browsertime -t amazon --browsertime-arg iterations=10``
There are two options to work around this and enable additional arguments.
Other methods for adding additional arguments are:
1. Define additional arguments in `testing/raptor/raptor/browsertime/base.py <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/browsertime/base.py#220-252>`_.
2. Add a ``browsertime_args`` entry to the appropriate manifest with the desired arguments, i.e. `browsertime-tp6.ini <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/tests/tp6/desktop/browsertime-tp6.ini>`_ for desktop page load tests. `Example of browsertime_args format <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/tests/custom/browsertime-process-switch.ini#27>`_.
* Define additional arguments in `testing/raptor/raptor/browsertime/base.py <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/browsertime/base.py#220-252>`_.
* Add a ``browsertime_args`` entry to the appropriate manifest with the desired arguments, i.e. `browsertime-tp6.ini <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/tests/tp6/desktop/browsertime-tp6.ini>`_ for desktop page load tests. `Example of browsertime_args format <https://searchfox.org/mozilla-central/source/testing/raptor/raptor/tests/custom/browsertime-process-switch.ini#27>`_.
Running Browsertime on Try
--------------------------
@ -244,3 +202,60 @@ Comparing Before/After Browsertime Videos
We have some scripts that can produce side-by-side comparison videos for you of the worst pairing of videos. You can find the script here: https://github.com/mozilla/mozperftest-tools#browsertime-side-by-side-video-comparisons
Once the side-by-side comparison is produced, the video on the left is the old/base video, and the video on the right is the new video.
Mach Browsertime Setup
----------------------
Note that if you are running Raptor-Browsertime then it will get installed automatically and also updates itself.
- ``./mach browsertime --clobber --setup --install-vismet-reqs``
This will automatically check your setup, which will output something like this:
::
ffmpeg: OK
convert: OK
compare: OK
Pillow: OK
SSIM: OK
- To manually check your setup, run ``./mach browsertime --check``
Known Issues
^^^^^^^^^^^^
**If you aren't running visual metrics, then failures in** ``Pillow`` **and** ``SSIM`` **can be ignored.**
`Bug 1735410: [meta] browsertime visual metrics dependencies not installing correctly <https://bugzilla.mozilla.org/show_bug.cgi?id=1735410>`_
Currently there are issues on all platforms installing browsertime vismet dependencies. There is a fix for Linux (`Bug 1746208 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746208>`__) but not on Windows (`Bug 1746206 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746206>`__) or OSX (`Bug 1746207 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746207>`__)
Linux
"""""
`Bug 1746208 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746208>`__ **(resolved)**
If ``ffmpeg`` is listed as FAIL, try `downloading ffmpeg manually <https://ffmpeg.org/>`_ and adding it to your PATH
OSX
"""
`Bug 1746207 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746207>`__ **(resolved)**
**Current Status**: ``convert`` and ``compare`` fail to install. Rebuilding Firefox and running browsertime setup has not shown to resolve this issue.
Windows
"""""""
`Bug 1746206 <https://bugzilla.mozilla.org/show_bug.cgi?id=1746206>`__ **(unresolved)**
If the ImageMagick URL returns a 404 during setup, please `file a bug like this <https://bugzilla.mozilla.org/show_bug.cgi?id=1735540>_` to have the URL updated.
**Current Status**: ``convert``, ``compare``, and ``ffmpeg`` fail to install. Neither adding ``ffmpeg`` to the PATH, nor rebuilding Firefox have shown to resolve this issue.
- For other issues, try deleting the ``~/.mozbuild/browsertime`` folder and re-running the browsertime setup command.
- If you plan on running Browsertime on Android, your Android device must already be set up (see more below in the :ref: `Running on Android` section)
- **If you encounter any issues not mentioned here, please** `file a bug <https://bugzilla.mozilla.org/enter_bug.cgi?product=Testing&component=Raptor>`_ **in the** ``Testing::Raptor`` **component.**

Просмотреть файл

@ -150,6 +150,7 @@ suites:
custom:
description: "Browsertime tests that use a custom pageload test script. These use the pageload type, but may have other intentions."
tests:
browsertime: "Used to run vanilla browsertime tests through raptor. For example: `./mach raptor --browsertime -t browsertime --browsertime-arg url=https://www.sitespeed.io --browsertime-arg iterations=3`"
process-switch: "Measures process switch time"
welcome: "Measures pageload metrics for the first-install about:welcome page"
interactive:

Просмотреть файл

@ -51,3 +51,6 @@
# Interactive raptor-browsertime tests
[include:tests/interactive/browsertime-responsiveness.ini]
# Local custom browsertime tests
[include:tests/custom/browsertime-custom.ini]

Просмотреть файл

@ -0,0 +1,29 @@
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
[DEFAULT]
alert_on = fcp, loadtime
alert_threshold = 2.0
apps = firefox
browser_cycles = 1
gecko_profile_entries = 14000000
gecko_profile_interval = 1
lower_is_better = true
measure = fnbpaint, fcp, dcf, loadtime
owner = PerfTest Team
page_cycles = 1
page_timeout = 60000
playback = mitmproxy
playback_version = 5.1.1
type = pageload
unit = ms
use_live_sites = true
# Use this to run a custom browsertime test locally (with a custom url).
# Essentially, this is "vanilla" browsertime in a raptor wrapper.
[browsertime]
playback_pageset_manifest = null.manifest
test_script = None
test_url = None

Просмотреть файл

@ -255,7 +255,7 @@ def test_cmd_arguments(ConcreteBrowsertime, browsertime_options, mock_test):
"--timeouts.script",
str(DEFAULT_TIMEOUT),
"--resultDir",
"-n",
"--iterations",
"1",
}
if browsertime_options.get("app") in ["chrome", "chrome-m"]:
@ -280,8 +280,8 @@ def extract_arg_value(cmd, arg):
@pytest.mark.parametrize(
"arg_to_test, expected, test_patch, options_patch",
[
["-n", "1", {}, {"browser_cycles": None}],
["-n", "123", {"browser_cycles": 123}, {}],
["--iterations", "1", {}, {"browser_cycles": None}],
["--iterations", "123", {"browser_cycles": 123}, {}],
["--video", "false", {}, {"browsertime_video": None}],
["--video", "true", {}, {"browsertime_video": "dummy_value"}],
["--timeouts.script", str(DEFAULT_TIMEOUT), {}, {"page_cycles": None}],