зеркало из https://github.com/mozilla/gecko-dev.git
Bug 1868440 - [ci] Upgrade taskcluster-taskgraph to v7.4.0, r=perftest-reviewers,geckoview-reviewers,ci-and-tooling,devtools-reviewers,taskgraph-reviewers,releng-reviewers,mach-reviewers,omc-reviewers,jmaher,hneiva,aminomancer,jari,dom-storage-reviewers,afinder,asuth,bhearsum,owlish
Differential Revision: https://phabricator.services.mozilla.com/D206260
This commit is contained in:
Родитель
c7a25587de
Коммит
f33979228a
|
@ -43,7 +43,7 @@ build-clang.py accepts a JSON config format with the following fields:
|
|||
* assertions: Whether to enable LLVM assertions. The default is false.
|
||||
* pgo: Whether to build with PGO (requires stages == 4). The default is false.
|
||||
|
||||
The revisions are defined in taskcluster/ci/fetch/toolchains.yml. They are usually commit sha1s corresponding to upstream tags.
|
||||
The revisions are defined in taskcluster/kinds/fetch/toolchains.yml. They are usually commit sha1s corresponding to upstream tags.
|
||||
|
||||
Environment Variables
|
||||
---------------------
|
||||
|
|
|
@ -68,7 +68,7 @@ must read::
|
|||
will contain the version of ``rustc`` used by automation builds, which may
|
||||
lag behind stable for a few days after Rust releases, which is specified by
|
||||
the task definition in
|
||||
`this file <https://hg.mozilla.org/mozilla-central/file/tip/taskcluster/ci/toolchain/dist-toolchains.yml>`_.
|
||||
`this file <https://hg.mozilla.org/mozilla-central/file/tip/taskcluster/kinds/toolchain/dist-toolchains.yml>`_.
|
||||
For instance, to specify 1.37.0 rather than the current stable, run
|
||||
``rustup toolchain add 1.37.0`` and point to
|
||||
``/path/to/home/.rustup/toolchains/1.37.0-x86_64-apple-darwin/bin/rustc`` in your
|
||||
|
|
|
@ -209,7 +209,7 @@ its content and ensure that's what you're looking for.
|
|||
(...)
|
||||
|
||||
Once you have found the SDK you want, you can create or update toolchain tasks
|
||||
in ``taskcluster/ci/toolchain/macosx-sdk.yml``.
|
||||
in ``taskcluster/kinds/toolchain/macosx-sdk.yml``.
|
||||
|
||||
The ``taskcluster/scripts/misc/unpack-sdk.py`` script takes the url of a SDK
|
||||
package, the sha512 hash for its content, the path to the SDK in the package,
|
||||
|
|
|
@ -36,7 +36,7 @@ glob:testing/perfdocs/generated/**
|
|||
# Python API docs.
|
||||
glob:**/*.py
|
||||
# Referenced by some python files.
|
||||
path:taskcluster/ci/docker-image/kind.yml
|
||||
path:taskcluster/kinds/docker-image/kind.yml
|
||||
|
||||
# Included in ipc docs
|
||||
path:ipc/ipdl/test/ipdl/ok/PMyManaged.ipdl
|
||||
|
|
|
@ -19,4 +19,4 @@ The tests run on try on linux64 platforms. The complete name of try job is `devt
|
|||
Adding the tests to a try push depends on the try selector you are using.
|
||||
- try fuzzy: look for the job named `source-test-node-devtools-tests`
|
||||
|
||||
The configuration file for try can be found at `taskcluster/ci/source-test/node.yml`
|
||||
The configuration file for try can be found at `taskcluster/kinds/source-test/node.yml`
|
||||
|
|
|
@ -19,4 +19,4 @@ The tests run on try on linux64 platforms. The complete name of try job is `devt
|
|||
Adding the tests to a try push depends on the try selector you are using.
|
||||
- try fuzzy: look for the job named `source-test-node-devtools-tests`
|
||||
|
||||
The configuration file for try can be found at `taskcluster/ci/source-test/node.yml`
|
||||
The configuration file for try can be found at `taskcluster/kinds/source-test/node.yml`
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
* This is a test runner dedicated to run DevTools node tests continuous integration
|
||||
* platforms. It will parse the logs to output errors compliant with treeherder tooling.
|
||||
*
|
||||
* See taskcluster/ci/source-test/node.yml for the definition of the task running those
|
||||
* See taskcluster/kinds/source-test/node.yml for the definition of the task running those
|
||||
* tests on try.
|
||||
*/
|
||||
|
||||
|
|
|
@ -19,4 +19,4 @@ The tests run on try on linux64 platforms. The complete name of the try job is `
|
|||
Adding the tests to a try push depends on the try selector you are using.
|
||||
- try fuzzy: look for the job named `source-test-node-devtools-tests`
|
||||
|
||||
The configuration file for try can be found at `taskcluster/ci/source-test/node.yml`
|
||||
The configuration file for try can be found at `taskcluster/kinds/source-test/node.yml`
|
||||
|
|
|
@ -15,7 +15,7 @@ They are also run when using the "devtools" preset: `./mach try --preset devtool
|
|||
|
||||
### Node tests try job definition
|
||||
|
||||
The definition of those try jobs can be found at [taskcluster/ci/source-test/node.yml](https://searchfox.org/mozilla-central/source/taskcluster/ci/source-test/node.yml).
|
||||
The definition of those try jobs can be found at [taskcluster/kinds/source-test/node.yml](https://searchfox.org/mozilla-central/source/taskcluster/kinds/source-test/node.yml).
|
||||
|
||||
The definition also contains the list of files that will trigger the node test jobs. Currently the the devtools tests run when any file is modified under `devtools/client` or `devtools/shared`.
|
||||
|
||||
|
|
|
@ -347,7 +347,7 @@ Adding the linter to the CI
|
|||
|
||||
First, the job will have to be declared in Taskcluster.
|
||||
|
||||
This should be done in the `mozlint Taskcluster configuration <https://searchfox.org/mozilla-central/source/taskcluster/ci/source-test/mozlint.yml>`_.
|
||||
This should be done in the `mozlint Taskcluster configuration <https://searchfox.org/mozilla-central/source/taskcluster/kinds/source-test/mozlint.yml>`_.
|
||||
You will need to define a symbol, how it is executed and on what kind of change.
|
||||
|
||||
For example, for ruff, the configuration is the following:
|
||||
|
|
|
@ -5,7 +5,7 @@ CondProf Addons is a linter for condprof customization JSON files (see :searchfo
|
|||
it reports linting errors if:
|
||||
|
||||
- any of the addons required by the customization files (e.g. see :searchfox:`testing/condprofile/condprof/customization/webext.json`)
|
||||
is not found in the tar file fetched through the `firefox-addons` fetch task (see :searchfox:`taskcluster/ci/fetch/browsertime.yml`)
|
||||
is not found in the tar file fetched through the `firefox-addons` fetch task (see :searchfox:`taskcluster/kinds/fetch/browsertime.yml`)
|
||||
- or the expected `firefox-addons` fetch task has not been found
|
||||
|
||||
Run Locally
|
||||
|
@ -37,7 +37,7 @@ XPI file is missing from the firefox-addons.tar archive
|
|||
|
||||
This linting errors is expected to be reported if the linter detected that a confprof customization file
|
||||
requires an addon but the related xpi filename is not included in the firefox-addons.tar file fetched
|
||||
through the `firefox-addons` fetch task (see :searchfox:`taskcluster/ci/fetch/browsertime.yml`).
|
||||
through the `firefox-addons` fetch task (see :searchfox:`taskcluster/kinds/fetch/browsertime.yml`).
|
||||
|
||||
If the patch or phabricator revision is not meant to be landed, but only used as a temporary patch
|
||||
pushed on try or only applied locally (e.g. to run the tp6/tp6m webextensions perftests with a given
|
||||
|
@ -50,14 +50,14 @@ the linting error have to be fixed before or along landing the change, either by
|
|||
- removing the addition to the customization file if it wasn't intended to include that addon to all runs
|
||||
of the tp6/tp6m webextensions perftests
|
||||
|
||||
- updating the `firefox-addons` fetch task as defined in :searchfox:`taskcluster/ci/fetch/browsertime.yml`
|
||||
- updating the `firefox-addons` fetch task as defined in :searchfox:`taskcluster/kinds/fetch/browsertime.yml`
|
||||
by creating a pull request in the github repository where the asset is stored, and ask a review from
|
||||
a peer of the `#webextensions-reviewer` and `#perftests-reviewers` review groups.
|
||||
|
||||
firefox-addons taskcluster config 'add-prefix' attribute should be set to 'firefox-addons/'
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
If this linting error is hit, then the `firefox-addons` task defined in :searchfox:`taskcluster/ci/fetch/browsertime.yml`
|
||||
If this linting error is hit, then the `firefox-addons` task defined in :searchfox:`taskcluster/kinds/fetch/browsertime.yml`
|
||||
is missing the `add-prefix` attribute or its value is not set to the expected 'firefox-addons/' subdir name.
|
||||
|
||||
This is enforced as a linting rule because when the condprof utility is going to build a conditioned profile
|
||||
|
@ -69,7 +69,7 @@ names are already available in `$MOZ_FETCHES_DIR/firefox-addons`.
|
|||
firefox-addons taskcluser fetch config section not found
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
This linting error is hit if the linter does not find the expected `firefox-addons` task defined in :searchfox:`taskcluster/ci/fetch/browsertime.yml`
|
||||
This linting error is hit if the linter does not find the expected `firefox-addons` task defined in :searchfox:`taskcluster/kinds/fetch/browsertime.yml`
|
||||
or it is missing the expected `fetch` attribute.
|
||||
|
||||
Configuration
|
||||
|
|
|
@ -96,7 +96,7 @@ Regression Testing
|
|||
|
||||
In order to prevent regressions in our clang-tidy based static analysis,
|
||||
we have created a
|
||||
:searchfox:`task <taskcluster/ci/static-analysis-autotest/kind.yml>`
|
||||
:searchfox:`task <taskcluster/kinds/static-analysis-autotest/kind.yml>`
|
||||
on automation. This task runs on each commit and launches a test suite
|
||||
that is integrated into mach.
|
||||
|
||||
|
|
|
@ -127,7 +127,7 @@ Example: Re-Signing Official Nightly
|
|||
0:00.20 Using ad-hoc signing identity
|
||||
0:00.20 Using nightly channel signing configuration
|
||||
0:00.20 Using developer entitlements
|
||||
0:00.20 Reading build config file /Users/me/r/mc/taskcluster/ci/config.yml
|
||||
0:00.20 Reading build config file /Users/me/r/mc/taskcluster/config.yml
|
||||
0:00.23 Stripping existing xattrs and signatures
|
||||
0:01.91 Signing with codesign
|
||||
0:02.72 Verification of signed app /Users/me/Desktop/FirefoxNightly.app OK
|
||||
|
@ -149,7 +149,7 @@ can be exported from Keychain Access in .p12 format.
|
|||
0:00.26 Using pkcs12 signing identity
|
||||
0:00.26 Using devedition channel signing configuration
|
||||
0:00.26 Using developer entitlements
|
||||
0:00.26 Reading build config file /Users/me/r/mc/taskcluster/ci/config.yml
|
||||
0:00.26 Reading build config file /Users/me/r/mc/taskcluster/config.yml
|
||||
0:00.29 Stripping existing xattrs and signatures
|
||||
0:02.09 Signing with rcodesign
|
||||
0:11.16 Verification of signed app /Users/me/Desktop/DevEdition.app OK
|
||||
|
|
|
@ -27,8 +27,8 @@ Add files to `perftest.toml
|
|||
<https://searchfox.org/mozilla-central/source/dom/serviceworkers/test/performance/perftest.toml>`_
|
||||
as usual for mochitests.
|
||||
|
||||
Modify linux.yml, macosx.yml, and windows.yml under `taskcluster/ci/perftest
|
||||
<https://searchfox.org/mozilla-central/source/taskcluster/ci/perftest>`_.
|
||||
Modify linux.yml, macosx.yml, and windows.yml under `taskcluster/kinds/perftest
|
||||
<https://searchfox.org/mozilla-central/source/taskcluster/kinds/perftest>`_.
|
||||
Currently, each test needs to be added individually to the run command (`here
|
||||
<https://searchfox.org/mozilla-central/rev/91cc8848427fdbbeb324e6ca56a0d08d32d3c308/taskcluster/ci/perftest/linux.yml#121-149>`_,
|
||||
for example). kind.yml can be ignored–it provides some defaults.
|
||||
|
@ -62,7 +62,7 @@ and `autoland
|
|||
<https://treeherder.mozilla.org/jobs?repo=autoland&searchStr=perftest>`_. Look
|
||||
for linux-sw, macosx-sw, and win-sw (`example
|
||||
<https://treeherder.mozilla.org/perfherder/graphs?series=mozilla-central,4967140,1,15&selected=4967140,1814245176>`_).
|
||||
These symbol names are defined in the .yml files under taskcluster/ci/perftest.
|
||||
These symbol names are defined in the .yml files under taskcluster/kinds/perftest.
|
||||
|
||||
Contacts
|
||||
========
|
||||
|
|
|
@ -51,7 +51,7 @@ How to add more tests?
|
|||
* Under the ``[test_name]`` section, specity the test parameters as a sequence of ``--browsertime.key=value`` arguments as a value of ``browsertime_args =``
|
||||
* Under the ``[test_name]`` section, override any other values as needed
|
||||
|
||||
* Add test as a subtest to run for Desktop ``taskcluster/ci/test/browsertime-desktop.yml`` (maybe also for mobile)
|
||||
* Add test as a subtest to run for Desktop ``taskcluster/kinds/test/browsertime-desktop.yml`` (maybe also for mobile)
|
||||
* Add test documentation to ``testing/raptor/raptor/perfdocs/config.yml``
|
||||
|
||||
* Generated files:
|
||||
|
|
|
@ -11,4 +11,4 @@ https://hg.mozilla.org/l10n/gecko-strings-quarantine/.
|
|||
|
||||
The code is in https://hg.mozilla.org/mozilla-central/file/tip/python/l10n/mozxchannel/,
|
||||
supported as a mach subcommand in https://hg.mozilla.org/mozilla-central/file/tip/tools/compare-locales/mach_commands.py,
|
||||
as a taskcluster kind in https://hg.mozilla.org/mozilla-central/file/tip/taskcluster/ci/l10n-cross-channel, and scheduled in cron in https://hg.mozilla.org/mozilla-central/file/tip/.cron.yml.
|
||||
as a taskcluster kind in https://hg.mozilla.org/mozilla-central/file/tip/taskcluster/kinds/l10n-cross-channel, and scheduled in cron in https://hg.mozilla.org/mozilla-central/file/tip/.cron.yml.
|
||||
|
|
|
@ -18,7 +18,7 @@ By default the Fenix CI runs tests using virtual devices on `x86`.
|
|||
That's faster when the host is also a `x86(_64)` system, but most physical devices use the Arm platform.
|
||||
So first we need to instruct it to run tests on Arm.
|
||||
|
||||
Which platform to test on is defined in [`taskcluster/ci/ui-test/kind.yml`](https://github.com/mozilla-mobile/fenix/blob/58e12b18e6e9f4f67c059fe9c9bf9f02579a55db/taskcluster/ci/ui-test/kind.yml#L65).
|
||||
Which platform to test on is defined in [`taskcluster/kinds/ui-test/kind.yml`](https://searchfox.org/mozilla-central/source/taskcluster/kinds/ui-test/kind.yml).
|
||||
Find the line where it downloads the `target.apk` produced in a previous step and change it from `x86` to `arm64-v8a`:
|
||||
|
||||
```patch
|
||||
|
|
|
@ -182,7 +182,7 @@ class CommonBackend(BuildBackend):
|
|||
# the order is not consistent across multiple runs.
|
||||
#
|
||||
# Exclude this file in order to avoid breaking the
|
||||
# taskcluster/ci/diffoscope/reproducible.yml jobs.
|
||||
# taskcluster/kinds/diffoscope/reproducible.yml jobs.
|
||||
continue
|
||||
fullpath = ObjDirPath(obj._context, "!" + f).full_path
|
||||
self._handle_generated_sources([fullpath])
|
||||
|
|
|
@ -14,9 +14,7 @@ def toolchain_task_definitions():
|
|||
# Don't import globally to allow this module being imported without
|
||||
# the taskgraph module being available (e.g. standalone js)
|
||||
params = {"level": os.environ.get("MOZ_SCM_LEVEL", "3")}
|
||||
root_dir = os.path.join(
|
||||
os.path.dirname(__file__), "..", "..", "..", "taskcluster", "ci"
|
||||
)
|
||||
root_dir = os.path.join(os.path.dirname(__file__), "..", "..", "..", "taskcluster")
|
||||
toolchains = load_tasks_for_kind(params, "toolchain", root_dir=root_dir)
|
||||
aliased = {}
|
||||
for t in toolchains.values():
|
||||
|
|
|
@ -125,7 +125,7 @@ def run_perftest(command_context, **kwargs):
|
|||
for plat in platform:
|
||||
if plat not in _TRY_PLATFORMS:
|
||||
# we can extend platform support here: linux, win, macOs
|
||||
# by adding more jobs in taskcluster/ci/perftest/kind.yml
|
||||
# by adding more jobs in taskcluster/kinds/perftest/kind.yml
|
||||
# then picking up the right one here
|
||||
raise NotImplementedError(
|
||||
"%r doesn't exist or is not yet supported" % plat
|
||||
|
|
|
@ -2,12 +2,12 @@
|
|||
# License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
from taskgraph.target_tasks import _target_task
|
||||
from taskgraph.target_tasks import register_target_task
|
||||
|
||||
from android_taskgraph.release_type import does_task_match_release_type
|
||||
|
||||
|
||||
@_target_task("promote_android")
|
||||
@register_target_task("promote_android")
|
||||
def target_tasks_promote(full_task_graph, parameters, graph_config):
|
||||
return _filter_release_promotion(
|
||||
full_task_graph,
|
||||
|
@ -17,7 +17,7 @@ def target_tasks_promote(full_task_graph, parameters, graph_config):
|
|||
)
|
||||
|
||||
|
||||
@_target_task("push_android")
|
||||
@register_target_task("push_android")
|
||||
def target_tasks_push(full_task_graph, parameters, graph_config):
|
||||
filtered_for_candidates = target_tasks_promote(
|
||||
full_task_graph,
|
||||
|
@ -29,7 +29,7 @@ def target_tasks_push(full_task_graph, parameters, graph_config):
|
|||
)
|
||||
|
||||
|
||||
@_target_task("ship_android")
|
||||
@register_target_task("ship_android")
|
||||
def target_tasks_ship(full_task_graph, parameters, graph_config):
|
||||
filtered_for_candidates = target_tasks_push(
|
||||
full_task_graph,
|
||||
|
@ -71,7 +71,7 @@ def _filter_release_promotion(
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t, parameters)]
|
||||
|
||||
|
||||
@_target_task("screenshots")
|
||||
@register_target_task("screenshots")
|
||||
def target_tasks_screnshots(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to generate screenshots on a real device."""
|
||||
|
||||
|
@ -81,7 +81,7 @@ def target_tasks_screnshots(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t, parameters)]
|
||||
|
||||
|
||||
@_target_task("legacy_api_ui_tests")
|
||||
@register_target_task("legacy_api_ui_tests")
|
||||
def target_tasks_legacy_api_ui_tests(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to run select UI tests on other API."""
|
||||
|
||||
|
|
|
@ -27,7 +27,7 @@ mv depot_tools.git depot_tools
|
|||
|
||||
# Generating a new version of the preloaded depot_tools download can be done by:
|
||||
# 1) Running the task, uncommenting the variable assignment below, uncommenting the
|
||||
# _GENERATE_DEPOT_TOOLS_BINARIES_ section in taskcluster/ci/updatebot/kind.yml,
|
||||
# _GENERATE_DEPOT_TOOLS_BINARIES_ section in taskcluster/kinds/updatebot/kind.yml,
|
||||
# and ensuring that an angle update will actually take place (so it downloads the depot_tools)
|
||||
# 2) Downloading and sanity-checking the depot_tools-preloaded-binaries-GIT_HASH-DATE.zip artifact
|
||||
# 3) Adding it to tooltool
|
||||
|
@ -121,7 +121,7 @@ if test -n "$GENERATE_DEPOT_TOOLS_BINARIES"; then
|
|||
|
||||
# Convoluted way to get the git hash, because we don't have a .git directory
|
||||
# Adding extra print statements just in case we need to debug it
|
||||
GIT_HASH=$(grep depot_tools -A 1 "$GECKO_PATH/taskcluster/ci/fetch/updatebot.yml" | tee /dev/tty | grep revision | tee /dev/tty | awk -F': *' '{print $2}' | tee /dev/tty)
|
||||
GIT_HASH=$(grep depot_tools -A 1 "$GECKO_PATH/taskcluster/kinds/fetch/updatebot.yml" | tee /dev/tty | grep revision | tee /dev/tty | awk -F': *' '{print $2}' | tee /dev/tty)
|
||||
DATE=$(date -I)
|
||||
mv depot_tools-preloaded-binaries.zip "depot_tools-preloaded-binaries-$GIT_HASH-$DATE.zip"
|
||||
|
||||
|
|
|
@ -35,7 +35,7 @@ Changing Test Characteristics
|
|||
.............................
|
||||
|
||||
First, find the test description. This will be in
|
||||
``taskcluster/ci/*/tests.yml``, for the appropriate kind (consult
|
||||
``taskcluster/kinds/*/tests.yml``, for the appropriate kind (consult
|
||||
:ref:`kinds`). You will find a YAML stanza for each test suite, and each
|
||||
stanza defines the test's characteristics. For example, the ``chunks``
|
||||
property gives the number of chunks to run. This can be specified as a simple
|
||||
|
@ -65,7 +65,7 @@ Adding a Test Suite
|
|||
To add a new test suite, you will need to know the proper mozharness invocation
|
||||
for that suite, and which kind it fits into (consult :ref:`kinds`).
|
||||
|
||||
Add a new stanza to ``taskcluster/ci/<kind>/tests.yml``, copying from the other
|
||||
Add a new stanza to ``taskcluster/kinds/<kind>/tests.yml``, copying from the other
|
||||
stanzas in that file. The meanings should be clear, but authoritative
|
||||
documentation is in
|
||||
``taskcluster/gecko_taskgraph/transforms/test/__init__.py`` should you need
|
||||
|
|
|
@ -684,7 +684,7 @@ diffoscope
|
|||
----------
|
||||
Tasks used to compare pairs of Firefox builds using https://diffoscope.org/.
|
||||
As of writing, this is mainly meant to be used in try builds, by editing
|
||||
taskcluster/ci/diffoscope/kind.yml for your needs.
|
||||
taskcluster/kinds/diffoscope/kind.yml for your needs.
|
||||
|
||||
addon
|
||||
-----
|
||||
|
|
|
@ -120,7 +120,7 @@ manifest conditions pending the triage owner / manager to review.
|
|||
Please subscribe to alerts from `firefox-ci <https://groups.google.com/a/mozilla.com/g/firefox-ci>`
|
||||
group in order to be aware of changes to the CI, scheduling, or the policy.
|
||||
|
||||
.. _variants.yml: https://searchfox.org/mozilla-central/source/taskcluster/ci/test/variants.yml
|
||||
.. _variants.yml: https://searchfox.org/mozilla-central/source/taskcluster/kinds/test/variants.yml
|
||||
.. _json-e: https://json-e.js.org/
|
||||
|
||||
|
||||
|
|
|
@ -27,7 +27,7 @@ partner repacks.
|
|||
logic. It's usually looked up during the release promotion action task, using the Github
|
||||
GraphQL API in the `get_partner_config_by_url()
|
||||
<python/taskgraph.util.html#taskgraph.util.partners.get_partner_config_by_url>`_ function, with the
|
||||
url defined in `taskcluster/ci/config.yml <https://searchfox.org/mozilla-central/search?q=partner-urls&path=taskcluster%2Fci%2Fconfig.yml&case=true®exp=false&redirect=true>`_.
|
||||
url defined in `taskcluster/config.yml <https://searchfox.org/mozilla-central/search?q=partner-urls&path=taskcluster%2Fconfig.yml&case=true®exp=false&redirect=true>`_.
|
||||
|
||||
``release_partner_build_number`` is an integer used to create unique upload paths in the firefox
|
||||
candidates directory, while ``release_partners`` is a list of partners that should be
|
||||
|
|
|
@ -40,7 +40,7 @@ release. They're both true for Firefox betas >= b8 and releases, but otherwise d
|
|||
logic. It's usually looked up during the release promotion action task, using the Github
|
||||
GraphQL API in the `get_partner_config_by_url()
|
||||
<python/taskgraph.util.html#taskgraph.util.partners.get_partner_config_by_url>`_ function, with the
|
||||
url defined in `taskcluster/ci/config.yml <https://searchfox
|
||||
url defined in `taskcluster/config.yml <https://searchfox
|
||||
.org/mozilla-release/search?q=regexp%3A^partner+path%3Aconfig.yml&redirect=true>`_.
|
||||
|
||||
``release_partner_build_number`` is an integer used to create unique upload paths in the firefox
|
||||
|
|
|
@ -32,7 +32,7 @@ Kinds
|
|||
Generation starts with "kinds". These are yaml files that denote groupings of
|
||||
tasks that are loosely related to one another. For example, in Gecko there are
|
||||
``build`` and ``test`` kinds. Each kind has its own directory under
|
||||
`taskcluster/ci`_ which contains a ``kind.yml`` file.
|
||||
`taskcluster/kinds`_ which contains a ``kind.yml`` file.
|
||||
|
||||
For more information on kinds, see Taskgraph's `kind documentation`_. For a
|
||||
list of available kinds in ``mozilla-central``, see the :doc:`kinds reference
|
||||
|
@ -114,7 +114,7 @@ Graph Configuration
|
|||
|
||||
There are several configuration settings that are pertain to the entire
|
||||
taskgraph. These are specified in :file:`config.yml` at the root of the
|
||||
taskgraph configuration (typically :file:`taskcluster/ci/`). The available
|
||||
taskgraph configuration (typically :file:`taskcluster`). The available
|
||||
settings are documented inline in `taskcluster/gecko_taskgraph/config.py
|
||||
<https://searchfox.org/mozilla-central/source/taskcluster/gecko_taskgraph/config.py>`_.
|
||||
|
||||
|
@ -131,7 +131,7 @@ For more information, see Taskgraph's `actions documentation`_.
|
|||
|
||||
.. _graph generation documentation: https://taskcluster-taskgraph.readthedocs.io/en/latest/concepts/task-graphs.html
|
||||
.. _this guide: https://taskcluster-taskgraph.readthedocs.io/en/latest/howto/run-locally.html
|
||||
.. _taskcluster/ci: https://searchfox.org/mozilla-central/source/taskcluster/ci
|
||||
.. _taskcluster/kinds: https://searchfox.org/mozilla-central/source/taskcluster/kinds
|
||||
.. _kind documentation: https://taskcluster-taskgraph.readthedocs.io/en/latest/concepts/kind.html
|
||||
.. _loader documentation: https://taskcluster-taskgraph.readthedocs.io/en/latest/concepts/loading.html
|
||||
.. _transforms documentation: https://taskcluster-taskgraph.readthedocs.io/en/latest/concepts/transforms.html
|
||||
|
|
|
@ -35,7 +35,7 @@ The locations are as follows:
|
|||
Debian Packages for Debian and Ubuntu Based Docker Images
|
||||
---------------------------------------------------------
|
||||
|
||||
``taskcluster/ci/packages/debian.yml`` and ``taskcluster/ci/packages/ubuntu.yml``
|
||||
``taskcluster/kinds/packages/debian.yml`` and ``taskcluster/kinds/packages/ubuntu.yml``
|
||||
define custom Debian packages for Mercurial. These are installed in various
|
||||
Docker images.
|
||||
|
||||
|
|
|
@ -386,7 +386,7 @@ def show_taskgraph(options):
|
|||
output_file = options["output_file"]
|
||||
|
||||
if options["diff"]:
|
||||
# --root argument is taskgraph's config at <repo>/taskcluster/ci
|
||||
# --root argument is taskgraph's config at <repo>/taskcluster
|
||||
repo_root = os.getcwd()
|
||||
if options["root"]:
|
||||
repo_root = f"{options['root']}/../.."
|
||||
|
@ -707,7 +707,7 @@ def decision(options):
|
|||
@argument(
|
||||
"--root",
|
||||
"-r",
|
||||
default="taskcluster/ci",
|
||||
default="taskcluster",
|
||||
help="root of the taskgraph definition relative to topsrcdir",
|
||||
)
|
||||
def action_callback(options):
|
||||
|
@ -743,7 +743,7 @@ def action_callback(options):
|
|||
@argument(
|
||||
"--root",
|
||||
"-r",
|
||||
default="taskcluster/ci",
|
||||
default="taskcluster",
|
||||
help="root of the taskgraph definition relative to topsrcdir",
|
||||
)
|
||||
@argument(
|
||||
|
|
|
@ -11,7 +11,7 @@ from datetime import datetime, timedelta
|
|||
|
||||
from redo import retry
|
||||
from taskgraph.parameters import Parameters
|
||||
from taskgraph.target_tasks import _target_task, get_method
|
||||
from taskgraph.target_tasks import get_method, register_target_task
|
||||
from taskgraph.util.taskcluster import find_task_id
|
||||
|
||||
from gecko_taskgraph import GECKO, try_option_syntax
|
||||
|
@ -396,7 +396,7 @@ def _try_option_syntax(full_task_graph, parameters, graph_config):
|
|||
return target_tasks_labels
|
||||
|
||||
|
||||
@_target_task("try_tasks")
|
||||
@register_target_task("try_tasks")
|
||||
def target_tasks_try(full_task_graph, parameters, graph_config):
|
||||
try_mode = parameters["try_mode"]
|
||||
if try_mode == "try_task_config":
|
||||
|
@ -408,13 +408,13 @@ def target_tasks_try(full_task_graph, parameters, graph_config):
|
|||
return []
|
||||
|
||||
|
||||
@_target_task("try_select_tasks")
|
||||
@register_target_task("try_select_tasks")
|
||||
def target_tasks_try_select(full_task_graph, parameters, graph_config):
|
||||
tasks = target_tasks_try_select_uncommon(full_task_graph, parameters, graph_config)
|
||||
return [l for l in tasks if filter_by_uncommon_try_tasks(l)]
|
||||
|
||||
|
||||
@_target_task("try_select_tasks_uncommon")
|
||||
@register_target_task("try_select_tasks_uncommon")
|
||||
def target_tasks_try_select_uncommon(full_task_graph, parameters, graph_config):
|
||||
from gecko_taskgraph.decision import PER_PROJECT_PARAMETERS
|
||||
|
||||
|
@ -440,7 +440,7 @@ def target_tasks_try_select_uncommon(full_task_graph, parameters, graph_config):
|
|||
return sorted(tasks)
|
||||
|
||||
|
||||
@_target_task("try_auto")
|
||||
@register_target_task("try_auto")
|
||||
def target_tasks_try_auto(full_task_graph, parameters, graph_config):
|
||||
"""Target the tasks which have indicated they should be run on autoland
|
||||
(rather than try) via the `run_on_projects` attributes.
|
||||
|
@ -471,7 +471,7 @@ def target_tasks_try_auto(full_task_graph, parameters, graph_config):
|
|||
]
|
||||
|
||||
|
||||
@_target_task("default")
|
||||
@register_target_task("default")
|
||||
def target_tasks_default(full_task_graph, parameters, graph_config):
|
||||
"""Target the tasks which have indicated they should be run on this project
|
||||
via the `run_on_projects` attributes."""
|
||||
|
@ -484,7 +484,7 @@ def target_tasks_default(full_task_graph, parameters, graph_config):
|
|||
]
|
||||
|
||||
|
||||
@_target_task("autoland_tasks")
|
||||
@register_target_task("autoland_tasks")
|
||||
def target_tasks_autoland(full_task_graph, parameters, graph_config):
|
||||
"""In addition to doing the filtering by project that the 'default'
|
||||
filter does, also remove any tests running against shippable builds
|
||||
|
@ -510,7 +510,7 @@ def target_tasks_autoland(full_task_graph, parameters, graph_config):
|
|||
return [l for l in filtered_for_project if filter(full_task_graph[l])]
|
||||
|
||||
|
||||
@_target_task("mozilla_central_tasks")
|
||||
@register_target_task("mozilla_central_tasks")
|
||||
def target_tasks_mozilla_central(full_task_graph, parameters, graph_config):
|
||||
"""In addition to doing the filtering by project that the 'default'
|
||||
filter does, also remove any tests running against regular (aka not shippable,
|
||||
|
@ -550,7 +550,7 @@ def target_tasks_mozilla_central(full_task_graph, parameters, graph_config):
|
|||
return [l for l in filtered_for_project if filter(full_task_graph[l])]
|
||||
|
||||
|
||||
@_target_task("graphics_tasks")
|
||||
@register_target_task("graphics_tasks")
|
||||
def target_tasks_graphics(full_task_graph, parameters, graph_config):
|
||||
"""In addition to doing the filtering by project that the 'default'
|
||||
filter does, also remove artifact builds because we have csets on
|
||||
|
@ -568,7 +568,7 @@ def target_tasks_graphics(full_task_graph, parameters, graph_config):
|
|||
return [l for l in filtered_for_project if filter(full_task_graph[l])]
|
||||
|
||||
|
||||
@_target_task("mozilla_beta_tasks")
|
||||
@register_target_task("mozilla_beta_tasks")
|
||||
def target_tasks_mozilla_beta(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a promotable beta or release build
|
||||
of desktop, plus android CI. The candidates build process involves a pipeline
|
||||
|
@ -581,7 +581,7 @@ def target_tasks_mozilla_beta(full_task_graph, parameters, graph_config):
|
|||
]
|
||||
|
||||
|
||||
@_target_task("mozilla_release_tasks")
|
||||
@register_target_task("mozilla_release_tasks")
|
||||
def target_tasks_mozilla_release(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a promotable beta or release build
|
||||
of desktop, plus android CI. The candidates build process involves a pipeline
|
||||
|
@ -594,7 +594,7 @@ def target_tasks_mozilla_release(full_task_graph, parameters, graph_config):
|
|||
]
|
||||
|
||||
|
||||
@_target_task("mozilla_esr115_tasks")
|
||||
@register_target_task("mozilla_esr115_tasks")
|
||||
def target_tasks_mozilla_esr115(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a promotable beta or release build
|
||||
of desktop, without android CI. The candidates build process involves a pipeline
|
||||
|
@ -618,7 +618,7 @@ def target_tasks_mozilla_esr115(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("promote_desktop")
|
||||
@register_target_task("promote_desktop")
|
||||
def target_tasks_promote_desktop(full_task_graph, parameters, graph_config):
|
||||
"""Select the superset of tasks required to promote a beta or release build
|
||||
of a desktop product. This should include all non-android
|
||||
|
@ -642,7 +642,7 @@ def target_tasks_promote_desktop(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("push_desktop")
|
||||
@register_target_task("push_desktop")
|
||||
def target_tasks_push_desktop(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to push a build of desktop to cdns.
|
||||
Previous build deps will be optimized out via action task."""
|
||||
|
@ -668,7 +668,7 @@ def target_tasks_push_desktop(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("ship_desktop")
|
||||
@register_target_task("ship_desktop")
|
||||
def target_tasks_ship_desktop(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to ship desktop.
|
||||
Previous build deps will be optimized out via action task."""
|
||||
|
@ -709,7 +709,7 @@ def target_tasks_ship_desktop(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("pine_tasks")
|
||||
@register_target_task("pine_tasks")
|
||||
def target_tasks_pine(full_task_graph, parameters, graph_config):
|
||||
"""Bug 1879960 - no reftests or wpt needed"""
|
||||
filtered_for_project = target_tasks_default(
|
||||
|
@ -725,7 +725,7 @@ def target_tasks_pine(full_task_graph, parameters, graph_config):
|
|||
return [l for l in filtered_for_project if filter(full_task_graph[l])]
|
||||
|
||||
|
||||
@_target_task("larch_tasks")
|
||||
@register_target_task("larch_tasks")
|
||||
def target_tasks_larch(full_task_graph, parameters, graph_config):
|
||||
"""Bug 1879213 - only run necessary tasks on larch"""
|
||||
filtered_for_project = target_tasks_default(
|
||||
|
@ -749,7 +749,7 @@ def target_tasks_larch(full_task_graph, parameters, graph_config):
|
|||
return [l for l in filtered_for_project if filter(full_task_graph[l])]
|
||||
|
||||
|
||||
@_target_task("kaios_tasks")
|
||||
@register_target_task("kaios_tasks")
|
||||
def target_tasks_kaios(full_task_graph, parameters, graph_config):
|
||||
"""The set of tasks to run for kaios integration"""
|
||||
|
||||
|
@ -760,7 +760,7 @@ def target_tasks_kaios(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("custom-car_perf_testing")
|
||||
@register_target_task("custom-car_perf_testing")
|
||||
def target_tasks_custom_car_perf_testing(full_task_graph, parameters, graph_config):
|
||||
"""Select tasks required for running daily performance tests for custom chromium-as-release."""
|
||||
|
||||
|
@ -787,7 +787,7 @@ def target_tasks_custom_car_perf_testing(full_task_graph, parameters, graph_conf
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("general_perf_testing")
|
||||
@register_target_task("general_perf_testing")
|
||||
def target_tasks_general_perf_testing(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select tasks required for running performance tests 3 times a week.
|
||||
|
@ -910,7 +910,7 @@ def make_desktop_nightly_filter(platforms):
|
|||
return filter
|
||||
|
||||
|
||||
@_target_task("sp-perftests")
|
||||
@register_target_task("sp-perftests")
|
||||
def target_tasks_speedometer_tests(full_task_graph, parameters, graph_config):
|
||||
def filter(task):
|
||||
platform = task.attributes.get("test_platform")
|
||||
|
@ -934,7 +934,7 @@ def target_tasks_speedometer_tests(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("nightly_linux")
|
||||
@register_target_task("nightly_linux")
|
||||
def target_tasks_nightly_linux(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a nightly build of linux. The
|
||||
nightly build process involves a pipeline of builds, signing,
|
||||
|
@ -945,7 +945,7 @@ def target_tasks_nightly_linux(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t, parameters)]
|
||||
|
||||
|
||||
@_target_task("nightly_macosx")
|
||||
@register_target_task("nightly_macosx")
|
||||
def target_tasks_nightly_macosx(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a nightly build of macosx. The
|
||||
nightly build process involves a pipeline of builds, signing,
|
||||
|
@ -954,7 +954,7 @@ def target_tasks_nightly_macosx(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t, parameters)]
|
||||
|
||||
|
||||
@_target_task("nightly_win32")
|
||||
@register_target_task("nightly_win32")
|
||||
def target_tasks_nightly_win32(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a nightly build of win32 and win64.
|
||||
The nightly build process involves a pipeline of builds, signing,
|
||||
|
@ -963,7 +963,7 @@ def target_tasks_nightly_win32(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t, parameters)]
|
||||
|
||||
|
||||
@_target_task("nightly_win64")
|
||||
@register_target_task("nightly_win64")
|
||||
def target_tasks_nightly_win64(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a nightly build of win32 and win64.
|
||||
The nightly build process involves a pipeline of builds, signing,
|
||||
|
@ -972,7 +972,7 @@ def target_tasks_nightly_win64(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t, parameters)]
|
||||
|
||||
|
||||
@_target_task("nightly_win64_aarch64")
|
||||
@register_target_task("nightly_win64_aarch64")
|
||||
def target_tasks_nightly_win64_aarch64(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a nightly build of win32 and win64.
|
||||
The nightly build process involves a pipeline of builds, signing,
|
||||
|
@ -981,7 +981,7 @@ def target_tasks_nightly_win64_aarch64(full_task_graph, parameters, graph_config
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t, parameters)]
|
||||
|
||||
|
||||
@_target_task("nightly_asan")
|
||||
@register_target_task("nightly_asan")
|
||||
def target_tasks_nightly_asan(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a nightly build of asan. The
|
||||
nightly build process involves a pipeline of builds, signing,
|
||||
|
@ -992,7 +992,7 @@ def target_tasks_nightly_asan(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t, parameters)]
|
||||
|
||||
|
||||
@_target_task("daily_releases")
|
||||
@register_target_task("daily_releases")
|
||||
def target_tasks_daily_releases(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to identify if we should release.
|
||||
If we determine that we should the task will communicate to ship-it to
|
||||
|
@ -1004,7 +1004,7 @@ def target_tasks_daily_releases(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("nightly_desktop")
|
||||
@register_target_task("nightly_desktop")
|
||||
def target_tasks_nightly_desktop(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a nightly build of linux, mac,
|
||||
windows."""
|
||||
|
@ -1042,7 +1042,7 @@ def target_tasks_nightly_desktop(full_task_graph, parameters, graph_config):
|
|||
)
|
||||
|
||||
|
||||
@_target_task("nightly_all")
|
||||
@register_target_task("nightly_all")
|
||||
def target_tasks_nightly_all(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required for a nightly build of firefox desktop and android"""
|
||||
index_path = (
|
||||
|
@ -1065,7 +1065,7 @@ def target_tasks_nightly_all(full_task_graph, parameters, graph_config):
|
|||
|
||||
|
||||
# Run Searchfox analysis once daily.
|
||||
@_target_task("searchfox_index")
|
||||
@register_target_task("searchfox_index")
|
||||
def target_tasks_searchfox(full_task_graph, parameters, graph_config):
|
||||
"""Select tasks required for indexing Firefox for Searchfox web site each day"""
|
||||
return [
|
||||
|
@ -1081,7 +1081,7 @@ def target_tasks_searchfox(full_task_graph, parameters, graph_config):
|
|||
|
||||
|
||||
# Run build linux64-plain-clang-trunk/opt on mozilla-central/beta with perf tests
|
||||
@_target_task("linux64_clang_trunk_perf")
|
||||
@register_target_task("linux64_clang_trunk_perf")
|
||||
def target_tasks_build_linux64_clang_trunk_perf(
|
||||
full_task_graph, parameters, graph_config
|
||||
):
|
||||
|
@ -1097,19 +1097,19 @@ def target_tasks_build_linux64_clang_trunk_perf(
|
|||
|
||||
|
||||
# Run Updatebot's cron job 4 times daily.
|
||||
@_target_task("updatebot_cron")
|
||||
@register_target_task("updatebot_cron")
|
||||
def target_tasks_updatebot_cron(full_task_graph, parameters, graph_config):
|
||||
"""Select tasks required to run Updatebot's cron job"""
|
||||
return ["updatebot-cron"]
|
||||
|
||||
|
||||
@_target_task("customv8_update")
|
||||
@register_target_task("customv8_update")
|
||||
def target_tasks_customv8_update(full_task_graph, parameters, graph_config):
|
||||
"""Select tasks required for building latest d8/v8 version."""
|
||||
return ["toolchain-linux64-custom-v8"]
|
||||
|
||||
|
||||
@_target_task("file_update")
|
||||
@register_target_task("file_update")
|
||||
def target_tasks_file_update(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to perform nightly in-tree file updates"""
|
||||
|
||||
|
@ -1120,7 +1120,7 @@ def target_tasks_file_update(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("l10n_bump")
|
||||
@register_target_task("l10n_bump")
|
||||
def target_tasks_l10n_bump(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to perform l10n bumping."""
|
||||
|
||||
|
@ -1131,7 +1131,7 @@ def target_tasks_l10n_bump(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("merge_automation")
|
||||
@register_target_task("merge_automation")
|
||||
def target_tasks_merge_automation(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to perform repository merges."""
|
||||
|
||||
|
@ -1142,7 +1142,7 @@ def target_tasks_merge_automation(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("scriptworker_canary")
|
||||
@register_target_task("scriptworker_canary")
|
||||
def target_tasks_scriptworker_canary(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to run scriptworker canaries."""
|
||||
|
||||
|
@ -1153,7 +1153,7 @@ def target_tasks_scriptworker_canary(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("cron_bouncer_check")
|
||||
@register_target_task("cron_bouncer_check")
|
||||
def target_tasks_bouncer_check(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to perform bouncer version verification."""
|
||||
|
||||
|
@ -1166,7 +1166,7 @@ def target_tasks_bouncer_check(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("staging_release_builds")
|
||||
@register_target_task("staging_release_builds")
|
||||
def target_tasks_staging_release(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select all builds that are part of releases.
|
||||
|
@ -1190,7 +1190,7 @@ def target_tasks_staging_release(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("release_simulation")
|
||||
@register_target_task("release_simulation")
|
||||
def target_tasks_release_simulation(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select builds that would run on push on a release branch.
|
||||
|
@ -1227,7 +1227,7 @@ def target_tasks_release_simulation(full_task_graph, parameters, graph_config):
|
|||
]
|
||||
|
||||
|
||||
@_target_task("codereview")
|
||||
@register_target_task("codereview")
|
||||
def target_tasks_codereview(full_task_graph, parameters, graph_config):
|
||||
"""Select all code review tasks needed to produce a report"""
|
||||
|
||||
|
@ -1245,13 +1245,13 @@ def target_tasks_codereview(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("nothing")
|
||||
@register_target_task("nothing")
|
||||
def target_tasks_nothing(full_task_graph, parameters, graph_config):
|
||||
"""Select nothing, for DONTBUILD pushes"""
|
||||
return []
|
||||
|
||||
|
||||
@_target_task("daily_beta_perf")
|
||||
@register_target_task("daily_beta_perf")
|
||||
def target_tasks_daily_beta_perf(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select performance tests on the beta branch to be run daily
|
||||
|
@ -1353,7 +1353,7 @@ def target_tasks_daily_beta_perf(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("weekly_release_perf")
|
||||
@register_target_task("weekly_release_perf")
|
||||
def target_tasks_weekly_release_perf(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select performance tests on the release branch to be run weekly
|
||||
|
@ -1421,7 +1421,7 @@ def target_tasks_weekly_release_perf(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("raptor_tp6m")
|
||||
@register_target_task("raptor_tp6m")
|
||||
def target_tasks_raptor_tp6m(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select tasks required for running raptor cold page-load tests on fenix and refbrow
|
||||
|
@ -1448,7 +1448,7 @@ def target_tasks_raptor_tp6m(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("backfill_all_browsertime")
|
||||
@register_target_task("backfill_all_browsertime")
|
||||
def target_tasks_backfill_all_browsertime(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Search for revisions that contains patches that were reviewed by perftest reviewers
|
||||
|
@ -1516,7 +1516,7 @@ def target_tasks_backfill_all_browsertime(full_task_graph, parameters, graph_con
|
|||
return []
|
||||
|
||||
|
||||
@_target_task("condprof")
|
||||
@register_target_task("condprof")
|
||||
def target_tasks_condprof(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select tasks required for building conditioned profiles.
|
||||
|
@ -1527,7 +1527,7 @@ def target_tasks_condprof(full_task_graph, parameters, graph_config):
|
|||
yield name
|
||||
|
||||
|
||||
@_target_task("system_symbols")
|
||||
@register_target_task("system_symbols")
|
||||
def target_tasks_system_symbols(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select tasks for scraping and uploading system symbols.
|
||||
|
@ -1541,7 +1541,7 @@ def target_tasks_system_symbols(full_task_graph, parameters, graph_config):
|
|||
yield name
|
||||
|
||||
|
||||
@_target_task("perftest")
|
||||
@register_target_task("perftest")
|
||||
def target_tasks_perftest(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select perftest tasks we want to run daily
|
||||
|
@ -1553,7 +1553,7 @@ def target_tasks_perftest(full_task_graph, parameters, graph_config):
|
|||
yield name
|
||||
|
||||
|
||||
@_target_task("perftest-on-autoland")
|
||||
@register_target_task("perftest-on-autoland")
|
||||
def target_tasks_perftest_autoland(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select perftest tasks we want to run daily
|
||||
|
@ -1567,7 +1567,7 @@ def target_tasks_perftest_autoland(full_task_graph, parameters, graph_config):
|
|||
yield name
|
||||
|
||||
|
||||
@_target_task("l10n-cross-channel")
|
||||
@register_target_task("l10n-cross-channel")
|
||||
def target_tasks_l10n_cross_channel(full_task_graph, parameters, graph_config):
|
||||
"""Select the set of tasks required to run l10n cross-channel."""
|
||||
|
||||
|
@ -1577,7 +1577,7 @@ def target_tasks_l10n_cross_channel(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("eslint-build")
|
||||
@register_target_task("eslint-build")
|
||||
def target_tasks_eslint_build(full_task_graph, parameters, graph_config):
|
||||
"""Select the task to run additional ESLint rules which require a build."""
|
||||
|
||||
|
@ -1588,7 +1588,7 @@ def target_tasks_eslint_build(full_task_graph, parameters, graph_config):
|
|||
yield name
|
||||
|
||||
|
||||
@_target_task("holly_tasks")
|
||||
@register_target_task("holly_tasks")
|
||||
def target_tasks_holly(full_task_graph, parameters, graph_config):
|
||||
"""Bug 1814661: only run updatebot tasks on holly"""
|
||||
|
||||
|
@ -1598,7 +1598,7 @@ def target_tasks_holly(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t)]
|
||||
|
||||
|
||||
@_target_task("snap_upstream_tests")
|
||||
@register_target_task("snap_upstream_tests")
|
||||
def target_tasks_snap_upstream_tests(full_task_graph, parameters, graph_config):
|
||||
"""
|
||||
Select tasks for testing Snap package built as upstream. Omit -try because
|
||||
|
@ -1609,7 +1609,7 @@ def target_tasks_snap_upstream_tests(full_task_graph, parameters, graph_config):
|
|||
yield name
|
||||
|
||||
|
||||
@_target_task("nightly-android")
|
||||
@register_target_task("nightly-android")
|
||||
def target_tasks_nightly_android(full_task_graph, parameters, graph_config):
|
||||
def filter(task, parameters):
|
||||
# geckoview
|
||||
|
@ -1645,7 +1645,7 @@ def target_tasks_nightly_android(full_task_graph, parameters, graph_config):
|
|||
return [l for l, t in full_task_graph.tasks.items() if filter(t, parameters)]
|
||||
|
||||
|
||||
@_target_task("android-l10n-import")
|
||||
@register_target_task("android-l10n-import")
|
||||
def target_tasks_android_l10n_import(full_task_graph, parameters, graph_config):
|
||||
return [l for l, t in full_task_graph.tasks.items() if l == "android-l10n-import"]
|
||||
|
||||
|
|
|
@ -47,7 +47,7 @@ def enable_logging():
|
|||
|
||||
@pytest.fixture(scope="session")
|
||||
def graph_config():
|
||||
return load_graph_config(os.path.join(GECKO, "taskcluster", "ci"))
|
||||
return load_graph_config(os.path.join(GECKO, "taskcluster"))
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
|
|
|
@ -155,6 +155,7 @@ def is_subset(subset, superset):
|
|||
def test_extract_applicable_action(
|
||||
responses, monkeypatch, actions_json, task_def, expected
|
||||
):
|
||||
actions.util.get_task_definition.clear()
|
||||
base_url = "https://taskcluster"
|
||||
decision_task_id = "dddd"
|
||||
task_id = "tttt"
|
||||
|
|
|
@ -37,7 +37,7 @@ TASK_DEFAULTS = {
|
|||
|
||||
@pytest.fixture(scope="module")
|
||||
def config():
|
||||
graph_config = load_graph_config(os.path.join(GECKO, "taskcluster", "ci"))
|
||||
graph_config = load_graph_config(os.path.join(GECKO, "taskcluster"))
|
||||
params = FakeParameters(
|
||||
{
|
||||
"base_repository": "http://hg.example.com",
|
||||
|
|
|
@ -32,7 +32,7 @@ diff_description_schema = Schema(
|
|||
Required("original"): index_or_string,
|
||||
Required("new"): index_or_string,
|
||||
# Arguments to pass to diffoscope, used for job-defaults in
|
||||
# taskcluster/ci/diffoscope/kind.yml
|
||||
# taskcluster/kinds/diffoscope/kind.yml
|
||||
Optional("args"): str,
|
||||
# Extra arguments to pass to diffoscope, that can be set per job.
|
||||
Optional("extra-args"): str,
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
"""
|
||||
Transform the reprocess-symbols task description template,
|
||||
taskcluster/ci/reprocess-symbols/job-template.yml into an actual task description.
|
||||
taskcluster/kinds/reprocess-symbols/job-template.yml into an actual task description.
|
||||
"""
|
||||
|
||||
|
||||
|
|
|
@ -208,7 +208,7 @@ TC_TREEHERDER_SCHEMA_URL = (
|
|||
|
||||
|
||||
UNKNOWN_GROUP_NAME = (
|
||||
"Treeherder group {} (from {}) has no name; " "add it to taskcluster/ci/config.yml"
|
||||
"Treeherder group {} (from {}) has no name; " "add it to taskcluster/config.yml"
|
||||
)
|
||||
|
||||
V2_ROUTE_TEMPLATES = [
|
||||
|
@ -313,7 +313,7 @@ def index_builder(name):
|
|||
|
||||
UNSUPPORTED_INDEX_PRODUCT_ERROR = """\
|
||||
The gecko-v2 product {product} is not in the list of configured products in
|
||||
`taskcluster/ci/config.yml'.
|
||||
`taskcluster/config.yml'.
|
||||
"""
|
||||
|
||||
|
||||
|
@ -1612,7 +1612,7 @@ def task_name_from_label(config, tasks):
|
|||
|
||||
UNSUPPORTED_SHIPPING_PRODUCT_ERROR = """\
|
||||
The shipping product {product} is not in the list of configured products in
|
||||
`taskcluster/ci/config.yml'.
|
||||
`taskcluster/config.yml'.
|
||||
"""
|
||||
|
||||
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
"""
|
||||
Transform the upload-generated-files task description template,
|
||||
taskcluster/ci/upload-generated-sources/kind.yml, into an actual task description.
|
||||
taskcluster/kinds/upload-generated-sources/kind.yml, into an actual task description.
|
||||
"""
|
||||
|
||||
from taskgraph.transforms.base import TransformSequence
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
"""
|
||||
Transform the upload-symbols task description template,
|
||||
taskcluster/ci/upload-symbols/job-template.yml into an actual task description.
|
||||
taskcluster/kinds/upload-symbols/job-template.yml into an actual task description.
|
||||
"""
|
||||
|
||||
|
||||
|
|
|
@ -24,8 +24,8 @@ here = os.path.abspath(os.path.dirname(__file__))
|
|||
resolver = TestResolver.from_environment(cwd=here, loader_cls=TestManifestLoader)
|
||||
|
||||
TEST_VARIANTS = {}
|
||||
if os.path.exists(os.path.join(GECKO, "taskcluster", "ci", "test", "variants.yml")):
|
||||
TEST_VARIANTS = load_yaml(GECKO, "taskcluster", "ci", "test", "variants.yml")
|
||||
if os.path.exists(os.path.join(GECKO, "taskcluster", "kinds", "test", "variants.yml")):
|
||||
TEST_VARIANTS = load_yaml(GECKO, "taskcluster", "kinds", "test", "variants.yml")
|
||||
|
||||
WPT_SUBSUITES = {
|
||||
"canvas": "html/canvas",
|
||||
|
|
|
@ -298,7 +298,7 @@ class ImagePathsMap(Mapping):
|
|||
self.__update_image_paths(jobs, image_dir)
|
||||
|
||||
|
||||
image_paths = ImagePathsMap("taskcluster/ci/docker-image/kind.yml")
|
||||
image_paths = ImagePathsMap("taskcluster/kinds/docker-image/kind.yml")
|
||||
|
||||
|
||||
def image_path(name):
|
||||
|
|
|
@ -523,7 +523,7 @@ def apply_partner_priority(config, jobs):
|
|||
# Reduce the priority of the partner repack jobs because they don't block QE. Meanwhile
|
||||
# leave EME-free jobs alone because they do, and they'll get the branch priority like the rest
|
||||
# of the release. Only bother with this in production, not on staging releases on try.
|
||||
# medium is the same as mozilla-central, see taskcluster/ci/config.yml. ie higher than
|
||||
# medium is the same as mozilla-central, see taskcluster/config.yml. ie higher than
|
||||
# integration branches because we don't want to wait a lot for the graph to be done, but
|
||||
# for multiple releases the partner tasks always wait for non-partner.
|
||||
if (
|
||||
|
|
Некоторые файлы не были показаны из-за слишком большого количества измененных файлов Показать больше
Загрузка…
Ссылка в новой задаче