For kind=hook, the spec doesn't include this value as it's untrustworthy.
For kind=task, it's still untrustworthy, but there is no privilege escalation
so that's not important. Still, it dramatically expands the size of the task
definition.
MozReview-Commit-ID: 6scQ2ZwxP10
--HG--
extra : rebase_source : 4dc34390a510091ddc26023755992995fe358e47
The latest python-zstandard uses a newer zstandard that is faster.
It also has wheels available, which means installation doesn't require
Python development headers, etc.
MozReview-Commit-ID: 5gRq81KYmX4
--HG--
extra : rebase_source : 96ccc64e9707c6b4815c1bfa5c1a98b9a428b387
Now that `mach taskcluster-build-image` can, we can avoid all the manual
handling based on curl and jq in the image builder.
An additional advantage on relying on `mach taskcluster-build-image`
doing more is that less changes to the build-image.sh script will be
necessary, and thus less updates of the image builder docker image.
--HG--
extra : rebase_source : dd174d60675e41e4391894f28235c674c1840829
The zstd command we spawn, if available at all, might be the wrong
version: zstd changed its stream format in an incompatible way at some
point, and the version shipped in e.g. Ubuntu 16.04 uses the old format,
while the version taskcluster relies on uses the new format.
Relying on gps's zstandard library allows to ensure we use the right
version. Another advantage is that we can trivially pip install it in a
virtualenv if it isn't available on the system running the command.
If we're ridding ourselves of the subprocess spawning for zstd, we might
as well cover curl as well. Especially considering the error handling
when subprocesses are involved is not trivial, such that the current
error handling code is actually broken and leads to dead-lock
conditions, when, for example, curl is still waiting for the python side
to read data, but the python side is not reading data anymore because
an exception was thrown in the tar reading loop.
--HG--
extra : rebase_source : 054c37cfaa68bf475b37545ebaa99144584b93d4
Ideally, we'd simply use the --build-arg docker argument along with ARG
in the Dockerfile, but that's only supported from Docker API 1.21, and
we're stuck on 1.18 for the moment.
So we add another hack to how we handle the Dockerfile, by adding a
commented syntax that allows to declare arguments to the Dockerfile.
The arguments can be defined in the docker images kind.yml file through
the `args` keyword. Under the hood, they are passed down to the docker
image task through the environment. The mach taskcluster-build-image
command then uses the corresponding values from the environment to
generate a "preprocessed" Dockerfile for its context.
--HG--
extra : rebase_source : 26a43dd680c1ab97b1a4689a23c55594a3b21b67
Initially this will skip toolchain task optimizations, which avoids hashing
local directory contents and speeds up taskgraph generation by about 25%.
MozReview-Commit-ID: B4LB5BV86nw
--HG--
extra : rebase_source : c41e4d838e8920b865cd62bb8de38e64b85b2d84
Add a release promotion custom action for releng's TC release promotion migration work.
This action generates a graph dependent on previously built tasks. To track these, we add the `do_not_optimize` and `existing_tasks` parameters. The `do_not_optimize` parameter specifies tasks that we want to explicitly exclude from taskgraph optimization. The `existing_tasks` parameter specifies a label-to-taskid map for tasks from previous graphs.
MozReview-Commit-ID: 1vKrNUavM4V
--HG--
extra : rebase_source : b8ba95d270aafe1464c2b3bfc318b9568500a7a1
These tests can now be run with:
./mach python-test taskcluster/taskgraph
or:
./mach python-test taskcluster
They can now run in parallel by passing in -j.
MozReview-Commit-ID: JXeZV8B04Sf
--HG--
extra : rebase_source : 7427f96facaf75e6b6fcca532027c68fb6c1a1a3
These tests can now be run with:
./mach python-test taskcluster/taskgraph
or:
./mach python-test taskcluster
They can now run in parallel by passing in -j.
MozReview-Commit-ID: JXeZV8B04Sf
--HG--
extra : rebase_source : a8bb8c21d4c9722bf630c426a780f856fffc8cdb
The purpose of this parameter has been superseded by the `include_nightly`
property.
MozReview-Commit-ID: 4iXQsv9Drqg
--HG--
extra : rebase_source : c94142282909a88c29fe6809d87721bef1f198c2
Graph morphs modify the graph after optimization, without changing its meaning.
In this case, that means adding index tasks that will insert paths into the
index beyond the relatively limited number afforded in task.routes.
MozReview-Commit-ID: AJy4exX7q2v
--HG--
extra : rebase_source : d61e7462defd41e7112739fb057edb493f495430
extra : source : c580568ed47c1ed2af40d98b47fbb0d136e63060
Various modules under taskcluster are doing ad-hoc url formatting or
requests to taskcluster services. While we could use the taskcluster
client python module, it's kind of overkill for the simple requests done
here. So instead of vendoring that module, create a smaller one with
a limited set of functions we need.
This changes the behavior of the get_artifact function to return a
file-like object when the file is neither a json nor a yaml, but that
branch was never used (and was actually returning an unassigned
variable, so it was broken anyways).
At the same time, make the function that does HTTP requests more
error-resistant, using urllib3's Retry with a backoff factor.
Also add a function that retrieves the list of artifacts, that while
currently unused, will be used by `mach artifact` shortly.
--HG--
extra : rebase_source : d7ef633e8e5041dc8450f3ff2f3751c85f144cdc
Various modules under taskcluster are doing ad-hoc url formatting or
requests to taskcluster services. While we could use the taskcluster
client python module, it's kind of overkill for the simple requests done
here. So instead of vendoring that module, create a smaller one with
a limited set of functions we need.
This changes the behavior of the get_artifact function to return a
file-like object when the file is neither a json nor a yaml, but that
branch was never used (and was actually returning an unassigned
variable, so it was broken anyways).
At the same time, make the function that does HTTP requests more
error-resistant, using urllib3's Retry with a backoff factor.
Also add a function that retrieves the list of artifacts, that while
currently unused, will be used by `mach artifact` shortly.
--HG--
extra : rebase_source : 06777dea62e884f546a5b951baad80fd8aec1f1e
Framework for defining actions in-tree that can be displayed
and triggered from Treeherder.
MozReview-Commit-ID: 3rvwgy2i4xu
--HG--
extra : rebase_source : beca394f4337aae4ab149e4db810352f57ec4988
This adds `.cron.yml` and a new mach command to interpret it. While
functionality is limited to nightlies right now, there is room to expand to
more diverse periodic tasks. Let your imagination run wild!
MozReview-Commit-ID: KxQkaUbsjQs
--HG--
extra : rebase_source : ddf0a1eadae5a1169c0ead7bcb7b9ce61b255fbf
Previously, all callers outside of tests that passed
"target_tasks_method" to TaskGraphGenerator all used the same pattern
of looking for a key in the parameters and calling a function in
the target_tasks module.
Future commits will refactor how target tasks graph work. To
make the transition easier, we move the logic for obtaining the
target tasks method into TaskGraphGenerator.
MozReview-Commit-ID: 3QU09iGhoXh
--HG--
extra : rebase_source : fbcc31d705c4b0e148aa3709ddcb18ad99953231