At the moment installing Python packages with native code is done by calling
`pip install <package>` and does not enforce any SHA hash for installed
dependencies, nor does it enforce a specific version to be installed.
This commit adds `requirements.in` and `requirements.txt` files for native
packages and changes these packages to be installed by running `pip install`
and passing the requirements file for the package. This allows us to pin the
SHA of the various dependencies. The `.txt` files are generated using
`pip-compile`.
We also add the new requirements files to the sparse profile for `mach`.
Differential Revision: https://phabricator.services.mozilla.com/D99912
Covers adding the new JS global `GleanPings` for JS, the new structs for C++ at
mozilla::glean_pings, ping-id and string-table-index codegen, the usual
boilerplate for JS and C++ stuff, and tests.
Unresolved:
* What happens if we call this on a non-parent process?
(This isn't a supported mode of operation)
Differential Revision: https://phabricator.services.mozilla.com/D98671
This extracts an install-meson.sh helper script to install meson in both
the wrench-deps task for Firefox CI and the taskcluster.yml in WebRender CI.
Differential Revision: https://phabricator.services.mozilla.com/D90441
Also define a scheme for storing the index of Glean definitions files in a file
separate from the build system for consumption by
* mach build
* mach doc
* (future) mozilla/probe-scraper
Differential Revision: https://phabricator.services.mozilla.com/D87600
In two different places we've been encountering issues regarding 1) how we configure the system Python environment and 2) how the system Python environment relates to the `virtualenv`s that we use for building, testing, and other dev tasks. Specifically:
1. With the push to use `glean` for telemetry in `mach`, we are requiring (or rather, strongly encouraging) the `glean_sdk` Python package to be installed with bug 1651424. `mach bootstrap` upgrades the library using your system Python 3 in bug 1654607. We can't vendor it due to the package containing native code. Since we generally vendor all code required for `mach` to function, requiring that the system Python be configured with a certain version of `glean` is an unfortunate change.
2. The build uses the vendored `glean_parser` for a number of build tasks. Since the vendored `glean_parser` conflicts with the globally-installed `glean_sdk` package, we had to add special ad-hoc handling to allow us to circumvent this conflict in bug 1655781.
3. We begin to rely more and more on the `zstandard` package during build tasks, this package again being one that we can't vendor due to containing native code. Bug 1654994 contained more ad-hoc code which subprocesses out from the build system's `virtualenv` to the SYSTEM `python3` binary, assuming that the system `python3` has `zstandard` installed.
As we rely more on `glean_sdk`, `zstandard`, and other packages that are not vendorable, we need to settle on a standard model for how `mach`, the build process, and other `mach` commands that may make their own `virtualenv`s work in the presence of unvendorable packages.
With that in mind, this patch does all the following:
1. Separate out the `mach` `virtualenv_packages` from the in-build `virtualenv_packages`. Refactor the common stuff into `common_virtualenv_packages.txt`. Add functionality to the `virtualenv_packages` manifest parsing to allow the build `virtualenv` to "inherit" from the parent by pointing to the parent's `site-packages`. The `in-virtualenv` feature from bug 1655781 is no longer necessary, so delete it.
2. Add code to `bootstrap`, as well as a new `mach` command `create-mach-environment` to create `virtualenv`s in `~/.mozbuild`.
3. Add code to `mach` to dispatch either to the in-`~/.mozbuild` `virtualenv`s (or to the system Python 3 for commands which cannot run in the `virtualenv`s, namely `bootstrap` and `create-mach-environment`).
4. Remove the "add global argument" feature from `mach`. It isn't used and conflicts with (3).
5. Remove the `--print-command` feature from `mach` which is obsoleted by these changes.
This has the effect of allowing us to install packages that cannot be vendored into a "common" place (namely the global `~/.mozbuild` `virtualenv`s) and use those from the build without requiring us to hit the network. Miscellaneous implementation notes:
1. We allow users to force running `mach` with the system Python if they like. For now it doesn't make any sense to require 100% of people to create these `virtualenv`s when they're allowed to continue on with the old behavior if they like. We also skip this in CI.
2. We needed to duplicate the global-argument logic into the `mach` script to allow for the dispatch behavior. This is something we avoided with the Python 2 -> Python 3 migration with the `--print-command` feature, justifying its use by saying it was only temporarily required until all `mach` commands were running with Python 3. With this change, we'll need to be able to determine the `mach` command from the shell script for the forseeable future, and committing to this forever with the cost that `--print-command` incurs (namely `mach` startup time, an additional .4s on my machine) didn't seem worth it to me. It's not a ton of duplicated code.
Differential Revision: https://phabricator.services.mozilla.com/D85916
Add an action that will trigger a task that runs
`mach release push-scriptworker-canary`
to test a new scriptworker deployment.
Differential Revision: https://phabricator.services.mozilla.com/D82821
This patch is adding an option to push a perftest run in the CI.
It's based on :
- sparse profiles
- push_to_try
- options passed through try_task_config.json
Differential Revision: https://phabricator.services.mozilla.com/D74115
Without this, taskgraph calculates the digest based on just the files in these
directories that are in the sparse profile (I suspect this is just the
moz.build files), and will rebuild it when those files change. This changes
ensures that all files in those directories are used to calculate the digest.
Additionally, this will cause the same digest to be generted by developers
locally, since the files *are* present in the non-sparse checkout that most
developers have.
Differential Revision: https://phabricator.services.mozilla.com/D74113
This change introduces a "github-sync" component into tools,
which aims to support synchronizing both wgpu and WebRender with github.
~~It also features a "cargo test" job for standalone wgpu (bug 1596127)~~
The code is ported from "gfx/wr/scripts/wrupdater" folder. Changes are:
1. remove explicit WR parts and make them configurable by command line params
2. detect "mozilla-xxx" tags and use them in addition to the commits
As a follow up, wrupdater will be removed in favor of github-sync.
Status:
- [x] get the CI test job working
- [x] get @kats to fork "wgpu" github for "moz-gfx" bot
- [x] remove the wgpu testing CI job (into separate PR)
- [x] create new secret and reference it
Differential Revision: https://phabricator.services.mozilla.com/D57057
--HG--
extra : moz-landing-system : lando
We've long handled chunks by defining the total number of chunks in our CI
configuration, and then passing that value down into the test harnesses at task
runtime (via the '--this-chunk' and '--total-chunks' parameters). The test
harness then runs an algorithm to determine which tests should be run in "this"
chunk.
There are several problems with this approach, but by far the biggest is that
we can't use test information in our scheduling algorithms. The information
simply isn't available yet. This patch switches things around such that we
determine which tests go in which tasks during the taskgraph generation. This
means we have perfect information around which tasks are running which tests,
and if e.g a ccov or machine learning algorithm deems a particular test
important, we can make sure to *only* schedule the tasks that contain that
test.
I'm planning to enable this a couple suites at a time so we don't accidentally
stop running tests. This specifically only enables this mode for
'mochitest-media', 'mochitest-browser-chrome' and 'mochitest-devtools-chrome'.
I chose these suites because they are the ones that are already using the
'chunk_by_runtime' algorithm.
Differential Revision: https://phabricator.services.mozilla.com/D52729
--HG--
extra : moz-landing-system : lando
Right now toolchain tasks that aren't dependencies of requested tasks
will run in response to source code changes only if the files are in the
sparse profile. That is, taskgraph calculates the digest based on just
the files in those directories that are in the sparse profile, and will
rebuild it when those files change. This changes ensures that more
geckodriver files are used to calculate the digest, and thus changes
to them will cause geckodriver to be rebuilt.
As more things depend on the geckodriver toolchain tasks directly this
becomes less valuable and can be removed from the sparse profile.
Differential Revision: https://phabricator.services.mozilla.com/D47459
--HG--
extra : moz-landing-system : lando
In browsertime.zip we should have:
browsertime/
package.json
package-lock.json
node_modules/
.bin/
browsertime -> ../browsertime/bin/browsertime.js
browsertime/
...
The idea is that we'll fetch browsertime.zip in a generic-worker
environment and be able to run Node.js from within the top level
browsertime/ directory.
Differential Revision: https://phabricator.services.mozilla.com/D38773
--HG--
extra : moz-landing-system : lando
This was originally from bug 1528374 for Mac PGO, but that isn't able to
land yet and it should help Windows PGO runs in the meantime.
Differential Revision: https://phabricator.services.mozilla.com/D37807
--HG--
extra : moz-landing-system : lando