* `pystache` is unused.
* `funcsigs` is only needed for WPT - use the WPT version instead.
* `moz.build` has a bunch of obsolete/redundant references, clean
them up.
* `mohawk` isn't used directly, but is rather depended-on via
`taskcluster`. So, remove it from `requirements.in`.
Differential Revision: https://phabricator.services.mozilla.com/D126732
The previous behaviour was to:
* Never add a `pthfile` to the Mach virtualenv, and
* Always add Mach's paths to the `sys.path` when Mach initializes
However, this meant that `pip` would needlessly install packages
that already exist in the vendored environment.
Tweak `pth` behaviour so that `pip` behaves more efficiently.
Differential Revision: https://phabricator.services.mozilla.com/D120402
There's an existing algorithm to check if a virtualenv's installed
packages are up-to-date with its requirements. This patch
extracts that logic so that, in cases where we can't automatically
download needed pip packages, we can at least assert that the ones
installed to the system Python are sufficient to meet our requirements.
The current only case in which this system-checking logic is applied
is when starting the Mach virtualenv and the `MOZ_AUTOMATION` or
`MACH_USE_SYSTEM_PYTHON` environment variable is set.
Differential Revision: https://phabricator.services.mozilla.com/D122890
`pytest` has platform-specific dependencies (`colorama` on Windows), but
our vendoring logic currently doesn't support having platform-specific
packages.
Since `python-test` jobs don't have to be isolated from the network,
migrate `pytest` to be a `pypi:` dependency.
Differential Revision: https://phabricator.services.mozilla.com/D126285
Both of them are version `2018.4.16`, but our local one is vendored in a
`pip`-compatible way (it includes a `.dist-info` directory).
FWIW, we'll probably need to keep these two versions in-sync (CI
should already be verifying this in the virtualenv-compatibility
tests) because:
* `certifi` is needed by `sentry-sdk`, a Mach-global dependency
* Web platform tests should use the version of `certifi` that
exists in the upstream `certifi` repo. These tests may operate in
a context that includes Mach and its dependencies.
Differential Revision: https://phabricator.services.mozilla.com/D126282
It's only needed for some commands.
Additionally, as we start verifying Python environments in CI, some of
the tasks don't have `zstandard` installed.
Differential Revision: https://phabricator.services.mozilla.com/D126281
There's some trade-offs in play here: the major issue is that we can't
pin `psutil`'s because it's pre-installed on some CI workers with a
different version (5.4.2).
Additionally, we can't "just" bump that version, because CI workers jump
between different revisions to do work, so a specific pinned version
won't work when we try to update such a package.
One option is that we could avoid validating package versions in CI, but
then that will cause surprises (heck, I didn't know we were still using
`psutil==5.4.2` instead of `5.8.0` until now). By doing validation, we
make it more explicit and avoid accidentally depending on behaviour of
too new of such a package.
However, in most cases, we manage the installation environment and can
pin dependencies. So, I've made the top-level Mach virtualenv the _only_
one that is able to use requirement specifiers other than "==".
Differential Revision: https://phabricator.services.mozilla.com/D122889
This will allow us to parse and compare pip package versions the same
way that `pip` does.
`pyparsing` was added because it's needed by `packaging`.
Differential Revision: https://phabricator.services.mozilla.com/D122888
This adds two main compatibility guarantees:
1. Vendored dependencies <=> Pypi-downloaded dependencies
2. Global Mach dependencies <=> command-specific dependencies
As part of this, a new `vendored:` action was added to the virtualenv
definition format. Otherwise similar to `pth:` paths, `vendored:`
packages are assumed to be "pip install"-able.
Some validation (the `.dist-info`/`PKG-INFO` checks) was added to
`requirements.py` to verify that `pth:` and `vendored:` are correctly
used.
Differential Revision: https://phabricator.services.mozilla.com/D122900
Rather than re-implementing it as `search_path()`, use the existing
`MachEnvRequirements` tool to parse `mach_virtualenv_requirements.txt`
Differential Revision: https://phabricator.services.mozilla.com/D126280
There's an existing algorithm to check if a virtualenv's installed
packages are up-to-date with its requirements. This patch
extracts that logic so that, in cases where we can't automatically
download needed pip packages, we can at least assert that the ones
installed to the system Python are sufficient to meet our requirements.
The current only case in which this system-checking logic is applied
is when starting the Mach virtualenv and the `MOZ_AUTOMATION` or
`MACH_USE_SYSTEM_PYTHON` environment variable is set.
Differential Revision: https://phabricator.services.mozilla.com/D122890
Both of them are version `2018.4.16`, but our local one is vendored in a
`pip`-compatible way (it includes a `.dist-info` directory).
FWIW, we'll probably need to keep these two versions in-sync (CI
should already be verifying this in the virtualenv-compatibility
tests) because:
* `certifi` is needed by `sentry-sdk`, a Mach-global dependency
* Web platform tests should use the version of `certifi` that
exists in the upstream `certifi` repo. These tests may operate in
a context that includes Mach and its dependencies.
Differential Revision: https://phabricator.services.mozilla.com/D126282
It's only needed for some commands.
Additionally, as we start verifying Python environments in CI, some of
the tasks don't have `zstandard` installed.
Differential Revision: https://phabricator.services.mozilla.com/D126281
There's some trade-offs in play here: the major issue is that we can't
pin `psutil`'s because it's pre-installed on some CI workers with a
different version (5.4.2).
Additionally, we can't "just" bump that version, because CI workers jump
between different revisions to do work, so a specific pinned version
won't work when we try to update such a package.
One option is that we could avoid validating package versions in CI, but
then that will cause surprises (heck, I didn't know we were still using
`psutil==5.4.2` instead of `5.8.0` until now). By doing validation, we
make it more explicit and avoid accidentally depending on behaviour of
too new of such a package.
However, in most cases, we manage the installation environment and can
pin dependencies. So, I've made the top-level Mach virtualenv the _only_
one that is able to use requirement specifiers other than "==".
Differential Revision: https://phabricator.services.mozilla.com/D122889
This will allow us to parse and compare pip package versions the same
way that `pip` does.
`pyparsing` was added because it's needed by `packaging`.
Differential Revision: https://phabricator.services.mozilla.com/D122888
This adds two main compatibility guarantees:
1. Vendored dependencies <=> Pypi-downloaded dependencies
2. Global Mach dependencies <=> command-specific dependencies
As part of this, a new `vendored:` action was added to the virtualenv
definition format. Otherwise similar to `pth:` paths, `vendored:`
packages are assumed to be "pip install"-able.
Some validation (the `.dist-info`/`PKG-INFO` checks) was added to
`requirements.py` to verify that `pth:` and `vendored:` are correctly
used.
Differential Revision: https://phabricator.services.mozilla.com/D122900
Rather than re-implementing it as `search_path()`, use the existing
`MachEnvRequirements` tool to parse `mach_virtualenv_requirements.txt`
Differential Revision: https://phabricator.services.mozilla.com/D126280
This removes the `@CommandProvider` decorator and the need to implement
mach commands inside subclasses of `MachCommandBase`, and moves all
existing commands out from classes to module level functions.
Differential Revision: https://phabricator.services.mozilla.com/D121512
`Layer::GetDisplayListLog()` also still contains the metion to
LayerScope to dump display list.
But this change does not remove it because it's a part of debugging display list.
If we remove it, I think we should open a new bug for it.
Differential Revision: https://phabricator.services.mozilla.com/D126512
This removes the `@CommandProvider` decorator and the need to implement
mach commands inside subclasses of `MachCommandBase`, and moves all
existing commands out from classes to module level functions.
Differential Revision: https://phabricator.services.mozilla.com/D121512
It's currently added manually to CXXFLAGS and bindgen flags, and is
notably missing from HOST_CXXFLAGS. However, setting it at the toolchain
level makes it inherited anywhere it's needed, including host builds and
bindgen.
Differential Revision: https://phabricator.services.mozilla.com/D126153
This removes the `@CommandProvider` decorator and the need to implement
mach commands inside subclasses of `MachCommandBase`, and moves all
existing commands out from classes to module level functions.
Differential Revision: https://phabricator.services.mozilla.com/D121512
This removes the `@CommandProvider` decorator and the need to implement
mach commands inside subclasses of `MachCommandBase`, and moves all
existing commands out from classes to module level functions.
Differential Revision: https://phabricator.services.mozilla.com/D121512
In practice we already only use SourceSurfaceSharedData as our
rasterized image backing. This means we no longer need to lock the data
to keep it in memory (when we used volatile memory), nor to try to
optimize the surface for the DrawTarget.
Differential Revision: https://phabricator.services.mozilla.com/D124476
The build-clang step in the clang-mingw toolchains just replicates what
we already do for clang, except with no patched applied, which could
arguably seen as an issue. Instead of rebuilding what we essentially
already have as a result of the clang-12 toolchain, we use the clang-12
toolchain directly instead.
Differential Revision: https://phabricator.services.mozilla.com/D125912
The reason the error mentioned in build-mingw32-nsis.sh happens is that
the default mode NSIS builds in is a fully-installed mode, where it
hardcodes the locations of its data files. This is why nsis needs to
be used from the same place it's built for. But there's another mode,
enabled with NSIS_CONFIG_CONST_DATA_PATH=no, that makes it relocatable,
and makes it find its data files relatively to the nsis binary.
However, there's a bug in the nsis build scripts, which makes the nsis
binary installed in the destination directory instead of a bin/
subdirectory, while the source code itself looks for data files relative
to the parent directory of the directory that contains the executable.
So we need to set PREFIX_BIN to force the executable to be installed in
a bin/ subdirectory.
There is also an issue in nsis itself when it's executed by anything
other than a shell, which we patch out.
Differential Revision: https://phabricator.services.mozilla.com/D125638
Using the sysroot with GCC requires some unnecessary complication, and
we don't really care if the first stages are not using the sysroot.
Differential Revision: https://phabricator.services.mozilla.com/D125161
Building sanitizers in older versions of clang emit errors when built with
newer GCC because of some narrowing conversions. We only really need the
sanitizers in the final stage anyways, so we force-disable them (and
everything else that might enable "sanitizer-common") in early stages.
Building the lli tool (also used in tests) fails as well, and we don't
need it until the final stage (where we need it to ship it).
Differential Revision: https://phabricator.services.mozilla.com/D125159
There are complications with building a 1-stage clang with gcc, so just
use clang. Eventually, the clang-tidy toolchains will be removed in
favor of providing clang-tidy from the clang toolchain itself anyways.
Differential Revision: https://phabricator.services.mozilla.com/D125158
Tweak the `VirtualenvManager` API to accept a single base "python"
executable during initialization, then to consistently use it for
up-to-date checking, construction, etc.
This constraint allows future simplification for behaviour involving
the "base" python.
Differential Revision: https://phabricator.services.mozilla.com/D124516
Maps virtualenvs one-to-one with their associated requirements
definition.
For example:
```
Name: "docs"
Virtualenv location: <snip>/_virtualenvs/docs
Requirements location: $topsrcdir/build/docs_virtualenv_packages.txt
```
An issue to be resolved in the future is that it's tricky to know that,
when you define a new virtualenv, you have to *know* that a requirements
definition needs to exist in `build/`.
As part of this change, the default virtualenv ("common") was
split from the build virtualenv ("build").
This required changes to the python-test virtualenv since
python-tests depend on `glean_parser`, but were getting it
implicitly from the "build" virtualenv's requirements.
This addition to the `python-test` virtualenv is temporary and
will be removed when bug 1724273 is resolved.
Differential Revision: https://phabricator.services.mozilla.com/D122891
Thunderbird can't currently define its own Mach commands, which means
that it doesn't define its own virtualenvs.
However, it does add some tweaks to the `common` virtualenv requirements
(for build scripts, I believe).
So, we should allow it to continue making those tweaks, but we should
remove the capability of specifying PyPI packages. This is needed for
lockfile-generation, since lockfiles are created and placed in the
Mozilla repo, and we don't want "conditional lockfiles" based on the
existence of a "comm" repo.
This will not regress existing Thunderbird expectations.
Differential Revision: https://phabricator.services.mozilla.com/D124515