These files were omitted from the original patch because reformatting them required some manual intervention in order to avoid breaking unit tests. Generally the `noqa` lines were already there and just needed to be moved from one line to another (due to the reformatting by `black`), but sometimes `black` saw fit to move a bunch of stuff all onto one line, requiring me to introduce new `noqa` lines.
Besides the autoformat by `black` and some manual fixups, this patch contains no other changes.
# ignore-this-changeset
Differential Revision: https://phabricator.services.mozilla.com/D94052
Allow-list all Python code in tree for use with the black linter, and re-format all code in-tree accordingly.
To produce this patch I did all of the following:
1. Make changes to tools/lint/black.yml to remove include: stanza and update list of source extensions.
2. Run ./mach lint --linter black --fix
3. Make some ad-hoc manual updates to python/mozbuild/mozbuild/test/configure/test_configure.py -- it has some hard-coded line numbers that the reformat breaks.
4. Make some ad-hoc manual updates to `testing/marionette/client/setup.py`, `testing/marionette/harness/setup.py`, and `testing/firefox-ui/harness/setup.py`, which have hard-coded regexes that break after the reformat.
5. Add a set of exclusions to black.yml. These will be deleted in a follow-up bug (1672023).
# ignore-this-changeset
Differential Revision: https://phabricator.services.mozilla.com/D94045
process_install_manifest now only prints the message (not the stack
trace) of ErrorMessage exceptions, and the error config has been updated
to consider such messages to have the "ERROR" severity.
Differential Revision: https://phabricator.services.mozilla.com/D93854
We used to have a complicated scheme to figure out the minimum supported
version of clang on OSX, based on some compiler feature, which wouldn't
allow to do other version checks further down the line.
The main blocker for better tests was to be able to distinguish between
Xcode clang and plain clang, which turns out to be possible with the
__apple_build_version__ define.
We still need to map versions manually, but it's better than the current
status quo.
Differential Revision: https://phabricator.services.mozilla.com/D94261
These files were omitted from the original patch because reformatting them required some manual intervention in order to avoid breaking unit tests. Generally the `noqa` lines were already there and just needed to be moved from one line to another (due to the reformatting by `black`), but sometimes `black` saw fit to move a bunch of stuff all onto one line, requiring me to introduce new `noqa` lines.
Besides the autoformat by `black` and some manual fixups, this patch contains no other changes.
# ignore-this-changeset
Differential Revision: https://phabricator.services.mozilla.com/D94052
Allow-list all Python code in tree for use with the black linter, and re-format all code in-tree accordingly.
To produce this patch I did all of the following:
1. Make changes to tools/lint/black.yml to remove include: stanza and update list of source extensions.
2. Run ./mach lint --linter black --fix
3. Make some ad-hoc manual updates to python/mozbuild/mozbuild/test/configure/test_configure.py -- it has some hard-coded line numbers that the reformat breaks.
4. Add a set of exclusions to black.yml. These will be deleted in a follow-up bug (1672023).
# ignore-this-changeset
Differential Revision: https://phabricator.services.mozilla.com/D94045
`mach configure` currently runs the equivalent to `make -f client.mk`.
This is history, and essentially does the following:
- Create `configure` and `js/src/configure` from `configure.in` and
`js/src/configure.in` respectively.
- Create the objdir.
- Run `configure` from the objdir.
The `configure` script is, nowadays, only really used as a means to set
OLD_CONFIGURE (and also for people who want to run `configure`,
literally, as in the `configure; make` workflow). `mach configure`
actually doesn't need it. Neither does recursing into `js/src` require
`js/src/configure`, since bug 1520340 (and now as of bug 1669633, we
don't even recurse).
Because configure.py can actually derive OLD_CONFIGURE on its own
(except for `js/src/configure`, but `mach configure` doesn't run that),
we don't really need `configure` for `mach configure`.
So all in all, we're at a point in history where it's straightforward to
just initiate configure.py from mach configure, so we just do that.
And in the hypothetical case where the `mach configure` code is somehow
running in python2, we get the mach virtualenv python3 and use it to
execute `configure.py`.
Differential Revision: https://phabricator.services.mozilla.com/D93741
Rustc >= 1.44 changed the file names of the static libraries it
produces with -windows-gnu targets, to match that of mingw clang/gcc.
Considering we still build on 1.43, the best fix would be to derive the
prefix/suffix based on the version of rust, but that actually turns into
a hard-to-solve problem because of configure tests for bindgen also
depending on the prefix/suffix value to be known.
On the other hand, we're soon due to an update to 1.47, so the simpler
solution is to just push mingw builds to require 1.44 (settling for the
smallest upgrade possible for now) and to remove the split between C and
rust library prefix/suffixes.
Differential Revision: https://phabricator.services.mozilla.com/D93726
`mach configure` currently runs the equivalent to `make -f client.mk`.
This is history, and essentially does the following:
- Create `configure` and `js/src/configure` from `configure.in` and
`js/src/configure.in` respectively.
- Create the objdir.
- Run `configure` from the objdir.
The `configure` script is, nowadays, only really used as a means to set
OLD_CONFIGURE (and also for people who want to run `configure`,
literally, as in the `configure; make` workflow). `mach configure`
actually doesn't need it. Neither does recursing into `js/src` require
`js/src/configure`, since bug 1520340 (and now as of bug 1669633, we
don't even recurse).
Because configure.py can actually derive OLD_CONFIGURE on its own
(except for `js/src/configure`, but `mach configure` doesn't run that),
we don't really need `configure` for `mach configure`.
So all in all, we're at a point in history where it's straightforward to
just initiate configure.py from mach configure, so we just do that.
And in the hypothetical case where the `mach configure` code is somehow
running in python2, we get the mach virtualenv python3 and use it to
execute `configure.py`.
Differential Revision: https://phabricator.services.mozilla.com/D93741
Due to the MacOS `__PYVENV_LAUNCHER__` environment variable, some
virtualenv operations were being run with the system python (and
packages), rather than the python and packages within the venv.
This was already partially solved by having `__PYVENV_LAUNCHER__`
get unset when some virtualenv operations were run.
This change makes this more consistent by unsetting the environment
variable once a `VirtualenvManager` is created.
Differential Revision: https://phabricator.services.mozilla.com/D93615
The `clobber` targets are superseded by `mach clobber`, so we don't need them for any reason. The `clean` target is meant to get you to a post-`configure` state, but it doesn't really work, and if it's necessary for you to be in that state for some reason you can just clobber and re-`configure`, so it doesn't seem worth it to get it working again. Instead, delete all of them. Also delete `everything` which is not useful when `clobber` doesn't exist.
Differential Revision: https://phabricator.services.mozilla.com/D93514
I accidentally broke the 'mach' unittests on Python 2 due to some difference in the unittest
module. Rather than poking into 'unittest', this patch moves us closer to the pytest format
while also fixing the issue.
Differential Revision: https://phabricator.services.mozilla.com/D93420
I try to increase the probability of this error message getting read and heeded by enumerating the list of changed files from the latest `central`, and printing them out.
Also, bug 1636797 I think renders unnecessary the advice to run `mach clobber python`, so I delete that suggestion.
Differential Revision: https://phabricator.services.mozilla.com/D93422
Some recent change apparently made the multiprocessing code reenter
python with the arguments `-s -c "..."` instead of `-c "..."`, which
broke the assumption of the hack.
Differential Revision: https://phabricator.services.mozilla.com/D93060
People keep using `gecko-dev` and trying to run artifact builds, although this very unsupported, and the existing error message is useless if you're not knowledgeable about how these systems work. Since this is the most common case where people come in with questions about artifact builds not working, try to detect this case and print a helpful error message.
Differential Revision: https://phabricator.services.mozilla.com/D92492
Bug 1666244 added this. `sccache-dist` is not the recommended build configuration for arbitrary build scenarios, is not actively supported, and is only relevant for a subset of those building Firefox. Instead, point to more relevant general documentation.
Differential Revision: https://phabricator.services.mozilla.com/D92973
`imp` is deprecated since Python 3.4 and later Python versions are very noisy about printing `DeprecationWarning`s; instead, use its replacement, `importlib`.
Differential Revision: https://phabricator.services.mozilla.com/D92632
The existing implementation of `@imports()` in the `configure` sandbox doesn't translate an import of the form `@imports('distutils.sysconfig')` into an `import distutils.sysconfig` statement; instead, it transforms the input `@imports()` request a few times in such a way that we eventually just do `import distutils`, and expect that `distutils.sysconfig` will be populated that way. This would be fine, except that this isn't the way that Python's `import` system works:
```
>>> import distutils
>>> distutils.sysconfig
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: module 'distutils' has no attribute 'sysconfig'
>>> import distutils.sysconfig
>>> distutils.sysconfig
<module 'distutils.sysconfig' from '/usr/lib/python3.8/distutils/sysconfig.py'>
```
i.e., we can't just import a parent module and expect that we can indirectly access all child packages of that module without importing them specifically.
So instead, we simplify the current model somewhat by not transforming the `@imports()` request at all and instead just performing the exact `import` that the user requested. This resolves the `distutils.sysconfig` issue as well as hopefully preventing any other similar issues popping up in the future.
While I'm here, I also refactored some stuff so that the way that we're patching in wrapped modules for the sandbox is more structured.
Differential Revision: https://phabricator.services.mozilla.com/D90627
Before, this would be written to `sitecustomize.py` irrespective of the value of `populate_local_paths`. This doesn't make sense -- since the local paths aren't included in the `virtualenv`'s `PYTHONPATH` when Python starts up, it doesn't know how to `import mach_bootstrap`. Since on `mach` startup the import hook will be loaded anyway, and the `virtualenv`s in `~/.mozbuild` (i.e. the only `virtualenv`s for which we don't `populate_local_paths`) are just used to run `mach`, this is fine and won't regress anything.
Also, since the `import` hook is only necessary for Python 2, add a couple conditional checks to get rid of the added overhead when we're running with Python 3.
This was never noticed because importing `sitecustomize` is allowed to throw an `ImportError`, which failure is ignored silently. This may be fixed in the latest version of `virtualenv`.
Differential Revision: https://phabricator.services.mozilla.com/D92290
This document was imported from MDN and contained very outdated/incorrect information, and much of the information here is duplicated from the existing `mach` documentation. For the little content that isn't already expressed in the existing documentation in a better way, merge it into `python/mach/docs`.
The unique content is mainly in the FAQ, so I added a new page for that.
Differential Revision: https://phabricator.services.mozilla.com/D91455
This patch changes how/when we decide to (re-)install browsertime. We do this by checking the package.json entries to see if they match or not.
Differential Revision: https://phabricator.services.mozilla.com/D91329
Also this patch adds tests for `ClangdBackend` and `StaticAnalysisBackend` since
both of them are derived from `CompileDB`.
Differential Revision: https://phabricator.services.mozilla.com/D91768
1. Provide a new backend dedicated for C++ static-code_analysis
2. Build a list with directories, from non-unified-compat that have been fixed and
permit compiling of C++ files outside of the unified environment. With this list
we eliminate the unified sources and instead use the original source for command
attribute from compile_commands.json.
In this way if a regression appears clang-tidy will report it since it no longer uses
the unified environment for files that are compatible to be compiled standalone.
3. Remove the coverity functionality that was reading and using non-unified build files
since in practive it proved to be sub-optimal.
Differential Revision: https://phabricator.services.mozilla.com/D91011
`bootstrap` does not/cannot "fix" a broken Python environment, but we can do some checks and make sensible suggestions about how to fix the problem.
We expect that both of the checks we do here are primarily going to be triggered on new installs of Ubuntu. For macOS, installing `python` with `brew` will also install `pip`, and Windows has MozillaBuild. It's just Ubuntu that goes out of its way to be confusing, so add particularly informative error messages for Debian/Ubuntu.
Note that this doesn't change the functionality of `bootstrap` in any fundamental way -- `mach` is already broken if either of these conditions don't hold, but now we show an informative error message.
Differential Revision: https://phabricator.services.mozilla.com/D91470
If the "ui.username" config option doesn't have a value in the expected
format, assume that the user email address is unknown instead of
throwing an exception.
Differential Revision: https://phabricator.services.mozilla.com/D91626
In order to support building with relative paths to third-party code we need
to normalize the non-unified source paths prior to comparing them to incoming
source paths during moz.build file generation.
Depends on D91319
Differential Revision: https://phabricator.services.mozilla.com/D91320
Some hooks modules use global variables to share state between hooks calls.
This patch makes sure we reuse the same module instance once it's loaded by a
Hook class.
Differential Revision: https://phabricator.services.mozilla.com/D91096
The supported way of submitting Firefox patches is via moz-phab, so
install it automatically in bootstrap rather than redirecting to docs.
Depends on D90455
Differential Revision: https://phabricator.services.mozilla.com/D90459
Windows handles files in a case-insensitive way. So, when checking if a
file matches a certain extension, that check needs to be
case-insensitive as well.
Differential Revision: https://phabricator.services.mozilla.com/D91057
In the build process, there's two ways that java is used:
* From the path
* From the java-bin-path specified in the mozconfig
Before, to assert that both "java" usages would be consistent, the
implementation assumed that there was only a single "java" binary
per-JDK-version, and all duplicate "binaries" were symlinks to the
original.
However, in Fedora, it has two identical full binaries: one in
$JDK/bin, and one in $JDK/jre/bin. The symlink theory was incorrect.
So instead, we can assert that both "java" usages are consistent
by checking their versions and asserting that they are equivalent.
Differential Revision: https://phabricator.services.mozilla.com/D90918
In the build process, there's two ways that java is used:
* From the path
* From the java-bin-path specified in the mozconfig
Before, to assert that both "java" usages would be consistent, the
implementation assumed that there was only a single "java" binary
per-JDK-version, and all duplicate "binaries" were symlinks to the
original.
However, in Fedora, it has two identical full binaries: one in
$JDK/bin, and one in $JDK/jre/bin. The symlink theory was incorrect.
So instead, we can assert that both "java" usages are consistent
by checking their versions and asserting that they are equivalent.
Differential Revision: https://phabricator.services.mozilla.com/D90774
I randomly noticed in a log file that we don't apply the flags to wasm compilations. I don't have a super strong motivation for this, but eh, we might as well.
Differential Revision: https://phabricator.services.mozilla.com/D90741
If you set a `DEFAULT` `requirements` file in a Python test manifest, the current implementation of `python-test` will try (and usually fail) to install the file once for every test file in parallel. Instead, initializing the environment should be done once when the command starts up.
Differential Revision: https://phabricator.services.mozilla.com/D90475
The manifest file hasn't actually done anything since XPT definitions were
moved to the libxul binary, and now just generates warnings in local builes.
Differential Revision: https://phabricator.services.mozilla.com/D89197
The manifest file hasn't actually done anything since XPT definitions were
moved to the libxul binary, and now just generates warnings in local builes.
Differential Revision: https://phabricator.services.mozilla.com/D89197
I think they're remnants from the past that we don't really need anymore.
And they're making things more complicated for some pending work of mine.
Differential Revision: https://phabricator.services.mozilla.com/D89687
Allows mach commands to define their own glean metrics with the `metrics_path` @CommandProvider parameter.
When `metrics_path` is defined:
* A `metrics` kwarg is provided to the decorated class. This `metrics` handle is a Glean instance, so Glean documentation should be consulted for usage information.
* When `mach doc telemetry` is run, metrics docs will be generated from all the registered metrics files.
Note: there was some consideration between making `metrics_path` a @CommandProvider or @Command parameter.
In the end, @CommandProvider seemed like a better fit because:
* Metrics seem to be more associated with the entire class than a specific command/method. This is because a class represents a "domain", and that domain may have different commands that have overlapping metrics. Accordingly, all the metrics should be defined once as available to the entire class.
* Currently, @Command methods only take parameters that map one-to-one with CLI arguments. It could seem inconsistent to have one exception: the metrics handle
Differential Revision: https://phabricator.services.mozilla.com/D85953
In the patch for bug 1656993, the case in which
get_command was being set was removed.
Accordingly, its usage in CommandAction will always be evaluated to
`False`, and it can be deleted.
Differential Revision: https://phabricator.services.mozilla.com/D90198
When multiple java versions are in use, some OSes have a mechanism to
change the "default"/system Java.
This can cause unexpected build failures if the system Java changes.
So, in bootstrap, if the Java version found is valid, then its path
should be encoded into mozconfig.
Differential Revision: https://phabricator.services.mozilla.com/D90163
This was originally meant to allow `virtualenv`s to use packages from a parent Python environment without having to re-install them. This turned out to not pan out as we would have liked, so we're going another way to solve the same problem. Bug 1660351 walked back a bunch of this logic; this patch deletes the rest of it.
Differential Revision: https://phabricator.services.mozilla.com/D89492
I think they're remnants from the past that we don't really need anymore.
And they're making things more complicated for some pending work of mine.
Differential Revision: https://phabricator.services.mozilla.com/D89687
The idea here was that we keep the NDK's downloaded in this directory as a "cache", such that if the download gets interrupted, then we can resume the download from an earlier point. This logic was walked back by bug 1637379, which deletes the cache.
In the spirit of the original version of the code, remove the `os.rmdir()` that was introduced by that bug.
An alternative fix for this is to download the NDK to a temporary directory and just clean that up entirely after every call to `mach bootstrap`, but then we would be forgoing the build cache behavior, which I'm not sure makes sense at this point in time.
Differential Revision: https://phabricator.services.mozilla.com/D90125
VSCode doesn't install by default it's command line integration toolset on MacOS
so w don't have a link to `/usr/local/bin/code` instead we must use the binary from the
`Applications` folder.
Also extens this to other platforms even though it's improbable that we are going to use it
there.
Differential Revision: https://phabricator.services.mozilla.com/D90088
In bootstrap, pacman should ensure its database is up-to-date.
Otherwise, when installing packages, pacman will slowly query every
mirror when attempting to download an out-of-date package.
Differential Revision: https://phabricator.services.mozilla.com/D89958
In addition to the existing build telemetry, also gather the stats and
report with Glean. This new telemetry is reported in tandem with the existing
telemetry to allow testing and confidence before a full roll-out.
Additionally, Glean isn't compatible with Python 2, so the new telemetry only runs
on Python 3 mach commands.
Differential Revision: https://phabricator.services.mozilla.com/D83572
This was originally meant to allow `virtualenv`s to use packages from a parent Python environment without having to re-install them. This turned out to not pan out as we would have liked, so we're going another way to solve the same problem. Bug 1660351 walked back a bunch of this logic; this patch deletes the rest of it.
Differential Revision: https://phabricator.services.mozilla.com/D89492
This commit does the following.
- Renames `slashslash` as `dumbComments`. As a result, it now comes before
`emptyLines` in alphabetical ordering, which means that if you apply both
`dumbComments` and `emptyLines`, lines that contain only comments will be
fully removed.
(I contemplated changing the filter ordering to match the order specified,
rather than using alphabetical ordering, but that was more invasive and not
obviously better.)
- Changes `dumbComments` so it only applies if the comment is at the start of
the line (with optional leading whitespace). This is so it can be used with
prefs files, which contain lines like `pref("foo", "https://mozilla.org");`
where the `//` must not be treated as a comment.
Note that `slashslash` wasn't being used anywhere.
Depends on D88240
Differential Revision: https://phabricator.services.mozilla.com/D88242
It's not used, probably because it's pretty strange and hard to imagine using
safely. (Stripping leading and trailing space could be useful, but collapsing
sequences of spaces? Hmm.)
Differential Revision: https://phabricator.services.mozilla.com/D88240
The llvm-strip from clang-11 complains about this file. This file doesn't really interest us anyway -- it's imported from elsewhere -- so just avoid it.
Differential Revision: https://phabricator.services.mozilla.com/D89491
`bootstrap` is written in such a way that we don't necessarily assume the existence of either the global state directory (`~/.mozbuild`) or a local checkout of `mozilla-central`. The independence from `~/.mozbuild` is a design decision that may have been appropriate at some point, but today it's arguably useless to continue executing `bootstrap` without a global state directory (we install a bunch of build dependencies there, as well as the `mach` `virtualenv`s that are needed for running non-`bootstrap` `mach` commands after bug 1656993). The independence from a local checkout is an artifact of the old design of `python/mozboot/bin/bootstrap.py`, which is no longer necessary as of bug 1647792.
With that in mind, 1) throw errors if we can't create the global state directory or find the topsrcdir, and 2) remove all existing conditionals of the form "if the global state directory exists" or "if the root of the checkout exists" since these conditions are always going to be true in practice.
Differential Revision: https://phabricator.services.mozilla.com/D89220
This will be found in a couple places, so we might as well make a helper function. For symmetry put it in the same file where we keep the helper function to locate the `state_dir`.
Differential Revision: https://phabricator.services.mozilla.com/D89156
Bug 1659539 caused the unit tests of this class to suddenly start running on Linux; it failed with a type error that suggests this test has never really properly worked, at least not with the version of Mercurial we're using in CI (`unsupported changeid '0' of type <type 'unicode'>`). The class itself isn't used anywhere besides these tests, so just delete the entire class.
Differential Revision: https://phabricator.services.mozilla.com/D89205
The manifest file hasn't actually done anything since XPT definitions were
moved to the libxul binary, and now just generates warnings in local builes.
Differential Revision: https://phabricator.services.mozilla.com/D89197
`mach artifact` has a dependency on `zstandard`, which is installed in the `mach` `virtualenv`s, so we have to run `mach artifact` with the correct `virtualenv`. Also create the `virtualenv`s earlier in the process to account for this.
This all has a dependency on the existence of a checkout (which has the `mach` script with all its dependencies on everything else), but after bug 1647792 that's not a concern.
Differential Revision: https://phabricator.services.mozilla.com/D87920
The ability to get the path to the Python executable from a given `virtualenv` location is generally useful outside the context of all the extra stuff a `VirtualenvManager` provides, so refactor it out into a lighter-weight helper class.
Differential Revision: https://phabricator.services.mozilla.com/D89175
Pipenv is heavy weight and overkill for the purposes it is being used. We'd like to remove it from the tree and |mach python-test| was one of the last remanining use cases.
Remove the `--python` command-line argument as a result. Users who wish to run unit tests with Python 2 can do `MACH_PY2=1 ./mach python-test ...` or `python2 ./mach python-test ...`.
Also update a few unit tests that would break otherwise in the presence of this change.
There were a couple lines in the `setup.py` for `mozlog` that were problematic for tests and was resulting in errors due to the `mozlog` plugin being loaded by `pytest` more than once. We just delete those lines and bump up the major version number of the package to fix it.
Differential Revision: https://phabricator.services.mozilla.com/D88296
Otherwise `gyp` can choose incorrectly when trying to figure out which Python to use to run its internal scripts, which can manifest as test failures in certain circumstances.
Differential Revision: https://phabricator.services.mozilla.com/D89165
This consolidates the `have_clone` logic in one place unconditionally. After bug 1647792 we're deprecating the use case where `bootstrap` is run without a clone, so that's not a problem.
In reality the whole `have_clone` thing isn't necessary any more (`have_clone` is always going to be `True` in practice), but I'll save that for a bigger refactoring.
Differential Revision: https://phabricator.services.mozilla.com/D89152
Note that when I refer to "standalone `bootstrap.py`" here, I'm referring to the file `python/mozboot/bin/bootstrap.py` and no other similarly-named file in-tree.
The current design, where standalone `bootstrap.py` downloads a small portion of the `mozilla-central` repo and then works through all the `bootstrap` logic, performing a clone in the middle of the `bootstrap` process, has some deficiencies, namely:
1. Refactoring code that is shared between the `bootstrap` logic and the mainline `mach` logic is, if not impossible, more difficult than it needs to be, since standalone `bootstrap.py` needs to download a set of files that bootstraps a build environment perfectly with no other dependencies in `mozilla-central`. As a result we have some [duplicated or redundant code](https://searchfox.org/mozilla-central/rev/c6676771df58c6e0098574bc6b11517acbf264cf/python/mozboot/mozboot/base.py#349) and some stuff that has been [refactored into the `mozboot` directory](https://searchfox.org/mozilla-central/source/python/mozboot/mozboot/util.py) irrespective of whether it actually makes sense to go there (see bug 1649850).
2. Since `mach bootstrap` has access to the entire `mozilla-central` environment, but the (much less frequently exercised) standalone `bootstrap.py` script does *not*, this can lead people to write patches that work fine in `mach bootstrap` but which regress standalone `bootstrap.py`. Furthermore, testing `bootstrap` patches with standalone `bootstrap.py` is difficult. So this is a not infrequent source of regressions; bugs 1652736 and 1643158 are recent examples. Furthermore, typically these regressions are "fixed" by adding more code duplication or by replacing battle-tested frequently-used libraries (either `m-c`-internal or third-party) with less robust bespoke code.
3. Issue (2) is avoidable if people are sufficiently critical during code review, but at *best*, this is a weird extra level of mental overhead that we need to keep in mind only for `bootstrap` patches.
This patch preserves back-compatibility and the validity of existing documentation by factoring out all the logic to clone `mozilla-central` into standalone `bootstrap.py` directly, and cloning *before* calling into `mach bootstrap` directly. This gives us only one official entry point into the bootstrapper, namely, `mach bootstrap`.
There are a couple concrete implications of this change:
1. Now, the clone happens before any other interesting work happens, so people may have to wait ~an hour before actually beginning to engage with the `bootstrap` wizard. While it's arguably slightly less convenient, I'm not sure it matters enough that we should block this patch or a similar one on it.
2. The `hg`/`git` configuration step now happens *after* the clone rather than before it. Looking at what the `hg` and `git` configuration wizards actually do today, I don't think this matters either (all of the configurations can easily happen after cloning the repo).
Another note: `bootstrap` installs `git-cinnabar` to the `.mozbuild` state directory. We could have duplicated all of that logic over to standalone `bootstrap.py`, but it's non-trivial and I didn't think that made any sense, so instead we have standalone `bootstrap.py` download a temporary version and use it for the initial clone if necessary.
Differential Revision: https://phabricator.services.mozilla.com/D85058
This brings it up-to-date with the minimum allowable version in `version-control-tools`; these two versions should be equal to each other, as the [comment in the source code](4b0de666d1/hgext/configwizard/__init__.py (l59)) suggests. Also add a similar comment in this file to prevent the likelihood of a mismatch going into the future.
Differential Revision: https://phabricator.services.mozilla.com/D89022
This logic is meant to expose packages from a globally-installed Python to be used by the in-`objdir` `virtualenv`s, so for example we don't have to figure out how to install `zstandard` (or other Python packages with native code that may or may not have prebuilt wheels for any given platform) in those `virtualenv`s. Bug 1660351 augmented that logic to work within the requirements of bug 1660353. This worked mostly, but is causing builds to unconditionally break on Arch Linux, caused a couple test failures, and in general is just introducing other weird behaviors downstream, and issues with the resultant `PYTHONPATH`s are hard to diagnose and fix.
In the long-term we'll have to permanently solve the `zstandard` problem and pave the way for other Python packages with native code as well, but that's not an urgent need.
The ultimate goal is to completely remove `inherit-from-parent-environment`, but we can't do that until bug 1659539 is solved.
Partially reverts bugs 1660351. Entirely reverts bug 1660353, restoring that file to as it was before that patch.
Differential Revision: https://phabricator.services.mozilla.com/D89001
This patch adds two new options to mozperftest. The --simplify-names argument can be provided to enable simplification of metric names and the --simplify-exclude option allows users to select which metrics to skip in the simplification. A bug relating to setting the default of a list option is also fixed here.
Differential Revision: https://phabricator.services.mozilla.com/D88917
This fixes a local failure in mozperftest by mocking the find_node_executable function in the visual-metrics tests. It is not needed for those tests.
Depends on D88915
Differential Revision: https://phabricator.services.mozilla.com/D88916
Improves glean performance.
Prior to this change, using Glean adds ~500ms to each `mach` run.
After this change, using Glean adds ~200ms to each `mach` run.
Differential Revision: https://phabricator.services.mozilla.com/D88691
This was always a temporary hack because getting `zstandard` installed into the `objdir` `virtualenv`s was impossible. With the changes made in bug 1656993, this is possible now, so we can remove all this.
Differential Revision: https://phabricator.services.mozilla.com/D87809
The intended behavior of `inherit-from-parent-environment` is that the packages from the parent Python environment are available to the sub-`virtualenv`. The implementation of that behavior thus far has been around "site directories", the idea being that custom (non-stdlib) packages are likely to be installed in the "site directory". The limitation of this approach is that there's no one location, in practice, where packages are installed, and it's hard to enumerate a static list of all those possible locations across all platforms.
This patch circumvents the issue by ignoring the "site directory" question entirely and just looking at `sys.path`. If we're inheriting from the parent environment when creating a `virtualenv`, we just ask the parent Python what its `sys.path` is and configure the `virtualenv`'s `sys.path` on startup.
Differential Revision: https://phabricator.services.mozilla.com/D87808
We `normpath()` the `_root` path when we save it, but the input `path` to `get()` is not necessarily also normalized. Normalizing it prevents unnecessary test failures.
Differential Revision: https://phabricator.services.mozilla.com/D88635
There's a usage of `mozpath.join(...)` found while investigating
wildcard-management. Since the return value isn't used here, and the
parameters here aren't mutated, this function call doesn't do anything.
Differential Revision: https://phabricator.services.mozilla.com/D88508
Bug 1645986 solved the problem for most generated files by moving their
rules to the top-level, but we're going to add rules that will end up in
subdirectories, so we have to solve the same problem again, in the
subdirectories.
Differential Revision: https://phabricator.services.mozilla.com/D88389
A single task is created to do all partner attributions. The partner_attribution transform processes the configuration into an environment variable for the tools/attribution/attribute.py script to use. This is quite verbose so a large number of configurations may cause problems.
Applies the same priority modification to attribution tasks as to partner repacks, to not impede the main part of the graph.
Differential Revision: https://phabricator.services.mozilla.com/D87729
A single task is created to do all partner attributions. The partner_attribution transform processes the configuration into an environment variable for the tools/attribution/attribute.py script to use. This is quite verbose so a large number of configurations may cause problems.
Applies the same priority modification to attribution tasks as to partner repacks, to not impede the main part of the graph.
Differential Revision: https://phabricator.services.mozilla.com/D87729
There are various problems happening when dealing with the output from
setup.py during virtualenv setup, all of which step from the process
command output not being a unicode string in python.
As this code is still used to setup python2 virtualenv, we need to use
the backwards-compatible universal_newlines=True trick.
Differential Revision: https://phabricator.services.mozilla.com/D88372
This, hopefully, begins to address an ongoing global problem where we have few, if any, insights into the performance of individual build tasks (compilations, calls into Python scripts, etc.) At most we have aggregated statistics about how long tiers last, combined with `sccache` aggregates across the entire build (which don't cover non-compilation tasks). This has a few implications:
1. It's impossible to identify bottlenecks, except by going out of your way to notice and reproduce them. e.g. no one, to my knowledge, was aware that `make_dafsa.py` was a bottleneck until someone happened to notice and report it in bug 1629337. We could have systems that automatically detect this sort of thing, or at least that make it easier to do so than by CTRL-C'ing in the middle of the build several times to try to reproduce the problem.
2. It's impossible to detect regressions, unless the regression is so pronounced and severe that it has an immediate impact on the overall build time and triggers build time alerts.
3. It's impossible to identify that you have *fixed* regressions, except by doing ad-hoc timing measurements by building individual `make` targets. This is error-prone and annoying.
Here we propose a low-friction system wherein individual build tasks log their build own perf info. For now, that's a write to `stdout` consisting of the string `BUILDTASK ` followed by a simple JSON object with a start time, end time, the `argv` of the task, and an additional `"context"` key (I anticipate this could be used to annotate the task with relevant per-task for later aggregation, for example: was this an `sccache` cache hit or not? For now, it's empty everywhere). The build controller then collects this data, validates it, and writes out the entire list of build tasks as a JSON file after the build has completed, similarly to what we already do with `build_resources.json`. We already parse some `make` output to do stuff like tracking when we switch tiers, so this isn't a huge architectural shift or anything.
In my opinion this "should" happen at the build system, or `make`, level, but `make` doesn't expose anything resembling this information to my knowledge, so this has to be implemented outside of `make`. One could implement something like this at the `sccache` level but that doesn't touch anything but C/C++/Rust compilation tasks; an ideal solution would support other generic build tasks. We could also fork `make` to add this feature ourselves, but for several reasons I don't think that's tractable. :)
Of course, this approach has downsides:
1. We depend on parsing the `stdout` of `make`, and processes can unfortunately sometimes trample on each other, leading to data loss for individual build tasks occasionally. This is a necessary limitation of the model to my knowledge, and I don't know that it can be fixed generally. In my testing, not much data tends to be lost usually.
2. Dumping arbitrary data to `stdout` isn't always possible or desirable. If you're not careful about it this can also result in noisier-than-necessary tasks, especially when those tasks are not invoked by a parent process that knows how to handle the special `BUILDTASK` lines.
3. This data is raw enough where aggregation is not completely trivial.
4. This functionality has to be added for any new kind of build task whose performance we'd like to track; it doesn't come "for free" due to not being able to be implemented at the build system level.
5. The data isn't awfully small due to the `argv`'s (at this point, not nearly big enough where we need to be concerned about it IMO, but maybe that will change in the future?)
One can imagine a couple other architectures that could avoid the first two problems, namely: 1) we could use a "real" database that would not dump info to `stdout` and wouldn't lose data, like `sqlite3`; or, 2) we could set up another server, similar to `sccache`, that collects this data from subprocesses and aggregates it, making sure not to lose any along the way. Both of these have enough overhead, in terms of engineering effort or actual impact on latency, where I dont know that they make any sense to even attempt implementing. The remaining continue to be real issues, however.
After this is landed there are a few ways forward. We can start uploading these files as build artifacts in CI to allow us to reason about performance impacts of changes in `central`. We can easily add this functionality to the `sccache` client to start tracking those builds as well. We already have a very simple visualization of build tier timing in `mach resource-usage`; we could join that data against the `BUILDTASK` data to produce a very clear visualization of build bottlenecks, i.e., "why is the `export` tier taking so long", etc.
Differential Revision: https://phabricator.services.mozilla.com/D80284
It can be tricky to find out why artifact-lookup fails, especially since
it can be implicitly affected by environment variables.
Log if a known troublemaker (TASKCLUSTER_ROOT_URL) is set when an
artifact-lookup fails.
Differential Revision: https://phabricator.services.mozilla.com/D88304
On some platforms re-creating the `virtualenv`s can be very time-consuming so we'd like to avoid deleting these `virtualenv`s unnecessarily.
Preserve the existing behavior behind a `-f` flag in case unconditionally wiping the `virtualenv`s is what's needed for any reason.
Differential Revision: https://phabricator.services.mozilla.com/D87668
Bug 1659906 removed `../`s from some generated files paths which turned
up not being supported by the faster make backend. Instead of returning
to those relative paths, just support the topobjdir-relative paths
correctly. The new code is derived from the equivalent code in the
recursive make backend.
Differential Revision: https://phabricator.services.mozilla.com/D88097
This generally happens because people cleverly create custom forks of `mozilla-central` that don't have `git-cinnabar` metadata. This is ALWAYS broken in for artifact builds, but people generally don't know that and the error message isn't informative. Instead, identify when this happens as it happens and suggest an immediate, working alternative.
Differential Revision: https://phabricator.services.mozilla.com/D87923
`glean_sdk` can't currently be installed on Apple Silicon or OpenBSD since Glean can't be built from source. While that issue is being resolved (see bug 1660120), allow this installation to fail.
Differential Revision: https://phabricator.services.mozilla.com/D87667
Bug #985141 added this argument without changing all the callers.
Instead of fixing each caller individually, just allow a value not to be
passed in. This is what the underlying MozbuildObject class does
anyways.
Differential Revision: https://phabricator.services.mozilla.com/D87386
There are zero uses of this `mach` command over the past 90 days according to our telemetry. There are no external references to `mach python-safety` in-tree, and indeed if you track the history of the originating bug 1468394, it appears that once the `mach` command was created, none of the follow-up work that was discussed (i.e. running this in CI and triaging failures to appropriate owners) was done over the following 2 years.
If this ever does appear to be useful in the future, we can just resurrect this code from source control.
Differential Revision: https://phabricator.services.mozilla.com/D87351
These arguments will get converted to `bytes` later on if necessary; if we don't defer to `hglib`, then we need proper strings.
Differential Revision: https://phabricator.services.mozilla.com/D87440
In two different places we've been encountering issues regarding 1) how we configure the system Python environment and 2) how the system Python environment relates to the `virtualenv`s that we use for building, testing, and other dev tasks. Specifically:
1. With the push to use `glean` for telemetry in `mach`, we are requiring (or rather, strongly encouraging) the `glean_sdk` Python package to be installed with bug 1651424. `mach bootstrap` upgrades the library using your system Python 3 in bug 1654607. We can't vendor it due to the package containing native code. Since we generally vendor all code required for `mach` to function, requiring that the system Python be configured with a certain version of `glean` is an unfortunate change.
2. The build uses the vendored `glean_parser` for a number of build tasks. Since the vendored `glean_parser` conflicts with the globally-installed `glean_sdk` package, we had to add special ad-hoc handling to allow us to circumvent this conflict in bug 1655781.
3. We begin to rely more and more on the `zstandard` package during build tasks, this package again being one that we can't vendor due to containing native code. Bug 1654994 contained more ad-hoc code which subprocesses out from the build system's `virtualenv` to the SYSTEM `python3` binary, assuming that the system `python3` has `zstandard` installed.
As we rely more on `glean_sdk`, `zstandard`, and other packages that are not vendorable, we need to settle on a standard model for how `mach`, the build process, and other `mach` commands that may make their own `virtualenv`s work in the presence of unvendorable packages.
With that in mind, this patch does all the following:
1. Separate out the `mach` `virtualenv_packages` from the in-build `virtualenv_packages`. Refactor the common stuff into `common_virtualenv_packages.txt`. Add functionality to the `virtualenv_packages` manifest parsing to allow the build `virtualenv` to "inherit" from the parent by pointing to the parent's `site-packages`. The `in-virtualenv` feature from bug 1655781 is no longer necessary, so delete it.
2. Add code to `bootstrap`, as well as a new `mach` command `create-mach-environment` to create `virtualenv`s in `~/.mozbuild`.
3. Add code to `mach` to dispatch either to the in-`~/.mozbuild` `virtualenv`s (or to the system Python 3 for commands which cannot run in the `virtualenv`s, namely `bootstrap` and `create-mach-environment`).
4. Remove the "add global argument" feature from `mach`. It isn't used and conflicts with (3).
5. Remove the `--print-command` feature from `mach` which is obsoleted by these changes.
This has the effect of allowing us to install packages that cannot be vendored into a "common" place (namely the global `~/.mozbuild` `virtualenv`s) and use those from the build without requiring us to hit the network. Miscellaneous implementation notes:
1. We allow users to force running `mach` with the system Python if they like. For now it doesn't make any sense to require 100% of people to create these `virtualenv`s when they're allowed to continue on with the old behavior if they like. We also skip this in CI.
2. We needed to duplicate the global-argument logic into the `mach` script to allow for the dispatch behavior. This is something we avoided with the Python 2 -> Python 3 migration with the `--print-command` feature, justifying its use by saying it was only temporarily required until all `mach` commands were running with Python 3. With this change, we'll need to be able to determine the `mach` command from the shell script for the forseeable future, and committing to this forever with the cost that `--print-command` incurs (namely `mach` startup time, an additional .4s on my machine) didn't seem worth it to me. It's not a ton of duplicated code.
Differential Revision: https://phabricator.services.mozilla.com/D85916
This resolves a long-standing issue in development where `mach artifact` (and therefore `mach bootstrap`) would fail unpredictably if you had dirty, but ignored, files in your checkout. Resolving this problem often required unwieldy `hg purge`/`git ignore` incantations that are easy to get wrong.
This patch addresses the problem by doing what we "should" have been doing all along, and consulting the VCS to list tracked files rather than listing EVERY file on disk and applying heuristics to determine whether they should be included in the hash.
Differential Revision: https://phabricator.services.mozilla.com/D86780
Now you can pass the `virtualenv_name` kwarg to the `Command` decorator which will configure the `_virtualenv_manager` accordingly.
Differential Revision: https://phabricator.services.mozilla.com/D86256
This check is unsound; `virtualenv` binaries are apparently not guaranteed to have the same file size as the `python` binaries used to create those `virtualenv`s, at least not with our current vendored version of the `virtualenv` library on macOS. This is trivially reproducible on my own Macbook:
```
rickystewart-a5lvdq:mozilla-unified rickystewart$ rm -rf obj-x86_64-apple-darwin19.5.0/
rickystewart-a5lvdq:mozilla-unified rickystewart$ ./mach configure
...
rickystewart-a5lvdq:mozilla-unified rickystewart$ python3 -c 'import os; print(os.path.getsize("obj-x86_64-apple-darwin19.5.0/_virtualenvs/init_py3/bin/python"))'
16644 # <- ACTUAL VIRTUALENV SIZE
rickystewart-a5lvdq:mozilla-unified rickystewart$ python3 -c 'import os; print(os.path.getsize("/usr/local/opt/python/bin/python3.7"))'
17704 # <- SIZE OF THE PYTHON USED TO CREATE THE VIRTUALENV
```
Concretely, this was causing unit tests to be very aggressive about deleting the parent `init_py3` `virtualenv` repeatedly in unit tests, resulting in failures. The removal of this check fixes the issue.
Differential Revision: https://phabricator.services.mozilla.com/D86872
Now you can pass the `virtualenv_name` kwarg to the `Command` decorator which will configure the `_virtualenv_manager` accordingly.
Differential Revision: https://phabricator.services.mozilla.com/D86256
To check if pipenv is installed, the existing logic looked to see if the "pipenv" binary existed. However, on Windows, this binary is named "pipenv.exe".
Differential Revision: https://phabricator.services.mozilla.com/D86680
Pipenv enthusiastically checks pyenv for possible Python installations, and it seems to always want to use the most
up-to-date version possible. This can lead to issues if the version of python used for the regular venvs is older
than another version of python that exists on the machine (such as if the system python is 3.8.2, but pyenv has
3.8.3 installed).
This patch addresses this by scoping pipenv to match the currently-running version of python (if major versions line
up).
Differential Revision: https://phabricator.services.mozilla.com/D86448
Without this patch, the last "path" in this list will always be the empty string due to how the `-z` option to `git` works. This mirrors what we already do in the `get_files_in_working_directory` implementation for `hg`.
Differential Revision: https://phabricator.services.mozilla.com/D86752
There's a Windows Defender CLI (`Get-MpComputerStatus`) available, but MozillaBuild
can't (easily) access PowerShell. So, instead, we find Windows Defender status and path exclusions
by checking the registry at HKLM\SOFTWARE\Microsoft\Windows Defender.
Determining if Windows Defender is "on" or not is a surprisingly vague task.
Not only does it encompass a lot of components (of which Ricky and I believe "Real-time protection"
is the part slowing down the build), but there's (at least?) two different kinds of "disabled"
states that it can be in:
* If "disabled" via Settings, it will turn itself back on after a reboot
* If an antivirus is installed, it will turn itself off permanently
Unfortunately, disabling "Real-time protection" in Settings doesn't affect any registry keys, but
I'm opting to ignore this because I doubt many users are manually disabling this protection before
each build. The repercussion of this is that users may be incorrectly warned "your Firefox
directory isn't excluded from Windows Defender!".
Focusing on the antivirus use case and querying the registry I found that:
* Querying the `IsServiceRunning` value wasn't sufficient because new Windows installations didn't have that key
* Querying the `DisableRealtimeMonitoring` value was inconsistent - it can be missing, and not always because an antivirus removed it
* Querying the `DisableAntiVirus` value doesn't _sound_ accurate (we care about Real-time Protection), but it's consistently "off" for non-AV machines, and "on" for machines with an AV installed. So, this is our winner!
TL;DR: there may be some "false positive" warnings about excluding the Firefox srcdir, but they're
accurate for my test cases and workaround-able (just add the exclusion to Windows Defender).
Also, this patch updates our Windows Defender docs to make them:
* More easily link-able
* Have direct advice to resolve the issue
Differential Revision: https://phabricator.services.mozilla.com/D85952
I wrote this patch because I noticed that the `.pth` files in my `objdir` `virtualenv`s were extremely repetitive, containing multiple references to the same directories. This happens because we [append](https://searchfox.org/mozilla-central/rev/03794edd6edcc3fc1e222de966cb27256ce08998/python/mozbuild/mozbuild/virtualenv.py#366) to the in-`virtualenv` `.pth` files when calling `populate()`, but we don't ever clean up the old ones, meaning that whenever we determine that the `virtualenv`s are out of date and need to be recreated, we actually leave A LOT of state lying around on-disk that is going to go on to impact further uses of the `virtualenv`. Concretely, how this manifested is that when I erroneously removed an entry from `virtualenv_packages.txt`, the build actually succeeded because that entry was still in the `.pth` file in the `virtualenv`; instead of "creating" a new `virtualenv` with the correct `.pth` files, it just appended the new `.pth` data to the old, stale data.
I've chosen to address this by completely deleting the entire `virtualenv` when we try to re-create it. Another way you might solve this problem is by doing a `find $VIRTUALENV -name '*.pth' | xargs rm` before doing the `virtualenv` re-creation, but I'm suggesting we do it this way because we have had a long history of difficulty with `virtualenv` persistence. Bug 1628498 is an obvious example; note that we would never have encountered that bug if we always unconditionally deleted the `virtualenv` before creating a new one, as in this patch. A patch that is laser-targeted at handling the issue with `.pth` files might be fine for now but this is more foolproof and future-proof.
Differential Revision: https://phabricator.services.mozilla.com/D85636
To ensure that a python 3 virtualenv exists for tests, python-test will ensure that it's created (regardless of which version of python is currently in use).
However, the existing logic was incorrectly creating an "extra py3 virtualenv", even if currently running python 3 and having already created a "first" py3 virtualenv
Differential Revision: https://phabricator.services.mozilla.com/D86269
Today we don't require that `mach` `CommandProvider`s subclass from any particular parent class and we're very lax about the requirements they must meet. While that's convenient in certain circumstances, it has some unfortunate implications for feature development.
Today the only requirements that we have for `CommandProvider`s are that they have an `__init__()` method that takes either 1 or 2 arguments, the second of which must be called `context` and is populated with the `mach` `CommandContext`. Again, while this flexibility is occasionally convenient, it is limiting. As we add features to `mach`, having a better idea what the shape of our `CommandProvider`s are and how we can instantiate them and use them is increasingly important, and this gives us additional control when having `mach` configure `CommandProvider`s based on data that is only available at the `mach` level. In particular, we plan to leverage this in bugs 985141 and 1654074.
Here we add validation to the `CommandProvider` decorator to ensure all classes inherit from `MachCommandBase`, update all `CommandProvider`s in-tree to inherit from `MachCommandBase`, and update source and test code accordingly.
Follow-up work: we now require (de facto) that the `context` be populated with a `topdir` attribute by the `populate_context_handler` function, since instantiating the `MachCommandBase` requires a `topdir` be provided. This is fine for now in the interest of keeping this patch reasonably sized, but some additional refactoring could make this cleaner.
Differential Revision: https://phabricator.services.mozilla.com/D86255
When subprocessing to get the venv from pipenv with PyCharm, the returned path has a command sequence to reset the terminal color.
This command sequence is unexpected, and causes the returned path to be incorrect.
The root issue is that colorama mistakenly believes that, if it's ever running underneath PyCharm, it's probably attached to a tty.
However, this assumption isn't true if PyCharm is running a script which subprocesses out a colorama-enabled script.
This issue has already been reported here: https://github.com/tartley/colorama/issues/263
To workaround this issue, we clear the "PYCHARM_HOSTED" environment variable when we invoke pipenv, which works around the colorama logic.
Differential Revision: https://phabricator.services.mozilla.com/D86242
I noticed that the `objdir:build` entry in `build/virtualenv_packages.txt` entry was apparently unused. This originates from bug 841713, seven years ago, when the `objdir` handling was introduced. Today, this doesn't appear to be serving a purpose. There is no Python library in my `$objdir/build` directory; nor can I see anything in `build/moz.build` or any related files suggesting one could ever appear. I can only assume this feature has outlived its usefulness, so delete it and the relevant in-tree support.
This necessitates slightly changing the signature and implementation of the `activate_pipenv()` method; also update all callers.
Differential Revision: https://phabricator.services.mozilla.com/D85635
I noticed that the `objdir:build` entry in `build/virtualenv_packages.txt` entry was apparently unused. This originates from bug 841713, seven years ago, when the `objdir` handling was introduced. Today, this doesn't appear to be serving a purpose. There is no Python library in my `$objdir/build` directory; nor can I see anything in `build/moz.build` or any related files suggesting one could ever appear. I can only assume this feature has outlived its usefulness, so delete it and the relevant in-tree support.
This necessitates slightly changing the signature and implementation of the `activate_pipenv()` method; also update all callers.
Differential Revision: https://phabricator.services.mozilla.com/D85635
The resource file is always generated so being able to configure its name
is not useful. On the other hand, the way things are currently implemented,
the lack of RESFILE also makes RCFILE ignored, which we fix at the same
time.
And remove a spurious RESFILE in widget/windows/moz.build, where no binary
is produced, which means RESFILE had no meaning.
Differential Revision: https://phabricator.services.mozilla.com/D86154
The resource file is always generated so being able to configure its name
is not useful. On the other hand, the way things are currently implemented,
the lack of RESFILE also makes RCFILE ignored, which we fix at the same
time.
And remove a spurious RESFILE in widget/windows/moz.build, where no binary
is produced, which means RESFILE had no meaning.
Differential Revision: https://phabricator.services.mozilla.com/D86154
Add a modular approach for the integration of `static-analysis` module in order
to be able to share components of it with other modules, like the integration of
`clangd` in `vscode` where we need to have access to the configuration of `clang-tidy`
in order to have `in-ide` `static-analysis` messages.
In this initial step we make a separate module for the clang-tidy configuration.
Differential Revision: https://phabricator.services.mozilla.com/D85979
In order to have a cross platform ide for C++ language support we've added `clangd`
extenssion and artifact part of `vscode` suite.
To generate the configuration you simply run:
`./mach ide vscode `.
Differential Revision: https://phabricator.services.mozilla.com/D85416
Contrary to python2, python3 considers the values in cl_lnotab to be signed
integers, so with python3, offsets larger than 127 would be encoded in a way
that would make them wrong, or worse, negative.
Differential Revision: https://phabricator.services.mozilla.com/D86122
These tests depend on the `mach uuid` command which was deleted with bug 1639509, and now that `mach uuid` is gone it's broken unconditionally. We could replace the reference to `uuid` with a new no-op `mach` command, but we're in the process of replacing our telemetry code with use of the `glean` API; and the new telemetry code won't have the same semantics (namely, we are unlikely to want to continue to guarantee that sub-`mach` invocations aren't covered by telemetry), so this test might as well just be deleted now.
Differential Revision: https://phabricator.services.mozilla.com/D85911
This patch fixes an issue where the metric settings were not being used because they don't use the test name. It also handles some changes (from a bad copy-paste) that didn't make it into the last live-site patch series.
Differential Revision: https://phabricator.services.mozilla.com/D85609
@CommandProvider does parameter validation and collects information (such
as "pass_context") that will be needed by Registrar.
However, rather than just checking parameter length, we can make it more
specific and assert that the specific expected parameter ("context") exists.
Differential Revision: https://phabricator.services.mozilla.com/D85482
This patch adds the `--<LAYER>-split-by` option to the metric layers. It allows users to split the data they obtain using a given data field name. For instance, if `browserScripts.pageinfo.url` is provided, then the data will be split based on the unique URLs that are found.
Differential Revision: https://phabricator.services.mozilla.com/D84822
The "what" value contains a list of build targets, the warning should be printed if any one of the targets is unexpected.
Differential Revision: https://phabricator.services.mozilla.com/D85462
This allows to test them in the exact same environment as the tests are
going to run, which turns out to have revealed a few issues that would
only appear once xpcshell tests fail, impeding on debugging those
failures.
Differential Revision: https://phabricator.services.mozilla.com/D84781