Mach's import scope includes:
* Its `pth` entries
* Its pip packages, which is either:
* The Mach virtualenv site-packages, if packages were "pip
installed" over the internet.
* The system environment's site-packages, if installing packages
over the internet is disabled and Mach is grabbing packages from
the system Python instead.
Command virtualenvs _already_ had this import scope when they were
dynamically activated from an existing Mach process. However, when
used directly (e.g. by `<venv>/bin/python <script>`), they would be
missing this import scope, which was a confusing inconsistency.
However, resolving this inconsistency adds a new risk: when Mach is
using the system Python, the system packages now populate the same
context as command virtualenv packages - and they hadn't been checked
for compatibility. So, this patch also
includes behaviour to verify system<=>command-venv compatibility at
activation-time.
A few notes about this system-package-verification:
* It happens at virtualenv activation-time instead of build-time because
system packages may change after the virtualenv is built.
* It takes roughly 1.5s, which is significant, but it should mainly occur
in CI, where it's acceptable. Devs using `MACH_USE_SYSTEM_PACKAGES`
should unset the variable to avoid the time delay.
* The algorithm works by asserting top-level requirements (e.g.
`psutil>=5.4.2,<=5.8.0`), then runs `pip check` over the combined set
of packages that would be in-scope.
* Note that, in this patch, system packages are *not* asserted to be
the same version as their vendored counterparts. This is because, until
we parse `third_party/python/requirements.in`, we don't know which
packages we depend on directly (and whose APIs we care about if they
change). Since leaning on system packages is essentially only used in
CI (which we have strong control on), this downside seemed acceptable.
Differential Revision: https://phabricator.services.mozilla.com/D126288
The command site manager needs to be able to do ad-hoc pip
installations, while the Mach site manager needs to manage
the system `sys.path` and conditionally create an on-disk
virtualenv.
By splitting the class into two, we can now give each use case the
attention it deserves.
Differential Revision: https://phabricator.services.mozilla.com/D129529
The existing terminology had two issues:
* `VirtualenvManager` wasn't always associated with an on-disk
`virtualenv`: for example, when running in automation, Mach
"activates" a `VirtualenvManager`, updating its import scope,
but without ever creating an on-disk `virtualenv`.
* An upcoming patch splits the `VirtualenvManager` class, pulling
"on-disk virtualenv-handling functions" from the project-wide
interface for managing Python's import scope.
After some good discussion with Ahal, I think we've struck
the terminology that handles this distinction well: we'll call
the "import scope"-handling part the "site", and we'll continue
to call on-disk virtualenvs (and their representative classes)
as, well, virtualenvs.
Differential Revision: https://phabricator.services.mozilla.com/D130391
It's possible for the `PYTHON3` config to point to a different Python
than that that is executing the configure scripts themselves.
This flexibility was needed for the Python 2->3 migration, but now that
it's complete, we can remove the extra configuration and just lean on
the Python interpreter used to run configure.
Differential Revision: https://phabricator.services.mozilla.com/D129863
There's some trade-offs in play here: the major issue is that we can't
pin `psutil`'s because it's pre-installed on some CI workers with a
different version (5.4.2).
Additionally, we can't "just" bump that version, because CI workers jump
between different revisions to do work, so a specific pinned version
won't work when we try to update such a package.
One option is that we could avoid validating package versions in CI, but
then that will cause surprises (heck, I didn't know we were still using
`psutil==5.4.2` instead of `5.8.0` until now). By doing validation, we
make it more explicit and avoid accidentally depending on behaviour of
too new of such a package.
However, in most cases, we manage the installation environment and can
pin dependencies. So, I've made the top-level Mach virtualenv the _only_
one that is able to use requirement specifiers other than "==".
Differential Revision: https://phabricator.services.mozilla.com/D122889
Rather than re-implementing it as `search_path()`, use the existing
`MachEnvRequirements` tool to parse `mach_virtualenv_requirements.txt`
Differential Revision: https://phabricator.services.mozilla.com/D126280
There's some trade-offs in play here: the major issue is that we can't
pin `psutil`'s because it's pre-installed on some CI workers with a
different version (5.4.2).
Additionally, we can't "just" bump that version, because CI workers jump
between different revisions to do work, so a specific pinned version
won't work when we try to update such a package.
One option is that we could avoid validating package versions in CI, but
then that will cause surprises (heck, I didn't know we were still using
`psutil==5.4.2` instead of `5.8.0` until now). By doing validation, we
make it more explicit and avoid accidentally depending on behaviour of
too new of such a package.
However, in most cases, we manage the installation environment and can
pin dependencies. So, I've made the top-level Mach virtualenv the _only_
one that is able to use requirement specifiers other than "==".
Differential Revision: https://phabricator.services.mozilla.com/D122889
Rather than re-implementing it as `search_path()`, use the existing
`MachEnvRequirements` tool to parse `mach_virtualenv_requirements.txt`
Differential Revision: https://phabricator.services.mozilla.com/D126280
Allow-list all Python code in tree for use with the black linter, and re-format all code in-tree accordingly.
To produce this patch I did all of the following:
1. Make changes to tools/lint/black.yml to remove include: stanza and update list of source extensions.
2. Run ./mach lint --linter black --fix
3. Make some ad-hoc manual updates to python/mozbuild/mozbuild/test/configure/test_configure.py -- it has some hard-coded line numbers that the reformat breaks.
4. Make some ad-hoc manual updates to `testing/marionette/client/setup.py`, `testing/marionette/harness/setup.py`, and `testing/firefox-ui/harness/setup.py`, which have hard-coded regexes that break after the reformat.
5. Add a set of exclusions to black.yml. These will be deleted in a follow-up bug (1672023).
# ignore-this-changeset
Differential Revision: https://phabricator.services.mozilla.com/D94045
Allow-list all Python code in tree for use with the black linter, and re-format all code in-tree accordingly.
To produce this patch I did all of the following:
1. Make changes to tools/lint/black.yml to remove include: stanza and update list of source extensions.
2. Run ./mach lint --linter black --fix
3. Make some ad-hoc manual updates to python/mozbuild/mozbuild/test/configure/test_configure.py -- it has some hard-coded line numbers that the reformat breaks.
4. Make some ad-hoc manual updates to `testing/marionette/client/setup.py`, `testing/marionette/harness/setup.py`, and `testing/firefox-ui/harness/setup.py`, which have hard-coded regexes that break after the reformat.
5. Add a set of exclusions to black.yml. These will be deleted in a follow-up bug (1672023).
# ignore-this-changeset
Differential Revision: https://phabricator.services.mozilla.com/D94045
Allow-list all Python code in tree for use with the black linter, and re-format all code in-tree accordingly.
To produce this patch I did all of the following:
1. Make changes to tools/lint/black.yml to remove include: stanza and update list of source extensions.
2. Run ./mach lint --linter black --fix
3. Make some ad-hoc manual updates to python/mozbuild/mozbuild/test/configure/test_configure.py -- it has some hard-coded line numbers that the reformat breaks.
4. Add a set of exclusions to black.yml. These will be deleted in a follow-up bug (1672023).
# ignore-this-changeset
Differential Revision: https://phabricator.services.mozilla.com/D94045
`mach configure` currently runs the equivalent to `make -f client.mk`.
This is history, and essentially does the following:
- Create `configure` and `js/src/configure` from `configure.in` and
`js/src/configure.in` respectively.
- Create the objdir.
- Run `configure` from the objdir.
The `configure` script is, nowadays, only really used as a means to set
OLD_CONFIGURE (and also for people who want to run `configure`,
literally, as in the `configure; make` workflow). `mach configure`
actually doesn't need it. Neither does recursing into `js/src` require
`js/src/configure`, since bug 1520340 (and now as of bug 1669633, we
don't even recurse).
Because configure.py can actually derive OLD_CONFIGURE on its own
(except for `js/src/configure`, but `mach configure` doesn't run that),
we don't really need `configure` for `mach configure`.
So all in all, we're at a point in history where it's straightforward to
just initiate configure.py from mach configure, so we just do that.
And in the hypothetical case where the `mach configure` code is somehow
running in python2, we get the mach virtualenv python3 and use it to
execute `configure.py`.
Differential Revision: https://phabricator.services.mozilla.com/D93741
`mach configure` currently runs the equivalent to `make -f client.mk`.
This is history, and essentially does the following:
- Create `configure` and `js/src/configure` from `configure.in` and
`js/src/configure.in` respectively.
- Create the objdir.
- Run `configure` from the objdir.
The `configure` script is, nowadays, only really used as a means to set
OLD_CONFIGURE (and also for people who want to run `configure`,
literally, as in the `configure; make` workflow). `mach configure`
actually doesn't need it. Neither does recursing into `js/src` require
`js/src/configure`, since bug 1520340 (and now as of bug 1669633, we
don't even recurse).
Because configure.py can actually derive OLD_CONFIGURE on its own
(except for `js/src/configure`, but `mach configure` doesn't run that),
we don't really need `configure` for `mach configure`.
So all in all, we're at a point in history where it's straightforward to
just initiate configure.py from mach configure, so we just do that.
And in the hypothetical case where the `mach configure` code is somehow
running in python2, we get the mach virtualenv python3 and use it to
execute `configure.py`.
Differential Revision: https://phabricator.services.mozilla.com/D93741
Instead, we now run js/src/old-configure from the top-level configure
after having run old-configure and extracted a few variables to inherit
from it.
Because we're now running from the top-level, $_objdir is always the
top-level objdir, which simplifies some things. The topobjdir in
js/src/config.status, however, needs to stay in js/src because of the
build frontend expecting it there.
When running js/src/old-configure, we used to need some special
treatment for a large number of variables for historic reasons, where
we'd take values from the assigned values before running old-configure
for some, or from AC_SUBSTs after running old-configure.
Now that both old-configure and js/src/old-configure get the same
assignments from old-configure.vars, we don't need anything special for
the former. And only a few remaining variables still need manual work
for the latter.
One notable difference, though, is that the new code doesn't try to
avoid running js subconfigure, which added complexity, and was actually
error-prone.
Differential Revision: https://phabricator.services.mozilla.com/D92725
In order to be able to run both old-configure and js/src/old-configure
from the same python configure run, we need to stop setting the items
set by old-configure into the global sandbox config, and instead store
them to be later handled by configure.py.
Differential Revision: https://phabricator.services.mozilla.com/D92718
I think they're remnants from the past that we don't really need anymore.
And they're making things more complicated for some pending work of mine.
Differential Revision: https://phabricator.services.mozilla.com/D89687
I think they're remnants from the past that we don't really need anymore.
And they're making things more complicated for some pending work of mine.
Differential Revision: https://phabricator.services.mozilla.com/D89687
Rather than always printing instructions at the end of the bootstrap phase, we will now create a mozconfig
file if one doesn't exist and there's configuration to be written.
Differential Revision: https://phabricator.services.mozilla.com/D78417
This also does a few remaining python 2 incompatible changes to
.configure files.
Differential Revision: https://phabricator.services.mozilla.com/D69538
--HG--
extra : moz-landing-system : lando
Many values we get out from configure are of types that look like
lists/tuples, that write_indented_repr will serialize as lists... but
only in python 2, because the alternative implementation for python 3
is not doing that. So sanitize first.
Differential Revision: https://phabricator.services.mozilla.com/D69949
--HG--
extra : moz-landing-system : lando
If configure is invoked manually, clobber.py is bypassed and the CLOBBER
file doesn't get touched. The clobber check in Makefile.in gets
triggered causing the build to stop.
Moving the objdir/CLOBBER file creation into configure.py should cause
it to be created regardless of how configure is invoked.
Depends on D53290
Differential Revision: https://phabricator.services.mozilla.com/D53291
--HG--
extra : moz-landing-system : lando
Now that there is no bytes strings in it, we don't need to store
config.status in the system encoding to keep those valid. Moreover, the
system encoding is lossy, which utf-8 is not.
Differential Revision: https://phabricator.services.mozilla.com/D42731
--HG--
extra : moz-landing-system : lando
With bug 1575135, we ensure nothing gets out of the configure sandbox
as a bytes string. We can thus now avoid the encode() pass in
configure.py. We still need, however, to normalize the configuration
so that it doesn't contain unexpected types, and conformning to what
indented_repr does to the configuration in config.status.
While here, convert some obj.iteritems() to six.iteritems(obj).
And remove the now unused encode function.
Differential Revision: https://phabricator.services.mozilla.com/D42630
--HG--
extra : moz-landing-system : lando
Make it a hard error when the sandbox returns non-unicode strings.
This should help quickly catch any remaining non-unicode string that
are not caught by automation.
Differential Revision: https://phabricator.services.mozilla.com/D42607
--HG--
extra : moz-landing-system : lando
This is necessary for any attempt at @import('six') or doing any similar py2+py3 work
in the configure sandbox.
Differential Revision: https://phabricator.services.mozilla.com/D29604
--HG--
extra : moz-landing-system : lando
Since js configure is also python configure, we can actually create
a ConfigureSandbox directly, with the right environment and arguments.
Depends on D16668
Differential Revision: https://phabricator.services.mozilla.com/D16669
--HG--
extra : moz-landing-system : lando
Since MozbuildObject.from_environment() reads from mozinfo.json, tup
picks up that file as a dependency for anything that imports buildconfig
(eg: all generated files). Using FileAvoidWrite when creating
mozinfo.json will help avoid unnecessary work after re-running configure
in the tup backend.
MozReview-Commit-ID: EEOPQYJA1MV
--HG--
extra : rebase_source : 136a0579090776dc55ea5cee870574f13cc27c58
So the file can be more easily consumed without this variable defined.
MozReview-Commit-ID: DF3ASwx4SZP
--HG--
extra : rebase_source : 7d760a71c8f0fa389c2f31f1b383a1f298220545
The config.statusd directory is created alongside config.status, which
contains the same information but is split across many files instead of
all in a single file. This allows the build system to track dependencies
on individual configure values.
MozReview-Commit-ID: 2DbwKCJuNSX
--HG--
extra : rebase_source : 8b6257fd9c75cff3e4b6513d69048c0e3fdda5f4