When ./breeze stop is run, we run docker-compose down under the
hood - by default with --volumes flag which also removes the
volumes. But the volumes were only defined when you
selected the database.
We want to clean up all the volumes on breeze stop
in order to avoid surprizes when you switch the DB and find
the DB is there.
Otherwise when you switch databases while they are running
stop will delete volumes for only the most recently used
database.
The fix makes sure that all the db
volumes are defined always so they are always all deleted on stop
(cherry picked from commit a983273413)
Previously we skipped building 2.7 and 3.5 for master branch but
now we have also v2-0-test, so it is better to skip the
versions when branch is != v1-10-test (this is the only
DEFAULT_BRANCH - even in v1-10-stable builds v1-10-test is used
as DEFAULT_BRANCH is v1-10-test.
(cherry picked from commit 0f01faa486)
These three functions were in test_core, but the separate test_dates
file is better suited.
In addition I have removed the use of `assert_array_almost_equal` from
numpy as pytest provides it's own version
(cherry picked from commit 95d9088ab1)
We've upgraded pyarrow release to 2.0.0 which seems to work
well with the upcoming pip 20.3.2 version. This PR is to
automatically rebuild our images to take the pyarrow constraint
into account.
(cherry picked from commit 0eb210df3e)
* Fetch inventories for third-party services only once
* fixup! Fetch inventories for third-party services only once
(cherry picked from commit fa9c6b47d3)
The snowflake provider when imported breaks other providers
Until https://github.com/apache/airflow/issues/12881 is fixed
we should skip discovering snowflake provider in development mode
(cherry picked from commit 169aa019c7)
This fixes three problems:
1. That remote logs weren't being uploaded due to the fork change
2. That the S3 hook attempted to fetch credentials from the DB, but the
ORM had already been disposed.
3. That even if forking was disabled, that S3 logs would fail due to use
of concurrent.futures. See https://bugs.python.org/issue33097
The extras configured by --extras Breeze switch are now
passed to pip install command in case airfow is installed via
--install-airflow-version or --install-airflow-reference switch.
Flask before 2.0 (unreleased at time of writing) will prefer simplejson if it is installed.
But unfortunately simplejson is not compatible with the stock JSONEncoder -- it always
passes an encoding argument. Changing the base class for our encoder to be what ever
Flask is using makes this more resilient.
This PR refactors the airflow plugins command to be compatible with
'output' parameter which allows users to get output in form of table,
json or yaml.
* Install airflow and providers from dist and verifies them
This check is there to prevent problems similar to those reported
in #13027 and fixed in #13031.
Previously we always built airflow from wheels, only providers were
installed from sdist packages and tested. In this version both
airflow and providers are installed using the same package format
(sdist or wheel).
* Update scripts/in_container/entrypoint_ci.sh
Co-authored-by: Kaxil Naik <kaxilnaik@gmail.com>
Co-authored-by: Kaxil Naik <kaxilnaik@gmail.com>
* Changes release image preparation to use PyPI packages
Since we released all teh provider packages to PyPI now in
RC version, we can now change the mechanism to prepare the
production to use released packages in case of tagged builds.
The "branch" production images are still prepared using the
CI images and .whl packages built from sources, but the
release packages are built from officially released PyPI
packages.
Also some corrections and updates were made to the release process:
* the constraint tags when RC candidate is sent should contain
rcn suffix.
* there was missing step about pushing the release tag once the
release is out
* pushing tag to GitHub should be done after the PyPI packages
are uploaded, so that automated image building in DockerHub
can use those packages.
* added a note that in case we will release some provider
packages that depend on the just released airflow version
they shoudl be released after airflow is in PyPI but before
the tag is pushed to GitHub (also to allow the image to be
build automatically from the released packages)
Fixes: #12970
* Update dev/README_RELEASE_AIRFLOW.md
Co-authored-by: Ash Berlin-Taylor <ash_github@firemirror.com>
* Update dev/README_RELEASE_AIRFLOW.md
Co-authored-by: Ash Berlin-Taylor <ash_github@firemirror.com>
Co-authored-by: Ash Berlin-Taylor <ash_github@firemirror.com>