[AIRFLOW-6763] Make systems tests ready for backport tests (#7389)
We will run system test on back-ported operators for 1.10* series of airflow and for that we need to have support for running system tests using pytest's markers and reading environment variables passed from HOST machine (to pass credentials). This is the first step to automate system tests execution.
This commit is contained in:
Родитель
6502cfa8e6
Коммит
848fbab5bd
78
BREEZE.rst
78
BREEZE.rst
|
@ -206,6 +206,12 @@ run tests immediately.
|
|||
You can `set up autocomplete <#setting-up-autocomplete>`_ for commands and add the
|
||||
checked-out Airflow repository to your PATH to run Breeze without the ``./`` and from any directory.
|
||||
|
||||
|
||||
When you enter the Breeze environment, automatically an environment file is sourced from
|
||||
``files/airflow-breeze-config/variables.env``. The ``files`` folder from your local sources is
|
||||
automatically mounted to the container under ``/files`` path and you can put there any files you want
|
||||
to make available fot the Breeze container.
|
||||
|
||||
Stopping Breeze
|
||||
---------------
|
||||
|
||||
|
@ -610,8 +616,14 @@ This is the current syntax for `./breeze <./breeze>`_:
|
|||
|
||||
-S, --static-check <STATIC_CHECK>
|
||||
Run selected static checks for currently changed files. You should specify static check that
|
||||
you would like to run or 'all' to run all checks. One of
|
||||
[ all all-but-pylint bat-tests check-apache-license check-executables-have-shebangs check-hooks-apply check-merge-conflict check-xml debug-statements doctoc detect-private-key end-of-file-fixer flake8 forbid-tabs insert-license lint-dockerfile mixed-line-ending mypy pylint pylint-test setup-order shellcheck].
|
||||
you would like to run or 'all' to run all checks. One of:
|
||||
|
||||
all all-but-pylint bat-tests check-apache-license
|
||||
check-executables-have-shebangs check-hooks-apply check-merge-conflict
|
||||
check-xml debug-statements doctoc detect-private-key end-of-file-fixer flake8
|
||||
forbid-tabs insert-license lint-dockerfile mixed-line-ending mypy pylint
|
||||
pylint-test setup-order shellcheck
|
||||
|
||||
You can pass extra arguments including options to to the pre-commit framework as
|
||||
<EXTRA_ARGS> passed after --. For example:
|
||||
|
||||
|
@ -624,8 +636,14 @@ This is the current syntax for `./breeze <./breeze>`_:
|
|||
|
||||
-F, --static-check-all-files <STATIC_CHECK>
|
||||
Run selected static checks for all applicable files. You should specify static check that
|
||||
you would like to run or 'all' to run all checks. One of
|
||||
[ all all-but-pylint bat-tests check-apache-license check-executables-have-shebangs check-hooks-apply check-merge-conflict check-xml debug-statements doctoc detect-private-key end-of-file-fixer flake8 forbid-tabs insert-license lint-dockerfile mixed-line-ending mypy pylint pylint-test setup-order shellcheck].
|
||||
you would like to run or 'all' to run all checks. One of:
|
||||
|
||||
all all-but-pylint bat-tests check-apache-license
|
||||
check-executables-have-shebangs check-hooks-apply check-merge-conflict
|
||||
check-xml debug-statements doctoc detect-private-key end-of-file-fixer flake8
|
||||
forbid-tabs insert-license lint-dockerfile mixed-line-ending mypy pylint
|
||||
pylint-test setup-order shellcheck
|
||||
|
||||
You can pass extra arguments including options to the pre-commit framework as
|
||||
<EXTRA_ARGS> passed after --. For example:
|
||||
|
||||
|
@ -660,17 +678,26 @@ This is the current syntax for `./breeze <./breeze>`_:
|
|||
|
||||
-P, --python <PYTHON_VERSION>
|
||||
Python version used for the image. This is always major/minor version.
|
||||
One of [ 3.6 3.7 ]. Default is the python3 or python on the path.
|
||||
One of:
|
||||
|
||||
3.6 3.7
|
||||
|
||||
|
||||
-B, --backend <BACKEND>
|
||||
Backend to use for tests - it determines which database is used.
|
||||
One of [ sqlite mysql postgres ]. Default: sqlite
|
||||
One of:
|
||||
|
||||
sqlite mysql postgres
|
||||
|
||||
Default: sqlite
|
||||
|
||||
-I, --integration <INTEGRATION>
|
||||
Integration to start during tests - it determines which integrations are started for integration
|
||||
tests. There can be more than one integration started, or all to start all integrations.
|
||||
Selected integrations are not saved for future execution.
|
||||
One of [ cassandra kerberos mongo openldap rabbitmq redis all ].
|
||||
One of:
|
||||
|
||||
cassandra kerberos mongo openldap rabbitmq redis all
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
|
@ -697,11 +724,19 @@ This is the current syntax for `./breeze <./breeze>`_:
|
|||
|
||||
-M, --kubernetes-mode <KUBERNETES_MODE>
|
||||
Kubernetes mode - only used in case --start-kind-cluster flag is specified.
|
||||
One of [ persistent_mode git_mode ]. Default: git_mode
|
||||
One of:
|
||||
|
||||
persistent_mode git_mode
|
||||
|
||||
Default: git_mode
|
||||
|
||||
-V, --kubernetes-version <KUBERNETES_VERSION>
|
||||
Kubernetes version - only used in case --start-kind-cluster flag is specified.
|
||||
One of [ v1.15.3 v1.16.2 ]. Default: v1.15.3
|
||||
One of:
|
||||
|
||||
v1.15.3 v1.16.2
|
||||
|
||||
Default: v1.15.3
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
|
@ -713,6 +748,21 @@ This is the current syntax for `./breeze <./breeze>`_:
|
|||
Skips mounting local volume with sources - you get exactly what is in the
|
||||
docker image rather than your current local sources of airflow.
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
** Install Airflow if different than current
|
||||
**
|
||||
*********************************************************************************************************
|
||||
|
||||
-N, --install-airflow-version <INSTALL_AIRFLOW_VERSION>
|
||||
If different than 'current' removes the source-installed airflow and installs a
|
||||
released version of Airflow instead. One of:
|
||||
|
||||
|
||||
current 1.10.9 1.10.8 1.10.7 1.10.6 1.10.5 1.10.4 1.10.3 1.10.2 1.10.1
|
||||
|
||||
Default: current.
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
** Assume answers to questions
|
||||
|
@ -725,6 +775,16 @@ This is the current syntax for `./breeze <./breeze>`_:
|
|||
-n, --assume-no
|
||||
Assume 'no' answer to all questions.
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
** Credentials
|
||||
**
|
||||
*********************************************************************************************************
|
||||
|
||||
-f, --forward-credentials
|
||||
Forwards host credentials to docker container. Use with care as it will make your credentials
|
||||
Available to everything you install in Docker.
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
** Increase verbosity of the script
|
||||
|
|
392
TESTING.rst
392
TESTING.rst
|
@ -25,36 +25,36 @@ Airflow Test Infrastructure
|
|||
and local virtualenv.
|
||||
|
||||
* **Integration tests** are available in the Breeze development environment
|
||||
that is also used for Airflow Travis CI tests. Integration test are special tests that require
|
||||
additional services running - such as Postgres/Mysql/Kerberos etc. Those tests are not yet
|
||||
clearly marked as integration tests but soon they will be clearly separated by pytest annotations.
|
||||
that is also used for Airflow Travis CI tests. Integration tests are special tests that require
|
||||
additional services running, such as Postgres, MySQL, Kerberos, etc. Currently, these tests are not
|
||||
marked as integration tests but soon they will be clearly separated by ``pytest`` annotations.
|
||||
|
||||
* **System tests** are automatic tests that use external systems like
|
||||
Google Cloud Platform. These tests are intended for an end-to-end DAG execution.
|
||||
Note that automated execution of these tests is still
|
||||
`work in progress <https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-4+Support+for+System+Tests+for+external+systems#app-switcher>`_.
|
||||
The tests can be executed on both current version of Apache Airflow, and any of the older
|
||||
versions from 1.10.* series.
|
||||
|
||||
This document is about running python tests, before the tests are run we also use
|
||||
`static code checks <STATIC_CODE_CHECKS.rst>`__ which allow to catch typical errors in code
|
||||
before tests are executed.
|
||||
This document is about running Python tests. Before the tests are run, use
|
||||
`static code checks <STATIC_CODE_CHECKS.rst>`__ that enable catching typical errors in the code.
|
||||
|
||||
Airflow Unit Tests
|
||||
==================
|
||||
|
||||
All tests for Apache Airflow are run using `pytest <http://doc.pytest.org/en/latest/>`_ .
|
||||
|
||||
Writing unit tests
|
||||
Writing Unit Tests
|
||||
------------------
|
||||
|
||||
There are a few guidelines that you should follow when writing unit tests:
|
||||
Follow the guidelines when writing unit tests:
|
||||
|
||||
* Standard unit tests that do not require integrations with external systems should mock all communication
|
||||
* All our tests are run with pytest make sure you set your IDE/runners (see below) to use pytest by default
|
||||
* For new tests we should use standard "asserts" of python and pytest decorators/context managers for testing
|
||||
rather than unittest ones. Look at `Pytest docs <http://doc.pytest.org/en/latest/assert.html>`_ for details.
|
||||
* We use parameterized framework for tests that have variations in parameters
|
||||
* We plan to convert all unittests to standard "asserts" semi-automatically but this will be done later
|
||||
in Airflow 2.0 development phase. That will include setUp/tearDown/context managers and decorators
|
||||
* For standard unit tests that do not require integrations with external systems, make sure to simulate all communications.
|
||||
* All Airflow tests are run with ``pytest``. Make sure to set your IDE/runners (see below) to use ``pytest`` by default.
|
||||
* For new tests, use standard "asserts" of Python and ``pytest`` decorators/context managers for testing
|
||||
rather than ``unittest`` ones. See `Pytest docs <http://doc.pytest.org/en/latest/assert.html>`_ for details.
|
||||
* Use a parameterized framework for tests that have variations in parameters.
|
||||
|
||||
**NOTE:** We plan to convert all unit tests to standard "asserts" semi-automatically but this will be done later
|
||||
in Airflow 2.0 development phase. That will include setUp/tearDown/context managers and decorators.
|
||||
|
||||
Running Unit Tests from IDE
|
||||
---------------------------
|
||||
|
@ -72,68 +72,65 @@ and run unit tests as follows:
|
|||
:align: center
|
||||
:alt: Running unit tests
|
||||
|
||||
Note that you can run the unit tests in the standalone local virtualenv
|
||||
**NOTE:** You can run the unit tests in the standalone local virtualenv
|
||||
(with no Breeze installed) if they do not have dependencies such as
|
||||
Postgres/MySQL/Hadoop/etc.
|
||||
|
||||
|
||||
Running Unit Tests
|
||||
--------------------------------
|
||||
To run unit, integration and system tests from the Breeze and your
|
||||
virtualenv you can use `pytest <http://doc.pytest.org/en/latest/>`_ framework.
|
||||
To run unit, integration, and system tests from the Breeze and your
|
||||
virtualenv, you can use the `pytest <http://doc.pytest.org/en/latest/>`_ framework.
|
||||
|
||||
Custom pytest plugin run ``airflow db init`` and ``airflow db reset`` the first
|
||||
time you launch them, so you can count on the database being initialized. Currently,
|
||||
when you run tests not supported **in the local virtualenv, the tests may either fail
|
||||
Custom ``pytest`` plugin run ``airflow db init`` and ``airflow db reset`` the first
|
||||
time you launch them. So, you can count on the database being initialized. Currently,
|
||||
when you run tests not supported **in the local virtualenv, they may either fail
|
||||
or provide an error message**.
|
||||
|
||||
There are many available options for selecting specific test in pytest. Details could be found
|
||||
in official documentation but here are few basic examples:
|
||||
There are many available options for selecting a specific test in ``pytest``. Details can be found
|
||||
in the official documentation but here are a few basic examples:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest -k "TestCore and not check"
|
||||
|
||||
This will run ``TestCore`` class but will skip tests of this class that includes 'check' in their names.
|
||||
For better performance (due to test collection) you should do:
|
||||
This runs the ``TestCore`` class but skips tests of this class that include 'check' in their names.
|
||||
For better performance (due to a test collection), run:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest tests/tests_core.py -k "TestCore and not bash".
|
||||
|
||||
This flag is useful when used like this:
|
||||
This flag is useful when used to run a single test like this:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest tests/tests_core.py -k "test_check_operators"
|
||||
|
||||
to run single test. This can also be done by specifying full path to the test:
|
||||
This can also be done by specifying a full path to the test:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest tests/test_core.py::TestCore::test_check_operators
|
||||
|
||||
To run whole test class:
|
||||
To run the whole test class, enter:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest tests/test_core.py::TestCore
|
||||
|
||||
You can use all available pytest flags, for example to increase log level
|
||||
for debugging purposes:
|
||||
You can use all available ``pytest`` flags. For example, to increase a log level
|
||||
for debugging purposes, enter:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest --log-level=DEBUG tests/test_core.py::TestCore
|
||||
|
||||
**Note:** We do not provide a clear distinction between tests
|
||||
(Unit/Integration/System tests), but we are working on it.
|
||||
|
||||
|
||||
Running Tests for a Specified Target using Breeze from the host
|
||||
Running Tests for a Specified Target Using Breeze from the Host
|
||||
---------------------------------------------------------------
|
||||
|
||||
If you wish to only run tests and not to drop into shell, you can do this by providing the
|
||||
If you wish to only run tests and not to drop into shell, apply the
|
||||
``-t``, ``--test-target`` flag. You can add extra pytest flags after ``--`` in the command line.
|
||||
|
||||
.. code-block:: bash
|
||||
|
@ -156,24 +153,24 @@ You can also specify individual tests or a group of tests:
|
|||
Airflow Integration Tests
|
||||
=========================
|
||||
|
||||
Some of the tests in Airflow are Integration tests. Those tests require not only airflow-testing docker
|
||||
image but also extra images with integrations (such as redis/mongodb etc.).
|
||||
Some of the tests in Airflow are integration tests. These tests require not only ``airflow-testing`` Docker
|
||||
image but also extra images with integrations (such as ``redis``, ``mongodb``, etc.).
|
||||
|
||||
|
||||
Enabling integrations
|
||||
Enabling Integrations
|
||||
---------------------
|
||||
|
||||
Running Airflow integration tests cannot be run in local virtualenv. They can only run in Breeze
|
||||
Airflow integration tests cannot be run in the local virtualenv. They can only run in the Breeze
|
||||
environment with enabled integrations and in Travis CI.
|
||||
|
||||
When you are in Breeze environment, by default all integrations are disabled - this way only true unit tests
|
||||
can be executed in Breeze. You can enable the integration by passing ``--integration <INTEGRATION>``
|
||||
When you are in the Breeze environment, by default all integrations are disabled. This enables only true unit tests
|
||||
to be executed in Breeze. You can enable the integration by passing the ``--integration <INTEGRATION>``
|
||||
switch when starting Breeze. You can specify multiple integrations by repeating the ``--integration`` switch
|
||||
or by using ``--integration all`` switch which enables all integrations.
|
||||
or by using the ``--integration all`` switch that enables all integrations.
|
||||
|
||||
Note, that every integration requires separate container with the corresponding integration image,
|
||||
so they take precious resources on your PC - mainly memory. The integrations started are not stopped
|
||||
until you stop the Breeze environment with ``--stop-environment`` switch.
|
||||
NOTE: Every integration requires a separate container with the corresponding integration image.
|
||||
So, they take precious resources on your PC, mainly, the memory. The started integrations are not stopped
|
||||
until you stop the Breeze environment with the ``--stop-environment`` switch.
|
||||
|
||||
The following integrations are available:
|
||||
|
||||
|
@ -196,40 +193,40 @@ The following integrations are available:
|
|||
* - redis
|
||||
- Integration required for Celery executor tests
|
||||
|
||||
Below command starts mongo integration only:
|
||||
To start the ``mongo`` integration only, enter:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
./breeze --integration mongo
|
||||
|
||||
Below command starts mongo and cassandra integrations:
|
||||
To start ``mongo`` and ``cassandra`` integrations, enter:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
./breeze --integration mongo --integration cassandra
|
||||
|
||||
Below command starts all integrations:
|
||||
To start all integrations, enter:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
./breeze --integration all
|
||||
|
||||
In the CI environment integrations can be enabled by specifying ``ENABLED_INTEGRATIONS`` variable
|
||||
storing space-separated list of integrations to start. Thanks to that we can run integration and
|
||||
integration-less tests separately in different jobs which is desired from the memory usage point of view.
|
||||
In the CI environment, integrations can be enabled by specifying the ``ENABLED_INTEGRATIONS`` variable
|
||||
storing a space-separated list of integrations to start. Thanks to that, we can run integration and
|
||||
integration-less tests separately in different jobs, which is desired from the memory usage point of view.
|
||||
|
||||
Note that Kerberos is a special kind of integration. There are some tests that run differently when
|
||||
Kerberos integration is enabled (they retrieve and use Kerberos authentication token) and differently when the
|
||||
Kerberos integration is disabled (they do not retrieve nor use the token). Therefore one of the test job
|
||||
for the CI system should run all tests with kerberos integration enabled to test both scenarios.
|
||||
Kerberos integration is enabled (they retrieve and use a Kerberos authentication token) and differently when the
|
||||
Kerberos integration is disabled (they neither retrieve nor use the token). Therefore, one of the test jobs
|
||||
for the CI system should run all tests with the Kerberos integration enabled to test both scenarios.
|
||||
|
||||
Running integration tests
|
||||
Running Integration Tests
|
||||
-------------------------
|
||||
|
||||
All tests that are using an integration are marked with custom pytest marker ``pytest.mark.integration``.
|
||||
The marker has single parameter - name of the integration.
|
||||
All tests using an integration are marked with a custom pytest marker ``pytest.mark.integration``.
|
||||
The marker has a single parameter - the name of an integration.
|
||||
|
||||
Example redis-integration test:
|
||||
Example of the ``redis`` integration test:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
|
@ -241,53 +238,56 @@ Example redis-integration test:
|
|||
self.assertTrue(redis.ping(), 'Connection to Redis with PING works.')
|
||||
|
||||
The markers can be specified at the test level or at the class level (then all tests in this class
|
||||
require the integration). You can add multiple markers with different integrations for tests that
|
||||
require an integration). You can add multiple markers with different integrations for tests that
|
||||
require more than one integration.
|
||||
|
||||
The behaviour of such marked tests is that it is skipped in case required integration is not enabled.
|
||||
The skip message will clearly say what's needed in order to use that tests.
|
||||
If such a marked test does not have a required integration enabled, it is skipped.
|
||||
The skip message clearly says what is needed to use the test.
|
||||
|
||||
You can run all tests that are using certain integration with the custom pytest flag ``--integrations``,
|
||||
where you can pass integrations as comma separated values. You can also specify ``all`` in order to start
|
||||
tests for all integrations. Note that if an integration is not enabled in Breeze or CI.
|
||||
To run all tests with a certain integration, use the custom pytest flag ``--integrations``,
|
||||
where you can pass integrations as comma-separated values. You can also specify ``all`` to start
|
||||
tests for all integrations.
|
||||
|
||||
Example that runs only ``mongo`` integration tests:
|
||||
**NOTE:** If an integration is not enabled in Breeze or Travis CI,
|
||||
the affected test will be skipped.
|
||||
|
||||
To run only ``mongo`` integration tests:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest --integrations mongo
|
||||
|
||||
Example that runs integration tests fot ``mogo`` and ``rabbitmq``:
|
||||
To run integration tests fot ``mongo`` and ``rabbitmq``:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest --integrations mongo,rabbitmq
|
||||
|
||||
Example that runs all integration tests:
|
||||
To runs all integration tests:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest --integrations all
|
||||
|
||||
Note that collecting all tests takes quite some time, so if you know where your tests are located you can
|
||||
speed up test collection significantly by providing the folder where the tests are located.
|
||||
Note that collecting all tests takes some time. So, if you know where your tests are located, you can
|
||||
speed up the test collection significantly by providing the folder where the tests are located.
|
||||
|
||||
Here is an example of collection limited only to apache providers directory:
|
||||
Here is an example of the collection limited to the ``providers/apache`` directory:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest --integrations cassandra tests/providers/apache/
|
||||
|
||||
Running backend-specific tests
|
||||
Running Backend-Specific Tests
|
||||
------------------------------
|
||||
|
||||
Some tests that are using a specific backend are marked with custom pytest marker ``pytest.mark.backend``.
|
||||
The marker has single parameter - name of the backend. It correspond with the ``--backend`` switch of
|
||||
the Breeze environment (one of ``mysql``, ``sqlite``, ``postgres``). Those tests will only run when
|
||||
the Breeze environment is running with the right backend. You can specify more than one backend
|
||||
in the marker - then the test will run for all those backends specified.
|
||||
Tests that are using a specific backend are marked with a custom pytest marker ``pytest.mark.backend``.
|
||||
The marker has a single parameter - the name of a backend. It corresponds to the ``--backend`` switch of
|
||||
the Breeze environment (one of ``mysql``, ``sqlite``, or ``postgres``). Backen-specific tests only run when
|
||||
the Breeze environment is running with the right backend. If you specify more than one backend
|
||||
in the marker, the test runs for all specified backends.
|
||||
|
||||
Example postgres-only test:
|
||||
Example of the ``postgres`` only test:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
|
@ -296,7 +296,7 @@ Example postgres-only test:
|
|||
...
|
||||
|
||||
|
||||
Example postgres,mysql test (they are skipped with sqlite backend):
|
||||
Example of the ``postgres,mysql`` test (they are skipped with the ``sqlite`` backend):
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
|
@ -305,8 +305,8 @@ Example postgres,mysql test (they are skipped with sqlite backend):
|
|||
...
|
||||
|
||||
|
||||
You can use custom ``--backend`` switch in pytest to only run tests specific for that backend.
|
||||
Here is an example of only running postgres-specific backend tests:
|
||||
You can use the custom ``--backend`` switch in pytest to only run tests specific for that backend.
|
||||
Here is an example of running only postgres-specific backend tests:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
|
@ -315,74 +315,74 @@ Here is an example of only running postgres-specific backend tests:
|
|||
Running Tests with Kubernetes
|
||||
-----------------------------
|
||||
|
||||
Starting Kubernetes Cluster when starting Breeze
|
||||
Starting Kubernetes Cluster when Starting Breeze
|
||||
................................................
|
||||
|
||||
In order to run Kubernetes in Breeze you can start Breeze with ``--start-kind-cluster`` switch. This will
|
||||
automatically create a Kind Kubernetes cluster in the same ``docker`` engine that is used to run Breeze
|
||||
To run Kubernetes in Breeze, you can start Breeze with the ``--start-kind-cluster`` switch. This
|
||||
automatically creates a Kind Kubernetes cluster in the same ``docker`` engine that is used to run Breeze.
|
||||
Setting up the Kubernetes cluster takes some time so the cluster continues running
|
||||
until the cluster is stopped with ``--stop-kind-cluster`` switch or until ``--recreate-kind-cluster``
|
||||
switch is used rather than ``--start-kind-cluster``. Starting breeze with kind cluster automatically
|
||||
until the it is stopped with the ``--stop-kind-cluster`` switch or until the ``--recreate-kind-cluster``
|
||||
switch is used rather than ``--start-kind-cluster``. Starting Breeze with the Kind Cluster automatically
|
||||
sets ``runtime`` to ``kubernetes`` (see below).
|
||||
|
||||
The cluster name follows the pattern ``airflow-python-X.Y.Z-vA.B.C`` where X.Y.Z is Python version
|
||||
and A.B.C is kubernetes version. This way you can have multiple clusters setup and running at the same
|
||||
time for different python versions and different kubernetes versions.
|
||||
The cluster name follows the pattern ``airflow-python-X.Y.Z-vA.B.C`` where X.Y.Z is a Python version
|
||||
and A.B.C is a Kubernetes version. This way you can have multiple clusters set up and running at the same
|
||||
time for different Python versions and different Kubernetes versions.
|
||||
|
||||
The Control Plane is available from inside the docker image via ``<CLUSTER_NAME>-control-plane:6443``
|
||||
host:port, the worker of the kind cluster is available at <CLUSTER_NAME>-worker
|
||||
The Control Plane is available from inside the Docker image via ``<CLUSTER_NAME>-control-plane:6443``
|
||||
host:port, the worker of the Kind Cluster is available at <CLUSTER_NAME>-worker
|
||||
and webserver port for the worker is 30809.
|
||||
|
||||
The Kubernetes Cluster is started but in order to deploy airflow to Kubernetes cluster you need to:
|
||||
After the Kubernetes Cluster is started, you need to deploy Airflow to the cluster:
|
||||
|
||||
1. Build the image.
|
||||
2. Load it to Kubernetes cluster.
|
||||
3. Deploy airflow application.
|
||||
2. Load it to the Kubernetes cluster.
|
||||
3. Deploy the Airflow application.
|
||||
|
||||
It can be done with single script: ``./scripts/ci/in_container/kubernetes/deploy_airflow_to_kubernetes.sh``
|
||||
It can be done with a single script: ``./scripts/ci/in_container/kubernetes/deploy_airflow_to_kubernetes.sh``.
|
||||
|
||||
You can, however, work separately on the image in Kubernetes and deploying the Airflow app in the cluster.
|
||||
|
||||
Building Airflow Images and Loading them to Kubernetes cluster
|
||||
Building and Loading Airflow Images to Kubernetes Cluster
|
||||
..............................................................
|
||||
|
||||
This is done using ``./scripts/ci/in_container/kubernetes/docker/rebuild_airflow_image.sh`` script:
|
||||
Use the script ``./scripts/ci/in_container/kubernetes/docker/rebuild_airflow_image.sh`` that does the following:
|
||||
|
||||
1. Latest ``apache/airflow:master-pythonX.Y-ci`` images are rebuilt using latest sources.
|
||||
2. New Kubernetes image based on the ``apache/airflow:master-pythonX.Y-ci`` is built with
|
||||
necessary scripts added to run in kubernetes. The image is tagged with
|
||||
``apache/airflow:master-pythonX.Y-ci-kubernetes`` tag.
|
||||
3. The image is loaded to the kind cluster using ``kind load`` command
|
||||
1. Rebuilds the latest ``apache/airflow:master-pythonX.Y-ci`` images using the latest sources.
|
||||
2. Builds a new Kubernetes image based on the ``apache/airflow:master-pythonX.Y-ci`` using
|
||||
necessary scripts added to run in Kubernetes. The image is tagged as
|
||||
``apache/airflow:master-pythonX.Y-ci-kubernetes``.
|
||||
3. Loads the image to the Kind Cluster using the ``kind load`` command.
|
||||
|
||||
Deploying Airflow Application in the Kubernetes cluster
|
||||
Deploying the Airflow Application in the Kubernetes Cluster
|
||||
.......................................................
|
||||
|
||||
This is done using ``./scripts/ci/in_container/kubernetes/app/deploy_app.sh`` script:
|
||||
Use the script ``./scripts/ci/in_container/kubernetes/app/deploy_app.sh`` that does the following:
|
||||
|
||||
1. Kubernetes resources are prepared by processing template from ``template`` directory, replacing
|
||||
1. Prepares Kubernetes resources by processing a template from the ``template`` directory and replacing
|
||||
variables with the right images and locations:
|
||||
- configmaps.yaml
|
||||
- airflow.yaml
|
||||
2. The existing resources are used without replacing any variables inside:
|
||||
2. Uses the existing resources without replacing any variables inside:
|
||||
- secrets.yaml
|
||||
- postgres.yaml
|
||||
- volumes.yaml
|
||||
3. All the resources are applied in the Kind cluster
|
||||
4. The script will wait until all the applications are ready and reachable
|
||||
3. Applies all the resources to the Kind Cluster.
|
||||
4. Waits for all the applications to be ready and reachable.
|
||||
|
||||
After the deployment is finished you can run Kubernetes tests immediately in the same way as other tests.
|
||||
The Kubernetes tests are in ``tests/runtime/kubernetes`` folder.
|
||||
After the deployment is finished, you can run Kubernetes tests immediately in the same way as other tests.
|
||||
The Kubernetes tests are available in the ``tests/runtime/kubernetes`` folder.
|
||||
|
||||
You can run all the integration tests for Kubernetes with ``pytest tests/runtime/kubernetes``.
|
||||
|
||||
|
||||
Running runtime-specific tests
|
||||
Running Runtime-Specific Tests
|
||||
------------------------------
|
||||
|
||||
Some tests that are using a specific runtime are marked with custom pytest marker ``pytest.mark.runtime``.
|
||||
The marker has single parameter - name of the runtime. For the moment the only supported runtime is
|
||||
``kubernetes``. This runtime is set when you run Breeze with ``--start-kind-cluster`` option.
|
||||
Those tests will only run when the selectd runtime is started.
|
||||
Tests using a specific runtime are marked with a custom pytest marker ``pytest.mark.runtime``.
|
||||
The marker has a single parameter - the name of a runtime. At the moment the only supported runtime is
|
||||
``kubernetes``. This runtime is set when you run Breeze with the ``--start-kind-cluster`` option.
|
||||
Runtime-specific tests run only when the selectd runtime is started.
|
||||
|
||||
|
||||
.. code-block:: python
|
||||
|
@ -391,16 +391,16 @@ Those tests will only run when the selectd runtime is started.
|
|||
class TestKubernetesExecutor(unittest.TestCase):
|
||||
|
||||
|
||||
You can use custom ``--runtime`` switch in pytest to only run tests specific for that backend.
|
||||
You can use the custom ``--runtime`` switch in pytest to only run tests specific for that backend.
|
||||
|
||||
Here is an example of only running kubernetes-runtime backend tests:
|
||||
To run only kubernetes-runtime backend tests, enter:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
pytest --runtime kubernetes
|
||||
|
||||
Note! For convenience and faster search, all runtime tests are stored in ``tests.runtime`` package. You
|
||||
can speed up collection of tests in this case by:
|
||||
**NOTE:** For convenience and faster search, all runtime tests are stored in the ``tests.runtime`` package. In this case, you
|
||||
can speed up the collection of tests by running:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
|
@ -474,21 +474,99 @@ More information:
|
|||
Airflow System Tests
|
||||
====================
|
||||
|
||||
The System tests for Airflow are not yet fully implemented. They are Work In Progress of the
|
||||
`AIP-4 Support for System Tests for external systems <https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-4+Support+for+System+Tests+for+external+systems>`__.
|
||||
These tests need to communicate with external services/systems that are available
|
||||
System tests need to communicate with external services/systems that are available
|
||||
if you have appropriate credentials configured for your tests.
|
||||
The tests derive from ``tests.system_test_class.SystemTests`` class.
|
||||
The system tests derive from the ``tests.test_utils.system_test_class.SystemTests`` class. They should also
|
||||
be marked with ``@pytest.marker.system(SYSTEM)`` where ``system`` designates the system
|
||||
to be tested (for example, ``google.cloud``). These tests are skipped by default.
|
||||
You can execute the system tests by providing the ``--systems SYSTEMS`` flag to ``pytest``.
|
||||
|
||||
The system tests execute a specified
|
||||
example DAG file that runs the DAG end-to-end.
|
||||
The system tests execute a specified example DAG file that runs the DAG end-to-end.
|
||||
|
||||
An example of such a system test is
|
||||
``airflow.tests.providers.google.operators.test_natural_language_system.CloudNaturalLanguageExampleDagsTest``.
|
||||
See more details about adding new system tests below.
|
||||
|
||||
For now you can execute the system tests and follow messages printed to get them running. Soon more information on
|
||||
running the tests will be available.
|
||||
Running System Tests
|
||||
--------------------
|
||||
**Prerequisites:** You may need to set some variables to run system tests. If you need to
|
||||
add some intialization of environment variables to Breeze, you can always add a
|
||||
``variables.env`` file in the ``files/airflow-breeze-config/variables.env`` file. It will be automatically
|
||||
sourced when entering the Breeze environment.
|
||||
|
||||
To execute system tests, specify the ``--systems SYSTEMS``
|
||||
flag where ``SYSTEMS`` is a coma-separated list of systems to run the system tests for.
|
||||
|
||||
Forwarding Authentication from the Host
|
||||
----------------------------------------------------
|
||||
|
||||
For system tests, you can also forward authentication from the host to your Breeze container. You can specify
|
||||
the ``--forward-credentials`` flag when starting Breeze. Then, it will also forward the most commonly used
|
||||
credentials stored in your ``home`` directory. Use this feature with care as it makes your personal credentials
|
||||
visible to anything that you have installed inside the Docker container.
|
||||
|
||||
Currently forwarded credentials are:
|
||||
* all credentials stored in ``${HOME}/.config`` (for example, GCP credentials)
|
||||
* credentials stored in ``${HOME}/.gsutil`` for ``gsutil`` tool from GCS
|
||||
* credentials stored in ``${HOME}/.boto`` and ``${HOME}/.s3`` (for AWS authentication)
|
||||
* credentials stored in ``${HOME}/.docker`` for docker
|
||||
* credentials stored in ``${HOME}/.kube`` for kubectl
|
||||
* credentials stored in ``${HOME}/.ssh`` for SSH
|
||||
|
||||
|
||||
Adding a New System Test
|
||||
--------------------------
|
||||
|
||||
We are working on automating system tests execution (AIP-4) but for now system tests are skipped when
|
||||
tests are run in our CI system. But to enable the test automation, we encourage you to add system
|
||||
tests whenever an operator/hook/sensor is added/modified in a given system.
|
||||
|
||||
* To add your own system tests, derive them from the
|
||||
``tests.test_utils.system_tests_class.SystemTest` class and mark with the
|
||||
``@pytest.mark.system(SYSTEM_NAME)`` marker. The system name should follow the path defined in
|
||||
the ``providers`` package (for example, the system tests from ``tests.providers.google.cloud``
|
||||
package should be marked with ``@pytest.mark.system("google.cloud")``.
|
||||
* If your system tests need some credential files to be available for an
|
||||
authentication with external systems, make sure to keep these credentials in the
|
||||
``files/airflow-breeze-config/keys`` directory. Mark your tests with
|
||||
``@pytest.mark.credential_file(<FILE>)`` so that they are skipped if such a credential file is not there.
|
||||
The tests should read the right credentials and authenticate on their own. The credentials are read
|
||||
in Breeze from the ``/files`` directory. The local "files" folder is mounted to the "/files" folder in Breeze.
|
||||
* If your system tests are long-lasting ones (i.e., require more than 20-30 minutes
|
||||
to complete), mark them with the ```@pytest.markers.long_running`` marker.
|
||||
Such tests are skipped by default unless you specify the ``--long-lasting`` flag to pytest.
|
||||
* The system test itself (python class) does not have any logic. Such a test runs
|
||||
the DAG specified by its ID. This DAG should contain the actual DAG logic
|
||||
to execute. Make sure to define the DAG in ``providers/<SYSTEM_NAME>/example_dags``. These example DAGs
|
||||
are also used to take some snippets of code out of them when documentation is generated. So, having these
|
||||
DAGs runnable is a great way to make sure the documenation is describing a working example. Inside
|
||||
your test class/test method, simply use ``self.run_dag(<DAG_ID>,<DAG_FOLDER>)`` to run the DAG. Then,
|
||||
the system class will take care about running the DAG. Note that the DAG_FOLDER should be
|
||||
a subdirectory of the ``tests.test_utils.AIRFLOW_MAIN_FOLDER`` + ``providers/<SYSTEM_NAME>/example_dags``.
|
||||
|
||||
An example of a system test is available in:
|
||||
|
||||
``airflow.tests.providers.google.operators.test_natunal_language_system.CloudNaturalLanguageExampleDagsTest``.
|
||||
|
||||
It runs the DAG defined in ``airflow.providers.google.cloud.example_dags.example_natural_language.py``.
|
||||
|
||||
Running Tests for Older Airflow Versions
|
||||
----------------------------------------
|
||||
|
||||
The tests can be executed against the master version of Airflow but they also work
|
||||
with older versions. This is especially useful to test back-ported operators
|
||||
from Airflow 2.0 to 1.10.* versions.
|
||||
|
||||
To run the tests for Airflow 1.10.* series, you need to run Breeze with
|
||||
``--install-airflow-version==<VERSION>`` to install a different version of Airflow.
|
||||
If ``current`` is specified (default), then the current version of Airflow is used.
|
||||
Otherwise, the released version of Airflow is installed.
|
||||
|
||||
The commands make sure that the source version of master Airflow is removed and the released version of
|
||||
Airflow from ``Pypi`` is installed. Note that tests sources are not removed and they can be used
|
||||
to run tests (unit tests and system tests) against the freshly installed version.
|
||||
|
||||
This works best for system tests: all the system tests should work for at least latest released 1.10.x
|
||||
Airflow version. Some of the unit and integration tests might also work in the same
|
||||
fashion but it is not necessary or expected.
|
||||
|
||||
Local and Remote Debugging in IDE
|
||||
=================================
|
||||
|
@ -518,14 +596,14 @@ your local sources to the ``/opt/airflow`` location of the sources within the co
|
|||
:align: center
|
||||
:alt: Source code mapping
|
||||
|
||||
DAG testing
|
||||
DAG Testing
|
||||
===========
|
||||
|
||||
To ease and speed up process of developing DAGs you can use
|
||||
py:class:`~airflow.executors.debug_executor.DebugExecutor` - a single process executor
|
||||
for debugging purposes. Using this executor you can run and debug DAGs from your IDE.
|
||||
To ease and speed up process of developing DAGs, you can use
|
||||
py:class:`~airflow.executors.debug_executor.DebugExecutor`, which is a single process executor
|
||||
for debugging purposes. Using this executor, you can run and debug DAGs from your IDE.
|
||||
|
||||
**IDE setup steps:**
|
||||
To set up the IDE:
|
||||
|
||||
1. Add ``main`` block at the end of your DAG file to make it runnable.
|
||||
It will run a backfill job:
|
||||
|
@ -537,16 +615,16 @@ It will run a backfill job:
|
|||
dag.run()
|
||||
|
||||
|
||||
2. Setup ``AIRFLOW__CORE__EXECUTOR=DebugExecutor`` in run configuration of your IDE. In
|
||||
this step you should also setup all environment variables required by your DAG.
|
||||
2. Set up ``AIRFLOW__CORE__EXECUTOR=DebugExecutor`` in the run configuration of your IDE.
|
||||
Make sure to also set up all environment variables required by your DAG.
|
||||
|
||||
3. Run and debug the DAG file.
|
||||
|
||||
Additionally ``DebugExecutor`` can be used in a fail-fast mode that will make
|
||||
all other running or scheduled tasks fail immediately. To enable this option set
|
||||
Additionally, ``DebugExecutor`` can be used in a fail-fast mode that will make
|
||||
all other running or scheduled tasks fail immediately. To enable this option, set
|
||||
``AIRFLOW__DEBUG__FAIL_FAST=True`` or adjust ``fail_fast`` option in your ``airflow.cfg``.
|
||||
|
||||
Also, with the Airflow CLI command ``airflow dags test`` you can execute one complete run of a DAG:
|
||||
Also, with the Airflow CLI command ``airflow dags test``, you can execute one complete run of a DAG:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
|
@ -554,52 +632,52 @@ Also, with the Airflow CLI command ``airflow dags test`` you can execute one com
|
|||
airflow dags test example_branch_operator 2018-01-01
|
||||
|
||||
|
||||
BASH unit testing (BATS)
|
||||
BASH Unit Testing (BATS)
|
||||
========================
|
||||
|
||||
We have started to add tests to cover Bash scripts we have in our codeabase.
|
||||
The tests are placed in ``tests\bats`` folder.
|
||||
They require BAT CLI to be installed if you want to run them in your
|
||||
host or via docker image.
|
||||
We have started adding tests to cover Bash scripts we have in our codebase.
|
||||
The tests are placed in the ``tests\bats`` folder.
|
||||
They require BAT CLI to be installed if you want to run them on your
|
||||
host or via a Docker image.
|
||||
|
||||
BATS CLI installation
|
||||
Installing BATS CLI
|
||||
---------------------
|
||||
|
||||
You can find installation guide as well as information on how to write
|
||||
the bash tests in [BATS installation](https://github.com/bats-core/bats-core#installation)
|
||||
You can find an installation guide as well as information on how to write
|
||||
the bash tests in `BATS Installation <https://github.com/bats-core/bats-core#installation>`_.
|
||||
|
||||
Running BATS tests in the host
|
||||
Running BATS Tests on the Host
|
||||
------------------------------
|
||||
|
||||
Running all tests:
|
||||
To run all tests:
|
||||
|
||||
```
|
||||
bats -r tests/bats/
|
||||
```
|
||||
|
||||
Running single test:
|
||||
To run a single test:
|
||||
|
||||
```
|
||||
bats tests/bats/your_test_file.bats
|
||||
```
|
||||
|
||||
Running BATS tests via docker
|
||||
Running BATS Tests via Docker
|
||||
-----------------------------
|
||||
|
||||
Running all tests:
|
||||
To run all tests:
|
||||
|
||||
```
|
||||
docker run -it --workdir /airflow -v $(pwd):/airflow bats/bats:latest -r /airflow/tests/bats
|
||||
```
|
||||
|
||||
Running single test:
|
||||
To run a single test:
|
||||
|
||||
```
|
||||
docker run -it --workdir /airflow -v $(pwd):/airflow bats/bats:latest /airflow/tests/bats/your_test_file.bats
|
||||
```
|
||||
|
||||
BATS usage
|
||||
Using BATS
|
||||
----------
|
||||
|
||||
You can read more about using BATS CLI and writing tests in:
|
||||
[BATS usage](https://github.com/bats-core/bats-core#usage)
|
||||
You can read more about using BATS CLI and writing tests in
|
||||
`BATS Usage <https://github.com/bats-core/bats-core#usage>`_.
|
||||
|
|
147
breeze
147
breeze
|
@ -84,6 +84,7 @@ function setup_default_breeze_variables() {
|
|||
|
||||
# If true, the docker images are rebuilt locally.
|
||||
export NEEDS_DOCKER_BUILD="false"
|
||||
export NEEDS_DOCKER_BUILD="false"
|
||||
|
||||
# By default we only pull images if we do not have them locally.
|
||||
# This can be overridden by -p flag
|
||||
|
@ -101,7 +102,14 @@ function setup_default_breeze_variables() {
|
|||
# By default we do not push images. This can be overridden by -u flag.
|
||||
export PUSH_IMAGES=${PUSH_IMAGES:="false"}
|
||||
|
||||
# Determine version of the Airflow from version.py
|
||||
# Forward credentials to docker
|
||||
export FORWARD_CREDENTIALS="false"
|
||||
|
||||
# If install released airflow is set to specified version, then the source version of airflow
|
||||
# is removed and the specified version of airflow is installed from pypi
|
||||
export INSTALL_AIRFLOW_VERSION=${INSTALL_AIRFLOW_VERSION:="current"}
|
||||
|
||||
# Determine version of the Airflow from version.py
|
||||
AIRFLOW_VERSION=$(cat airflow/version.py - << EOF | python
|
||||
print(version.replace("+",""))
|
||||
EOF
|
||||
|
@ -123,6 +131,7 @@ EOF
|
|||
_BREEZE_DEFAULT_BACKEND="sqlite"
|
||||
_BREEZE_DEFAULT_KUBERNETES_MODE="git_mode"
|
||||
_BREEZE_DEFAULT_KUBERNETES_VERSION="v1.15.3"
|
||||
_BREEZE_DEFAULT_INSTALL_AIRFLOW_VERSION="current"
|
||||
|
||||
STATIC_CHECK_PYTHON_VERSION=3.6
|
||||
}
|
||||
|
@ -299,13 +308,14 @@ function print_badge {
|
|||
&&&&&&&&&&&&& && &&&&@ &&&&&&&&&&&@ &&&&&&&&&&&& @&&&&&&&&&&& &&&&&&&&&&&
|
||||
|
||||
|
||||
Branch name: ${BRANCH_NAME}
|
||||
Docker image: ${AIRFLOW_CI_IMAGE}
|
||||
Airflow version: ${AIRFLOW_VERSION}
|
||||
Python version: ${PYTHON_VERSION}
|
||||
DockerHub user: ${DOCKERHUB_USER}
|
||||
DockerHub repo: ${DOCKERHUB_REPO}
|
||||
Backend: ${BACKEND}
|
||||
Branch name: ${BRANCH_NAME}
|
||||
Docker image: ${AIRFLOW_CI_IMAGE}
|
||||
Airflow source version: ${AIRFLOW_VERSION}
|
||||
Airflow installed: ${INSTALL_AIRFLOW_VERSION}
|
||||
Python version: ${PYTHON_VERSION}
|
||||
DockerHub user: ${DOCKERHUB_USER}
|
||||
DockerHub repo: ${DOCKERHUB_REPO}
|
||||
Backend: ${BACKEND}
|
||||
EOF
|
||||
if [[ ${RUNTIME} == "kubernetes" ]]; then
|
||||
cat <<EOF
|
||||
|
@ -322,13 +332,14 @@ EOF
|
|||
else
|
||||
cat <<EOF
|
||||
|
||||
Branch name: ${BRANCH_NAME}
|
||||
Docker image: ${AIRFLOW_CI_IMAGE}
|
||||
Airflow version: ${AIRFLOW_VERSION}
|
||||
Python version: ${PYTHON_VERSION}
|
||||
DockerHub user: ${DOCKERHUB_USER}
|
||||
DockerHub repo: ${DOCKERHUB_REPO}
|
||||
Backend: ${BACKEND}
|
||||
Branch name: ${BRANCH_NAME}
|
||||
Docker image: ${AIRFLOW_CI_IMAGE}
|
||||
Airflow source version: ${AIRFLOW_VERSION}
|
||||
Airflow installed: ${INSTALL_AIRFLOW_VERSION}
|
||||
Python version: ${PYTHON_VERSION}
|
||||
DockerHub user: ${DOCKERHUB_USER}
|
||||
DockerHub repo: ${DOCKERHUB_REPO}
|
||||
Backend: ${BACKEND}
|
||||
EOF
|
||||
if [[ ${RUNTIME} == "kubernetes" ]]; then
|
||||
cat <<EOF
|
||||
|
@ -363,6 +374,7 @@ export ENABLE_KIND_CLUSTER="${ENABLE_KIND_CLUSTER}"
|
|||
export KUBERNETES_MODE="${KUBERNETES_MODE}"
|
||||
export KUBERNETES_VERSION="${KUBERNETES_VERSION}"
|
||||
export AIRFLOW_VERSION="${AIRFLOW_VERSION}"
|
||||
export INSTALL_AIRFLOW_VERSION="${INSTALL_AIRFLOW_VERSION}"
|
||||
export RUN_TESTS="${TESTS}"
|
||||
export WEBSERVER_HOST_PORT="${WEBSERVER_HOST_PORT}"
|
||||
export POSTGRES_HOST_PORT="${POSTGRES_HOST_PORT}"
|
||||
|
@ -378,6 +390,8 @@ function prepare_command_files() {
|
|||
BACKEND_DOCKER_COMPOSE_FILE=${SCRIPTS_CI_DIR}/docker-compose/backend-${BACKEND}.yml
|
||||
LOCAL_DOCKER_COMPOSE_FILE=${SCRIPTS_CI_DIR}/docker-compose/local.yml
|
||||
KUBERNETES_DOCKER_COMPOSE_FILE=${SCRIPTS_CI_DIR}/docker-compose/runtime-kubernetes.yml
|
||||
REMOVE_SOURCES_DOCKER_COMPOSE_FILE=${SCRIPTS_CI_DIR}/docker-compose/remove-sources.yml
|
||||
FORWARD_CREDENTIALS_DOCKER_COMPOSE_FILE=${SCRIPTS_CI_DIR}/docker-compose/forward-credentials.yml
|
||||
|
||||
COMPOSE_FILE=${MAIN_DOCKER_COMPOSE_FILE}:${BACKEND_DOCKER_COMPOSE_FILE}
|
||||
|
||||
|
@ -385,6 +399,14 @@ function prepare_command_files() {
|
|||
COMPOSE_FILE=${COMPOSE_FILE}:${LOCAL_DOCKER_COMPOSE_FILE}
|
||||
fi
|
||||
|
||||
if [[ ${FORWARD_CREDENTIALS} == "true" ]]; then
|
||||
COMPOSE_FILE=${COMPOSE_FILE}:${FORWARD_CREDENTIALS_DOCKER_COMPOSE_FILE}
|
||||
fi
|
||||
|
||||
if [[ ${INSTALL_AIRFLOW_VERSION} != "current" ]]; then
|
||||
COMPOSE_FILE=${COMPOSE_FILE}:${REMOVE_SOURCES_DOCKER_COMPOSE_FILE}
|
||||
fi
|
||||
|
||||
if [[ ${RUNTIME} == "kubernetes" ]]; then
|
||||
COMPOSE_FILE=${COMPOSE_FILE}:${KUBERNETES_DOCKER_COMPOSE_FILE}
|
||||
fi
|
||||
|
@ -510,6 +532,11 @@ function parse_arguments() {
|
|||
echo "Skip mounting local sources: ${SKIP_MOUNTING_LOCAL_SOURCES}"
|
||||
echo
|
||||
shift ;;
|
||||
-N|--install-airflow-version)
|
||||
INSTALL_AIRFLOW_VERSION="${2}"
|
||||
echo "Installs version of Airflow: ${INSTALL_AIRFLOW_VERSION}"
|
||||
echo
|
||||
shift 2 ;;
|
||||
-b|--build-only)
|
||||
COMMAND_TO_RUN="build_ci_images_only"
|
||||
# if you want to build an image - assume you want to build it :)
|
||||
|
@ -611,6 +638,11 @@ function parse_arguments() {
|
|||
echo
|
||||
COMMAND_TO_RUN="perform_initialize_local_virtualenv"
|
||||
shift ;;
|
||||
-f|--forward-credentials)
|
||||
echo "Fowarding credentials. Be careful as your credentials ar available in the container!"
|
||||
echo
|
||||
export FORWARD_CREDENTIALS="true"
|
||||
shift 1 ;;
|
||||
-a|--setup-autocomplete)
|
||||
echo "Setting up autocomplete"
|
||||
echo
|
||||
|
@ -639,14 +671,14 @@ function parse_arguments() {
|
|||
export PYTHON_VERSION=${STATIC_CHECK_PYTHON_VERSION}
|
||||
export STATIC_CHECK="${2}"
|
||||
export STATIC_CHECK_ALL_FILES="false"
|
||||
export EXTRA_STATIC_CHECK_OPTIONS=("--show-diff-on-failure")
|
||||
EXTRA_STATIC_CHECK_OPTIONS+=("--show-diff-on-failure")
|
||||
shift 2 ;;
|
||||
-F|--static-check-all-files)
|
||||
COMMAND_TO_RUN="perform_static_checks"
|
||||
export PYTHON_VERSION=${STATIC_CHECK_PYTHON_VERSION}
|
||||
export STATIC_CHECK="${2}"
|
||||
export STATIC_CHECK_ALL_FILES="true"
|
||||
export EXTRA_STATIC_CHECK_OPTIONS=("--all-files" "--show-diff-on-failure")
|
||||
EXTRA_STATIC_CHECK_OPTIONS+=("--all-files" "--show-diff-on-failure")
|
||||
shift 2 ;;
|
||||
-O|--build-docs)
|
||||
COMMAND_TO_RUN="build_docs"
|
||||
|
@ -667,6 +699,22 @@ function parse_arguments() {
|
|||
}
|
||||
|
||||
usage() {
|
||||
LIST_PREFIX=" "
|
||||
WIDTH=$((SEPARATOR_WIDTH>80 ? 80: SEPARATOR_WIDTH))
|
||||
ALLOWED_PYTHON_VERSIONS=$(echo "${_BREEZE_ALLOWED_PYTHON_VERSIONS=""}" | tr '\n' ' ' | \
|
||||
fold -w "${WIDTH}" -s | sed "s/^/${LIST_PREFIX}/")
|
||||
ALLOWED_BACKENDS=$(echo "${_BREEZE_ALLOWED_BACKENDS=""}" | tr '\n' ' ' | \
|
||||
fold -w "${WIDTH}" -s | sed "s/^/${LIST_PREFIX}/")
|
||||
ALLOWED_STATIC_CHECKS=$(echo "${_BREEZE_ALLOWED_STATIC_CHECKS=""}" | tr '\n' ' ' | \
|
||||
fold -w "${WIDTH}" -s | sed "s/^/${LIST_PREFIX}/")
|
||||
ALLOWED_INTEGRATIONS=$(echo "${_BREEZE_ALLOWED_INTEGRATIONS=""}" | tr '\n' ' ' | \
|
||||
fold -w "${WIDTH}" -s | sed "s/^/${LIST_PREFIX}/")
|
||||
ALLOWED_KUBERNETES_MODES=$(echo "${_BREEZE_ALLOWED_KUBERNETES_MODES=""}" | tr '\n' ' ' | \
|
||||
fold -w "${WIDTH}" -s | sed "s/^/${LIST_PREFIX}/")
|
||||
ALLOWED_KUBERNETES_VERSIONS=$(echo "${_BREEZE_ALLOWED_KUBERNETES_VERSIONS=""}" | tr '\n' ' ' | \
|
||||
fold -w "${WIDTH}" -s | sed "s/^/${LIST_PREFIX}/")
|
||||
ALLOWED_INSTALL_AIRFLOW_VERSIONS=$(echo "${_BREEZE_ALLOWED_INSTALL_AIRFLOW_VERSIONS=""}" | \
|
||||
tr '\n' ' ' | fold -w "${WIDTH}" -s | sed "s/^/${LIST_PREFIX}/")
|
||||
echo """
|
||||
|
||||
*********************************************************************************************************
|
||||
|
@ -721,8 +769,10 @@ The swiss-knife-army tool for Airflow testings. It allows to perform various tes
|
|||
|
||||
-S, --static-check <STATIC_CHECK>
|
||||
Run selected static checks for currently changed files. You should specify static check that
|
||||
you would like to run or 'all' to run all checks. One of
|
||||
[${_BREEZE_ALLOWED_STATIC_CHECKS:=}].
|
||||
you would like to run or 'all' to run all checks. One of:
|
||||
|
||||
${ALLOWED_STATIC_CHECKS}
|
||||
|
||||
You can pass extra arguments including options to to the pre-commit framework as
|
||||
<EXTRA_ARGS> passed after --. For example:
|
||||
|
||||
|
@ -735,8 +785,10 @@ The swiss-knife-army tool for Airflow testings. It allows to perform various tes
|
|||
|
||||
-F, --static-check-all-files <STATIC_CHECK>
|
||||
Run selected static checks for all applicable files. You should specify static check that
|
||||
you would like to run or 'all' to run all checks. One of
|
||||
[${_BREEZE_ALLOWED_STATIC_CHECKS:=}].
|
||||
you would like to run or 'all' to run all checks. One of:
|
||||
|
||||
${ALLOWED_STATIC_CHECKS}
|
||||
|
||||
You can pass extra arguments including options to the pre-commit framework as
|
||||
<EXTRA_ARGS> passed after --. For example:
|
||||
|
||||
|
@ -771,17 +823,26 @@ The swiss-knife-army tool for Airflow testings. It allows to perform various tes
|
|||
|
||||
-P, --python <PYTHON_VERSION>
|
||||
Python version used for the image. This is always major/minor version.
|
||||
One of [${_BREEZE_ALLOWED_PYTHON_VERSIONS:=}]. Default is the python3 or python on the path.
|
||||
One of:
|
||||
|
||||
${ALLOWED_PYTHON_VERSIONS}
|
||||
|
||||
|
||||
-B, --backend <BACKEND>
|
||||
Backend to use for tests - it determines which database is used.
|
||||
One of [${_BREEZE_ALLOWED_BACKENDS:=}]. Default: ${_BREEZE_DEFAULT_BACKEND:=}
|
||||
One of:
|
||||
|
||||
${ALLOWED_BACKENDS}
|
||||
|
||||
Default: ${_BREEZE_DEFAULT_BACKEND:=}
|
||||
|
||||
-I, --integration <INTEGRATION>
|
||||
Integration to start during tests - it determines which integrations are started for integration
|
||||
tests. There can be more than one integration started, or all to start all integrations.
|
||||
Selected integrations are not saved for future execution.
|
||||
One of [${_BREEZE_ALLOWED_INTEGRATIONS:=}].
|
||||
One of:
|
||||
|
||||
${ALLOWED_INTEGRATIONS}
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
|
@ -808,11 +869,19 @@ The swiss-knife-army tool for Airflow testings. It allows to perform various tes
|
|||
|
||||
-M, --kubernetes-mode <KUBERNETES_MODE>
|
||||
Kubernetes mode - only used in case --start-kind-cluster flag is specified.
|
||||
One of [${_BREEZE_ALLOWED_KUBERNETES_MODES:=}]. Default: ${_BREEZE_DEFAULT_KUBERNETES_MODE:=}
|
||||
One of:
|
||||
|
||||
${ALLOWED_KUBERNETES_MODES}
|
||||
|
||||
Default: ${_BREEZE_DEFAULT_KUBERNETES_MODE:=}
|
||||
|
||||
-V, --kubernetes-version <KUBERNETES_VERSION>
|
||||
Kubernetes version - only used in case --start-kind-cluster flag is specified.
|
||||
One of [${_BREEZE_ALLOWED_KUBERNETES_VERSIONS:=}]. Default: ${_BREEZE_DEFAULT_KUBERNETES_VERSION:=}
|
||||
One of:
|
||||
|
||||
${ALLOWED_KUBERNETES_VERSIONS}
|
||||
|
||||
Default: ${_BREEZE_DEFAULT_KUBERNETES_VERSION:=}
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
|
@ -824,6 +893,21 @@ The swiss-knife-army tool for Airflow testings. It allows to perform various tes
|
|||
Skips mounting local volume with sources - you get exactly what is in the
|
||||
docker image rather than your current local sources of airflow.
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
** Install Airflow if different than current
|
||||
**
|
||||
*********************************************************************************************************
|
||||
|
||||
-N, --install-airflow-version <INSTALL_AIRFLOW_VERSION>
|
||||
If different than 'current' removes the source-installed airflow and installs a
|
||||
released version of Airflow instead. One of:
|
||||
|
||||
|
||||
${ALLOWED_INSTALL_AIRFLOW_VERSIONS}
|
||||
|
||||
Default: ${_BREEZE_DEFAULT_INSTALL_AIRFLOW_VERSION:=}.
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
** Assume answers to questions
|
||||
|
@ -836,6 +920,16 @@ The swiss-knife-army tool for Airflow testings. It allows to perform various tes
|
|||
-n, --assume-no
|
||||
Assume 'no' answer to all questions.
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
** Credentials
|
||||
**
|
||||
*********************************************************************************************************
|
||||
|
||||
-f, --forward-credentials
|
||||
Forwards host credentials to docker container. Use with care as it will make your credentials
|
||||
Available to everything you install in Docker.
|
||||
|
||||
*********************************************************************************************************
|
||||
**
|
||||
** Increase verbosity of the script
|
||||
|
@ -975,6 +1069,7 @@ function check_and_save_all_params() {
|
|||
check_and_save_allowed_param "BACKEND" "backend" "--backend"
|
||||
check_and_save_allowed_param "KUBERNETES_MODE" "Kubernetes mode" "--kubernetes-mode"
|
||||
check_and_save_allowed_param "KUBERNETES_VERSION" "Kubernetes version" "--kubernetes-version"
|
||||
check_and_save_allowed_param "INSTALL_AIRFLOW_VERSION" "Install airflow version" "--install-airflow-version"
|
||||
|
||||
# Can't verify those
|
||||
save_to_file DOCKERHUB_USER
|
||||
|
|
141
breeze-complete
141
breeze-complete
|
@ -17,20 +17,57 @@
|
|||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
_BREEZE_ALLOWED_PYTHON_VERSIONS=" 3.6 3.7 "
|
||||
_BREEZE_ALLOWED_BACKENDS=" sqlite mysql postgres "
|
||||
_BREEZE_ALLOWED_INTEGRATIONS=" cassandra kerberos mongo openldap rabbitmq redis all "
|
||||
_BREEZE_ALLOWED_KUBERNETES_MODES=" persistent_mode git_mode "
|
||||
_BREEZE_ALLOWED_KUBERNETES_VERSIONS=" v1.15.3 v1.16.2 "
|
||||
_BREEZE_ALLOWED_STATIC_CHECKS=" all all-but-pylint bat-tests check-apache-license check-executables-have-shebangs check-hooks-apply check-merge-conflict check-xml debug-statements doctoc detect-private-key end-of-file-fixer flake8 forbid-tabs insert-license lint-dockerfile mixed-line-ending mypy pylint pylint-test setup-order shellcheck"
|
||||
_BREEZE_ALLOWED_PYTHON_VERSIONS="3.6 3.7"
|
||||
_BREEZE_ALLOWED_BACKENDS="sqlite mysql postgres"
|
||||
_BREEZE_ALLOWED_INTEGRATIONS="cassandra kerberos mongo openldap rabbitmq redis all"
|
||||
_BREEZE_ALLOWED_KUBERNETES_MODES="persistent_mode git_mode"
|
||||
_BREEZE_ALLOWED_KUBERNETES_VERSIONS="v1.15.3 v1.16.2"
|
||||
_BREEZE_ALLOWED_INSTALL_AIRFLOW_VERSIONS=$(cat <<-EOF
|
||||
current
|
||||
1.10.9
|
||||
1.10.8
|
||||
1.10.7
|
||||
1.10.6
|
||||
1.10.5
|
||||
1.10.4
|
||||
1.10.3
|
||||
1.10.2
|
||||
1.10.1
|
||||
EOF
|
||||
)
|
||||
_BREEZE_ALLOWED_STATIC_CHECKS=$(cat <<-EOF
|
||||
all
|
||||
all-but-pylint
|
||||
bat-tests
|
||||
check-apache-license
|
||||
check-executables-have-shebangs
|
||||
check-hooks-apply
|
||||
check-merge-conflict
|
||||
check-xml
|
||||
debug-statements
|
||||
doctoc
|
||||
detect-private-key
|
||||
end-of-file-fixer
|
||||
flake8
|
||||
forbid-tabs
|
||||
insert-license
|
||||
lint-dockerfile
|
||||
mixed-line-ending
|
||||
mypy
|
||||
pylint
|
||||
pylint-test
|
||||
setup-order
|
||||
shellcheck
|
||||
EOF
|
||||
)
|
||||
_BREEZE_DEFAULT_DOCKERHUB_USER="apache"
|
||||
_BREEZE_DEFAULT_DOCKERHUB_REPO="airflow"
|
||||
|
||||
_BREEZE_SHORT_OPTIONS="
|
||||
h P: B: I: K Z X
|
||||
M: V:
|
||||
s b O N
|
||||
v y n q C A
|
||||
s N: b O
|
||||
v y n q C A f
|
||||
r p R L u
|
||||
c D: H: e a
|
||||
t: d: k x: S: F:
|
||||
|
@ -39,8 +76,8 @@ t: d: k x: S: F:
|
|||
_BREEZE_LONG_OPTIONS="
|
||||
help python: backend: integration: start-kind-cluster recreate-kind-cluster stop-kind-cluster
|
||||
kubernetes-mode: kubernetes-version:
|
||||
skip-mounting-local-sources build-only build-docs
|
||||
verbose assume-yes assume-no assume-quit toggle-suppress-cheatsheet toggle-suppress-asciiart
|
||||
skip-mounting-local-sources install-airflow-version: build-only build-docs
|
||||
verbose assume-yes assume-no assume-quit toggle-suppress-cheatsheet toggle-suppress-asciiart forward-credentials
|
||||
force-build-images force-pull-images force-clean-images use-local-cache push-images
|
||||
cleanup-image dockerhub-user: dockerhub-repo: initialize-local-virtualenv setup-autocomplete
|
||||
test-target: docker-compose: stop-environment execute-command: static-check: static-check-all-files:
|
||||
|
@ -50,34 +87,48 @@ test-target: docker-compose: stop-environment execute-command: static-check: sta
|
|||
|
||||
_BREEZE_KNOWN_VALUES=""
|
||||
|
||||
function _get_known_values_breeze {
|
||||
function _get_known_values_breeze() {
|
||||
case "$1" in
|
||||
-P | --python )
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_PYTHON_VERSIONS} ;;
|
||||
-B | --backend )
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_BACKENDS} ;;
|
||||
-I | --integration )
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_INTEGRATIONS} ;;
|
||||
-M | --kubernetes-mode )
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_KUBERNETES_MODES} ;;
|
||||
-V | --kubernetes-version )
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_KUBERNETES_VERSIONS} ;;
|
||||
-S | --static-check )
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_STATIC_CHECKS} ;;
|
||||
-F | --static-check-all-files )
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_STATIC_CHECKS} ;;
|
||||
-d | --docker-compose )
|
||||
# shellcheck disable=SC2034
|
||||
if typeset -f "_docker_compose" > /dev/null; then
|
||||
_docker_compose
|
||||
fi
|
||||
_BREEZE_KNOWN_VALUES="" ;;
|
||||
-D | --dockerhub-user )
|
||||
_BREEZE_KNOWN_VALUES="${_BREEZE_DEFAULT_DOCKERHUB_USER}" ;;
|
||||
-H | --dockerhub-repo )
|
||||
_BREEZE_KNOWN_VALUES="${_BREEZE_DEFAULT_DOCKERHUB_REPO}" ;;
|
||||
*)
|
||||
_BREEZE_KNOWN_VALUES=""
|
||||
-P | --python)
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_PYTHON_VERSIONS}
|
||||
;;
|
||||
-B | --backend)
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_BACKENDS}
|
||||
;;
|
||||
-I | --integration)
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_INTEGRATIONS}
|
||||
;;
|
||||
-M | --kubernetes-mode)
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_KUBERNETES_MODES}
|
||||
;;
|
||||
-V | --kubernetes-version)
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_KUBERNETES_VERSIONS}
|
||||
;;
|
||||
-S | --static-check)
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_STATIC_CHECKS}
|
||||
;;
|
||||
-F | --static-check-all-files)
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_STATIC_CHECKS}
|
||||
;;
|
||||
-N | --install-airflow-version)
|
||||
_BREEZE_KNOWN_VALUES=${_BREEZE_ALLOWED_INSTALL_AIRFLOW_VERSIONS}
|
||||
;;
|
||||
-d | --docker-compose)
|
||||
# shellcheck disable=SC2034
|
||||
if typeset -f "_docker_compose" >/dev/null; then
|
||||
_docker_compose
|
||||
fi
|
||||
_BREEZE_KNOWN_VALUES=""
|
||||
;;
|
||||
-D | --dockerhub-user)
|
||||
_BREEZE_KNOWN_VALUES="${_BREEZE_DEFAULT_DOCKERHUB_USER}"
|
||||
;;
|
||||
-H | --dockerhub-repo)
|
||||
_BREEZE_KNOWN_VALUES="${_BREEZE_DEFAULT_DOCKERHUB_REPO}"
|
||||
;;
|
||||
*)
|
||||
_BREEZE_KNOWN_VALUES=""
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
|
@ -103,11 +154,11 @@ function _build_options_breeze {
|
|||
}
|
||||
|
||||
function _listcontains_breeze {
|
||||
local WORD
|
||||
for WORD in $1; do
|
||||
local WORD
|
||||
for WORD in $1; do
|
||||
[[ ${WORD} = "$2" ]] && return 0
|
||||
done
|
||||
return 1
|
||||
done
|
||||
return 1
|
||||
}
|
||||
|
||||
# A completion function for breeze
|
||||
|
@ -121,7 +172,7 @@ function _comp_breeze {
|
|||
|
||||
for OPTION in ${_BREEZE_SHORT_OPTIONS}
|
||||
do
|
||||
LAST_CHAR="${OPTION:$((${#OPTION}-1)):1}"
|
||||
LAST_CHAR="${OPTION:$((${#OPTION} - 1)):1}"
|
||||
GETOPT_OPTION='-'${OPTION//:/}
|
||||
if [[ "${LAST_CHAR}" == ":" ]]; then
|
||||
EXTRA_ARG_OPTIONS="${EXTRA_ARG_OPTIONS} ${GETOPT_OPTION}"
|
||||
|
@ -130,7 +181,7 @@ function _comp_breeze {
|
|||
done
|
||||
for OPTION in ${_BREEZE_LONG_OPTIONS}
|
||||
do
|
||||
LAST_CHAR="${OPTION:$((${#OPTION}-1)):1}"
|
||||
LAST_CHAR="${OPTION:$((${#OPTION} - 1)):1}"
|
||||
GETOPT_OPTION='--'${OPTION//:/}
|
||||
ALL_OPTIONS="${ALL_OPTIONS} ${GETOPT_OPTION}"
|
||||
if [[ "${LAST_CHAR}" == ":" ]]; then
|
||||
|
@ -138,9 +189,9 @@ function _comp_breeze {
|
|||
fi
|
||||
done
|
||||
|
||||
LAST_COMMAND_PREFIX="${COMP_WORDS[${#COMP_WORDS[@]}-1]}"
|
||||
LAST_COMMAND_PREFIX="${COMP_WORDS[${#COMP_WORDS[@]} - 1]}"
|
||||
if [[ ${#COMP_WORDS[@]} -gt 1 ]]; then
|
||||
PREVIOUS_COMMAND="${COMP_WORDS[${#COMP_WORDS[@]}-2]}"
|
||||
PREVIOUS_COMMAND="${COMP_WORDS[${#COMP_WORDS[@]} - 2]}"
|
||||
else
|
||||
PREVIOUS_COMMAND=""
|
||||
fi
|
||||
|
|
|
@ -133,7 +133,6 @@ function print_info() {
|
|||
# Simple (?) no-dependency needed Yaml PARSER
|
||||
# From https://stackoverflow.com/questions/5014632/how-can-i-parse-a-yaml-file-from-a-linux-shell-script
|
||||
function parse_yaml {
|
||||
|
||||
if [[ -z $1 ]]; then
|
||||
echo "Please provide yaml filename as first parameter."
|
||||
exit 1
|
||||
|
@ -168,6 +167,7 @@ function parse_yaml {
|
|||
# from airflow-testing section to "-v" "volume mapping" series of options
|
||||
function convert_docker_mounts_to_docker_params() {
|
||||
ESCAPED_AIRFLOW_SOURCES=$(echo "${AIRFLOW_SOURCES}" | sed -e 's/[\/&]/\\&/g')
|
||||
ESCAPED_HOME=$(echo "${HOME}" | sed -e 's/[\/&]/\\&/g')
|
||||
# shellcheck disable=2046
|
||||
while IFS= read -r LINE; do
|
||||
echo "-v"
|
||||
|
@ -175,6 +175,7 @@ function convert_docker_mounts_to_docker_params() {
|
|||
done < <(parse_yaml scripts/ci/docker-compose/local.yml COMPOSE_ | \
|
||||
grep "COMPOSE_services_airflow-testing_volumes_" | \
|
||||
sed "s/..\/..\/../${ESCAPED_AIRFLOW_SOURCES}/" | \
|
||||
sed "s/\${HOME}/${ESCAPED_HOME}/" | \
|
||||
sed "s/COMPOSE_services_airflow-testing_volumes_//" | \
|
||||
sort -t "=" -k 1 -n | \
|
||||
cut -d "=" -f 2- | \
|
||||
|
@ -811,7 +812,7 @@ function check_and_save_allowed_param {
|
|||
_VARIABLE_DESCRIPTIVE_NAME="${2}"
|
||||
_FLAG="${3}"
|
||||
_ALLOWED_VALUES_ENV_NAME="_BREEZE_ALLOWED_${_VARIABLE_NAME}S"
|
||||
_ALLOWED_VALUES=${!_ALLOWED_VALUES_ENV_NAME}
|
||||
_ALLOWED_VALUES=" ${!_ALLOWED_VALUES_ENV_NAME//$'\n'/ } "
|
||||
_VALUE=${!_VARIABLE_NAME}
|
||||
if [[ ${_ALLOWED_VALUES:=} != *" ${_VALUE} "* ]]; then
|
||||
echo >&2
|
||||
|
@ -833,7 +834,7 @@ function check_and_save_allowed_param {
|
|||
}
|
||||
|
||||
function run_docs() {
|
||||
docker run "${AIRFLOW_CONTAINER_EXTRA_DOCKER_FLAGS[@]}" -t \
|
||||
docker run "${EXTRA_DOCKER_FLAGS[@]}" -t \
|
||||
--entrypoint "/usr/local/bin/dumb-init" \
|
||||
--env PYTHONDONTWRITEBYTECODE \
|
||||
--env AIRFLOW_CI_VERBOSE="${VERBOSE}" \
|
||||
|
|
|
@ -42,12 +42,26 @@ export AIRFLOW_CI_VERBOSE=${VERBOSE}
|
|||
# opposite - whether diagnostic messages should be silenced
|
||||
export AIRFLOW_CI_SILENT=${AIRFLOW_CI_SILENT:="true"}
|
||||
|
||||
# Forwards host credentials to the container
|
||||
export FORWARD_CREDENTIALS=${FORWARD_CREDENTIALS:="false"}
|
||||
|
||||
# Installs different airflow version than current from the sources
|
||||
export INSTALL_AIRFLOW_VERSION=${INSTALL_AIRFLOW_VERSION:="current"}
|
||||
|
||||
if [[ ${MOUNT_LOCAL_SOURCES} == "true" ]]; then
|
||||
DOCKER_COMPOSE_LOCAL=("-f" "${MY_DIR}/docker-compose/local.yml")
|
||||
else
|
||||
DOCKER_COMPOSE_LOCAL=()
|
||||
fi
|
||||
|
||||
if [[ ${FORWARD_CREDENTIALS} == "true" ]]; then
|
||||
DOCKER_COMPOSE_LOCAL+=("-f" "${MY_DIR}/docker-compose/forward-credentials.yml")
|
||||
fi
|
||||
|
||||
if [[ ${INSTALL_AIRFLOW_VERSION} != "current" ]]; then
|
||||
DOCKER_COMPOSE_LOCAL+=("-f" "${MY_DIR}/docker-compose/remove-sources.yml")
|
||||
fi
|
||||
|
||||
echo
|
||||
echo "Using docker image: ${AIRFLOW_CI_IMAGE} for docker compose runs"
|
||||
echo
|
||||
|
|
|
@ -55,6 +55,9 @@ services:
|
|||
- ENABLED_INTEGRATIONS
|
||||
- RUN_INTEGRATION_TESTS
|
||||
- BREEZE
|
||||
- INSTALL_AIRFLOW_VERSION
|
||||
- ENABLED_SYSTEMS
|
||||
- RUN_SYSTEM_TESTS
|
||||
volumes:
|
||||
# Pass docker to inside of the container so that Kind and Moto tests can use it.
|
||||
- /var/run/docker.sock:/var/run/docker.sock
|
||||
|
|
|
@ -0,0 +1,32 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
---
|
||||
version: "2.2"
|
||||
services:
|
||||
airflow-testing:
|
||||
# Forwards local credentials to docker image
|
||||
# Useful for gcloud/aws/kubernetes etc. authorisation to be passed
|
||||
# To inside docker. Use with care - your credentials will be available to
|
||||
# Everything you install in Docker
|
||||
volumes:
|
||||
- ${HOME}/.config:/root/.config:cached
|
||||
- ${HOME}/.boto:/root/.boto:cached
|
||||
- ${HOME}/.docker:/root/.docker:cached
|
||||
- ${HOME}/.gsutil:/root/.gsutil:cached
|
||||
- ${HOME}/.kube:/root/.kube:cached
|
||||
- ${HOME}/.s3:/root/.s3:cached
|
||||
- ${HOME}/.ssh:/root/.ssh:cached
|
|
@ -0,0 +1,28 @@
|
|||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
---
|
||||
version: "2.2"
|
||||
services:
|
||||
airflow-testing:
|
||||
# Forwards local credentials to docker image
|
||||
# Useful for gcloud/aws/kubernetes etc. authorisation to be passed
|
||||
# To inside docker. Use with care - your credentials will be available to
|
||||
# Everything you install in Docker
|
||||
volumes:
|
||||
- ./empty:/opt/airflow/airflow:cached
|
||||
# necessary for system tests - we need to take example dags from there
|
||||
- ../../../airflow/providers:/providers:cached
|
|
@ -202,7 +202,7 @@ function in_container_refresh_pylint_todo() {
|
|||
export DISABLE_CHECKS_FOR_TESTS="missing-docstring,no-self-use,too-many-public-methods,protected-access,do-not-use-asserts"
|
||||
|
||||
function start_output_heartbeat() {
|
||||
MESSAGE=${1:="Still working!"}
|
||||
MESSAGE=${1:-"Still working!"}
|
||||
INTERVAL=${2:=10}
|
||||
echo
|
||||
echo "Starting output heartbeat"
|
||||
|
@ -281,6 +281,16 @@ function dump_container_logs() {
|
|||
done
|
||||
}
|
||||
|
||||
function dump_airflow_logs() {
|
||||
echo "###########################################################################################"
|
||||
echo " Dumping logs from all the airflow tasks"
|
||||
echo "###########################################################################################"
|
||||
pushd /root/airflow/ || exit 1
|
||||
tar -czf "${1}" logs
|
||||
popd || exit 1
|
||||
echo "###########################################################################################"
|
||||
}
|
||||
|
||||
|
||||
function send_docker_logs_to_file_io() {
|
||||
echo "##############################################################################"
|
||||
|
@ -297,6 +307,21 @@ function send_docker_logs_to_file_io() {
|
|||
curl -F "file=@${DUMP_FILE}" https://file.io
|
||||
}
|
||||
|
||||
function send_airflow_logs_to_file_io() {
|
||||
echo "##############################################################################"
|
||||
echo
|
||||
echo " DUMPING LOG FILES FROM AIRFLOW AND SENDING THEM TO file.io"
|
||||
echo
|
||||
echo "##############################################################################"
|
||||
DUMP_FILE=/tmp/$(date "+%Y-%m-%d")_airflow_${TRAVIS_BUILD_ID:="default"}_${TRAVIS_JOB_ID:="default"}.log.tar.gz
|
||||
dump_airflow_logs "${DUMP_FILE}"
|
||||
echo
|
||||
echo " Logs saved to ${DUMP_FILE}"
|
||||
echo
|
||||
echo "##############################################################################"
|
||||
curl -F "file=@${DUMP_FILE}" https://file.io
|
||||
}
|
||||
|
||||
|
||||
function dump_kind_logs() {
|
||||
echo "###########################################################################################"
|
||||
|
@ -324,3 +349,12 @@ function send_kubernetes_logs_to_file_io() {
|
|||
echo "##############################################################################"
|
||||
curl -F "file=@${DUMP_DIR}.tar.gz" https://file.io
|
||||
}
|
||||
|
||||
function install_released_airflow_version() {
|
||||
pip uninstall apache-airflow -y || true
|
||||
find /root/airflow/ -type f -print0 | xargs rm -f
|
||||
if [[ ${1} == "1.10.2" || ${1} == "1.10.1" ]]; then
|
||||
export SLUGIFY_USES_TEXT_UNIDECODE=yes
|
||||
fi
|
||||
pip install "apache-airflow==${1}"
|
||||
}
|
||||
|
|
|
@ -70,74 +70,93 @@ function check_integration {
|
|||
echo "-----------------------------------------------------------------------------------------------"
|
||||
}
|
||||
|
||||
export EXIT_CODE=0
|
||||
function check_db_connection {
|
||||
MAX_CHECK=${1:=3}
|
||||
|
||||
if [[ -n ${BACKEND:=} ]]; then
|
||||
echo "Checking backend: ${BACKEND}"
|
||||
|
||||
set +e
|
||||
if [[ ${BACKEND} == "mysql" ]]; then
|
||||
# Wait until mysql is ready!
|
||||
MYSQL_CONTAINER=$(docker ps -qf "name=mysql")
|
||||
if [[ -z ${MYSQL_CONTAINER} ]]; then
|
||||
echo
|
||||
echo "ERROR! MYSQL container is not started. Exiting!"
|
||||
echo
|
||||
exit 1
|
||||
fi
|
||||
MAX_CHECK=60
|
||||
echo "Checking if MySQL is ready for connections (double restarts in the logs)"
|
||||
while true
|
||||
do
|
||||
CONNECTION_READY_MESSAGES=$(docker logs "${MYSQL_CONTAINER}" 2>&1 | \
|
||||
grep -c "mysqld: ready for connections" )
|
||||
# MySQL when starting from dockerfile starts a temporary server first because it
|
||||
# starts with an empty database first and it will create the airflow database and then
|
||||
# it will start a second server to serve this newly created database
|
||||
# That's why we should wait until docker logs contain "ready for connections" twice
|
||||
# more info: https://github.com/docker-library/mysql/issues/527
|
||||
if [[ ${CONNECTION_READY_MESSAGES} -gt 1 ]];
|
||||
then
|
||||
echo
|
||||
echo
|
||||
echo "MySQL is ready for connections!"
|
||||
echo
|
||||
break
|
||||
else
|
||||
echo -n "."
|
||||
fi
|
||||
MAX_CHECK=$((MAX_CHECK-1))
|
||||
if [[ ${MAX_CHECK} == 0 ]]; then
|
||||
echo
|
||||
echo "ERROR! Maximum number of retries while waiting for MySQL. Exiting"
|
||||
echo
|
||||
echo "Last check: ${CONNECTION_READY_MESSAGES} connection ready messages (expected >=2)"
|
||||
echo
|
||||
echo "==============================================================================================="
|
||||
echo
|
||||
exit 1
|
||||
else
|
||||
sleep 1
|
||||
fi
|
||||
done
|
||||
if [[ ${BACKEND} == "postgres" ]]; then
|
||||
HOSTNAME=postgres
|
||||
PORT=5432
|
||||
elif [[ ${BACKEND} == "mysql" ]]; then
|
||||
HOSTNAME=mysql
|
||||
PORT=3306
|
||||
else
|
||||
return
|
||||
fi
|
||||
|
||||
MAX_CHECK=3
|
||||
echo "-----------------------------------------------------------------------------------------------"
|
||||
echo " Checking DB ${BACKEND}"
|
||||
echo "-----------------------------------------------------------------------------------------------"
|
||||
while true
|
||||
do
|
||||
LAST_CHECK_RESULT=$(AIRFLOW__LOGGING__LOGGING_LEVEL=error airflow db check 2>&1)
|
||||
set +e
|
||||
LAST_CHECK_RESULT=$(nc -zvv ${HOSTNAME} ${PORT} 2>&1)
|
||||
RES=$?
|
||||
set -e
|
||||
if [[ ${RES} == 0 ]]; then
|
||||
echo
|
||||
echo " Backend ${BACKEND} OK!"
|
||||
echo
|
||||
break
|
||||
else
|
||||
echo -n "."
|
||||
MAX_CHECK=$((MAX_CHECK-1))
|
||||
fi
|
||||
if [[ ${MAX_CHECK} == 0 ]]; then
|
||||
echo
|
||||
echo "ERROR! Maximum number of retries while checking ${BACKEND} db. Exiting"
|
||||
echo
|
||||
break
|
||||
else
|
||||
sleep 1
|
||||
fi
|
||||
done
|
||||
if [[ ${RES} != 0 ]]; then
|
||||
echo " ERROR: ${BACKEND} db could not be reached!"
|
||||
echo
|
||||
echo "${LAST_CHECK_RESULT}"
|
||||
echo
|
||||
export EXIT_CODE=${RES}
|
||||
fi
|
||||
echo "-----------------------------------------------------------------------------------------------"
|
||||
}
|
||||
|
||||
function check_mysql_logs {
|
||||
MAX_CHECK=${1:=60}
|
||||
# Wait until mysql is ready!
|
||||
MYSQL_CONTAINER=$(docker ps -qf "name=mysql")
|
||||
if [[ -z ${MYSQL_CONTAINER} ]]; then
|
||||
echo
|
||||
echo "ERROR! MYSQL container is not started. Exiting!"
|
||||
echo
|
||||
exit 1
|
||||
fi
|
||||
echo
|
||||
echo "Checking if MySQL is ready for connections (double restarts in the logs)"
|
||||
echo
|
||||
while true
|
||||
do
|
||||
CONNECTION_READY_MESSAGES=$(docker logs "${MYSQL_CONTAINER}" 2>&1 | \
|
||||
grep -c "mysqld: ready for connections" )
|
||||
# MySQL when starting from dockerfile starts a temporary server first because it
|
||||
# starts with an empty database first and it will create the airflow database and then
|
||||
# it will start a second server to serve this newly created database
|
||||
# That's why we should wait until docker logs contain "ready for connections" twice
|
||||
# more info: https://github.com/docker-library/mysql/issues/527
|
||||
if [[ ${CONNECTION_READY_MESSAGES} -gt 1 ]];
|
||||
then
|
||||
echo
|
||||
echo
|
||||
echo "MySQL is ready for connections!"
|
||||
echo
|
||||
break
|
||||
else
|
||||
echo -n "."
|
||||
fi
|
||||
echo -n "."
|
||||
MAX_CHECK=$((MAX_CHECK-1))
|
||||
if [[ ${MAX_CHECK} == 0 ]]; then
|
||||
echo
|
||||
echo "==============================================================================================="
|
||||
echo " ERROR! Failure while checking backend database!"
|
||||
echo "ERROR! Maximum number of retries while waiting for MySQL. Exiting"
|
||||
echo
|
||||
echo "${LAST_CHECK_RESULT}"
|
||||
echo "Last check: ${CONNECTION_READY_MESSAGES} connection ready messages (expected >=2)"
|
||||
echo
|
||||
echo "==============================================================================================="
|
||||
echo
|
||||
|
@ -146,8 +165,27 @@ if [[ -n ${BACKEND:=} ]]; then
|
|||
sleep 1
|
||||
fi
|
||||
done
|
||||
}
|
||||
|
||||
if [[ -n ${BACKEND:=} ]]; then
|
||||
echo "==============================================================================================="
|
||||
echo " Checking backend: ${BACKEND}"
|
||||
echo "==============================================================================================="
|
||||
|
||||
set +e
|
||||
if [[ ${BACKEND} == "mysql" ]]; then
|
||||
check_mysql_logs 60
|
||||
fi
|
||||
|
||||
check_db_connection 5
|
||||
|
||||
set -e
|
||||
export EXIT_CODE=${RES}
|
||||
if [[ ${EXIT_CODE} == 0 ]]; then
|
||||
echo "==============================================================================================="
|
||||
echo " Backend database is sane"
|
||||
echo "==============================================================================================="
|
||||
echo
|
||||
fi
|
||||
else
|
||||
echo "==============================================================================================="
|
||||
echo " Skip checking backend - BACKEND not set"
|
||||
|
|
|
@ -51,26 +51,42 @@ echo
|
|||
ARGS=( "$@" )
|
||||
|
||||
RUN_TESTS=${RUN_TESTS:="true"}
|
||||
INSTALL_AIRFLOW_VERSION="${INSTALL_AIRFLOW_VERSION:=""}"
|
||||
|
||||
if [[ ! -d "${AIRFLOW_SOURCES}/airflow/www/node_modules" ]]; then
|
||||
echo
|
||||
echo "Installing node modules as they are not yet installed (Sources mounted from Host)"
|
||||
echo
|
||||
pushd "${AIRFLOW_SOURCES}/airflow/www/" &>/dev/null || exit 1
|
||||
yarn install --frozen-lockfile
|
||||
echo
|
||||
popd &>/dev/null || exit 1
|
||||
fi
|
||||
if [[ ! -d "${AIRFLOW_SOURCES}/airflow/www/static/dist" ]]; then
|
||||
pushd "${AIRFLOW_SOURCES}/airflow/www/" &>/dev/null || exit 1
|
||||
echo
|
||||
echo "Building production version of javascript files (Sources mounted from Host)"
|
||||
echo
|
||||
echo
|
||||
yarn run prod
|
||||
echo
|
||||
echo
|
||||
popd &>/dev/null || exit 1
|
||||
if [[ ${INSTALL_AIRFLOW_VERSION} == "current" ]]; then
|
||||
if [[ ! -d "${AIRFLOW_SOURCES}/airflow/www/node_modules" ]]; then
|
||||
echo
|
||||
echo "Installing node modules as they are not yet installed (Sources mounted from Host)"
|
||||
echo
|
||||
pushd "${AIRFLOW_SOURCES}/airflow/www/" &>/dev/null || exit 1
|
||||
yarn install --frozen-lockfile
|
||||
echo
|
||||
popd &>/dev/null || exit 1
|
||||
fi
|
||||
if [[ ! -d "${AIRFLOW_SOURCES}/airflow/www/static/dist" ]]; then
|
||||
pushd "${AIRFLOW_SOURCES}/airflow/www/" &>/dev/null || exit 1
|
||||
echo
|
||||
echo "Building production version of javascript files (Sources mounted from Host)"
|
||||
echo
|
||||
echo
|
||||
yarn run prod
|
||||
echo
|
||||
echo
|
||||
popd &>/dev/null || exit 1
|
||||
fi
|
||||
# Cleanup the logs, tmp when entering the environment
|
||||
sudo rm -rf "${AIRFLOW_SOURCES}"/logs/*
|
||||
sudo rm -rf "${AIRFLOW_SOURCES}"/tmp/*
|
||||
mkdir -p "${AIRFLOW_SOURCES}"/logs/
|
||||
mkdir -p "${AIRFLOW_SOURCES}"/tmp/
|
||||
export PYTHONPATH=${AIRFLOW_SOURCES}
|
||||
else
|
||||
if [[ ${AIRFLOW_VERSION} == *1.10* || ${INSTALL_AIRFLOW_VERSION} == *1.10* ]]; then
|
||||
export RUN_AIRFLOW_1_10="true"
|
||||
else
|
||||
export RUN_AIRFLOW_1_10="false"
|
||||
fi
|
||||
install_released_airflow_version "${INSTALL_AIRFLOW_VERSION}"
|
||||
fi
|
||||
|
||||
export HADOOP_DISTRO="${HADOOP_DISTRO:="cdh"}"
|
||||
|
@ -92,16 +108,9 @@ if [[ ! -h /home/travis/build/apache/airflow ]]; then
|
|||
sudo ln -s "${AIRFLOW_SOURCES}" /home/travis/build/apache/airflow
|
||||
fi
|
||||
|
||||
# Cleanup the logs, tmp when entering the environment
|
||||
sudo rm -rf "${AIRFLOW_SOURCES}"/logs/*
|
||||
sudo rm -rf "${AIRFLOW_SOURCES}"/tmp/*
|
||||
mkdir -p "${AIRFLOW_SOURCES}"/logs/
|
||||
mkdir -p "${AIRFLOW_SOURCES}"/tmp/
|
||||
|
||||
mkdir -pv "${AIRFLOW_HOME}/logs/"
|
||||
cp -f "${MY_DIR}/airflow_ci.cfg" "${AIRFLOW_HOME}/unittests.cfg"
|
||||
|
||||
export PYTHONPATH=${AIRFLOW_SOURCES}
|
||||
|
||||
"${MY_DIR}/check_environment.sh"
|
||||
|
||||
|
@ -164,6 +173,26 @@ if [[ "${ENABLE_KIND_CLUSTER}" == "true" ]]; then
|
|||
fi
|
||||
fi
|
||||
|
||||
export FILES_DIR="/files"
|
||||
export AIRFLOW_BREEZE_CONFIG_DIR="${FILES_DIR}/airflow-breeze-config"
|
||||
VARIABLES_ENV_FILE="variables.env"
|
||||
|
||||
if [[ -d "${AIRFLOW_BREEZE_CONFIG_DIR}" && \
|
||||
-f "${AIRFLOW_BREEZE_CONFIG_DIR}/${VARIABLES_ENV_FILE}" ]]; then
|
||||
pushd "${AIRFLOW_BREEZE_CONFIG_DIR}" >/dev/null 2>&1 || exit 1
|
||||
echo
|
||||
echo "Sourcing environment variables from ${VARIABLES_ENV_FILE} in ${AIRFLOW_BREEZE_CONFIG_DIR}"
|
||||
echo
|
||||
# shellcheck disable=1090
|
||||
source "${VARIABLES_ENV_FILE}"
|
||||
popd >/dev/null 2>&1 || exit 1
|
||||
else
|
||||
echo
|
||||
echo "You can add ${AIRFLOW_BREEZE_CONFIG_DIR} directory and place ${VARIABLES_ENV_FILE}"
|
||||
echo "In it to make breeze source the variables automatically for you"
|
||||
echo
|
||||
fi
|
||||
|
||||
set +u
|
||||
# If we do not want to run tests, we simply drop into bash
|
||||
if [[ "${RUN_TESTS}" == "false" ]]; then
|
||||
|
@ -212,5 +241,11 @@ if [[ -n ${RUNTIME} ]]; then
|
|||
fi
|
||||
fi
|
||||
|
||||
|
||||
ARGS=("${CI_ARGS[@]}" "${TESTS_TO_RUN}")
|
||||
"${MY_DIR}/run_ci_tests.sh" "${ARGS[@]}"
|
||||
|
||||
if [[ ${RUN_SYSTEM_TESTS:="false"} == "true" ]]; then
|
||||
"${MY_DIR}/run_system_tests.sh" "${ARGS[@]}"
|
||||
else
|
||||
"${MY_DIR}/run_ci_tests.sh" "${ARGS[@]}"
|
||||
fi
|
||||
|
|
|
@ -38,6 +38,7 @@ fi
|
|||
|
||||
if [[ ${CI} == "true" ]]; then
|
||||
send_docker_logs_to_file_io
|
||||
send_airflow_logs_to_file_io
|
||||
fi
|
||||
|
||||
if [[ ${CI} == "true" && ${ENABLE_KIND_CLUSTER} == "true" ]]; then
|
||||
|
|
|
@ -0,0 +1,56 @@
|
|||
#!/usr/bin/env bash
|
||||
# Licensed to the Apache Software Foundation (ASF) under one
|
||||
# or more contributor license agreements. See the NOTICE file
|
||||
# distributed with this work for additional information
|
||||
# regarding copyright ownership. The ASF licenses this file
|
||||
# to you under the Apache License, Version 2.0 (the
|
||||
# "License"); you may not use this file except in compliance
|
||||
# with the License. You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing,
|
||||
# software distributed under the License is distributed on an
|
||||
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
#
|
||||
# Bash sanity settings (error on exit, complain for undefined vars, error when pipe fails)
|
||||
set -euo pipefail
|
||||
|
||||
MY_DIR=$(cd "$(dirname "$0")" || exit 1; pwd)
|
||||
|
||||
# shellcheck source=scripts/ci/in_container/_in_container_utils.sh
|
||||
. "${MY_DIR}/_in_container_utils.sh"
|
||||
|
||||
in_container_basic_sanity_check
|
||||
|
||||
in_container_script_start
|
||||
|
||||
# any argument received is overriding the default nose execution arguments:
|
||||
PYTEST_ARGS=( "$@" )
|
||||
|
||||
echo
|
||||
echo "Starting the tests with those pytest arguments: ${PYTEST_ARGS[*]}"
|
||||
echo
|
||||
set +e
|
||||
|
||||
pytest "${PYTEST_ARGS[@]}"
|
||||
|
||||
RES=$?
|
||||
|
||||
set +x
|
||||
if [[ "${RES}" == "0" && ${CI} == "true" ]]; then
|
||||
echo "All tests successful"
|
||||
fi
|
||||
|
||||
if [[ ${CI} == "true" ]]; then
|
||||
send_docker_logs_to_file_io
|
||||
send_airflow_logs_to_file_io
|
||||
fi
|
||||
|
||||
in_container_script_end
|
||||
|
||||
exit "${RES}"
|
|
@ -29,7 +29,7 @@ echo "
|
|||
.. code-block:: text
|
||||
" >"${TMP_FILE}"
|
||||
|
||||
export SEPARATOR_WIDTH=100
|
||||
export SEPARATOR_WIDTH=80
|
||||
export AIRFLOW_CI_SILENT="true"
|
||||
./breeze --help | sed 's/^/ /' | sed 's/ *$//' >>"${TMP_FILE}"
|
||||
|
||||
|
|
2
setup.py
2
setup.py
|
@ -494,7 +494,7 @@ def do_setup():
|
|||
'tabulate>=0.7.5, <0.9',
|
||||
'tenacity==4.12.0',
|
||||
'termcolor==1.1.0',
|
||||
'text-unidecode==1.3',
|
||||
'text-unidecode==1.2',
|
||||
'thrift>=0.9.2',
|
||||
'typing;python_version<"3.6"',
|
||||
'typing-extensions>=3.7.4;python_version<"3.8"',
|
||||
|
|
|
@ -31,7 +31,7 @@ teardown() {
|
|||
|
||||
@test "Test missing value for a parameter" {
|
||||
load bats_utils
|
||||
export _BREEZE_ALLOWED_TEST_PARAMS=" a b c "
|
||||
export _BREEZE_ALLOWED_TEST_PARAMS="a b c"
|
||||
run check_and_save_allowed_param "TEST_PARAM" "Test Param" "--message"
|
||||
diff <(echo "${output}") - <<EOF
|
||||
|
||||
|
@ -47,7 +47,7 @@ EOF
|
|||
|
||||
initialize_breeze_environment
|
||||
|
||||
export _BREEZE_ALLOWED_TEST_PARAMS=" a b c "
|
||||
export _BREEZE_ALLOWED_TEST_PARAMS="a b c"
|
||||
export TEST_PARAM=x
|
||||
echo "a" > "${AIRFLOW_SOURCES}/.build/.TEST_PARAM"
|
||||
run check_and_save_allowed_param "TEST_PARAM" "Test Param" "--message"
|
||||
|
@ -67,7 +67,7 @@ EOF
|
|||
|
||||
initialize_breeze_environment
|
||||
|
||||
export _BREEZE_ALLOWED_TEST_PARAMS=" a b c "
|
||||
export _BREEZE_ALLOWED_TEST_PARAMS="a b c"
|
||||
export TEST_PARAM=x
|
||||
echo "x" > "${AIRFLOW_SOURCES}/.build/.TEST_PARAM"
|
||||
run check_and_save_allowed_param "TEST_PARAM" "Test Param" "--message"
|
||||
|
@ -89,7 +89,7 @@ EOF
|
|||
|
||||
initialize_breeze_environment
|
||||
|
||||
export _BREEZE_ALLOWED_TEST_PARAMS=" a b c "
|
||||
export _BREEZE_ALLOWED_TEST_PARAMS="a b c"
|
||||
export TEST_PARAM=a
|
||||
run check_and_save_allowed_param "TEST_PARAM" "Test Param" "--message"
|
||||
diff <(echo "${output}") <(echo "")
|
||||
|
@ -98,6 +98,27 @@ EOF
|
|||
[ "${status}" == "0" ]
|
||||
}
|
||||
|
||||
@test "Test correct value for a parameter from multi-line values" {
|
||||
load bats_utils
|
||||
|
||||
initialize_breeze_environment
|
||||
|
||||
_BREEZE_ALLOWED_TEST_PARAMS=$(cat <<-EOF
|
||||
a
|
||||
b
|
||||
c
|
||||
EOF
|
||||
)
|
||||
export _BREEZE_ALLOWED_TEST_PARAMS
|
||||
export TEST_PARAM=a
|
||||
run check_and_save_allowed_param "TEST_PARAM" "Test Param" "--message"
|
||||
diff <(echo "${output}") <(echo "")
|
||||
[ -f "${AIRFLOW_SOURCES}/.build/.TEST_PARAM" ]
|
||||
diff <(echo "a") <(cat "${AIRFLOW_SOURCES}/.build/.TEST_PARAM")
|
||||
[ "${status}" == "0" ]
|
||||
}
|
||||
|
||||
|
||||
@test "Test read_parameter from missing file" {
|
||||
load bats_utils
|
||||
|
||||
|
|
|
@ -28,6 +28,7 @@ tests_directory = os.path.dirname(os.path.realpath(__file__))
|
|||
os.environ["AIRFLOW__CORE__DAGS_FOLDER"] = os.path.join(tests_directory, "dags")
|
||||
os.environ["AIRFLOW__CORE__UNIT_TEST_MODE"] = "True"
|
||||
os.environ["AWS_DEFAULT_REGION"] = (os.environ.get("AWS_DEFAULT_REGION") or "us-east-1")
|
||||
os.environ["CREDENTIALS_DIR"] = (os.environ.get('CREDENTIALS_DIR') or "/files/airflow-breeze-config/keys")
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
|
@ -87,6 +88,26 @@ def pytest_addoption(parser):
|
|||
metavar="RUNTIME",
|
||||
help="only run tests matching the runtime: [kubernetes].",
|
||||
)
|
||||
group.addoption(
|
||||
"--systems",
|
||||
action="store",
|
||||
metavar="SYSTEMS",
|
||||
help="only run tests matching the systems specified [google.cloud, google.marketing_platform]",
|
||||
)
|
||||
group.addoption(
|
||||
"--include-long-running",
|
||||
action="store_true",
|
||||
help="Includes long running tests (marked with long_running) marker ",
|
||||
)
|
||||
|
||||
|
||||
def initial_db_init():
|
||||
if os.environ.get("RUN_AIRFLOW_1_10") == "true":
|
||||
print("Attempting to reset the db using airflow command")
|
||||
os.system("airflow resetdb -y")
|
||||
else:
|
||||
from airflow.utils import db
|
||||
db.resetdb()
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True, scope="session")
|
||||
|
@ -101,6 +122,10 @@ def breeze_test_helper(request):
|
|||
print("Skipping db initialization. Tests do not require database")
|
||||
return
|
||||
|
||||
from airflow import __version__
|
||||
if __version__.startswith("1.10"):
|
||||
os.environ['RUN_AIRFLOW_1_10'] = "true"
|
||||
|
||||
print(" AIRFLOW ".center(60, "="))
|
||||
|
||||
# Setup test environment for breeze
|
||||
|
@ -109,27 +134,17 @@ def breeze_test_helper(request):
|
|||
|
||||
print(f"Home of the user: {home}\nAirflow home {airflow_home}")
|
||||
|
||||
from airflow.utils import db
|
||||
|
||||
# Initialize Airflow db if required
|
||||
lock_file = os.path.join(airflow_home, ".airflow_db_initialised")
|
||||
if request.config.option.db_init:
|
||||
print("Initializing the DB - forced with --with-db-init switch.")
|
||||
try:
|
||||
db.initdb()
|
||||
except: # pylint: disable=bare-except # noqa
|
||||
print("Skipping db initialization because database already exists.")
|
||||
db.resetdb()
|
||||
initial_db_init()
|
||||
elif not os.path.exists(lock_file):
|
||||
print(
|
||||
"Initializing the DB - first time after entering the container.\n"
|
||||
"You can force re-initialization the database by adding --with-db-init switch to run-tests."
|
||||
)
|
||||
try:
|
||||
db.initdb()
|
||||
except: # pylint: disable=bare-except # noqa
|
||||
print("Skipping db initialization because database already exists.")
|
||||
db.resetdb()
|
||||
initial_db_init()
|
||||
# Create pid file
|
||||
with open(lock_file, "w+"):
|
||||
pass
|
||||
|
@ -159,6 +174,15 @@ def pytest_configure(config):
|
|||
config.addinivalue_line(
|
||||
"markers", "runtime(name): mark test to run with named runtime"
|
||||
)
|
||||
config.addinivalue_line(
|
||||
"markers", "system(name): mark test to run with named system"
|
||||
)
|
||||
config.addinivalue_line(
|
||||
"markers", "long_running(name): mark test that run for a long time (many minutes)"
|
||||
)
|
||||
config.addinivalue_line(
|
||||
"markers", "credential_file(name): mark tests that require credential file in CREDENTIALS_DIR"
|
||||
)
|
||||
|
||||
|
||||
def skip_if_not_marked_with_integration(selected_integrations, item):
|
||||
|
@ -168,8 +192,8 @@ def skip_if_not_marked_with_integration(selected_integrations, item):
|
|||
return
|
||||
pytest.skip("The test is skipped because it does not have the right integration marker. "
|
||||
"Only tests marked with pytest.mark.integration(INTEGRATION) are run with INTEGRATION"
|
||||
" being one of {}. {item}".
|
||||
format(selected_integrations, item=item))
|
||||
" being one of {integration}. {item}".
|
||||
format(integration=selected_integrations, item=item))
|
||||
|
||||
|
||||
def skip_if_not_marked_with_backend(selected_backend, item):
|
||||
|
@ -178,9 +202,9 @@ def skip_if_not_marked_with_backend(selected_backend, item):
|
|||
if selected_backend in backend_names:
|
||||
return
|
||||
pytest.skip("The test is skipped because it does not have the right backend marker "
|
||||
"Only tests marked with pytest.mark.backend('{}') are run"
|
||||
"Only tests marked with pytest.mark.backend('{backend}') are run"
|
||||
": {item}".
|
||||
format(selected_backend, item=item))
|
||||
format(backend=selected_backend, item=item))
|
||||
|
||||
|
||||
def skip_if_not_marked_with_runtime(selected_runtime, item):
|
||||
|
@ -189,8 +213,35 @@ def skip_if_not_marked_with_runtime(selected_runtime, item):
|
|||
if runtime_name == selected_runtime:
|
||||
return
|
||||
pytest.skip("The test is skipped because it has not been selected via --runtime switch. "
|
||||
"Only tests marked with pytest.mark.runtime('{}') are run: {item}".
|
||||
format(selected_runtime, item=item))
|
||||
"Only tests marked with pytest.mark.runtime('{runtime}') are run: {item}".
|
||||
format(runtime=selected_runtime, item=item))
|
||||
|
||||
|
||||
def skip_if_not_marked_with_system(selected_systems, item):
|
||||
for marker in item.iter_markers(name="system"):
|
||||
systems_name = marker.args[0]
|
||||
if systems_name in selected_systems or "all" in selected_systems:
|
||||
return
|
||||
pytest.skip("The test is skipped because it does not have the right system marker. "
|
||||
"Only tests marked with pytest.mark.system(SYSTEM) are run with SYSTEM"
|
||||
" being one of {systems}. {item}".
|
||||
format(systems=selected_systems, item=item))
|
||||
|
||||
|
||||
def skip_system_test(item):
|
||||
for marker in item.iter_markers(name="system"):
|
||||
pytest.skip("The test is skipped because it has system marker. "
|
||||
"System tests are only run when --systems flag "
|
||||
"with the right system ({system}) is passed to pytest. {item}".
|
||||
format(system=marker.args[0], item=item))
|
||||
|
||||
|
||||
def skip_long_running_test(item):
|
||||
for _ in item.iter_markers(name="long_running"):
|
||||
pytest.skip("The test is skipped because it has long_running marker. "
|
||||
"And system tests are only run when --long-lasting flag "
|
||||
"is passed to pytest. {item}".
|
||||
format(item=item))
|
||||
|
||||
|
||||
def skip_if_integration_disabled(marker, item):
|
||||
|
@ -199,10 +250,10 @@ def skip_if_integration_disabled(marker, item):
|
|||
environment_variable_value = os.environ.get(environment_variable_name)
|
||||
if not environment_variable_value or environment_variable_value != "true":
|
||||
pytest.skip("The test requires {integration_name} integration started and "
|
||||
"{} environment variable to be set to true (it is '{}')."
|
||||
"{name} environment variable to be set to true (it is '{value}')."
|
||||
" It can be set by specifying '--integration {integration_name}' at breeze startup"
|
||||
": {item}".
|
||||
format(environment_variable_name, environment_variable_value,
|
||||
format(name=environment_variable_name, value=environment_variable_value,
|
||||
integration_name=integration_name, item=item))
|
||||
|
||||
|
||||
|
@ -212,10 +263,10 @@ def skip_if_runtime_disabled(marker, item):
|
|||
environment_variable_value = os.environ.get(environment_variable_name)
|
||||
if not environment_variable_value or environment_variable_value != runtime_name:
|
||||
pytest.skip("The test requires {runtime_name} integration started and "
|
||||
"{} environment variable to be set to true (it is '{}')."
|
||||
"{name} environment variable to be set to true (it is '{value}')."
|
||||
" It can be set by specifying '--environment {runtime_name}' at breeze startup"
|
||||
": {item}".
|
||||
format(environment_variable_name, environment_variable_value,
|
||||
format(name=environment_variable_name, value=environment_variable_value,
|
||||
runtime_name=runtime_name, item=item))
|
||||
|
||||
|
||||
|
@ -225,20 +276,36 @@ def skip_if_wrong_backend(marker, item):
|
|||
environment_variable_value = os.environ.get(environment_variable_name)
|
||||
if not environment_variable_value or environment_variable_value not in valid_backend_names:
|
||||
pytest.skip("The test requires one of {valid_backend_names} backend started and "
|
||||
"{} environment variable to be set to true (it is '{}')."
|
||||
"{name} environment variable to be set to 'true' (it is '{value}')."
|
||||
" It can be set by specifying backend at breeze startup"
|
||||
": {item}".
|
||||
format(environment_variable_name, environment_variable_value,
|
||||
format(name=environment_variable_name, value=environment_variable_value,
|
||||
valid_backend_names=valid_backend_names, item=item))
|
||||
|
||||
|
||||
def skip_if_credential_file_missing(item):
|
||||
for marker in item.iter_markers(name="credential_file"):
|
||||
credential_file = marker.args[0]
|
||||
credential_path = os.path.join(os.environ.get('CREDENTIALS_DIR'), credential_file)
|
||||
if not os.path.exists(credential_path):
|
||||
pytest.skip("The test requires credential file {path}: {item}".
|
||||
format(path=credential_path, item=item))
|
||||
|
||||
|
||||
def pytest_runtest_setup(item):
|
||||
selected_integrations = item.config.getoption("--integrations")
|
||||
selected_integrations_list = selected_integrations.split(",") if selected_integrations else []
|
||||
selected_systems = item.config.getoption("--systems")
|
||||
selected_systems_list = selected_systems.split(",") if selected_systems else []
|
||||
include_long_running = item.config.getoption("--include-long-running")
|
||||
for marker in item.iter_markers(name="integration"):
|
||||
skip_if_integration_disabled(marker, item)
|
||||
if selected_integrations_list:
|
||||
skip_if_not_marked_with_integration(selected_integrations, item)
|
||||
skip_if_not_marked_with_integration(selected_integrations_list, item)
|
||||
if selected_systems_list:
|
||||
skip_if_not_marked_with_system(selected_systems_list, item)
|
||||
else:
|
||||
skip_system_test(item)
|
||||
for marker in item.iter_markers(name="backend"):
|
||||
skip_if_wrong_backend(marker, item)
|
||||
selected_backend = item.config.getoption("--backend")
|
||||
|
@ -249,3 +316,6 @@ def pytest_runtest_setup(item):
|
|||
selected_runtime = item.config.getoption("--runtime")
|
||||
if selected_runtime:
|
||||
skip_if_not_marked_with_runtime(selected_runtime, item)
|
||||
if not include_long_running:
|
||||
skip_long_running_test(item)
|
||||
skip_if_credential_file_missing(item)
|
||||
|
|
|
@ -19,12 +19,14 @@
|
|||
|
||||
import unittest
|
||||
|
||||
import pytest
|
||||
|
||||
from airflow.providers.google.cloud.hooks import bigquery as hook
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_BIGQUERY_KEY
|
||||
from tests.test_utils.gcp_system_helpers import skip_gcp_system
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_BIGQUERY_KEY)
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_BIGQUERY_KEY)
|
||||
class BigQueryDataframeResultsSystemTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.instance = hook.BigQueryHook()
|
||||
|
|
|
@ -15,21 +15,27 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_AUTOML_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_AUTOML_KEY, require_local_executor=True, long_lasting=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_AUTOML_KEY)
|
||||
@pytest.mark.long_running
|
||||
class AutoMLDatasetOperationsSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_AUTOML_KEY)
|
||||
def test_run_example_dag(self):
|
||||
self.run_dag('example_automl_dataset', CLOUD_DAG_FOLDER)
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_AUTOML_KEY, require_local_executor=True, long_lasting=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_AUTOML_KEY)
|
||||
@pytest.mark.long_running
|
||||
class AutoMLModelOperationsSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_AUTOML_KEY)
|
||||
def test_run_example_dag(self):
|
||||
|
|
|
@ -15,17 +15,20 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from airflow.providers.google.cloud.example_dags.example_bigquery_dts import (
|
||||
BUCKET_URI, GCP_DTS_BQ_DATASET, GCP_DTS_BQ_TABLE, GCP_PROJECT_ID,
|
||||
)
|
||||
from tests.providers.google.cloud.operators.test_bigquery_dts_system_helper import GcpBigqueryDtsTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_BIGQUERY_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_BIGQUERY_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_BIGQUERY_KEY)
|
||||
class GcpBigqueryDtsSystemTest(SystemTest):
|
||||
helper = GcpBigqueryDtsTestHelper()
|
||||
|
||||
|
|
|
@ -16,14 +16,17 @@
|
|||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""System tests for Google Cloud Build operators"""
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_bigquery_system_helper import GCPBigQueryTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_BIGQUERY_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_BIGQUERY_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_BIGQUERY_KEY)
|
||||
class BigQueryExampleDagsSystemTest(SystemTest):
|
||||
"""
|
||||
System tests for Google BigQuery operators
|
||||
|
|
|
@ -15,15 +15,17 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_bigtable_system_helper import GCPBigtableTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_BIGTABLE_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_BIGTABLE_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_BIGTABLE_KEY)
|
||||
class BigTableExampleDagsSystemTest(SystemTest):
|
||||
helper = GCPBigtableTestHelper()
|
||||
|
||||
|
|
|
@ -16,14 +16,17 @@
|
|||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""System tests for Google Cloud Build operators"""
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_cloud_build_system_helper import GCPCloudBuildTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_CLOUD_BUILD_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_CLOUD_BUILD_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_CLOUD_BUILD_KEY)
|
||||
class CloudBuildExampleDagsSystemTest(SystemTest):
|
||||
"""
|
||||
System tests for Google Cloud Build operators
|
||||
|
|
|
@ -16,16 +16,19 @@
|
|||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""System tests for Google Cloud Memorystore operators"""
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_cloud_memorystore_system_helper import (
|
||||
GCPCloudMemorystoreTestHelper,
|
||||
)
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_MEMORYSTORE # TODO: Update it
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_MEMORYSTORE, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_MEMORYSTORE)
|
||||
class CloudBuildExampleDagsSystemTest(SystemTest):
|
||||
"""
|
||||
System tests for Google Cloud Memorystore operators
|
||||
|
|
|
@ -20,11 +20,13 @@ import random
|
|||
import string
|
||||
import time
|
||||
|
||||
import pytest
|
||||
|
||||
from airflow import AirflowException
|
||||
from airflow.providers.google.cloud.hooks.cloud_sql import CloudSqlProxyRunner
|
||||
from tests.providers.google.cloud.operators.test_cloud_sql_system_helper import CloudSqlQueryTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_CLOUDSQL_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
GCP_PROJECT_ID = os.environ.get('GCP_PROJECT_ID', 'project-id')
|
||||
|
@ -32,7 +34,8 @@ GCP_PROJECT_ID = os.environ.get('GCP_PROJECT_ID', 'project-id')
|
|||
SQL_QUERY_TEST_HELPER = CloudSqlQueryTestHelper()
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_CLOUDSQL_KEY)
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_CLOUDSQL_KEY)
|
||||
class CloudSqlProxySystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_CLOUDSQL_KEY)
|
||||
def setUp(self):
|
||||
|
@ -95,7 +98,8 @@ class CloudSqlProxySystemTest(SystemTest):
|
|||
self.assertEqual(runner.get_proxy_version(), "1.13")
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_CLOUDSQL_KEY)
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_CLOUDSQL_KEY)
|
||||
class CloudSqlQueryExampleDagsSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_CLOUDSQL_KEY)
|
||||
def setUp(self):
|
||||
|
|
|
@ -17,10 +17,12 @@
|
|||
# under the License.
|
||||
import os
|
||||
|
||||
import pytest
|
||||
|
||||
from airflow import AirflowException
|
||||
from tests.providers.google.cloud.operators.test_cloud_sql_system_helper import CloudSqlQueryTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_CLOUDSQL_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
GCP_PROJECT_ID = os.environ.get('GCP_PROJECT_ID', 'project-id')
|
||||
|
@ -28,7 +30,9 @@ GCP_PROJECT_ID = os.environ.get('GCP_PROJECT_ID', 'project-id')
|
|||
SQL_QUERY_TEST_HELPER = CloudSqlQueryTestHelper()
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_CLOUDSQL_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_CLOUDSQL_KEY)
|
||||
class CloudSqlExampleDagsIntegrationTest(SystemTest):
|
||||
@provide_gcp_context(GCP_CLOUDSQL_KEY)
|
||||
def tearDown(self):
|
||||
|
|
|
@ -15,16 +15,19 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_cloud_storage_transfer_service_system_helper import (
|
||||
GCPTransferTestHelper,
|
||||
)
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_GCS_TRANSFER_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_GCS_TRANSFER_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_GCS_TRANSFER_KEY)
|
||||
class GcpTransferExampleDagsSystemTest(SystemTest):
|
||||
helper = GCPTransferTestHelper()
|
||||
|
||||
|
|
|
@ -15,17 +15,20 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_compute_system_helper import GCPComputeTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_COMPUTE_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_COMPUTE_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_COMPUTE_KEY)
|
||||
class GcpComputeExampleDagsSystemTest(SystemTest):
|
||||
helper = GCPComputeTestHelper()
|
||||
|
||||
@provide_gcp_context(GCP_COMPUTE_KEY)
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
|
@ -42,7 +45,9 @@ class GcpComputeExampleDagsSystemTest(SystemTest):
|
|||
self.run_dag('example_gcp_compute', CLOUD_DAG_FOLDER)
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_COMPUTE_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_COMPUTE_KEY)
|
||||
class GcpComputeIgmExampleDagsSystemTest(SystemTest):
|
||||
helper = GCPComputeTestHelper()
|
||||
|
||||
|
|
|
@ -15,14 +15,16 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_DATAFLOW_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_DATAFLOW_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_DATAFLOW_KEY)
|
||||
class CloudDataflowExampleDagsSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_DATAFLOW_KEY)
|
||||
def test_run_example_dag_function(self):
|
||||
|
|
|
@ -17,9 +17,11 @@
|
|||
# under the License.
|
||||
import os
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_dataproc_operator_system_helper import DataprocTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_DATAPROC_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
BUCKET = os.environ.get("GCP_DATAPROC_BUCKET", "dataproc-system-tests")
|
||||
|
@ -27,7 +29,9 @@ PYSPARK_MAIN = os.environ.get("PYSPARK_MAIN", "hello_world.py")
|
|||
PYSPARK_URI = "gs://{}/{}".format(BUCKET, PYSPARK_MAIN)
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_DATAPROC_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_DATAPROC_KEY)
|
||||
class DataprocExampleDagsTest(SystemTest):
|
||||
helper = DataprocTestHelper()
|
||||
|
||||
|
|
|
@ -15,15 +15,17 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_datastore_system_helper import GcpDatastoreSystemTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_DATASTORE_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_DATASTORE_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_DATASTORE_KEY)
|
||||
class GcpDatastoreSystemTest(SystemTest):
|
||||
helper = GcpDatastoreSystemTestHelper()
|
||||
|
||||
|
|
|
@ -21,13 +21,16 @@
|
|||
This module contains various unit tests for
|
||||
example_gcp_dlp DAG
|
||||
"""
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_DLP_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_DLP_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_DLP_KEY)
|
||||
class GcpDLPExampleDagsSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_DLP_KEY)
|
||||
def test_run_example_dag_function(self):
|
||||
|
|
|
@ -15,13 +15,16 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_FUNCTION_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_FUNCTION_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_FUNCTION_KEY)
|
||||
class GcpFunctionExampleDagsSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_FUNCTION_KEY)
|
||||
def test_run_example_dag_function(self):
|
||||
|
|
|
@ -15,15 +15,17 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_gcs_system_helper import GcsSystemTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_GCS_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_GCS_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_GCS_KEY)
|
||||
class GoogleCloudStorageExampleDagsTest(SystemTest):
|
||||
helper = GcsSystemTestHelper()
|
||||
|
||||
|
|
|
@ -16,15 +16,17 @@
|
|||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""System tests for Google Cloud Build operators"""
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_gcs_to_gcs_system_helper import GcsToGcsTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_GCS_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_GCS_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_GCS_KEY)
|
||||
class GcsToGcsExampleDagsSystemTest(SystemTest):
|
||||
"""
|
||||
System tests for Google Cloud Storage to Google Cloud Storage transfer operators
|
||||
|
|
|
@ -16,14 +16,16 @@
|
|||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""System tests for Google Cloud Build operators"""
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_gcs_to_sftp_system_helper import GcsToSFTPTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_GCS_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_GCS_KEY)
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_GCS_KEY)
|
||||
class GcsToSftpExampleDagsSystemTest(SystemTest):
|
||||
"""
|
||||
System tests for Google Cloud Storage to SFTP transfer operator
|
||||
|
|
|
@ -15,14 +15,16 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_GKE_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_GKE_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_GKE_KEY)
|
||||
class KubernetesEngineExampleDagTest(SystemTest):
|
||||
@provide_gcp_context(GCP_GKE_KEY)
|
||||
def test_run_example_dag(self):
|
||||
|
|
|
@ -15,13 +15,16 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_mlengine_system_helper import MlEngineSystemTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_AI_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_AI_KEY)
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_AI_KEY)
|
||||
class MlEngineExampleDagTest(SystemTest):
|
||||
helper = MlEngineSystemTestHelper()
|
||||
@provide_gcp_context(GCP_AI_KEY)
|
||||
|
|
|
@ -15,13 +15,16 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_AI_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_AI_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_AI_KEY)
|
||||
class CloudNaturalLanguageExampleDagsTest(SystemTest):
|
||||
@provide_gcp_context(GCP_AI_KEY)
|
||||
def test_run_example_dag(self):
|
||||
|
|
|
@ -15,13 +15,13 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
from psycopg2 import ProgrammingError
|
||||
|
||||
from airflow.providers.postgres.hooks.postgres import PostgresHook
|
||||
from tests.contrib.utils.logging_command_executor import LoggingCommandExecutor
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_GCS_KEY
|
||||
from tests.test_utils.gcp_system_helpers import provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
GCS_BUCKET = "postgres_to_gcs_example"
|
||||
|
@ -80,7 +80,9 @@ class GcsHelper(LoggingCommandExecutor):
|
|||
self.execute_cmd(["gsutil", "-m", "rm", "-r", "gs://{}".format(GCS_BUCKET)])
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_GCS_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_GCS_KEY)
|
||||
class PostgresToGCSSystemTest(SystemTest):
|
||||
helper = GcsHelper()
|
||||
|
||||
|
|
|
@ -15,14 +15,16 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_PUBSUB_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_PUBSUB_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_PUBSUB_KEY)
|
||||
class PubSubSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_PUBSUB_KEY)
|
||||
def test_run_example_dag(self):
|
||||
|
|
|
@ -16,14 +16,16 @@
|
|||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""System tests for Google Cloud Build operators"""
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_sftp_to_gcs_system_helper import SFTPtoGcsTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_GCS_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_GCS_KEY)
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_GCS_KEY)
|
||||
class SFTPToGcsExampleDagsSystemTest(SystemTest):
|
||||
"""
|
||||
System tests for SFTP to Google Cloud Storage transfer operator
|
||||
|
|
|
@ -15,14 +15,17 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_spanner_system_helper import GCPSpannerTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_SPANNER_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_SPANNER_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_SPANNER_KEY)
|
||||
class CloudSpannerExampleDagsTest(SystemTest):
|
||||
helper = GCPSpannerTestHelper()
|
||||
|
||||
|
|
|
@ -15,15 +15,17 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_speech_system_helper import GCPTextToSpeechTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_GCS_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_GCS_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_GCS_KEY)
|
||||
class GCPTextToSpeechExampleDagSystemTest(SystemTest):
|
||||
helper = GCPTextToSpeechTestHelper()
|
||||
|
||||
|
|
|
@ -15,13 +15,15 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_TASKS_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_TASKS_KEY)
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_TASKS_KEY)
|
||||
class GcpTasksExampleDagsSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_TASKS_KEY)
|
||||
def test_run_example_dag_function(self):
|
||||
|
|
|
@ -15,14 +15,16 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_AI_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_AI_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_AI_KEY)
|
||||
class CloudTranslateExampleDagsSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_AI_KEY)
|
||||
def test_run_example_dag_function(self):
|
||||
|
|
|
@ -15,17 +15,19 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_video_intelligence_system_helper import (
|
||||
GCPVideoIntelligenceHelper,
|
||||
)
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_AI_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_AI_KEY, require_local_executor=True)
|
||||
@pytest.mark.backend("mysql", "postgres")
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_AI_KEY)
|
||||
class CloudVideoIntelligenceExampleDagsTest(SystemTest):
|
||||
helper = GCPVideoIntelligenceHelper()
|
||||
|
||||
|
|
|
@ -15,16 +15,18 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.operators.test_vision_system_helper import GCPVisionTestHelper
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_AI_KEY
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import CLOUD_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
VISION_HELPER = GCPVisionTestHelper()
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_AI_KEY)
|
||||
@pytest.mark.system("google.cloud")
|
||||
@pytest.mark.credential_file(GCP_AI_KEY)
|
||||
class CloudVisionExampleDagsSystemTest(SystemTest):
|
||||
@provide_gcp_context(GCP_AI_KEY)
|
||||
def setUp(self):
|
||||
|
|
|
@ -23,8 +23,9 @@ from typing import Optional # noqa: W0611
|
|||
from airflow import AirflowException, settings
|
||||
from airflow.models import Connection
|
||||
from tests.contrib.utils.logging_command_executor import LoggingCommandExecutor
|
||||
|
||||
# Please keep these variables in alphabetical order.
|
||||
from tests.test_utils import AIRFLOW_MAIN_FOLDER
|
||||
|
||||
GCP_AI_KEY = 'gcp_ai.json'
|
||||
GCP_AUTOML_KEY = 'gcp_automl.json'
|
||||
GCP_BIGQUERY_KEY = 'gcp_bigquery.json'
|
||||
|
@ -53,10 +54,6 @@ KEYFILE_DICT_EXTRA = 'extra__google_cloud_platform__keyfile_dict'
|
|||
SCOPE_EXTRA = 'extra__google_cloud_platform__scope'
|
||||
PROJECT_EXTRA = 'extra__google_cloud_platform__project'
|
||||
|
||||
AIRFLOW_MAIN_FOLDER = os.path.realpath(
|
||||
os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir, os.pardir, os.pardir)
|
||||
)
|
||||
|
||||
|
||||
class GcpAuthenticator(LoggingCommandExecutor):
|
||||
"""
|
||||
|
|
|
@ -15,12 +15,13 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GOOGLE_CAMPAIGN_MANAGER_KEY
|
||||
from tests.providers.google.marketing_platform.operators.test_campaign_manager_system_helper import (
|
||||
GoogleCampaignManagerTestHelper,
|
||||
)
|
||||
from tests.test_utils.gcp_system_helpers import MARKETING_DAG_FOLDER, provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import MARKETING_DAG_FOLDER, provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
# Required scopes
|
||||
|
@ -31,7 +32,8 @@ SCOPES = [
|
|||
]
|
||||
|
||||
|
||||
@skip_gcp_system(GOOGLE_CAMPAIGN_MANAGER_KEY)
|
||||
@pytest.mark.system("google.marketing_platform")
|
||||
@pytest.mark.credential_file(GOOGLE_CAMPAIGN_MANAGER_KEY)
|
||||
class CampaignManagerSystemTest(SystemTest):
|
||||
helper = GoogleCampaignManagerTestHelper()
|
||||
|
||||
|
|
|
@ -14,19 +14,21 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_DISPLAY_VIDEO_KEY
|
||||
from tests.providers.google.marketing_platform.operators.test_display_video_system_helper import (
|
||||
GcpDisplayVideoSystemTestHelper,
|
||||
)
|
||||
from tests.test_utils.gcp_system_helpers import provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
# Requires the following scope:
|
||||
SCOPES = ["https://www.googleapis.com/auth/doubleclickbidmanager"]
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_DISPLAY_VIDEO_KEY)
|
||||
@pytest.mark.system("google.marketing_platform")
|
||||
@pytest.mark.credential_file(GCP_DISPLAY_VIDEO_KEY)
|
||||
class DisplayVideoSystemTest(SystemTest):
|
||||
helper = GcpDisplayVideoSystemTestHelper()
|
||||
|
||||
|
|
|
@ -15,16 +15,18 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import pytest
|
||||
|
||||
from tests.providers.google.cloud.utils.gcp_authenticator import GCP_SEARCHADS_KEY
|
||||
from tests.providers.google.marketing_platform.operators.test_search_ads_system_helper import (
|
||||
GoogleSearchAdsSystemTestHelper,
|
||||
)
|
||||
from tests.test_utils.gcp_system_helpers import provide_gcp_context, skip_gcp_system
|
||||
from tests.test_utils.gcp_system_helpers import provide_gcp_context
|
||||
from tests.test_utils.system_tests_class import SystemTest
|
||||
|
||||
|
||||
@skip_gcp_system(GCP_SEARCHADS_KEY)
|
||||
@pytest.mark.system("google.marketing_platform")
|
||||
@pytest.mark.credential_file(GCP_SEARCHADS_KEY)
|
||||
class SearchAdsSystemTest(SystemTest):
|
||||
helper = GoogleSearchAdsSystemTestHelper()
|
||||
|
||||
|
|
|
@ -20,9 +20,8 @@ import unittest
|
|||
|
||||
from parameterized import parameterized
|
||||
|
||||
AIRFLOW_MAIN_FOLDER = os.path.realpath(
|
||||
os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir)
|
||||
)
|
||||
from tests.test_utils import AIRFLOW_MAIN_FOLDER
|
||||
|
||||
CONFIG_TEMPLATES_FOLDER = os.path.join(AIRFLOW_MAIN_FOLDER, "airflow", "config_templates")
|
||||
|
||||
DEFAULT_AIRFLOW_SECTIONS = [
|
||||
|
|
|
@ -15,3 +15,8 @@
|
|||
# KIND, either express or implied. See the License for the
|
||||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import os
|
||||
|
||||
AIRFLOW_MAIN_FOLDER = os.path.realpath(
|
||||
os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir, os.pardir)
|
||||
)
|
||||
|
|
|
@ -16,14 +16,11 @@
|
|||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import os
|
||||
import unittest
|
||||
from typing import Optional, Sequence
|
||||
|
||||
from airflow.providers.google.cloud.utils.credentials_provider import provide_gcp_conn_and_credentials
|
||||
from tests.test_utils import AIRFLOW_MAIN_FOLDER
|
||||
|
||||
AIRFLOW_MAIN_FOLDER = os.path.realpath(
|
||||
os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir, os.pardir)
|
||||
)
|
||||
CLOUD_DAG_FOLDER = os.path.join(
|
||||
AIRFLOW_MAIN_FOLDER, "airflow", "providers", "google", "cloud", "example_dags"
|
||||
)
|
||||
|
@ -35,43 +32,6 @@ POSTGRES_LOCAL_EXECUTOR = os.path.join(
|
|||
)
|
||||
|
||||
|
||||
SKIP_TEST_WARNING = """
|
||||
The test is only run when the test is run in environment with GCP-system-tests enabled
|
||||
environment. You can enable it in one of two ways:
|
||||
|
||||
* Set GCP_CONFIG_DIR environment variable to point to the GCP configuration
|
||||
directory which keeps the {} key.
|
||||
* Run this test within automated environment variable workspace where
|
||||
config directory is checked out next to the airflow one.
|
||||
|
||||
"""
|
||||
|
||||
SKIP_LONG_TEST_WARNING = """
|
||||
The test is only run when the test is run in with GCP-system-tests enabled
|
||||
environment. And environment variable GCP_ENABLE_LONG_TESTS is set to True.
|
||||
You can enable it in one of two ways:
|
||||
|
||||
* Set GCP_CONFIG_DIR environment variable to point to the GCP configuration
|
||||
directory which keeps variables.env file with environment variables to set
|
||||
and keys directory which keeps service account keys in .json format and
|
||||
set GCP_ENABLE_LONG_TESTS to True
|
||||
* Run this test within automated environment variable workspace where
|
||||
config directory is checked out next to the airflow one.
|
||||
"""
|
||||
|
||||
|
||||
LOCAL_EXECUTOR_WARNING = """
|
||||
The test requires local executor. Please set AIRFLOW_CONFIG variable to '{}'
|
||||
and make sure you have a Postgres server running locally and
|
||||
airflow/airflow.db database created.
|
||||
|
||||
You can create the database via these commands:
|
||||
'createuser root'
|
||||
'createdb airflow/airflow.db`
|
||||
|
||||
"""
|
||||
|
||||
|
||||
def resolve_full_gcp_key_path(key: str) -> str:
|
||||
"""
|
||||
Returns path full path to provided GCP key.
|
||||
|
@ -80,42 +40,11 @@ def resolve_full_gcp_key_path(key: str) -> str:
|
|||
:type key: str
|
||||
:returns: Full path to the key
|
||||
"""
|
||||
path = os.environ.get("GCP_CONFIG_DIR", "/config")
|
||||
key = os.path.join(path, "keys", key)
|
||||
path = os.environ.get("CREDENTIALS_DIR", "/files/airflow-breeze-config/keys")
|
||||
key = os.path.join(path, key)
|
||||
return key
|
||||
|
||||
|
||||
def skip_gcp_system(
|
||||
service_key: str, long_lasting: bool = False, require_local_executor: bool = False
|
||||
):
|
||||
"""
|
||||
Decorator for skipping GCP system tests.
|
||||
|
||||
:param service_key: name of the service key that will be used to provide credentials
|
||||
:type service_key: str
|
||||
:param long_lasting: set True if a test take relatively long time
|
||||
:type long_lasting: bool
|
||||
:param require_local_executor: set True if test config must use local executor
|
||||
:type require_local_executor: bool
|
||||
"""
|
||||
try:
|
||||
full_key_path = resolve_full_gcp_key_path(service_key)
|
||||
with open(full_key_path):
|
||||
pass
|
||||
except FileNotFoundError:
|
||||
return unittest.skip(SKIP_TEST_WARNING.format(service_key))
|
||||
|
||||
if long_lasting and os.environ.get("GCP_ENABLE_LONG_TESTS") == "True":
|
||||
return unittest.skip(SKIP_LONG_TEST_WARNING)
|
||||
|
||||
if require_local_executor and POSTGRES_LOCAL_EXECUTOR != os.environ.get(
|
||||
"AIRFLOW_CONFIG"
|
||||
):
|
||||
return unittest.skip(LOCAL_EXECUTOR_WARNING.format(POSTGRES_LOCAL_EXECUTOR))
|
||||
|
||||
return lambda cls: cls
|
||||
|
||||
|
||||
def provide_gcp_context(
|
||||
key_file_path: Optional[str] = None,
|
||||
scopes: Optional[Sequence] = None,
|
||||
|
|
|
@ -16,25 +16,21 @@
|
|||
# specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import os
|
||||
import shutil
|
||||
from contextlib import ContextDecorator
|
||||
from datetime import datetime
|
||||
from shutil import move
|
||||
from tempfile import mkdtemp
|
||||
from unittest import SkipTest, TestCase
|
||||
from unittest import TestCase
|
||||
|
||||
from airflow import AirflowException, models
|
||||
from airflow.configuration import AIRFLOW_HOME, AirflowConfigParser, get_airflow_config
|
||||
from airflow.utils import db
|
||||
from airflow.utils.file import mkdirs
|
||||
from airflow.utils.log.logging_mixin import LoggingMixin
|
||||
from tests.test_utils import AIRFLOW_MAIN_FOLDER
|
||||
|
||||
AIRFLOW_MAIN_FOLDER = os.path.realpath(
|
||||
os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir, os.pardir)
|
||||
)
|
||||
DEFAULT_DAG_FOLDER = os.path.join(AIRFLOW_MAIN_FOLDER, "airflow", "example_dags")
|
||||
|
||||
SKIP_SYSTEM_TEST_WARNING = """Skipping system test.
|
||||
To allow system test set ENABLE_SYSTEM_TESTS=true.
|
||||
"""
|
||||
|
||||
|
||||
def resolve_dags_folder() -> str:
|
||||
"""
|
||||
|
@ -50,7 +46,24 @@ def resolve_dags_folder() -> str:
|
|||
return dags
|
||||
|
||||
|
||||
class empty_dags_directory( # pylint: disable=invalid-name
|
||||
def resolve_logs_folder() -> str:
|
||||
"""
|
||||
Returns LOGS folder specified in current Airflow config.
|
||||
"""
|
||||
config_file = get_airflow_config(AIRFLOW_HOME)
|
||||
conf = AirflowConfigParser()
|
||||
conf.read(config_file)
|
||||
try:
|
||||
logs = conf.get("logging", "base_log_folder")
|
||||
except AirflowException:
|
||||
try:
|
||||
logs = conf.get("core", "base_log_folder")
|
||||
except AirflowException:
|
||||
logs = os.path.join(AIRFLOW_HOME, 'logs')
|
||||
return logs
|
||||
|
||||
|
||||
class EmptyDagsDirectory( # pylint: disable=invalid-name
|
||||
ContextDecorator, LoggingMixin
|
||||
):
|
||||
"""
|
||||
|
@ -93,23 +106,93 @@ class empty_dags_directory( # pylint: disable=invalid-name
|
|||
|
||||
|
||||
class SystemTest(TestCase, LoggingMixin):
|
||||
def run(self, result=None):
|
||||
if os.environ.get('ENABLE_SYSTEM_TESTS') != 'true':
|
||||
raise SkipTest(SKIP_SYSTEM_TEST_WARNING)
|
||||
return super().run(result)
|
||||
|
||||
def setUp(self) -> None:
|
||||
"""
|
||||
We want to avoid random errors while database got reset - those
|
||||
Are apparently triggered by parser trying to parse DAGs while
|
||||
The tables are dropped. We move the dags temporarily out of the dags folder
|
||||
and move them back after reset
|
||||
and move them back after reset.
|
||||
|
||||
We also remove all logs from logs directory to have a clear log state and see only logs from this
|
||||
test.
|
||||
"""
|
||||
dag_folder = resolve_dags_folder()
|
||||
with empty_dags_directory(dag_folder):
|
||||
db.resetdb()
|
||||
with EmptyDagsDirectory(dag_folder):
|
||||
self.initial_db_init()
|
||||
print()
|
||||
print("Removing all log files except previous_runs")
|
||||
print()
|
||||
logs_folder = resolve_logs_folder()
|
||||
files = os.listdir(logs_folder)
|
||||
for file in files:
|
||||
file_path = os.path.join(logs_folder, file)
|
||||
if os.path.isfile(file_path):
|
||||
os.remove(file_path)
|
||||
elif os.path.isdir(file) and not file == "previous_runs":
|
||||
shutil.rmtree(file_path, ignore_errors=True)
|
||||
super().setUp()
|
||||
|
||||
def tearDown(self) -> None:
|
||||
"""
|
||||
We save the logs to a separate directory so that we can see them later.
|
||||
"""
|
||||
date_str = datetime.now().strftime("%Y-%m-%d_%H_%M_%S")
|
||||
logs_folder = resolve_logs_folder()
|
||||
print()
|
||||
print(f"Saving all log files to {logs_folder}/previous_runs/{date_str}")
|
||||
print()
|
||||
target_dir = os.path.join(logs_folder, "previous_runs", date_str)
|
||||
mkdirs(target_dir, 0o755)
|
||||
files = os.listdir(logs_folder)
|
||||
for file in files:
|
||||
if file != "previous_runs":
|
||||
file_path = os.path.join(logs_folder, file)
|
||||
shutil.move(file_path, target_dir)
|
||||
super().setUp()
|
||||
|
||||
def initial_db_init(self):
|
||||
if os.environ.get("RUN_AIRFLOW_1_10"):
|
||||
print("Attempting to reset the db using airflow command")
|
||||
os.system("airflow resetdb -y")
|
||||
else:
|
||||
from airflow.utils import db
|
||||
db.resetdb()
|
||||
|
||||
def _print_all_log_files(self):
|
||||
print()
|
||||
print("Printing all log files")
|
||||
print()
|
||||
logs_folder = resolve_logs_folder()
|
||||
for dirpath, _, filenames in os.walk(logs_folder):
|
||||
if "/previous_runs" not in dirpath:
|
||||
for name in filenames:
|
||||
filepath = os.path.join(dirpath, name)
|
||||
print()
|
||||
print(f" ================ Content of {filepath} ===============================")
|
||||
print()
|
||||
with open(filepath, "r") as f:
|
||||
print(f.read())
|
||||
|
||||
def correct_imports_for_airflow_1_10(self, directory):
|
||||
for dirpath, _, filenames in os.walk(directory):
|
||||
for filename in filenames:
|
||||
filepath = os.path.join(dirpath, filename)
|
||||
if filepath.endswith(".py"):
|
||||
self.replace_airflow_1_10_imports(filepath)
|
||||
|
||||
def replace_airflow_1_10_imports(self, filepath):
|
||||
replacements = [
|
||||
("airflow.operators.bash", "airflow.operators.bash_operator"),
|
||||
("airflow.operators.python", "airflow.operators.python_operator"),
|
||||
]
|
||||
with open(filepath, "rt") as file:
|
||||
data = file.read()
|
||||
for replacement in replacements:
|
||||
data = data.replace(replacement[0], replacement[1])
|
||||
with open(filepath, "wt") as file:
|
||||
file.write(data)
|
||||
|
||||
def run_dag(self, dag_id: str, dag_folder: str = DEFAULT_DAG_FOLDER) -> None:
|
||||
"""
|
||||
Runs example dag by it's ID.
|
||||
|
@ -119,6 +202,15 @@ class SystemTest(TestCase, LoggingMixin):
|
|||
:param dag_folder: directory where to look for the specific DAG. Relative to AIRFLOW_HOME.
|
||||
:type dag_folder: str
|
||||
"""
|
||||
if os.environ.get("RUN_AIRFLOW_1_10"):
|
||||
# For system tests purpose we are mounting airflow/providers to /providers folder
|
||||
# So that we can get example_dags from there
|
||||
dag_folder = dag_folder.replace("/opt/airflow/airflow/providers", "/providers")
|
||||
temp_dir = mkdtemp()
|
||||
os.rmdir(temp_dir)
|
||||
shutil.copytree(dag_folder, temp_dir)
|
||||
dag_folder = temp_dir
|
||||
self.correct_imports_for_airflow_1_10(temp_dir)
|
||||
self.log.info("Looking for DAG: %s in %s", dag_id, dag_folder)
|
||||
dag_bag = models.DagBag(dag_folder=dag_folder, include_examples=False)
|
||||
dag = dag_bag.get_dag(dag_id)
|
||||
|
@ -135,4 +227,8 @@ class SystemTest(TestCase, LoggingMixin):
|
|||
|
||||
self.log.info("Attempting to run DAG: %s", dag_id)
|
||||
dag.clear(reset_dag_runs=True)
|
||||
dag.run(ignore_first_depends_on_past=True, verbose=True)
|
||||
try:
|
||||
dag.run(ignore_first_depends_on_past=True, verbose=True)
|
||||
except Exception:
|
||||
self._print_all_log_files()
|
||||
raise
|
||||
|
|
Загрузка…
Ссылка в новой задаче