incubator-airflow/dev
Ash Berlin-Taylor 2936c13a44
Get airflow version from importlib.metadata rather than hard-coding (#12786)
One less thing to change, and one less pre-commit step needed :)
2020-12-04 16:42:25 +00:00
..
provider_packages Get airflow version from importlib.metadata rather than hard-coding (#12786) 2020-12-04 16:42:25 +00:00
templates Release candidate 2 for backport packages 2020.05.20 (#8898) 2020-05-19 14:17:22 +02:00
PROVIDER_PACKAGE_DETAILS.md Restructure documentation for releasing Airflow/Providers (#12350) 2020-11-13 23:04:09 +01:00
README.md Get airflow version from importlib.metadata rather than hard-coding (#12786) 2020-12-04 16:42:25 +00:00
README_RELEASE_AIRFLOW.md Get airflow version from importlib.metadata rather than hard-coding (#12786) 2020-12-04 16:42:25 +00:00
README_RELEASE_PROVIDER_PACKAGES.md Restructure documentation for releasing Airflow/Providers (#12350) 2020-11-13 23:04:09 +01:00
__init__.py Moves provider packages scripts to dev (#12082) 2020-11-09 13:27:10 +01:00
airflow-github Enable Black - Python Auto Formmatter (#9550) 2020-11-03 23:51:54 +00:00
airflow-license Enable Black - Python Auto Formmatter (#9550) 2020-11-03 23:51:54 +00:00
import_all_classes.py Remove Unnecessary comprehension (#12221) 2020-11-10 12:01:24 +00:00
remove_artifacts.sh Fix case of GitHub (#11398) 2020-10-21 14:32:41 +02:00
requirements.txt Remove airflow-pr tool (#10675) 2020-09-01 14:50:31 -04:00
send_email.py Enable Black - Python Auto Formmatter (#9550) 2020-11-03 23:51:54 +00:00
sign.sh Sign release files with an apache.org key by default (#12241) 2020-11-10 11:26:48 +00:00

README.md

Table of contents

Apache Airflow source releases

The Apache Airflow releases are one of the two types:

  • Releases of the Apache Airflow package
  • Releases of the Backport Providers Packages

Apache Airflow Package

This package contains sources that allow the user building fully-functional Apache Airflow 2.0 package. They contain sources for:

  • "apache-airflow" python package that installs "airflow" Python package and includes all the assets required to release the webserver UI coming with Apache Airflow
  • Dockerfile and corresponding scripts that build and use an official DockerImage
  • Breeze development environment that helps with building images and testing locally apache airflow built from sources

In the future (Airflow 2.0) this package will be split into separate "core" and "providers" packages that will be distributed separately, following the mechanisms introduced in Backport Package Providers. We also plan to release the official Helm Chart sources that will allow the user to install Apache Airflow via helm 3.0 chart in a distributed fashion.

The Source releases are the only "official" Apache Software Foundation releases, and they are distributed via Official Apache Download sources

Following source releases Apache Airflow release manager also distributes convenience packages:

Those convenience packages are not "official releases" of Apache Airflow, but the users who cannot or do not want to build the packages themselves can use them as a convenient way of installing Apache Airflow, however they are not considered as "official source releases". You can read more details about it in the ASF Release Policy.

Detailed instruction of releasing Provider Packages can be found in the README_RELEASE_AIRFLOW.md

Provider packages

The Provider packages are packages (per provider) that make it possible to easily install Hooks, Operators, Sensors, and Secrets for different providers (external services used by Airflow).

There are also Backport Provider Packages that allow to use the Operators, Hooks, Secrets from the 2.0 version of Airflow in the 1.10.* series.

Once you release the packages, you can simply install them with:

pip install apache-airflow-providers-<PROVIDER>[<EXTRAS>]

for regular providers and

pip install apache-airflow-backport-providers-<PROVIDER>[<EXTRAS>]

for backport providers.

Where <PROVIDER> is the provider id and <EXTRAS> are optional extra packages to install. You can find the provider packages dependencies and extras in the README.md files in each provider package (in airflow/providers/<PROVIDER> folder) as well as in the PyPI installation page.

Backport providers are a great way to migrate your DAGs to Airflow-2.0 compatible DAGs. You can switch to the new Airflow-2.0 packages in your DAGs, long before you attempt to migrate airflow to 2.0 line.

The sources released in SVN allow to build all the provider packages by the user, following the instructions and scripts provided. Those are also "official_source releases" as described in the ASF Release Policy and they are available via Official Apache Download for providers and Official Apache Download for backport-providers

The full provider's list can be found here: Provider Packages Reference

There are also convenience packages released as "apache-airflow-providers" and "apache-airflow-backport-providers" separately in PyPI. You can find all backport providers via: PyPI query for providers and PyPI query for backport providers.

Detailed instruction of releasing Provider Packages can be found in the README_RELEASE_PROVIDER_PACKAGES.md

Prerequisites for the release manager preparing the release

The person acting as release manager has to fulfill certain pre-requisites. More details and FAQs are available in the ASF Release Policy but here some important pre-requisites are listed below. Note that release manager does not have to be a PMC - it is enough to be committer to assume the release manager role, but there are final steps in the process (uploading final releases to SVN) that can only be done by PMC member. If needed, the release manager can ask PMC to perform that final step of release.

Upload Public keys to id.apache.org

Make sure your public key is on id.apache.org and in KEYS. You will need to sign the release artifacts with your pgp key. After you have created a key, make sure you:

# Create PGP Key
gpg --gen-key

# Checkout ASF dist repo
svn checkout https://dist.apache.org/repos/dist/release/airflow
cd airflow


# Add your GPG pub key to KEYS file. Replace "Kaxil Naik" with your name
(gpg --list-sigs "Kaxil Naik" && gpg --armor --export "Kaxil Naik" ) >> KEYS


# Commit the changes
svn commit -m "Add PGP keys of Airflow developers"

See this for more detail on creating keys and what is required for signing releases.

http://www.apache.org/dev/release-signing.html#basic-facts

Configure PyPI uploads

In order to not reveal your password in plain text, it's best if you create and configure API Upload tokens. You can add and copy the tokens here:

Create a ~/.pypirc file:

[distutils]
index-servers =
  pypi
  pypitest

[pypi]
username=__token__
password=<API Upload Token>

[pypitest]
repository=https://test.pypi.org/legacy/
username=__token__
password=<API Upload Token>

Set proper permissions for the pypirc file:

chmod 600 ~/.pypirc
  • Install twine if you do not have it already (it can be done in a separate virtual environment).
pip install twine

(more details here.)

  • Set proper permissions for the pypirc file: $ chmod 600 ~/.pypirc

Hardware used to prepare and verify the packages

The best way to prepare and verify the releases is to prepare them on a hardware owned and controlled by the committer acting as release manager. While strictly speaking, releases must only be verified on hardware owned and controlled by the committer, for practical reasons it's best if the packages are prepared using such hardware. More information can be found in this FAQ