2019-08-16 07:09:26 +03:00
.. Licensed to the Apache Software Foundation (ASF) under one
2018-11-13 17:01:44 +03:00
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
2019-08-16 07:09:26 +03:00
.. http://www.apache.org/licenses/LICENSE-2.0
2018-11-13 17:01:44 +03:00
2019-08-16 07:09:26 +03:00
.. Unless required by applicable law or agreed to in writing,
2018-11-13 17:01:44 +03:00
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
2019-08-16 07:09:26 +03:00
2015-09-21 13:37:44 +03:00
Security
========
2019-06-06 11:33:54 +03:00
.. include :: ../.github/SECURITY.rst
Web Authentication
------------------
2019-03-10 16:16:36 +03:00
By default, Airflow requires users to specify a password prior to login. You can use the
following CLI commands to create an account:
.. code-block :: bash
# create an admin user
airflow users -c --username admin --firstname Peter --lastname Parker --role Admin --email spiderman@superhero.org
2015-11-06 15:57:08 +03:00
It is however possible to switch on authentication by either using one of the supplied
2017-04-06 15:12:13 +03:00
backends or creating your own.
2015-11-06 15:57:08 +03:00
2017-11-01 17:38:36 +03:00
Be sure to checkout :doc: `api` for securing the API.
2018-10-06 23:51:13 +03:00
.. note ::
2018-10-12 12:13:05 +03:00
Airflow uses the config parser of Python. This config parser interpolates
'%'-signs. Make sure escape any `` % `` signs in your config file (but not
environment variables) as `` %% `` , otherwise Airflow might leak these
passwords on a config parser exception to a log.
2018-10-06 23:51:13 +03:00
2015-11-30 20:54:03 +03:00
Password
''''''''
One of the simplest mechanisms for authentication is requiring users to specify a password before logging in.
2019-01-27 11:44:19 +03:00
Please use command line interface `` airflow users --create `` to create accounts, or do that in the UI.
2015-12-17 21:47:43 +03:00
2015-11-30 20:54:03 +03:00
2015-11-06 15:57:08 +03:00
LDAP
''''
To turn on LDAP authentication configure your `` airflow.cfg `` as follows. Please note that the example uses
2018-11-09 16:58:34 +03:00
an encrypted connection to the ldap server as we do not want passwords be readable on the network level.
2015-11-06 15:57:08 +03:00
2015-12-31 20:54:54 +03:00
Additionally, if you are using Active Directory, and are not explicitly specifying an OU that your users are in,
you will need to change `` search_scope `` to "SUBTREE".
2016-01-05 22:03:37 +03:00
Valid search_scope options can be found in the `ldap3 Documentation <http://ldap3.readthedocs.org/searches.html?highlight=search_scope> `_
2015-11-06 15:57:08 +03:00
.. code-block :: bash
[webserver]
authenticate = True
auth_backend = airflow.contrib.auth.backends.ldap_auth
[ldap]
2016-07-28 00:33:39 +03:00
# set a connection without encryption: uri = ldap://<your.ldap.server>:<port>
2015-11-06 15:57:08 +03:00
uri = ldaps://<your.ldap.server>:<port>
user_filter = objectClass=*
2016-07-28 00:33:39 +03:00
# in case of Active Directory you would use: user_name_attr = sAMAccountName
user_name_attr = uid
2017-04-10 21:08:04 +03:00
# group_member_attr should be set accordingly with *_filter
# eg :
# group_member_attr = groupMembership
# superuser_filter = groupMembership=CN=airflow-super-users...
group_member_attr = memberOf
2015-12-22 00:57:05 +03:00
superuser_filter = memberOf=CN=airflow-super-users,OU=Groups,OU=RWC,OU=US,OU=NORAM,DC=example,DC=com
data_profiler_filter = memberOf=CN=airflow-data-profilers,OU=Groups,OU=RWC,OU=US,OU=NORAM,DC=example,DC=com
2015-11-06 15:57:08 +03:00
bind_user = cn=Manager,dc=example,dc=com
bind_password = insecure
basedn = dc=example,dc=com
cacert = /etc/ca/ldap_ca.crt
2016-07-28 00:33:39 +03:00
# Set search_scope to one of them: BASE, LEVEL , SUBTREE
# Set search_scope to SUBTREE if using Active Directory, and not specifying an Organizational Unit
search_scope = LEVEL
2015-11-06 15:57:08 +03:00
2019-03-07 13:47:59 +03:00
# This option tells ldap3 to ignore schemas that are considered malformed. This sometimes comes up
# when using hosted ldap services.
ignore_malformed_schema = False
2015-12-22 00:57:05 +03:00
The superuser_filter and data_profiler_filter are optional. If defined, these configurations allow you to specify LDAP groups that users must belong to in order to have superuser (admin) and data-profiler permissions. If undefined, all users will be superusers and data profilers.
2015-11-06 15:57:08 +03:00
Roll your own
'''''''''''''
Airflow uses `` flask_login `` and
exposes a set of hooks in the `` airflow.default_login `` module. You can
2017-04-06 15:12:13 +03:00
alter the content and make it part of the `` PYTHONPATH `` and configure it as a backend in `` airflow.cfg `` .
2015-11-06 15:57:08 +03:00
.. code-block :: bash
[webserver]
authenticate = True
auth_backend = mypackage.auth
2019-08-24 21:55:00 +03:00
API Authentication
------------------
Authentication for the API is handled separately to the Web Authentication. The default is to not
require any authentication on the API i.e. wide open by default. This is not recommended if your
Airflow webserver is publicly accessible, and you should probably use the `` deny all `` backend:
.. code-block :: ini
[api]
auth_backend = airflow.api.auth.backend.deny_all
Two "real" methods for authentication are currently supported for the API.
To enabled Password authentication, set the following in the configuration:
.. code-block :: ini
[api]
auth_backend = airflow.contrib.auth.backends.password_auth
It's usage is similar to the Password Authentication used for the Web interface.
To enable Kerberos authentication, set the following in the configuration:
.. code-block :: ini
[api]
auth_backend = airflow.api.auth.backend.kerberos_auth
[kerberos]
keytab = <KEYTAB>
The Kerberos service is configured as `` airflow/fully.qualified.domainname@REALM `` . Make sure this
principal exists in the keytab file.
2015-11-06 15:57:08 +03:00
Kerberos
--------
2016-09-04 16:13:14 +03:00
2015-09-21 13:37:44 +03:00
Airflow has initial support for Kerberos. This means that airflow can renew kerberos
tickets for itself and store it in the ticket cache. The hooks and dags can make use of ticket
to authenticate against kerberized services.
Limitations
2015-11-06 15:57:08 +03:00
'''''''''''
2015-09-21 13:37:44 +03:00
2017-04-06 15:12:13 +03:00
Please note that at this time, not all hooks have been adjusted to make use of this functionality.
2015-09-21 13:37:44 +03:00
Also it does not integrate kerberos into the web interface and you will have to rely on network
level security for now to make sure your service remains secure.
2017-04-06 15:12:13 +03:00
Celery integration has not been tried and tested yet. However, if you generate a key tab for every
host and launch a ticket renewer next to every worker it will most likely work.
2015-09-21 13:37:44 +03:00
2015-11-06 15:57:08 +03:00
Enabling kerberos
'''''''''''''''''
2015-09-21 13:37:44 +03:00
2017-04-06 15:12:13 +03:00
Airflow
^^^^^^^
2015-10-27 00:39:17 +03:00
2015-09-21 13:37:44 +03:00
To enable kerberos you will need to generate a (service) key tab.
.. code-block :: bash
2015-09-22 12:47:33 +03:00
2015-09-21 13:37:44 +03:00
# in the kadmin.local or kadmin shell, create the airflow principal
kadmin: addprinc -randkey airflow/fully.qualified.domain.name@YOUR-REALM.COM
# Create the airflow keytab file that will contain the airflow principal
kadmin: xst -norandkey -k airflow.keytab airflow/fully.qualified.domain.name
Now store this file in a location where the airflow user can read it (chmod 600). And then add the following to
2015-11-06 15:57:08 +03:00
your `` airflow.cfg ``
2015-09-21 13:37:44 +03:00
.. code-block :: bash
2015-09-22 12:47:33 +03:00
2015-10-05 12:56:51 +03:00
[core]
security = kerberos
[kerberos]
2015-09-21 13:37:44 +03:00
keytab = /etc/airflow/airflow.keytab
reinit_frequency = 3600
principal = airflow
Launch the ticket renewer by
.. code-block :: bash
# run ticket renewer
airflow kerberos
2017-04-06 15:12:13 +03:00
Hadoop
^^^^^^
2015-10-27 00:39:17 +03:00
2015-11-06 15:57:08 +03:00
If want to use impersonation this needs to be enabled in `` core-site.xml `` of your hadoop config.
2015-10-27 00:39:17 +03:00
.. code-block :: bash
<property>
<name>hadoop.proxyuser.airflow.groups</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.airflow.users</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.airflow.hosts</name>
<value>*</value>
</property>
Of course if you need to tighten your security replace the asterisk with something more appropriate.
2015-09-21 13:37:44 +03:00
Using kerberos authentication
2015-11-06 15:57:08 +03:00
'''''''''''''''''''''''''''''
2015-09-21 13:37:44 +03:00
2017-04-06 15:12:13 +03:00
The hive hook has been updated to take advantage of kerberos authentication. To allow your DAGs to
use it, simply update the connection details with, for example:
2015-09-21 13:37:44 +03:00
.. code-block :: bash
2015-09-22 12:47:33 +03:00
2015-09-21 13:37:44 +03:00
{ "use_beeline": true, "principal": "hive/_HOST@EXAMPLE.COM"}
Adjust the principal to your settings. The _HOST part will be replaced by the fully qualified domain name of
2015-09-22 12:47:33 +03:00
the server.
2015-09-22 21:00:29 +03:00
You can specify if you would like to use the dag owner as the user for the connection or the user specified in the login
2017-04-06 15:12:13 +03:00
section of the connection. For the login user, specify the following as extra:
2015-09-22 12:47:33 +03:00
.. code-block :: bash
{ "use_beeline": true, "principal": "hive/_HOST@EXAMPLE.COM", "proxy_user": "login"}
2015-09-22 21:00:29 +03:00
For the DAG owner use:
.. code-block :: bash
{ "use_beeline": true, "principal": "hive/_HOST@EXAMPLE.COM", "proxy_user": "owner"}
2017-04-06 15:12:13 +03:00
and in your DAG, when initializing the HiveOperator, specify:
2015-09-22 21:00:29 +03:00
.. code-block :: bash
run_as_owner=True
2018-10-09 18:14:07 +03:00
To use kerberos authentication, you must install Airflow with the `kerberos` extras group:
2019-01-25 12:59:40 +03:00
.. code-block :: bash
2018-10-09 18:14:07 +03:00
2019-03-25 15:14:43 +03:00
pip install 'apache-airflow[kerberos]'
2018-10-09 18:14:07 +03:00
2016-09-04 16:13:14 +03:00
OAuth Authentication
--------------------
2015-12-21 23:24:36 +03:00
GitHub Enterprise (GHE) Authentication
2016-02-13 23:51:13 +03:00
''''''''''''''''''''''''''''''''''''''
2015-12-21 23:24:36 +03:00
The GitHub Enterprise authentication backend can be used to authenticate users
against an installation of GitHub Enterprise using OAuth2. You can optionally
specify a team whitelist (composed of slug cased team names) to restrict login
to only members of those teams.
.. code-block :: bash
[webserver]
authenticate = True
auth_backend = airflow.contrib.auth.backends.github_enterprise_auth
[github_enterprise]
host = github.example.com
client_id = oauth_key_from_github_enterprise
client_secret = oauth_secret_from_github_enterprise
oauth_callback_route = /example/ghe_oauth/callback
2016-10-09 00:27:12 +03:00
allowed_teams = 1, 345, 23
2015-12-21 23:24:36 +03:00
2017-04-06 15:12:13 +03:00
.. note :: If you do not specify a team whitelist, anyone with a valid account on
your GHE installation will be able to login to Airflow.
2018-10-09 18:14:07 +03:00
To use GHE authentication, you must install Airflow with the `github_enterprise` extras group:
2019-01-25 12:59:40 +03:00
.. code-block :: bash
2018-10-09 18:14:07 +03:00
2019-03-25 15:14:43 +03:00
pip install 'apache-airflow[github_enterprise]'
2018-10-09 18:14:07 +03:00
2015-12-21 23:24:36 +03:00
Setting up GHE Authentication
2016-09-04 16:13:14 +03:00
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2015-12-21 23:24:36 +03:00
An application must be setup in GHE before you can use the GHE authentication
backend. In order to setup an application:
1. Navigate to your GHE profile
2. Select 'Applications' from the left hand nav
3. Select the 'Developer Applications' tab
4. Click 'Register new application'
2018-06-04 21:15:35 +03:00
5. Fill in the required information (the 'Authorization callback URL' must be fully qualified e.g. http://airflow.example.com/example/ghe_oauth/callback)
2015-12-21 23:24:36 +03:00
6. Click 'Register application'
2016-02-13 23:51:13 +03:00
7. Copy 'Client ID', 'Client Secret', and your callback route to your airflow.cfg according to the above example
2016-08-20 02:12:58 +03:00
2018-06-12 01:13:52 +03:00
Using GHE Authentication with github.com
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
It is possible to use GHE authentication with github.com:
1. `Create an Oauth App <https://developer.github.com/apps/building-oauth-apps/creating-an-oauth-app/> `_
2. Copy 'Client ID', 'Client Secret' to your airflow.cfg according to the above example
3. Set `` host = github.com `` and `` oauth_callback_route = /oauth/callback `` in airflow.cfg
2016-08-20 02:12:58 +03:00
Google Authentication
2016-09-04 16:13:14 +03:00
'''''''''''''''''''''
2016-08-20 02:12:58 +03:00
The Google authentication backend can be used to authenticate users
2018-08-20 12:42:55 +03:00
against Google using OAuth2. You must specify the email domains to restrict
2017-11-21 09:48:24 +03:00
login, separated with a comma, to only members of those domains.
2016-08-20 02:12:58 +03:00
.. code-block :: bash
[webserver]
authenticate = True
auth_backend = airflow.contrib.auth.backends.google_auth
[google]
client_id = google_client_id
client_secret = google_client_secret
oauth_callback_route = /oauth2callback
2018-11-26 12:12:10 +03:00
domain = example1.com,example2.com
2016-08-20 02:12:58 +03:00
2018-10-09 18:14:07 +03:00
To use Google authentication, you must install Airflow with the `google_auth` extras group:
2019-01-25 12:59:40 +03:00
.. code-block :: bash
2018-10-09 18:14:07 +03:00
2019-03-25 15:14:43 +03:00
pip install 'apache-airflow[google_auth]'
2018-10-09 18:14:07 +03:00
2016-08-20 02:12:58 +03:00
Setting up Google Authentication
2016-09-04 16:13:14 +03:00
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2016-08-20 02:12:58 +03:00
An application must be setup in the Google API Console before you can use the Google authentication
backend. In order to setup an application:
1. Navigate to https://console.developers.google.com/apis/
2. Select 'Credentials' from the left hand nav
3. Click 'Create credentials' and choose 'OAuth client ID'
4. Choose 'Web application'
2018-06-04 21:15:35 +03:00
5. Fill in the required information (the 'Authorized redirect URIs' must be fully qualified e.g. http://airflow.example.com/oauth2callback)
2016-08-20 02:12:58 +03:00
6. Click 'Create'
7. Copy 'Client ID', 'Client Secret', and your redirect URI to your airflow.cfg according to the above example
2016-09-19 16:55:07 +03:00
SSL
---
SSL can be enabled by providing a certificate and key. Once enabled, be sure to use
"https://" in your browser.
.. code-block :: bash
[webserver]
web_server_ssl_cert = <path to cert>
web_server_ssl_key = <path to key>
Enabling SSL will not automatically change the web server port. If you want to use the
standard port 443, you'll need to configure that too. Be aware that super user privileges
(or cap_net_bind_service on Linux) are required to listen on port 443.
.. code-block :: bash
# Optionally, set the server to listen on the standard SSL port.
web_server_port = 443
base_url = http://<hostname or IP>:443
2017-01-19 05:11:01 +03:00
2017-06-01 11:19:30 +03:00
Enable CeleryExecutor with SSL. Ensure you properly generate client and server
certs and keys.
.. code-block :: bash
[celery]
[AIRFLOW-3127] Fix out-dated doc for Celery SSL (#3967)
Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)
But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.
2018-09-28 11:56:43 +03:00
ssl_active = True
ssl_key = <path to key>
ssl_cert = <path to cert>
ssl_cacert = <path to cacert>
2017-06-01 11:19:30 +03:00
2017-01-19 05:11:01 +03:00
Impersonation
2017-04-06 15:12:13 +03:00
-------------
2017-01-19 05:11:01 +03:00
Airflow has the ability to impersonate a unix user while running task
instances based on the task's `` run_as_user `` parameter, which takes a user's name.
2017-04-06 15:12:13 +03:00
**NOTE:** For impersonations to work, Airflow must be run with `sudo` as subtasks are run
2017-01-19 05:11:01 +03:00
with `sudo -u` and permissions of files are changed. Furthermore, the unix user needs to
exist on the worker. Here is what a simple sudoers file entry could look like to achieve
this, assuming as airflow is running as the `airflow` user. Note that this means that
the airflow user must be trusted and treated the same way as the root user.
.. code-block :: none
2017-03-17 02:37:23 +03:00
2017-01-19 05:11:01 +03:00
airflow ALL=(ALL) NOPASSWD: ALL
2017-04-06 15:12:13 +03:00
2017-01-19 05:11:01 +03:00
Subtasks with impersonation will still log to the same folder, except that the files they
log to will have permissions changed such that only the unix user can write to it.
2017-04-06 15:12:13 +03:00
Default Impersonation
'''''''''''''''''''''
To prevent tasks that don't use impersonation to be run with `sudo` privileges, you can set the
2017-04-10 21:08:04 +03:00
`` core:default_impersonation `` config which sets a default user impersonate if `run_as_user` is
2017-04-06 15:12:13 +03:00
not set.
.. code-block :: bash
[core]
default_impersonation = airflow
2018-11-13 17:48:23 +03:00
Flower Authentication
---------------------
Basic authentication for Celery Flower is supported.
You can specify the details either as an optional argument in the Flower process launching
command, or as a configuration item in your `` airflow.cfg `` . For both cases, please provide
`user:password` pairs separated by a comma.
.. code-block :: bash
airflow flower --basic_auth=user1:password1,user2:password2
.. code-block :: bash
[celery]
flower_basic_auth = user1:password1,user2:password2
2019-01-31 09:50:13 +03:00
RBAC UI Security
----------------
Security of Airflow Webserver UI is handled by Flask AppBuilder (FAB).
Please read its related `security document <http://flask-appbuilder.readthedocs.io/en/latest/security.html> `_
regarding its security model.
Default Roles
'''''''''''''
Airflow ships with a set of roles by default: Admin, User, Op, Viewer, and Public.
Only `` Admin `` users could configure/alter the permissions for other roles. But it is not recommended
that `` Admin `` users alter these default roles in any way by removing
or adding permissions to these roles.
Admin
^^^^^
`` Admin `` users have all possible permissions, including granting or revoking permissions from
other users.
Public
^^^^^^
`` Public `` users (anonymous) don't have any permissions.
Viewer
^^^^^^
`` Viewer `` users have limited viewer permissions
2019-05-19 17:06:47 +03:00
.. exampleinclude :: ../airflow/www/security.py
:language: python
:start-after: [START security_viewer_perms]
:end-before: [END security_viewer_perms]
2019-01-31 09:50:13 +03:00
on limited web views
2019-05-19 17:06:47 +03:00
.. exampleinclude :: ../airflow/www/security.py
:language: python
:start-after: [START security_viewer_vms]
:end-before: [END security_viewer_vms]
2019-01-31 09:50:13 +03:00
User
^^^^
`` User `` users have `` Viewer `` permissions plus additional user permissions
2019-05-19 17:06:47 +03:00
.. exampleinclude :: ../airflow/www/security.py
:language: python
:start-after: [START security_user_perms]
:end-before: [END security_user_perms]
2019-01-31 09:50:13 +03:00
on User web views which is the same as Viewer web views.
Op
^^
`` Op `` users have `` User `` permissions plus additional op permissions
2019-05-19 17:06:47 +03:00
.. exampleinclude :: ../airflow/www/security.py
:language: python
:start-after: [START security_op_perms]
:end-before: [END security_op_perms]
2019-01-31 09:50:13 +03:00
on `` User `` web views plus these additional op web views
2019-05-19 17:06:47 +03:00
.. exampleinclude :: ../airflow/www/security.py
:language: python
:start-after: [START security_op_vms]
:end-before: [END security_op_vms]
2019-01-31 09:50:13 +03:00
Custom Roles
'''''''''''''
DAG Level Role
^^^^^^^^^^^^^^
`` Admin `` can create a set of roles which are only allowed to view a certain set of dags. This is called DAG level access. Each dag defined in the dag model table
is treated as a `` View `` which has two permissions associated with it (`` can_dag_read `` and `` can_dag_edit `` ). There is a special view called `` all_dags `` which
allows the role to access all the dags. The default `` Admin `` , `` Viewer `` , `` User `` , `` Op `` roles can all access `` all_dags `` view.