Improving the TriggerDagRunOperator example
This commit is contained in:
Родитель
5d3cb5fa7b
Коммит
2a3e526a30
|
@ -1,6 +1,6 @@
|
|||
from airflow.operators import *
|
||||
from airflow.operators import BashOperator, PythonOperator
|
||||
from airflow.models import DAG
|
||||
from datetime import date, datetime, time, timedelta
|
||||
from datetime import datetime
|
||||
|
||||
import pprint
|
||||
pp = pprint.PrettyPrinter(indent=4)
|
||||
|
@ -41,3 +41,9 @@ run_this = PythonOperator(
|
|||
provide_context=True,
|
||||
python_callable=run_this_func,
|
||||
dag=dag)
|
||||
|
||||
# You can also access the DagRun object in templates
|
||||
bash_task = BashOperator(
|
||||
task_id="bash_task",
|
||||
bash_command='echo "Here is the message: {{ dag_run.conf["message"] if dag_run else "" }}" ',
|
||||
dag=dag)
|
||||
|
|
|
@ -135,7 +135,7 @@ Variable Description
|
|||
represents the content of your
|
||||
``airflow.cfg``
|
||||
``run_id`` the ``run_id`` of the current DAG run
|
||||
``dag_run`` a reference to the DAG run object
|
||||
``dag_run`` a reference to the DagRun object
|
||||
``test_mode`` whether the task instance was called using
|
||||
the CLI's test subcommand
|
||||
================================= ====================================
|
||||
|
|
|
@ -247,12 +247,15 @@ For any queries/bugs on `MesosExecutor`, please contact `@kapil-malik <https://g
|
|||
|
||||
Integration with systemd
|
||||
''''''''''''''''''''''''
|
||||
Airflow can integrate with systemd based systems. This makes watching your daemons easy as systemd
|
||||
can take care of restarting a daemon on failure. In the ``scripts/systemd`` directory you can find unit files that
|
||||
have been tested on Redhat based systems. You can copy those to ``/usr/lib/systemd/system``. It is assumed that
|
||||
Airflow will run under ``airflow:airflow``. If not (or if you are running on a non Redhat based system) you
|
||||
probably need to adjust the unit files.
|
||||
Airflow can integrate with systemd based systems. This makes watching your
|
||||
daemons easy as systemd can take care of restarting a daemon on failure.
|
||||
In the ``scripts/systemd`` directory you can find unit files that
|
||||
have been tested on Redhat based systems. You can copy those to
|
||||
``/usr/lib/systemd/system``. It is assumed that Airflow will run under
|
||||
``airflow:airflow``. If not (or if you are running on a non Redhat
|
||||
based system) you probably need to adjust the unit files.
|
||||
|
||||
Environment configuration is picked up from ``/etc/sysconfig/airflow``. An example file is supplied.
|
||||
Make sure to specify the ``SCHEDULER_RUNS`` variable in this file when you run the schduler. You
|
||||
can also define here, for example, ``AIRFLOW_HOME`` or ``AIRFLOW_CONFIG``.
|
||||
Environment configuration is picked up from ``/etc/sysconfig/airflow``.
|
||||
An example file is supplied. Make sure to specify the ``SCHEDULER_RUNS``
|
||||
variable in this file when you run the scheduler. You
|
||||
can also define here, for example, ``AIRFLOW_HOME`` or ``AIRFLOW_CONFIG``.
|
||||
|
|
Загрузка…
Ссылка в новой задаче