Some community-contributed sensors are missing
from API reference.
This PR fixes docs/code.rst to refer to collect
sensor classes.
Closes#3125 from sekikn/AIRFLOW-2212
* Fix autodoc import path for BaseSensorOperator
* Add autodoc imports for many core & contrib Operators that were missing
* Rename "Operator API" section to "Core Operators" for better contrast with the following "Community-contributed Operators" section
* Add subheadings to API Reference#Operators. Since all the Sensors were already alphabetized separately from the rest of the operators, this formalizes that distinction and moves all the Transfer operators to their own section as well.
* Alphabetize Operator class names
* Improve formatting in top-level Operators section
This also fixes the earlier and more narrowly scoped [AIRFLOW-951]
- Added `create_bucket` method to `gcs_hook` and
created corresponding operator
`GoogleCloudStorageCreateBucket`
- Added tests
- Added documentation
Closes#3044 from kaxil/AIRFLOW-1618
- Add missing operator in `code.rst` and
`integration.rst`
- Fix documentation in DataProc operator
- Minor doc fix in GCS operators
- Fixed codeblocks & links in docstrings for
BigQuery, DataProc, DataFlow, MLEngine, GCS hooks
& operators
Closes#3003 from kaxil/doc_update
Moving the sensors to seperate files increases
readability of the
code. Also this reduces the code in the big
core.py file.
Closes#2875 from Fokko/AIRFLOW-1889-move-sensors-
to-separate-package
This sensor succeeds once a bash command/script
returns 0, and keeps poking otherwise. The
implementation is very similar to BashOperator.
Closes#2489 from diogoalexandrefranco/master
Update Sphinx docs to use correct import
structure. Fixes improperly
mocked modules that resulted in hooks not
displaying. Fixes executors
and operators section, which weren't displaying
anything.
Closes#2894 from andyxhadji/AIRFLOW-1942
Adds the necessary hooks to support pulling and
acknowleding Pub/Sub
messages. This is implemented by adding a
PubSubPullSensor operator
that will attempt to retrieve messages from a
specified subscription
and will meet its criteria when a message or
messages is available.
The configuration allows those messages to be
acknowledged immediately.
In addition, the messages are passed to downstream
workers via the
return value of operator's execute method.
An end-to-end example is included showing topic
and subscription
creation, parallel tasks to publish and pull
messages, and a downstream
chain to echo the contents of each message before
cleaning up.
Closes#2885 from prodonjs/airflow-1932-pr
Add DatabricksSubmitRun Operator
In this PR, we contribute a DatabricksSubmitRun operator and a
Databricks hook. This operator enables easy integration of Airflow
with Databricks. In addition to the operator, we have created a
databricks_default connection, an example_dag using this
DatabricksSubmitRunOperator, and matching documentation.
Closes#2202 from andrewmchen/databricks-operator-
squashed
This PR implements a hook to interface with Azure
storage over wasb://
via azure-storage; adds sensors to check for blobs
or prefixes; and
adds an operator to transfer a local file to the
Blob Storage.
Design is similar to that of the S3Hook in
airflow.operators.S3_hook.
Closes#2216 from hgrif/AIRFLOW-1065
Dear Airflow Maintainers,
Please accept this PR that addresses the following issues:
- *https://issues.apache.org/jira/browse/AIRFLOW-155*
Thanks,
Sumit
Author: Sumit Maheshwari <sumitm@qubole.com>
Closes#1560 from msumit/AIRFLOW-155.