dag_id is missing for all entries in metadata
table "job" with job_type "LocalTaskJob".
This is due to that dag_id was not specified
within class LocalTaskJob.
A test is added to check if essential
attributes of LocalTaskJob can be assigned
with proper values without intervention.
- The default value of fs_conn_id was not proper.
- Added a new test in which we try to ignore setting
fs_conn_id explicitly.
- a minor change on how a path is concatenated
Executes a task in a Kubernetes pod in the specified Google Kubernetes
Engine cluster. This makes it easier to interact with GCP kubernetes
engine service because it encapsulates acquiring credentials.
There are several scenarios where Task Instance view tries to render
Python callables where 'x' is not the correct artefact to target.
This commit adds a helper fuction to test for known scenarios, and
derives the source from the correc artefact or as a default returns 'No
source available for <type>'. This means that even in unknown or
unfixable edge cases, the Task Instance view still renders instead of
displaying an exception.
Closes#3571 from night0wl/AIRFLOW-2099_task_view_type_check
Fix scripts/ci/kubernetes/minikube/start_minikube.sh
as follows:
- Make minikube version configurable via
environment variable
- Remove unused variables for readability
- Reorder some lines to remove warnings
- Replace ineffective `return` with `exit`
- Add -E to `sudo minikube` so that non-root
users can use this script locally
In documentation page "Scheduling & Triggers",
it only mentioned the CLI method to
manually trigger a DAG run.
However, the manual trigger feature in Web UI
should be mentioned as well
(it may be even more frequently used by users).
By default one of Apache Airflow's dependencies pulls in a GPL
library. Airflow should not install (and upgrade) without an explicit choice.
This is part of the Apache requirements as we cannot depend on Category X
software.
Fixes PendingDeprecationWarning on HipChatAPISendRoomNotificationOperator
Using `HipChatAPISendRoomNotificationOperator` on Airflow master branch (2.0) gives:
airflow/models.py:2390: PendingDeprecationWarning:
Invalid arguments were passed to HipChatAPISendRoomNotificationOperator.
Support for passing such arguments will be dropped in Airflow 2.0.
Invalid arguments were:
*args: ()
**kwargs: {'color': 'green'}
category=PendingDeprecationWarning
Because upper/lower case was not considered
in the file extension check, S3ToHiveTransfer
operator may mistakenly think a GZIP file with
uppercase ext ".GZ" is not a GZIP file and
raise exception.
It's recommended by Falsk community to use random
SECRET_KEY for security reason.
However, in Airflow there is a default value for
secret_key and most users will ignore to change
it.
This may cause security concern.
Closes#3651 from XD-DENG/patch-2
value of min_file_process_interval in config
template is 0
However it's supposed to be 180 according to
airflow/jobs.py line 592
Closes#3659 from XD-DENG/patch-3
Currently the role assumption method works only if
the granting account
does not specify an External ID. The external ID
is used to solved the
confused deputy problem. When using the AWS hook
to export data to
multiple customers, it's good security practice to
use the external ID.
There is no backwards compatibility break, the ID
will be `None` in
existing cases. Moto doesn't provide any
convenient way to verify the
value was passed in the credential response in
tests, so existing
test cases are kept.
Documentation: https://docs.aws.amazon.com/IAM/lat
est/UserGuide/id_roles_create_for-
user_externalid.html
Closes#3647 from vvondra/support_sts_external_id
The tree view generates JSON that can be massive
for bigger DAGs,
up to 10s of MBs. The JSON is currently
prettified, which both
takes up more CPU time during serialization, and
slows down
everything else that uses it. Considering the JSON
is only
meant to be used programmatically, this is an easy
win
Closes#3620 from abdul-stripe/smaller-tree-view-
json
previous version created the subdag by copying
over all the tasks, and
then filtering them down. it's a lot faster if we
only copy over the
tasks we need
Closes#3621 from abdul-stripe/faster-subdag