Update PostgresHook's get_conn method to directly call the specified
conn_name_attr rather that always using self.postgres_conn_id.
Currently subclassing PostgresHook requires overriding the
postgres_conn_id attribute in order to establish a separate connection.
Add an additional unit test for this case checking that the subclassed
PostgresHook's get_conn calls the correct arguments and that the hook
calls the correction connection_id in get_connection.
Using mock.assert_call_with method can result in flaky tests
(ex. iterating through dict in python 3.5 which does not
store order of elements). That's why it's better to
use assert_called_once_with or has_calls methods.
Note: The order of arguments has changed for `check_for_prefix`.
The `bucket_name` is now optional. It falls back to the `connection schema` attribute.
- refactor code
- complete docs
Make the method: `run` in the HttpHook compare the attribute: 'method' in a case insensitive way.
This resolves the issue where a Httphook created with parameter `method='get'` would not be
treated as a GET-request in the run method and the attribute `params`would be omitted in the Http request.
Make the method: `run` in the HttpHook compare the attribute: 'method' in a case insensitive way.
This resolves the issue where a Httphook created with parameter `method='get'` would not be
treated as a GET-request in the run method and the attribute `params`would be omitted in the Http request.
* HA for Metastore
* [AIRFLOW-3888] HA for metastore connection
Creating a connection to a metasotor with two hosts for high avitablity (eg connection 1, connection 2) is not possible because the entire value entered is taken. For our needs, it is necessary to go through subsequent hosts and connect to the first working.
This change allows you to check and then connect to a working metastor.
* add function to base_hook
* update webhdfs_hook
* back to original version
* back to original version
* Update hive_hooks.py
Thank you. I made a few changes because during the tests I detected several errors.
I have a question, when I do marge to my pull it will be still possible to land it in the airflow main branch?
* [AIRFLOW-3888] HA for metastore connection
flake8 code repair
* [AIRFLOW-3888] HA for metastore connection
Flake8 repair
* [AIRFLOW-3888] HA for metastore connection
Code behavior improvements
* [AIRFLOW-3888] HA for metastore connection
Add test
* [AIRFLOW-3888] HA for metastore connection
test improvement
* [AIRFLOW-3888] HA for metastore connection
Add test
[AIRFLOW-3888] HA for metastore connection
test improvement
* [AIRFLOW-3888] HA for metastore connection
Add test
[AIRFLOW-3888] HA for metastore connection
test improvement
[AIRFLOW-3888] HA for metastore connection
test improvement
* [AIRFLOW-3888] HA for metastore connection
Improving the typo in the variable name
* [AIRFLOW-3888] HA for metastore connection
Mock return_value edit
* [AIRFLOW-3888] HA for metastore connection
Flake8 repair
* [AIRFLOW-3888] HA for metastore connection
Test repair
* [AIRFLOW-3888] HA for metastore connection
Flake8 repair
[AIRFLOW-3888] HA for metastore connection
Test repair
We use both `beeline_default` and `hive_cli_default`
as our default when we run `airflow initdb`. But in
airflow source folder we only use `hive_cli_default`
as conn_id in hive related hook/operator and only
use `beeline_default` in airflow test folder for
test hive hook/operator. That why I think we should
merge then as one default connection
And in [Hive doc](https://cwiki.apache.org/confluence/
display/Hive/LanguageManual+Cli#
LanguageManualCli-DeprecationinfavorofBeelineCLI) could
know that hive cli will be deprecation in favor of
Beeline. In this situation I think we should remove
`beeline_default` and change `hive_cli_default` as
same configure as `beeline_default`
* [AIRFLOW-3767] Correct bulk insert function
Fix Oracle hook bulk_insert bug when
param target_fields is None or rows
is empty iterable
* change without overwriting variables as Fokko said
This re-works the SageMaker functionality in Airflow to be more complete, and more useful for the kinds of operations that SageMaker supports.
We removed some files and operators here, but these were only added after the last release so we don't need to worry about any sort of back-compat.
MySQL hook does not support "unix_socket" extra - which allows
to specify a different location of Linux socket than the default one.
This is a blocker for tools like cloud-sql-proxy that
creates sockets in an arbitrary place:
https://mysqlclient.readthedocs.io/user_guide.html
S3Hook will silently fail if given a conn_id that
does not exist. The
calls to check_for_key done by an S3KeySensor will
never fail if the
credentials object is not configured correctly.
This adds the expected
ClientError exception type when performing a HEAD
operation on an
object that doesn't exist to the try catch
statements so that other
exceptions are properly raised.
Closes#3616 from mascah/AIRFLOW-2771-S3hook-
except-type