Enable Markdownlint rule MD046/code-block-style (#12429)

This commit is contained in:
Kaxil Naik 2020-11-18 01:57:02 +00:00 коммит произвёл GitHub
Родитель 8a8f54ffec
Коммит 9b3a3332c4
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
4 изменённых файлов: 43 добавлений и 26 удалений

Просмотреть файл

@ -60,6 +60,3 @@ MD040: false
# MD041/first-line-heading/first-line-h1
MD041: false
# MD046/code-block-style
MD046: false

Просмотреть файл

@ -2234,14 +2234,18 @@ custom auth backends might need a small change: `is_active`,
`is_authenticated`, and `is_anonymous` should now be properties. What this means is if
previously you had this in your user class
def is_active(self):
return self.active
```python
def is_active(self):
return self.active
```
then you need to change it like this
@property
def is_active(self):
return self.active
```python
@property
def is_active(self):
return self.active
```
### Support autodetected schemas to GoogleCloudStorageToBigQueryOperator
@ -2251,21 +2255,27 @@ If BigQuery tables are created outside of airflow and the schema is not defined
define a schema_fields:
gcs_to_bq.GoogleCloudStorageToBigQueryOperator(
...
schema_fields={...})
```python
gcs_to_bq.GoogleCloudStorageToBigQueryOperator(
...
schema_fields={...})
```
or define a schema_object:
gcs_to_bq.GoogleCloudStorageToBigQueryOperator(
...
schema_object='path/to/schema/object)
```python
gcs_to_bq.GoogleCloudStorageToBigQueryOperator(
...
schema_object='path/to/schema/object')
```
or enabled autodetect of schema:
gcs_to_bq.GoogleCloudStorageToBigQueryOperator(
...
autodetect=True)
```python
gcs_to_bq.GoogleCloudStorageToBigQueryOperator(
...
autodetect=True)
```
## Airflow 1.10.1

Просмотреть файл

@ -817,7 +817,9 @@ The table below shows the differences:
This endpoint ``/api/v1/dags/{dag_id}/dagRuns`` also allows you to filter dag_runs with parameters such as ``start_date``, ``end_date``, ``execution_date`` etc in the query string.
Therefore the operation previously performed by this endpoint
/api/experimental/dags/<string:dag_id>/dag_runs/<string:execution_date>
```
/api/experimental/dags/<string:dag_id>/dag_runs/<string:execution_date>
```
can now be handled with filter parameters in the query string.
Getting information about latest runs can be accomplished with the help of

Просмотреть файл

@ -314,23 +314,31 @@ to port-forward the Airflow UI to http://localhost:8080/ to confirm Airflow is w
1. Start a project using [astro-cli](https://github.com/astronomer/astro-cli), which will generate a Dockerfile, and load your DAGs in. You can test locally before pushing to kind with `astro airflow start`.
mkdir my-airflow-project && cd my-airflow-project
astro dev init
```shell script
mkdir my-airflow-project && cd my-airflow-project
astro dev init
```
2. Then build the image:
docker build -t my-dags:0.0.1 .
```shell script
docker build -t my-dags:0.0.1 .
```
3. Load the image into kind:
kind load docker-image my-dags:0.0.1
```shell script
kind load docker-image my-dags:0.0.1
```
4. Upgrade Helm deployment:
helm upgrade airflow -n airflow \
--set images.airflow.repository=my-dags \
--set images.airflow.tag=0.0.1 \
astronomer/airflow
```shell script
helm upgrade airflow -n airflow \
--set images.airflow.repository=my-dags \
--set images.airflow.tag=0.0.1 \
astronomer/airflow
```
## Contributing