CHORE - Remove Trailing Spaces
This commit is contained in:
Родитель
ab1c90b4e1
Коммит
39d64561c7
|
@ -100,7 +100,7 @@ Airflow requirement and should be used where possible).
|
|||
|
||||
### Testing locally
|
||||
|
||||
#### TL;DR
|
||||
#### TL;DR
|
||||
Tests can then be run with (see also the [Running unit tests](#running-unit-tests) section below):
|
||||
|
||||
./run_unit_tests.sh
|
||||
|
|
|
@ -13,8 +13,8 @@ The second set is based on the [`gcloud` Python library](https://googlecloudplat
|
|||
To use this set, install `airflow[gcloud]`.
|
||||
|
||||
##### Which should I use?
|
||||
New users should probably build on the `gcloud` set because the `gcloud` library is the recommended way for Python apps to interact with the Google Cloud Platform. The interface is easier to extend than the API approach.
|
||||
New users should probably build on the `gcloud` set because the `gcloud` library is the recommended way for Python apps to interact with the Google Cloud Platform. The interface is easier to extend than the API approach.
|
||||
|
||||
More pragmatically, if your existing code (hooks/operators/DAGs) depends on EITHER `gcloud >= 0.10` OR `google-api-python-client >= 1.5` (which both require `oauth2client >= 2.0`), then you won't be able to use `airflow[gcp_api]` due to compatibility issues.
|
||||
More pragmatically, if your existing code (hooks/operators/DAGs) depends on EITHER `gcloud >= 0.10` OR `google-api-python-client >= 1.5` (which both require `oauth2client >= 2.0`), then you won't be able to use `airflow[gcp_api]` due to compatibility issues.
|
||||
|
||||
However, if the hooks/operators in the `gcp_api` set meet your needs and you do not have other dependencies, then by all means use them!
|
||||
|
|
|
@ -11,7 +11,7 @@ logging.getLogger("google_cloud_storage").setLevel(logging.INFO)
|
|||
|
||||
class GoogleCloudStorageHook(GoogleCloudBaseHook):
|
||||
"""
|
||||
Interact with Google Cloud Storage. Connections must be defined with an
|
||||
Interact with Google Cloud Storage. Connections must be defined with an
|
||||
extras JSON field containing:
|
||||
|
||||
{
|
||||
|
@ -20,8 +20,8 @@ class GoogleCloudStorageHook(GoogleCloudBaseHook):
|
|||
"key_path": "<p12 key path>"
|
||||
}
|
||||
|
||||
If you have used ``gcloud auth`` to authenticate on the machine that's
|
||||
running Airflow, you can exclude the service_account and key_path
|
||||
If you have used ``gcloud auth`` to authenticate on the machine that's
|
||||
running Airflow, you can exclude the service_account and key_path
|
||||
parameters.
|
||||
"""
|
||||
conn_name_attr = 'google_cloud_storage_conn_id'
|
||||
|
|
|
@ -14,14 +14,14 @@ class BigQueryToBigQueryOperator(BaseOperator):
|
|||
|
||||
@apply_defaults
|
||||
def __init__(
|
||||
self,
|
||||
self,
|
||||
source_project_dataset_tables,
|
||||
destination_project_dataset_table,
|
||||
write_disposition='WRITE_EMPTY',
|
||||
create_disposition='CREATE_IF_NEEDED',
|
||||
bigquery_conn_id='bigquery_default',
|
||||
delegate_to=None,
|
||||
*args,
|
||||
*args,
|
||||
**kwargs):
|
||||
"""
|
||||
Copies data from one BigQuery table to another. See here:
|
||||
|
|
|
@ -14,19 +14,19 @@ class BigQueryToCloudStorageOperator(BaseOperator):
|
|||
|
||||
@apply_defaults
|
||||
def __init__(
|
||||
self,
|
||||
self,
|
||||
source_project_dataset_table,
|
||||
destination_cloud_storage_uris,
|
||||
compression='NONE',
|
||||
export_format='CSV',
|
||||
field_delimiter=',',
|
||||
print_header=True,
|
||||
destination_cloud_storage_uris,
|
||||
compression='NONE',
|
||||
export_format='CSV',
|
||||
field_delimiter=',',
|
||||
print_header=True,
|
||||
bigquery_conn_id='bigquery_default',
|
||||
delegate_to=None,
|
||||
*args,
|
||||
*args,
|
||||
**kwargs):
|
||||
"""
|
||||
Create a new BigQueryToCloudStorage to move data from BigQuery to
|
||||
Create a new BigQueryToCloudStorage to move data from BigQuery to
|
||||
Google Cloud Storage. See here:
|
||||
|
||||
https://cloud.google.com/bigquery/docs/reference/v2/jobs
|
||||
|
@ -36,9 +36,9 @@ class BigQueryToCloudStorageOperator(BaseOperator):
|
|||
:param source_project_dataset_table: The dotted (<project>.)<dataset>.<table> BigQuery table to use as the
|
||||
source data. If <project> is not included, project will be the project defined in the connection json.
|
||||
:type source_project_dataset_table: string
|
||||
:param destination_cloud_storage_uris: The destination Google Cloud
|
||||
Storage URI (e.g. gs://some-bucket/some-file.txt). Follows
|
||||
convention defined here:
|
||||
:param destination_cloud_storage_uris: The destination Google Cloud
|
||||
Storage URI (e.g. gs://some-bucket/some-file.txt). Follows
|
||||
convention defined here:
|
||||
https://cloud.google.com/bigquery/exporting-data-from-bigquery#exportingmultiple
|
||||
:type destination_cloud_storage_uris: list
|
||||
:param compression: Type of compression to use.
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
|
||||
{% block body %}
|
||||
<div>
|
||||
<h3 style="float: left">
|
||||
<h3 style="float: left">
|
||||
{% block page_header %}Hive Metastore Browser{% endblock%}
|
||||
</h3>
|
||||
<div id="object" class="select2-drop-mask" style="margin-top: 25px; width: 400px;float: right"></div>
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
{% extends 'metastore_browser/base.html' %}
|
||||
|
||||
{% block plugin_content %}
|
||||
<h3 style="float: left">
|
||||
<h3 style="float: left">
|
||||
<span style="color:#AAA;">Database: </span>
|
||||
<span>{{ db }}</span>
|
||||
</h3>
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
{% extends 'metastore_browser/base.html' %}
|
||||
|
||||
{% block plugin_content %}
|
||||
<h4>
|
||||
<h4>
|
||||
<span>Hive Databases</span>
|
||||
</h4>
|
||||
{{ table|safe }}
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
|
||||
{% block plugin_content %}
|
||||
<div>
|
||||
<h4>
|
||||
<h4>
|
||||
<span style="color:#AAA;">Table:</span>
|
||||
<span>{{ table.dbName }}.{{ table.tableName }}</span>
|
||||
</h4>
|
||||
|
@ -114,21 +114,21 @@ $('#tabs a').click(function (e) {
|
|||
});
|
||||
|
||||
$.get("{{ url_for(".data" , table=table_name) }}", function( data ) {
|
||||
$("#data").html(data);
|
||||
$("#data").html(data);
|
||||
$('#data table.dataframe').dataTable({
|
||||
"iDisplayLength": 30,
|
||||
});
|
||||
});
|
||||
|
||||
$.get("{{ url_for(".partitions" , table=table_name) }}", function( data ) {
|
||||
$("#partitions").html(data);
|
||||
$("#partitions").html(data);
|
||||
$('#partitions table.dataframe').dataTable({
|
||||
"iDisplayLength": 30,
|
||||
});
|
||||
});
|
||||
|
||||
$.get("{{ url_for(".ddl" , table=table_name) }}", function( data ) {
|
||||
$("#ddl_content").html(data);
|
||||
$("#ddl_content").html(data);
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
|
|
|
@ -58,5 +58,5 @@ class HDFSHook(BaseHook):
|
|||
hdfs_namenode_principal=hdfs_namenode_principal)
|
||||
else:
|
||||
raise HDFSHookException("conn_id doesn't exist in the repository")
|
||||
|
||||
|
||||
return client
|
||||
|
|
|
@ -19,11 +19,11 @@ class OracleHook(DbApiHook):
|
|||
"""
|
||||
Returns a oracle connection object
|
||||
Optional parameters for using a custom DSN connection (instead of using a server alias from tnsnames.ora)
|
||||
The dsn (data source name) is the TNS entry (from the Oracle names server or tnsnames.ora file)
|
||||
The dsn (data source name) is the TNS entry (from the Oracle names server or tnsnames.ora file)
|
||||
or is a string like the one returned from makedsn().
|
||||
:param dsn: the host address for the Oracle server
|
||||
:param service_name: the db_unique_name of the database that you are connecting to (CONNECT_DATA part of TNS)
|
||||
You can set these parameters in the extra fields of your connection
|
||||
You can set these parameters in the extra fields of your connection
|
||||
as in ``{ "dsn":"some.host.address" , "service_name":"some.service.name" }``
|
||||
"""
|
||||
conn = self.get_connection(self.oracle_conn_id)
|
||||
|
|
Разница между файлами не показана из-за своего большого размера
Загрузить разницу
|
@ -493,7 +493,7 @@ function defaultZoomSetup(graph, svg) {
|
|||
|
||||
// By default allow pan and zoom
|
||||
function defaultZoom(graph, svg) {
|
||||
|
||||
|
||||
this.zoom_obj = d3.behavior.zoom().on('zoom', function() {
|
||||
svg.attr('transform', 'translate(' + d3.event.translate + ')scale(' + d3.event.scale + ')');
|
||||
});
|
||||
|
@ -2654,7 +2654,7 @@ function acyclic(g) {
|
|||
var onStack = {},
|
||||
visited = {},
|
||||
reverseCount = 0;
|
||||
|
||||
|
||||
function dfs(u) {
|
||||
if (u in visited) return;
|
||||
visited[u] = onStack[u] = true;
|
||||
|
@ -3110,7 +3110,7 @@ function initCutValues(graph, spanningTree) {
|
|||
*/
|
||||
function computeLowLim(tree) {
|
||||
var postOrderNum = 0;
|
||||
|
||||
|
||||
function dfs(n) {
|
||||
var children = tree.successors(n);
|
||||
var low = postOrderNum;
|
||||
|
@ -3837,7 +3837,7 @@ Digraph.prototype.isDirected = function() {
|
|||
/*
|
||||
* Returns all successors of the node with the id `u`. That is, all nodes
|
||||
* that have the node `u` as their source are returned.
|
||||
*
|
||||
*
|
||||
* If no node `u` exists in the graph this function throws an Error.
|
||||
*
|
||||
* @param {String} u a node id
|
||||
|
@ -3851,7 +3851,7 @@ Digraph.prototype.successors = function(u) {
|
|||
/*
|
||||
* Returns all predecessors of the node with the id `u`. That is, all nodes
|
||||
* that have the node `u` as their target are returned.
|
||||
*
|
||||
*
|
||||
* If no node `u` exists in the graph this function throws an Error.
|
||||
*
|
||||
* @param {String} u a node id
|
||||
|
|
|
@ -196,7 +196,7 @@ ul.DTTT_dropdown.dropdown-menu li:hover a {
|
|||
}
|
||||
|
||||
div.DTTT_collection_background {
|
||||
z-index: 2002;
|
||||
z-index: 2002;
|
||||
}
|
||||
|
||||
/* TableTools information display */
|
||||
|
@ -216,7 +216,7 @@ div.DTTT_print_info {
|
|||
background-color: white;
|
||||
border: 1px solid rgba(0, 0, 0, 0.2);
|
||||
border-radius: 6px;
|
||||
|
||||
|
||||
-webkit-box-shadow: 0 3px 7px rgba(0, 0, 0, 0.5);
|
||||
box-shadow: 0 3px 7px rgba(0, 0, 0, 0.5);
|
||||
}
|
||||
|
@ -267,7 +267,7 @@ table.DTFC_Cloned tr.even {
|
|||
background-color: white;
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
|
||||
div.DTFC_RightHeadWrapper table ,
|
||||
div.DTFC_LeftHeadWrapper table {
|
||||
border-bottom: none !important;
|
||||
|
@ -276,7 +276,7 @@ div.DTFC_LeftHeadWrapper table {
|
|||
border-bottom-left-radius: 0 !important;
|
||||
border-bottom-right-radius: 0 !important;
|
||||
}
|
||||
|
||||
|
||||
div.DTFC_RightHeadWrapper table thead tr:last-child th:first-child,
|
||||
div.DTFC_RightHeadWrapper table thead tr:last-child td:first-child,
|
||||
div.DTFC_LeftHeadWrapper table thead tr:last-child th:first-child,
|
||||
|
@ -284,20 +284,20 @@ div.DTFC_LeftHeadWrapper table thead tr:last-child td:first-child {
|
|||
border-bottom-left-radius: 0 !important;
|
||||
border-bottom-right-radius: 0 !important;
|
||||
}
|
||||
|
||||
|
||||
div.DTFC_RightBodyWrapper table,
|
||||
div.DTFC_LeftBodyWrapper table {
|
||||
border-top: none;
|
||||
margin: 0 !important;
|
||||
}
|
||||
|
||||
|
||||
div.DTFC_RightBodyWrapper tbody tr:first-child th,
|
||||
div.DTFC_RightBodyWrapper tbody tr:first-child td,
|
||||
div.DTFC_LeftBodyWrapper tbody tr:first-child th,
|
||||
div.DTFC_LeftBodyWrapper tbody tr:first-child td {
|
||||
border-top: none;
|
||||
}
|
||||
|
||||
|
||||
div.DTFC_RightFootWrapper table,
|
||||
div.DTFC_LeftFootWrapper table {
|
||||
border-top: none;
|
||||
|
|
|
@ -153,15 +153,15 @@ div.form-inline{
|
|||
body div.panel {
|
||||
padding: 0px;
|
||||
}
|
||||
.blur {
|
||||
filter:url(#blur-effect-1);
|
||||
.blur {
|
||||
filter:url(#blur-effect-1);
|
||||
}
|
||||
div.legend_item {
|
||||
-moz-border-radius: 5px/5px;
|
||||
-webkit-border-radius: 5px 5px;
|
||||
border-radius: 5px/5px;
|
||||
float:right;
|
||||
margin: 0px 3px 0px 0px;
|
||||
margin: 0px 3px 0px 0px;
|
||||
padding:0px 3px;
|
||||
border:solid 2px grey;
|
||||
font-size: 11px;
|
||||
|
|
|
@ -4,7 +4,7 @@ html, body {
|
|||
height: 100%;
|
||||
padding: 0;
|
||||
}
|
||||
body {
|
||||
body {
|
||||
font-family: Ubuntu, Tahoma, Helvetica, sans-serif;
|
||||
font-size: 12px;
|
||||
line-height: 1.4em;
|
||||
|
@ -223,12 +223,12 @@ button#exclude-data:hover {
|
|||
width: 10px;
|
||||
height: 10px;
|
||||
}
|
||||
|
||||
|
||||
::-webkit-scrollbar-track {
|
||||
background: #ddd;
|
||||
border-radius: 12px;
|
||||
}
|
||||
|
||||
|
||||
::-webkit-scrollbar-thumb {
|
||||
background: #b5b5b5;
|
||||
border-radius: 12px;
|
||||
|
|
|
@ -17,7 +17,7 @@ var m = [60, 0, 10, 0],
|
|||
foreground,
|
||||
background,
|
||||
highlighted,
|
||||
dimensions,
|
||||
dimensions,
|
||||
legend,
|
||||
render_speed = 50,
|
||||
brush_count = 0,
|
||||
|
@ -232,7 +232,7 @@ function create_legend(colors,brush) {
|
|||
var legend = legend_data
|
||||
.enter().append("div")
|
||||
.attr("title", "Hide group")
|
||||
.on("click", function(d) {
|
||||
.on("click", function(d) {
|
||||
// toggle food group
|
||||
if (_.contains(excluded_groups, d)) {
|
||||
d3.select(this).attr("title", "Hide group")
|
||||
|
@ -253,16 +253,16 @@ function create_legend(colors,brush) {
|
|||
legend
|
||||
.append("span")
|
||||
.attr("class", "tally")
|
||||
.text(function(d,i) { return 0});
|
||||
.text(function(d,i) { return 0});
|
||||
|
||||
legend
|
||||
.append("span")
|
||||
.text(function(d,i) { return " " + d});
|
||||
.text(function(d,i) { return " " + d});
|
||||
|
||||
return legend;
|
||||
}
|
||||
|
||||
// render polylines i to i+render_speed
|
||||
|
||||
// render polylines i to i+render_speed
|
||||
function render_range(selection, i, max, opacity) {
|
||||
selection.slice(i,max).forEach(function(d) {
|
||||
path(d, foreground, color(d,opacity));
|
||||
|
@ -295,7 +295,7 @@ function data_table(sample) {
|
|||
.text(function(d) { return d.name; })
|
||||
}
|
||||
|
||||
// Adjusts rendering speed
|
||||
// Adjusts rendering speed
|
||||
function optimize(timer) {
|
||||
var delta = (new Date()).getTime() - timer;
|
||||
render_speed = Math.max(Math.ceil(render_speed * 30 / delta), 8);
|
||||
|
@ -404,7 +404,7 @@ function color(d, a){
|
|||
c = d3.rgb(color_scaler(d[color_column]));
|
||||
return ["rgba(",c.r,",",c.g,",",c.b,",",a,")"].join("");
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
var ci = 0;
|
||||
function color_cat(cat,a) {
|
||||
|
@ -442,7 +442,7 @@ function brush() {
|
|||
.selectAll('text')
|
||||
.style('font-weight', 'bold')
|
||||
.style('font-size', '13px')
|
||||
.style('display', function() {
|
||||
.style('display', function() {
|
||||
var value = d3.select(this).data();
|
||||
return extent[0] <= value && value <= extent[1] ? null : "none"
|
||||
});
|
||||
|
@ -458,7 +458,7 @@ function brush() {
|
|||
.style('display', null);
|
||||
});
|
||||
;
|
||||
|
||||
|
||||
// bold dimensions with label
|
||||
d3.selectAll('.label')
|
||||
.style("font-weight", function(dimension) {
|
||||
|
@ -513,7 +513,7 @@ function brush() {
|
|||
});
|
||||
|
||||
legend.selectAll(".tally")
|
||||
.text(function(d,i) { return tallies[d].length });
|
||||
.text(function(d,i) { return tallies[d].length });
|
||||
|
||||
// Render selected lines
|
||||
paths(selected, foreground, brush_count, true);
|
||||
|
@ -662,7 +662,7 @@ window.onresize = function() {
|
|||
.attr("height", h + m[0] + m[2])
|
||||
.select("g")
|
||||
.attr("transform", "translate(" + m[3] + "," + m[0] + ")");
|
||||
|
||||
|
||||
xscale = d3.scale.ordinal().rangePoints([0, w], 1).domain(dimensions);
|
||||
dimensions.forEach(function(d) {
|
||||
yscale[d].range([h, 0]);
|
||||
|
@ -710,7 +710,7 @@ function remove_axis(d,g) {
|
|||
dimensions = _.difference(dimensions, [d]);
|
||||
xscale.domain(dimensions);
|
||||
g.attr("transform", function(p) { return "translate(" + position(p) + ")"; });
|
||||
g.filter(function(p) { return p == d; }).remove();
|
||||
g.filter(function(p) { return p == d; }).remove();
|
||||
update_ticks();
|
||||
}
|
||||
|
||||
|
|
|
@ -18,7 +18,7 @@
|
|||
editor.getSession().on('change', function(){
|
||||
textarea.val(editor.getSession().getValue());
|
||||
});
|
||||
editor.focus();
|
||||
editor.focus();
|
||||
$(":checkbox").removeClass("form-control");
|
||||
});
|
||||
</script>
|
||||
|
|
|
@ -18,7 +18,7 @@
|
|||
editor.getSession().on('change', function(){
|
||||
textarea.val(editor.getSession().getValue());
|
||||
});
|
||||
editor.focus();
|
||||
editor.focus();
|
||||
|
||||
// Getting column_descriptions in tooltips
|
||||
$(":checkbox").removeClass("form-control");
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
<div style="font-family: verdana;">
|
||||
<h1>Airflow 404 = lots of circles</h1>
|
||||
<div style="color: white">{{ hostname }}</div>
|
||||
<div
|
||||
<div
|
||||
id="div_svg"
|
||||
class="content"
|
||||
class="centered text-center"
|
||||
class="content"
|
||||
class="centered text-center"
|
||||
style="border: 1px solid #CCC; padding:0px;margin:0;">
|
||||
<svg></svg>
|
||||
|
||||
|
@ -19,7 +19,7 @@
|
|||
var i = 0;
|
||||
var flip = 0;
|
||||
var colors = [
|
||||
"#FF5A5F", "#007A87", "#7B0051", "#00D1C1", "#8CE071", "#FFB400",
|
||||
"#FF5A5F", "#007A87", "#7B0051", "#00D1C1", "#8CE071", "#FFB400",
|
||||
"#FFAA91", "#B4A76C", "#9CA299", "#565A5C"
|
||||
];
|
||||
|
||||
|
@ -79,7 +79,7 @@
|
|||
}
|
||||
if(Math.random() > 0.8){
|
||||
col = function() {return choose(colors)};
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
|
@ -117,7 +117,7 @@
|
|||
.attr("r", function(d, i) {return 0});
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
|
||||
setInterval(toggle, duration*3);
|
||||
|
|
|
@ -19,7 +19,7 @@
|
|||
{% if code %}
|
||||
<pre>{{ code }}</pre>
|
||||
{% endif %}
|
||||
|
||||
|
||||
{% if code_html %}
|
||||
{{ code_html|safe }}
|
||||
{% endif %}
|
||||
|
|
|
@ -6,10 +6,10 @@
|
|||
{% endif %}
|
||||
{% if admin_view.alert_fernet_key() %}
|
||||
<div class="alert alert-danger"><b>Warning:</b>
|
||||
Airflow is currently storing passwords in <b>plain text</b>.
|
||||
Airflow is currently storing passwords in <b>plain text</b>.
|
||||
To turn on password encryption for connections, you need to add a
|
||||
"fernet_key" option to the "core" section of your airflow.cfg file.
|
||||
To generate a key, you can call the function
|
||||
To generate a key, you can call the function
|
||||
<code>airflow.configuration.generate_fernet_key()</code>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
|
|
@ -62,7 +62,7 @@
|
|||
<li>
|
||||
<a href="{{ url_for("airflow.dag_details", dag_id=dag.dag_id) }}">
|
||||
<span class="glyphicon glyphicon-list" aria-hidden="true"></span>
|
||||
Details
|
||||
Details
|
||||
</a>
|
||||
</li>
|
||||
<li>
|
||||
|
|
|
@ -145,7 +145,7 @@ body {
|
|||
$( document ).ready(function() {
|
||||
Highcharts.setOptions({
|
||||
colors: [
|
||||
"#FF5A5F", "#007A87", "#7B0051", "#00D1C1", "#8CE071", "#FFB400",
|
||||
"#FF5A5F", "#007A87", "#7B0051", "#00D1C1", "#8CE071", "#FFB400",
|
||||
"#FFAA91", "#B4A76C", "#9CA299", "#565A5C"
|
||||
],
|
||||
});
|
||||
|
@ -166,9 +166,9 @@ body {
|
|||
"columns": payload.data.columns,
|
||||
"scrollX": true,
|
||||
"iDisplayLength": 100,
|
||||
});
|
||||
});
|
||||
{% endif %}
|
||||
}
|
||||
}
|
||||
else {
|
||||
error(payload.error);
|
||||
}
|
||||
|
|
|
@ -234,14 +234,14 @@
|
|||
|
||||
g.append('circle')
|
||||
.attr('stroke-width', function(d) {
|
||||
if (d.count > 0)
|
||||
if (d.count > 0)
|
||||
return stroke_width;
|
||||
else {
|
||||
return 1;
|
||||
}
|
||||
})
|
||||
.attr('stroke', function(d) {
|
||||
if (d.count > 0)
|
||||
if (d.count > 0)
|
||||
return d.color;
|
||||
else {
|
||||
return 'grey';
|
||||
|
@ -255,7 +255,7 @@
|
|||
return"cursor:pointer;"
|
||||
})
|
||||
.on('click', function(d, i) {
|
||||
if (d.count > 0)
|
||||
if (d.count > 0)
|
||||
window.location = "/admin/taskinstance/?flt1_dag_id_equals=" + d.dag_id + "&flt2_state_equals=" + d.state;
|
||||
})
|
||||
.on('mouseover', function(d, i) {
|
||||
|
|
|
@ -46,7 +46,7 @@
|
|||
});
|
||||
editor.getSession().setMode("ace/mode/sql");
|
||||
editor.getSession().on('change', sync);
|
||||
editor.focus();
|
||||
editor.focus();
|
||||
$('table.dataframe').dataTable({
|
||||
"scrollX": true,
|
||||
"iDisplayLength": 100,
|
||||
|
|
|
@ -48,10 +48,10 @@
|
|||
}
|
||||
$("input#execution_date").on("change.daterangepicker", function(){
|
||||
date_change();
|
||||
});
|
||||
});
|
||||
$("input#execution_date").on("apply.daterangepicker", function(){
|
||||
date_change();
|
||||
});
|
||||
});
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
|
|
|
@ -47,7 +47,7 @@
|
|||
</div>
|
||||
<hr/>
|
||||
<div id="svg_container">
|
||||
<img id='loading' width="50"
|
||||
<img id='loading' width="50"
|
||||
src="{{ url_for('static', filename='loading.gif') }}">
|
||||
<svg class='tree' width="100%">
|
||||
<filter id="blur-effect-1">
|
||||
|
|
|
@ -148,7 +148,7 @@ attributes and methods.
|
|||
|
||||
Macros
|
||||
''''''
|
||||
Macros are a way to expose objects to your templates and live under the
|
||||
Macros are a way to expose objects to your templates and live under the
|
||||
``macros`` namespace in your templates.
|
||||
|
||||
A few commonly used libraries and methods are made available.
|
||||
|
|
|
@ -177,10 +177,10 @@ It is also possible to pull XCom directly in a template, here's an example
|
|||
of what this may look like:
|
||||
|
||||
.. code:: sql
|
||||
|
||||
|
||||
SELECT * FROM {{ task_instance.xcom_pull(task_ids='foo', key='table_name') }}
|
||||
|
||||
Note that XComs are similar to `Variables`_, but are specifically designed
|
||||
Note that XComs are similar to `Variables`_, but are specifically designed
|
||||
for inter-task communication rather than global settings.
|
||||
|
||||
|
||||
|
@ -441,7 +441,7 @@ configuration files, it allows you to expose the configuration that led
|
|||
to the related tasks in Airflow.
|
||||
|
||||
.. code:: python
|
||||
|
||||
|
||||
t = BashOperator("foo", dag=dag)
|
||||
t.doc_md = """\
|
||||
#Title"
|
||||
|
|
|
@ -42,7 +42,7 @@ Here are some of the common causes:
|
|||
|
||||
- Is the ``concurrency`` parameter of your DAG reached? ``concurency`` defines
|
||||
how many ``running`` task instances a DAG is allowed to have, beyond which
|
||||
point things get queued.
|
||||
point things get queued.
|
||||
|
||||
- Is the ``max_active_runs`` parameter of your DAG reached? ``max_active_runs`` defines
|
||||
how many ``running`` concurrent instances of a DAG there are allowed to be.
|
||||
|
|
|
@ -88,7 +88,7 @@ To Keep in Mind
|
|||
tasks in your DAG.
|
||||
* Subsequent ``DAG Runs`` are created by the scheduler process, based on
|
||||
your DAG's ``schedule_interval``, sequentially.
|
||||
* When clearing a set of tasks' state in hope of getting them to re-run,
|
||||
* When clearing a set of tasks' state in hope of getting them to re-run,
|
||||
it is important to keep in mind the ``DAG Run``'s state too as it defines
|
||||
whether the scheduler should look into triggering tasks for that run.
|
||||
|
||||
|
|
|
@ -141,7 +141,7 @@ Instantiate a DAG
|
|||
|
||||
We'll need a DAG object to nest our tasks into. Here we pass a string
|
||||
that defines the ``dag_id``, which serves as a unique identifier for your DAG.
|
||||
We also pass the default argument dictionary that we just defined and
|
||||
We also pass the default argument dictionary that we just defined and
|
||||
define a ``schedule_interval`` of 1 day for the DAG.
|
||||
|
||||
.. code:: python
|
||||
|
@ -358,8 +358,8 @@ Let's run a few commands to validate this script further.
|
|||
|
||||
Testing
|
||||
'''''''
|
||||
Let's test by running the actual task instances on a specific date. The
|
||||
date specified in this context is an ``execution_date``, which simulates the
|
||||
Let's test by running the actual task instances on a specific date. The
|
||||
date specified in this context is an ``execution_date``, which simulates the
|
||||
scheduler running your task or dag at a specific date + time:
|
||||
|
||||
.. code-block:: bash
|
||||
|
|
Загрузка…
Ссылка в новой задаче