Setup CLI, logging and templates

This commit is contained in:
Anna Scholtz 2022-03-01 14:11:45 -08:00
Родитель 0b1426c1d3
Коммит deb274d9da
19 изменённых файлов: 1775 добавлений и 2 удалений

Просмотреть файл

@ -1,3 +1,59 @@
# Operational Monitoring
Operational Monitoring
===
todo
To understand how to use Operational Monitoring, see the [documentation at dtmo](https://docs.telemetry.mozilla.org/cookbooks/operational_monitoring.html).
## Overview
![](./docs/images/overview.png)
The diagram above shows the relationship between different parts of the Operational Monitoring system. At a high level, data flows through the system in the following way:
1. Users create project definition files as described on [dtmo](https://docs.telemetry.mozilla.org/cookbooks/operational_monitoring.html)
2. Two daily jobs run in Airflow to process the config files: one to generate + run the ETL and one to generate LookML for views/explores/dashboards
3. Updated LookML dashboards and explores are available once per day and loading them runs aggregates on the fly by referencing relevant BigQuery tables.
Below we will dive deeper into what's happening under the hood.
## ETL Generator
The ETL generator takes the project definition files as input and uses [Jinja](https://jinja.palletsprojects.com/en/3.0.x/) templates to generate different SQL queries to process the relevant data. At a high level, it works by doing the following steps for each project file:
1) Check if the project file was updated since the last time its SQL queries were generated - if so, regenerate.
2) For each data source in the `analysis` list in the project definition:
a. For each data type (e.g. scalars, histograms), generate a different query
The aggregations done in this ETL are at the client-level. The queries are grouped by the branch, x-axis value (build id or submission date), and dimensions listed in the project config.
Normalization is done so that clients with many submissions would only count once.
* For histograms, this is done by summing up values for each bucket then dividing each bucket by the total number of submissions. For more detail, see [`histogram_normalized_sum()`](https://github.com/mozilla/bigquery-etl/blob/main/sql/mozfun/glam/histogram_normalized_sum/udf.sql)
* For scalars, this is done by computing an aggregate function for each client (e.g. sum, avg, etc)
Although the ETL uses a single set of SQL templates, in order to support both build IDs and submission dates on the x-axis, the data is stored/represented in slightly different ways for each case.
* For build IDs on the x-axis, submission date is used as the partition but previous submission dates are only there for backup. The most recent submission date is the only one of interest as it will include all relevant builds to be graphed.
* For submission dates on the x-axis, submission date is also used as the partition, but they are not backups. The previous submission dates will include dates that need to be graphed.
## ETL Runner
The [Operational Monitoring DAG](https://workflow.telemetry.mozilla.org/tree?dag_id=operational_monitoring) runs once per day in Airflow.
A separate table is generated for each operational monitoring project + data type. For example, a given project will have 1 table for scalars that might consist of scalars pulled in from a variety of different tables in BigQuery.
## LookML Generator
Specific LookML is generated for Operational Monitoring. The code for this lives in the [lookml-generator](https://github.com/mozilla/lookml-generator) repo and runs daily as part of the [probe_scraper DAG](https://workflow.telemetry.mozilla.org/tree?dag_id=probe_scraper). Each run performs the following steps:
1) A view is generated for each table that is outputted from the ETL runner. The view contains the dimensions (e.g. metric, branch, build_id) and a measure that computes percentiles
2) Explores are generated for each view, these include Looker [aggregate tables](https://docs.looker.com/reference/explore-params/aggregate_table) for each graph shown in the default view of a dashboard
3) Dashboards are generated for each project
## Output
Below is an example of a dashboard generated by Operational Monitoring for the [Fission Release Rollout](https://mozilla.cloud.looker.com/dashboards/operational_monitoring::fission_release_rollout?Percentile=50&Cores%20Count=2&Os=Windows)
![](./docs/images/example.png)
Note that the dropdowns shown, aside from `Percentile` are generated based on the project definition. There is one dropdown for each dimension specified and it is populated by querying unique values for that dimension. The default value for each dropdown is the most common value found in the table.

Двоичные данные
docs/images/example.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 196 KiB

Двоичные данные
docs/images/overview.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 209 KiB

62
opmon/bigquery_client.py Normal file
Просмотреть файл

@ -0,0 +1,62 @@
from typing import Dict, Iterable, Mapping, Optional
import attr
import google.cloud.bigquery
import google.cloud.bigquery.client
import google.cloud.bigquery.dataset
import google.cloud.bigquery.job
import google.cloud.bigquery.table
@attr.s(auto_attribs=True, slots=True)
class BigQueryClient:
project: str
dataset: str
_client: Optional[google.cloud.bigquery.client.Client] = None
@property
def client(self):
self._client = self._client or google.cloud.bigquery.client.Client(self.project)
return self._client
def add_labels_to_table(self, table_name: str, labels: Mapping[str, str]) -> None:
"""Adds the provided labels to the table."""
table_ref = self.client.dataset(self.dataset).table(table_name)
table = self.client.get_table(table_ref)
table.labels = labels
self.client.update_table(table, ["labels"])
def execute(
self,
query: str,
destination_table: Optional[str] = None,
write_disposition: Optional[google.cloud.bigquery.job.WriteDisposition] = None,
) -> None:
dataset = google.cloud.bigquery.dataset.DatasetReference.from_string(
self.dataset,
default_project=self.project,
)
kwargs = {}
if destination_table:
kwargs["destination"] = dataset.table(destination_table)
kwargs["write_disposition"] = google.cloud.bigquery.job.WriteDisposition.WRITE_TRUNCATE
if write_disposition:
kwargs["write_disposition"] = write_disposition
config = google.cloud.bigquery.job.QueryJobConfig(default_dataset=dataset, **kwargs)
job = self.client.query(query, config)
# block on result
job.result(max_results=1)
def load_table_from_json(
self, results: Iterable[Dict], table: str, job_config: google.cloud.bigquery.LoadJobConfig
):
# wait for the job to complete
destination_table = f"{self.project}.{self.dataset}.{table}"
self.client.load_table_from_json(results, destination_table, job_config=job_config).result()
def delete_table(self, table_id: str) -> None:
"""Delete the table."""
self.client.delete_table(table_id, not_found_ok=True)

121
opmon/cli.py Normal file
Просмотреть файл

@ -0,0 +1,121 @@
import logging
import os
import sys
from datetime import datetime
from pathlib import Path
from typing import Iterable
import click
import pytz
from .logging import LogConfiguration
logger = logging.getLogger(__name__)
class ClickDate(click.ParamType):
name = "date"
def convert(self, value, param, ctx):
if isinstance(value, datetime):
return value
return datetime.strptime(value, "%Y-%m-%d").replace(tzinfo=pytz.utc)
project_id_option = click.option(
"--project_id",
"--project-id",
default="moz-fx-data-experiments",
help="Project to write to",
)
dataset_id_option = click.option(
"--dataset_id", "--dataset-id", default="mozanalysis", help="Dataset to write to", required=True
)
slug_option = click.option(
"--slug",
help="Experimenter or Normandy slug associated with the project to (re)run the analysis for",
)
config_file_option = click.option(
"--config_file", "--config-file", type=click.File("rt"), hidden=True
)
@click.group()
@click.option(
"--log_project_id",
"--log-project-id",
default="moz-fx-data-experiments",
help="GCP project to write logs to",
)
@click.option(
"--log_dataset_id",
"--log-dataset-id",
default="monitoring",
help="Dataset to write logs to",
)
@click.option(
"--log_table_id", "--log-table-id", default="opmon_logs", help="Table to write logs to"
)
@click.option("--log_to_bigquery", "--log-to-bigquery", is_flag=True, default=False)
@click.pass_context
def cli(
ctx,
log_project_id,
log_dataset_id,
log_table_id,
log_to_bigquery,
):
log_config = LogConfiguration(
log_project_id,
log_dataset_id,
log_table_id,
log_to_bigquery,
)
log_config.setup_logger()
ctx.ensure_object(dict)
ctx.obj["log_config"] = log_config
@cli.command()
@project_id_option
@dataset_id_option
@click.option(
"--date",
type=ClickDate(),
help="Date for which experiments should be analyzed",
metavar="YYYY-MM-DD",
required=True,
)
@slug_option
@click.pass_context
def run(
ctx,
project_id,
dataset_id,
date,
slug,
):
# todo: run analysis for date
success = True
sys.exit(0 if success else 1)
@cli.command("validate_config")
@click.argument("path", type=click.Path(exists=True), nargs=-1)
def validate_config(path: Iterable[os.PathLike]):
"""Validate config files."""
dirty = False
# collection = ExperimentCollection.from_experimenter()
for config_file in path:
config_file = Path(config_file)
if not config_file.is_file():
continue
if ".example" in config_file.suffixes:
print(f"Skipping example config {config_file}")
continue
print(f"Evaluating {config_file}...")
# todo: run validation
sys.exit(1 if dirty else 0)

31
opmon/logging/__init__.py Normal file
Просмотреть файл

@ -0,0 +1,31 @@
import logging
from typing import Optional
import attr
from .bigquery_log_handler import BigQueryLogHandler
@attr.s(auto_attribs=True)
class LogConfiguration:
"""Configuration for setting up logging."""
log_project_id: Optional[str]
log_dataset_id: Optional[str]
log_table_id: Optional[str]
log_to_bigquery: bool = False
capacity: int = 50
def setup_logger(self, client=None):
logging.basicConfig(
level=logging.INFO,
format="%(levelname)s:%(asctime)s:%(name)s:%(message)s",
)
logger = logging.getLogger()
if self.log_to_bigquery:
bigquery_handler = BigQueryLogHandler(
self.log_project_id, self.log_dataset_id, self.log_table_id, client, self.capacity
)
bigquery_handler.setLevel(logging.WARNING)
logger.addHandler(bigquery_handler)

Просмотреть файл

@ -0,0 +1,64 @@
import datetime
from logging.handlers import BufferingHandler
from typing import Optional
from google.cloud import bigquery
class BigQueryLogHandler(BufferingHandler):
"""Custom logging handler for writing logs to BigQuery."""
def __init__(
self,
project_id: str,
dataset_id: str,
table_id: str,
client: Optional[bigquery.Client] = None,
capacity=50,
):
self.project_id = project_id
self.dataset_id = dataset_id
self.table_id = table_id
self.client = client
if client is None:
self.client = bigquery.Client(project_id)
super().__init__(capacity)
def _buffer_to_json(self, buffer):
"""Converts the records in the buffer to JSON."""
return [
{
"timestamp": datetime.datetime.fromtimestamp(record.created).strftime(
"%Y-%m-%d %H:%M:%S"
),
"slug": None if not hasattr(record, "slug") else record.slug,
"message": record.getMessage(),
"log_level": record.levelname,
"exception": str(record.exc_info),
"filename": record.filename,
"func_name": record.funcName,
"exception_type": None if not record.exc_info else record.exc_info[0].__name__,
}
for record in buffer
]
def flush(self):
"""
Override default flushing behaviour.
Write the buffer to BigQuery.
"""
self.acquire()
try:
if self.buffer:
destination_table = f"{self.project_id}.{self.dataset_id}.{self.table_id}"
self.client.load_table_from_json(
self._buffer_to_json(self.buffer), destination_table
).result()
self.buffer = []
except Exception as e:
print(f"Exception while flushing logs: {e}")
pass
finally:
self.release()

Просмотреть файл

@ -0,0 +1,33 @@
CREATE TABLE IF NOT EXISTS
`{{gcp_project}}.{{dataset}}.{{slug}}_histogram` (
submission_date DATE,
client_id STRING,
build_id STRING,
branch STRING,
{% for dimension in dimensions %}
{{ dimension.name }} STRING,
{% endfor %}
metrics ARRAY<
STRUCT<
name STRING,
histogram STRUCT<
bucket_count INT64,
sum INT64,
histogram_type INT64,
`range` ARRAY<INT64>,
VALUES
ARRAY<STRUCT<key STRING, value INT64>>
>
>
>)
PARTITION BY submission_date
CLUSTER BY
build_id
OPTIONS
(require_partition_filter = TRUE,
{% if xaxis == "submission_date" %}
partition_expiration_days = NULL
{% else %}
partition_expiration_days = 5
{% endif %}
)

Просмотреть файл

@ -0,0 +1,168 @@
{{ header }}
WITH merged_probes AS (
SELECT
{% if xaxis == "submission_date" %}
DATE(submission_timestamp) AS submission_date,
{% else %}
@submission_date AS submission_date,
{% endif %}
client_id,
SAFE.SUBSTR(application.build_id, 0, 8) AS build_id,
{% for dimension in dimensions %}
CAST({{ dimension.sql }} AS STRING) AS {{ dimension.name }},
{% endfor %}
-- If a pref is defined, treat it as a rollout with an enabled and disabled branch.
-- If branches are provided, use those instead.
-- If neither a pref or branches are available, use the slug and treat it as a rollout
-- where those with the slug have the feature enabled and those without do not.
{% if pref %}
CASE
WHEN SAFE_CAST({{pref}} as BOOLEAN) THEN 'enabled'
WHEN NOT SAFE_CAST({{pref}} as BOOLEAN) THEN 'disabled'
END
AS branch,
{% elif branches %}
mozfun.map.get_key(
environment.experiments,
"{{slug}}"
).branch AS branch,
{% else %}
CASE WHEN
mozfun.map.get_key(
environment.experiments,
"{{slug}}"
).branch IS NULL THEN 'disabled'
ELSE 'enabled'
END AS branch,
{% endif %}
ARRAY<
STRUCT<
metric STRING,
histograms ARRAY<
STRUCT<
bucket_count INT64,
sum INT64,
histogram_type INT64,
`range` ARRAY<INT64>,
values ARRAY<STRUCT<key INT64, value INT64>>>
>>
>[
{% for probe in probes %}
(
"{{ probe.name }}",
ARRAY_AGG(mozfun.hist.extract({{ probe.sql }}) IGNORE NULLS)
)
{{ "," if not loop.last else "" }}
{% endfor %}
] AS metrics,
FROM
`{{source}}`
WHERE
DATE(submission_timestamp) >= DATE_SUB(@submission_date, INTERVAL 60 DAY)
AND normalized_channel = '{{channel}}'
GROUP BY
submission_date,
client_id,
build_id,
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch
),
merged_histograms AS (
SELECT
submission_date,
client_id,
build_id,
branch,
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
ARRAY_AGG(
STRUCT<
name STRING,
histogram STRUCT<
bucket_count INT64,
sum INT64,
histogram_type INT64,
`range` ARRAY<INT64>,
values ARRAY<STRUCT<key INT64, value INT64>>
>
> (
metric,
CASE
WHEN
histograms IS NULL
THEN
NULL
ELSE
mozfun.hist.merge(histograms)
END
)
) AS metrics
FROM
merged_probes
CROSS JOIN
UNNEST(metrics)
WHERE branch IN (
-- If branches are not defined, assume it's a rollout
-- and fall back to branches labeled as enabled/disabled
{% if branches %}
{% for branch in branches %}
"{{ branch }}"
{{ "," if not loop.last else "" }}
{% endfor %}
{% else %}
"enabled", "disabled"
{% endif %}
)
GROUP BY
submission_date,
client_id,
build_id,
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch)
-- Cast histograms to have string keys so we can use the histogram normalization function
SELECT
submission_date,
client_id,
build_id,
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch,
ARRAY_AGG(
STRUCT<
name STRING,
histogram STRUCT<
bucket_count INT64,
sum INT64,
histogram_type INT64,
`range` ARRAY<INT64>,
VALUES
ARRAY<STRUCT<key STRING, value INT64>>
>
>(name, (histogram.bucket_count,
histogram.sum,
histogram.histogram_type,
histogram.range,
ARRAY(SELECT AS STRUCT CAST(keyval.key AS STRING), keyval.value FROM UNNEST(histogram.values) keyval))
)
) AS metrics
FROM merged_histograms
CROSS JOIN UNNEST(metrics)
GROUP BY
submission_date,
client_id,
build_id,
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch

Просмотреть файл

@ -0,0 +1,94 @@
{{ header }}
CREATE OR REPLACE VIEW
`{{gcp_project}}.operational_monitoring.{{slug}}_histogram`
AS
WITH valid_builds AS (
SELECT build_id
FROM `{{gcp_project}}.{{dataset}}.{{slug}}_histogram`
WHERE {% include 'where_clause.sql' %}
GROUP BY 1
HAVING COUNT(DISTINCT client_id) >= {{user_count_threshold}}
),
filtered_histograms AS (
SELECT *
FROM valid_builds
INNER JOIN `{{gcp_project}}.{{dataset}}.{{slug}}_histogram`
USING (build_id)
WHERE {% include 'where_clause.sql' %}
),
normalized AS (
SELECT
client_id,
{% if xaxis == "submission_date" %}
submission_date,
{% else %}
build_id,
{% endif %}
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch,
name AS probe,
STRUCT<
bucket_count INT64,
sum INT64,
histogram_type INT64,
`range` ARRAY<INT64>,
VALUES
ARRAY<STRUCT<key STRING, value FLOAT64>>
>(
ANY_VALUE(histogram.bucket_count),
ANY_VALUE(histogram.sum),
ANY_VALUE(histogram.histogram_type),
ANY_VALUE(histogram.range),
mozfun.glam.histogram_normalized_sum(
mozfun.hist.merge(ARRAY_AGG(histogram IGNORE NULLS)).values,
1.0
)
) AS histogram
FROM filtered_histograms
CROSS JOIN UNNEST(metrics)
GROUP BY
client_id,
{% if xaxis == "submission_date" %}
submission_date,
{% else %}
build_id,
{% endif %}
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch,
probe)
-- Cast histograms to have FLOAT64 keys
-- so we can use the histogram jackknife percentile function.
SELECT
client_id,
{% if xaxis == "submission_date" %}
submission_date,
{% else %}
build_id,
{% endif %}
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch,
probe,
STRUCT<
bucket_count INT64,
sum INT64,
histogram_type INT64,
`range` ARRAY<INT64>,
VALUES
ARRAY<STRUCT<key FLOAT64, value FLOAT64>
>>(histogram.bucket_count,
histogram.sum,
histogram.histogram_type,
histogram.range,
ARRAY(SELECT AS STRUCT CAST(keyval.key AS FLOAT64), keyval.value FROM UNNEST(histogram.values) keyval)
) AS histogram
FROM normalized

Просмотреть файл

@ -0,0 +1,24 @@
CREATE TABLE IF NOT EXISTS
`{{gcp_project}}.{{dataset}}.{{slug}}_scalar` (
submission_date DATE,
client_id STRING,
build_id STRING,
branch STRING,
{% for dimension in dimensions %}
{{ dimension.name }} STRING,
{% endfor %}
name STRING,
agg_type STRING,
value FLOAT64
)
PARTITION BY submission_date
CLUSTER BY
build_id
OPTIONS
(require_partition_filter = TRUE,
{% if xaxis == "submission_date" %}
partition_expiration_days = NULL
{% else %}
partition_expiration_days = 5
{% endif %}
)

Просмотреть файл

@ -0,0 +1,124 @@
{{ header }}
WITH merged_scalars AS (
SELECT
{% if xaxis == "submission_date" %}
DATE(submission_timestamp) AS submission_date,
{% else %}
@submission_date AS submission_date,
{% endif %}
client_id,
SAFE.SUBSTR(application.build_id, 0, 8) AS build_id,
{% for dimension in dimensions %}
CAST({{ dimension.sql }} AS STRING) AS {{ dimension.name }},
{% endfor %}
-- If a pref is defined, treat it as a rollout with an enabled and disabled branch.
-- If branches are provided, use those instead.
-- If neither a pref or branches are available, use the slug and treat it as a rollout
-- where those with the slug have the feature enabled and those without do not.
{% if pref %}
CASE
WHEN SAFE_CAST({{pref}} as BOOLEAN) THEN 'enabled'
WHEN NOT SAFE_CAST({{pref}} as BOOLEAN) THEN 'disabled'
END
AS branch,
{% elif branches %}
mozfun.map.get_key(
environment.experiments,
"{{slug}}"
).branch AS branch,
{% else %}
CASE WHEN
mozfun.map.get_key(
environment.experiments,
"{{slug}}"
).branch IS NULL THEN 'disabled'
ELSE 'enabled'
END AS branch,
{% endif %}
ARRAY<
STRUCT<
name STRING,
agg_type STRING,
value INT64
>
>[
{% for probe in probes %}
(
"{{ probe.name }}",
"MAX",
MAX(CAST({{ probe.sql }} AS INT64))
),
(
"{{ probe.name }}",
"SUM",
SUM(CAST({{ probe.sql }} AS INT64))
)
{{ "," if not loop.last else "" }}
{% endfor %}
] AS metrics,
FROM
`{{source}}`
WHERE
DATE(submission_timestamp) >= DATE_SUB(@submission_date, INTERVAL 60 DAY)
AND normalized_channel = '{{channel}}'
GROUP BY
submission_date,
client_id,
build_id,
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch
),
flattened_scalars AS (
SELECT *
FROM merged_scalars
CROSS JOIN UNNEST(metrics)
WHERE branch IN (
-- If branches are not defined, assume it's a rollout
-- and fall back to branches labeled as enabled/disabled
{% if branches %}
{% for branch in branches %}
"{{ branch }}"
{{ "," if not loop.last else "" }}
{% endfor %}
{% else %}
"enabled", "disabled"
{% endif %}
)
),
log_min_max AS (
SELECT
name,
LOG(IF(MIN(value) <= 0, 1, MIN(value)), 2) log_min,
LOG(IF(MAX(value) <= 0, 1, MAX(value)), 2) log_max
FROM
flattened_scalars
GROUP BY 1),
buckets_by_metric AS (
SELECT name, ARRAY(SELECT FORMAT("%.*f", 2, bucket) FROM UNNEST(
mozfun.glam.histogram_generate_scalar_buckets(log_min, log_max, 100)
) AS bucket ORDER BY bucket) AS buckets
FROM log_min_max)
SELECT
submission_date,
client_id,
build_id,
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch,
name,
agg_type,
-- Replace value with its bucket value
SAFE_CAST(FORMAT("%.*f", 2, COALESCE(mozfun.glam.histogram_bucket_from_value(buckets, SAFE_CAST(value AS FLOAT64)), 0) + 0.0001) AS FLOAT64) AS value
FROM
flattened_scalars
LEFT JOIN buckets_by_metric USING(name)

Просмотреть файл

@ -0,0 +1,50 @@
CREATE OR REPLACE VIEW
`{{gcp_project}}.operational_monitoring.{{slug}}_scalar`
AS
WITH valid_builds AS (
SELECT build_id
FROM `{{gcp_project}}.{{dataset}}.{{slug}}_scalar`
WHERE {% include 'where_clause.sql' %}
GROUP BY 1
HAVING COUNT(DISTINCT client_id) >= {{user_count_threshold}}
),
filtered_scalars AS (
SELECT *
FROM valid_builds
INNER JOIN `{{gcp_project}}.{{dataset}}.{{slug}}_scalar`
USING (build_id)
WHERE {% include 'where_clause.sql' %}
)
SELECT
client_id,
{% if xaxis == "submission_date" %}
submission_date,
{% else %}
build_id,
{% endif %}
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch,
agg_type,
name AS probe,
CASE agg_type
WHEN "MAX" THEN MAX(value)
ELSE SUM(value)
END AS value
FROM filtered_scalars
GROUP BY
client_id,
{% if xaxis == "submission_date" %}
submission_date,
{% else %}
build_id,
{% endif %}
{% for dimension in dimensions %}
{{ dimension.name }},
{% endfor %}
branch,
agg_type,
probe

Просмотреть файл

@ -0,0 +1,14 @@
{% if xaxis == "submission_date" %}
{% if start_date %}
DATE(submission_date) >= "{{start_date}}"
{% else %}
DATE(submission_date) > DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)
{% endif %}
{% else %}
{% if start_date %}
PARSE_DATE('%Y%m%d', CAST(build_id AS STRING)) >= "{{start_date}}"
{% else %}
PARSE_DATE('%Y%m%d', CAST(build_id AS STRING)) > DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)
{% endif %}
AND DATE(submission_date) = DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)
{% endif %}

Просмотреть файл

@ -0,0 +1,188 @@
#
# This file is autogenerated by update_deps
# To update, run:
#
# update_deps
#
attrs==21.4.0
# via
# cattrs
# jsonschema
# mozilla-opmon
# pytest
black==22.1.0
# via pytest-black
cachetools==5.0.0
# via google-auth
cattrs==1.10.0
# via mozilla-opmon
certifi==2021.10.8
# via requests
charset-normalizer==2.0.12
# via requests
click==8.0.4
# via
# black
# mozilla-opmon
coverage[toml]==6.3.2
# via
# mozilla-opmon
# pytest-cov
flake8==4.0.1
# via pytest-flake8
gitdb==4.0.9
# via gitpython
gitpython==3.1.27
# via mozilla-opmon
google-api-core[grpc]==2.5.0
# via
# google-cloud-bigquery
# google-cloud-core
google-auth==2.6.0
# via
# google-api-core
# google-cloud-core
google-cloud-bigquery==2.34.0
# via mozilla-opmon
google-cloud-core==2.2.2
# via google-cloud-bigquery
google-crc32c==1.3.0
# via google-resumable-media
google-resumable-media==2.3.0
# via google-cloud-bigquery
googleapis-common-protos==1.55.0
# via
# google-api-core
# grpcio-status
grpcio==1.44.0
# via
# google-api-core
# google-cloud-bigquery
# grpcio-status
# mozilla-opmon
grpcio-status==1.44.0
# via google-api-core
idna==3.3
# via requests
importlib-resources==5.4.0
# via jsonschema
iniconfig==1.1.1
# via pytest
isort==5.10.1
# via mozilla-opmon
jinja2==3.0.3
# via mozilla-opmon
jsonschema==4.4.0
# via mozilla-opmon
markupsafe==2.1.0
# via jinja2
mccabe==0.6.1
# via flake8
# via -r -
mypy==0.931
# via mozilla-opmon
mypy-extensions==0.4.3
# via
# black
# mypy
packaging==21.3
# via
# google-cloud-bigquery
# pytest
pathspec==0.9.0
# via black
platformdirs==2.5.1
# via black
pluggy==1.0.0
# via pytest
proto-plus==1.20.3
# via google-cloud-bigquery
protobuf==3.19.4
# via
# google-api-core
# google-cloud-bigquery
# googleapis-common-protos
# grpcio-status
# proto-plus
py==1.11.0
# via pytest
pyasn1==0.4.8
# via
# pyasn1-modules
# rsa
pyasn1-modules==0.2.8
# via google-auth
pycodestyle==2.8.0
# via flake8
pyflakes==2.4.0
# via flake8
pyparsing==3.0.7
# via packaging
pyrsistent==0.18.1
# via jsonschema
pytest==7.0.1
# via
# mozilla-opmon
# pytest-black
# pytest-cov
# pytest-flake8
pytest-black==0.3.12
# via mozilla-opmon
pytest-cov==3.0.0
# via mozilla-opmon
pytest-flake8==1.0.7
# via mozilla-opmon
python-dateutil==2.8.2
# via google-cloud-bigquery
pytz==2021.3
# via mozilla-opmon
requests==2.27.1
# via
# google-api-core
# google-cloud-bigquery
# mozilla-opmon
rsa==4.8
# via google-auth
six==1.16.0
# via
# google-auth
# grpcio
# python-dateutil
smmap==5.0.0
# via gitdb
toml==0.10.2
# via
# mozilla-opmon
# pytest-black
tomli==2.0.1
# via
# black
# coverage
# mypy
# pytest
types-futures==3.3.8
# via mozilla-opmon
types-pkg-resources==0.1.3
# via mozilla-opmon
types-protobuf==3.19.12
# via mozilla-opmon
types-pytz==2021.3.5
# via mozilla-opmon
types-pyyaml==6.0.4
# via mozilla-opmon
types-requests==2.27.11
# via mozilla-opmon
types-six==1.16.11
# via mozilla-opmon
types-toml==0.10.4
# via mozilla-opmon
types-urllib3==1.26.10
# via types-requests
typing-extensions==4.1.1
# via
# black
# mypy
urllib3==1.26.8
# via requests
zipp==3.7.0
# via importlib-resources

634
requirements.txt Normal file
Просмотреть файл

@ -0,0 +1,634 @@
#
# This file is autogenerated by pip-compile with python 3.8
# To update, run:
#
# pip-compile --generate-hashes --output-file=requirements.txt requirements.in
#
attrs==21.4.0 \
--hash=sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4 \
--hash=sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd
# via
# -r requirements.in
# cattrs
# jsonschema
# pytest
black==22.1.0 \
--hash=sha256:07e5c049442d7ca1a2fc273c79d1aecbbf1bc858f62e8184abe1ad175c4f7cc2 \
--hash=sha256:0e21e1f1efa65a50e3960edd068b6ae6d64ad6235bd8bfea116a03b21836af71 \
--hash=sha256:1297c63b9e1b96a3d0da2d85d11cd9bf8664251fd69ddac068b98dc4f34f73b6 \
--hash=sha256:228b5ae2c8e3d6227e4bde5920d2fc66cc3400fde7bcc74f480cb07ef0b570d5 \
--hash=sha256:2d6f331c02f0f40aa51a22e479c8209d37fcd520c77721c034517d44eecf5912 \
--hash=sha256:2ff96450d3ad9ea499fc4c60e425a1439c2120cbbc1ab959ff20f7c76ec7e866 \
--hash=sha256:3524739d76b6b3ed1132422bf9d82123cd1705086723bc3e235ca39fd21c667d \
--hash=sha256:35944b7100af4a985abfcaa860b06af15590deb1f392f06c8683b4381e8eeaf0 \
--hash=sha256:373922fc66676133ddc3e754e4509196a8c392fec3f5ca4486673e685a421321 \
--hash=sha256:5fa1db02410b1924b6749c245ab38d30621564e658297484952f3d8a39fce7e8 \
--hash=sha256:6f2f01381f91c1efb1451998bd65a129b3ed6f64f79663a55fe0e9b74a5f81fd \
--hash=sha256:742ce9af3086e5bd07e58c8feb09dbb2b047b7f566eb5f5bc63fd455814979f3 \
--hash=sha256:7835fee5238fc0a0baf6c9268fb816b5f5cd9b8793423a75e8cd663c48d073ba \
--hash=sha256:8871fcb4b447206904932b54b567923e5be802b9b19b744fdff092bd2f3118d0 \
--hash=sha256:a7c0192d35635f6fc1174be575cb7915e92e5dd629ee79fdaf0dcfa41a80afb5 \
--hash=sha256:b1a5ed73ab4c482208d20434f700d514f66ffe2840f63a6252ecc43a9bc77e8a \
--hash=sha256:c8226f50b8c34a14608b848dc23a46e5d08397d009446353dad45e04af0c8e28 \
--hash=sha256:ccad888050f5393f0d6029deea2a33e5ae371fd182a697313bdbd835d3edaf9c \
--hash=sha256:dae63f2dbf82882fa3b2a3c49c32bffe144970a573cd68d247af6560fc493ae1 \
--hash=sha256:e2f69158a7d120fd641d1fa9a921d898e20d52e44a74a6fbbcc570a62a6bc8ab \
--hash=sha256:efbadd9b52c060a8fc3b9658744091cb33c31f830b3f074422ed27bad2b18e8f \
--hash=sha256:f5660feab44c2e3cb24b2419b998846cbb01c23c7fe645fee45087efa3da2d61 \
--hash=sha256:fdb8754b453fb15fad3f72cd9cad3e16776f0964d67cf30ebcbf10327a3777a3
# via
# -r requirements.in
# pytest-black
cachetools==5.0.0 \
--hash=sha256:486471dfa8799eb7ec503a8059e263db000cdda20075ce5e48903087f79d5fd6 \
--hash=sha256:8fecd4203a38af17928be7b90689d8083603073622229ca7077b72d8e5a976e4
# via
# -r requirements.in
# google-auth
cattrs==1.10.0 \
--hash=sha256:211800f725cdecedcbcf4c753bbd22d248312b37d130f06045434acb7d9b34e1 \
--hash=sha256:35dd9063244263e63bd0bd24ea61e3015b00272cead084b2c40d788b0f857c46
# via -r requirements.in
certifi==2021.10.8 \
--hash=sha256:78884e7c1d4b00ce3cea67b44566851c4343c120abd683433ce934a68ea58872 \
--hash=sha256:d62a0163eb4c2344ac042ab2bdf75399a71a2d8c7d47eac2e2ee91b9d6339569
# via
# -r requirements.in
# requests
charset-normalizer==2.0.12 \
--hash=sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597 \
--hash=sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df
# via
# -r requirements.in
# requests
click==8.0.4 \
--hash=sha256:6a7a62563bbfabfda3a38f3023a1db4a35978c0abd76f6c9605ecd6554d6d9b1 \
--hash=sha256:8458d7b1287c5fb128c90e23381cf99dcde74beaf6c7ff6384ce84d6fe090adb
# via
# -r requirements.in
# black
coverage[toml]==6.3.2 \
--hash=sha256:03e2a7826086b91ef345ff18742ee9fc47a6839ccd517061ef8fa1976e652ce9 \
--hash=sha256:07e6db90cd9686c767dcc593dff16c8c09f9814f5e9c51034066cad3373b914d \
--hash=sha256:18d520c6860515a771708937d2f78f63cc47ab3b80cb78e86573b0a760161faf \
--hash=sha256:1ebf730d2381158ecf3dfd4453fbca0613e16eaa547b4170e2450c9707665ce7 \
--hash=sha256:21b7745788866028adeb1e0eca3bf1101109e2dc58456cb49d2d9b99a8c516e6 \
--hash=sha256:26e2deacd414fc2f97dd9f7676ee3eaecd299ca751412d89f40bc01557a6b1b4 \
--hash=sha256:2c6dbb42f3ad25760010c45191e9757e7dce981cbfb90e42feef301d71540059 \
--hash=sha256:2fea046bfb455510e05be95e879f0e768d45c10c11509e20e06d8fcaa31d9e39 \
--hash=sha256:34626a7eee2a3da12af0507780bb51eb52dca0e1751fd1471d0810539cefb536 \
--hash=sha256:37d1141ad6b2466a7b53a22e08fe76994c2d35a5b6b469590424a9953155afac \
--hash=sha256:46191097ebc381fbf89bdce207a6c107ac4ec0890d8d20f3360345ff5976155c \
--hash=sha256:4dd8bafa458b5c7d061540f1ee9f18025a68e2d8471b3e858a9dad47c8d41903 \
--hash=sha256:4e21876082ed887baed0146fe222f861b5815455ada3b33b890f4105d806128d \
--hash=sha256:58303469e9a272b4abdb9e302a780072c0633cdcc0165db7eec0f9e32f901e05 \
--hash=sha256:5ca5aeb4344b30d0bec47481536b8ba1181d50dbe783b0e4ad03c95dc1296684 \
--hash=sha256:68353fe7cdf91f109fc7d474461b46e7f1f14e533e911a2a2cbb8b0fc8613cf1 \
--hash=sha256:6f89d05e028d274ce4fa1a86887b071ae1755082ef94a6740238cd7a8178804f \
--hash=sha256:7a15dc0a14008f1da3d1ebd44bdda3e357dbabdf5a0b5034d38fcde0b5c234b7 \
--hash=sha256:8bdde1177f2311ee552f47ae6e5aa7750c0e3291ca6b75f71f7ffe1f1dab3dca \
--hash=sha256:8ce257cac556cb03be4a248d92ed36904a59a4a5ff55a994e92214cde15c5bad \
--hash=sha256:8cf5cfcb1521dc3255d845d9dca3ff204b3229401994ef8d1984b32746bb45ca \
--hash=sha256:8fbbdc8d55990eac1b0919ca69eb5a988a802b854488c34b8f37f3e2025fa90d \
--hash=sha256:9548f10d8be799551eb3a9c74bbf2b4934ddb330e08a73320123c07f95cc2d92 \
--hash=sha256:96f8a1cb43ca1422f36492bebe63312d396491a9165ed3b9231e778d43a7fca4 \
--hash=sha256:9b27d894748475fa858f9597c0ee1d4829f44683f3813633aaf94b19cb5453cf \
--hash=sha256:9baff2a45ae1f17c8078452e9e5962e518eab705e50a0aa8083733ea7d45f3a6 \
--hash=sha256:a2a8b8bcc399edb4347a5ca8b9b87e7524c0967b335fbb08a83c8421489ddee1 \
--hash=sha256:acf53bc2cf7282ab9b8ba346746afe703474004d9e566ad164c91a7a59f188a4 \
--hash=sha256:b0be84e5a6209858a1d3e8d1806c46214e867ce1b0fd32e4ea03f4bd8b2e3359 \
--hash=sha256:b31651d018b23ec463e95cf10070d0b2c548aa950a03d0b559eaa11c7e5a6fa3 \
--hash=sha256:b78e5afb39941572209f71866aa0b206c12f0109835aa0d601e41552f9b3e620 \
--hash=sha256:c76aeef1b95aff3905fb2ae2d96e319caca5b76fa41d3470b19d4e4a3a313512 \
--hash=sha256:dd035edafefee4d573140a76fdc785dc38829fe5a455c4bb12bac8c20cfc3d69 \
--hash=sha256:dd6fe30bd519694b356cbfcaca9bd5c1737cddd20778c6a581ae20dc8c04def2 \
--hash=sha256:e5f4e1edcf57ce94e5475fe09e5afa3e3145081318e5fd1a43a6b4539a97e518 \
--hash=sha256:ec6bc7fe73a938933d4178c9b23c4e0568e43e220aef9472c4f6044bfc6dd0f0 \
--hash=sha256:f1555ea6d6da108e1999b2463ea1003fe03f29213e459145e70edbaf3e004aaa \
--hash=sha256:f5fa5803f47e095d7ad8443d28b01d48c0359484fec1b9d8606d0e3282084bc4 \
--hash=sha256:f7331dbf301b7289013175087636bbaf5b2405e57259dd2c42fdcc9fcc47325e \
--hash=sha256:f9987b0354b06d4df0f4d3e0ec1ae76d7ce7cbca9a2f98c25041eb79eec766f1 \
--hash=sha256:fd9e830e9d8d89b20ab1e5af09b32d33e1a08ef4c4e14411e559556fd788e6b2
# via
# -r requirements.in
# pytest-cov
flake8==4.0.1 \
--hash=sha256:479b1304f72536a55948cb40a32dce8bb0ffe3501e26eaf292c7e60eb5e0428d \
--hash=sha256:806e034dda44114815e23c16ef92f95c91e4c71100ff52813adf7132a6ad870d
# via
# -r requirements.in
# pytest-flake8
gitdb==4.0.9 \
--hash=sha256:8033ad4e853066ba6ca92050b9df2f89301b8fc8bf7e9324d412a63f8bf1a8fd \
--hash=sha256:bac2fd45c0a1c9cf619e63a90d62bdc63892ef92387424b855792a6cabe789aa
# via
# -r requirements.in
# gitpython
gitpython==3.1.27 \
--hash=sha256:1c885ce809e8ba2d88a29befeb385fcea06338d3640712b59ca623c220bb5704 \
--hash=sha256:5b68b000463593e05ff2b261acff0ff0972df8ab1b70d3cdbd41b546c8b8fc3d
# via -r requirements.in
google-api-core[grpc]==2.5.0 \
--hash=sha256:7d030edbd3a0e994d796e62716022752684e863a6df9864b6ca82a1616c2a5a6 \
--hash=sha256:f33863a6709651703b8b18b67093514838c79f2b04d02aa501203079f24b8018
# via
# -r requirements.in
# google-cloud-bigquery
# google-cloud-core
google-auth==2.6.0 \
--hash=sha256:218ca03d7744ca0c8b6697b6083334be7df49b7bf76a69d555962fd1a7657b5f \
--hash=sha256:ad160fc1ea8f19e331a16a14a79f3d643d813a69534ba9611d2c80dc10439dad
# via
# -r requirements.in
# google-api-core
# google-cloud-core
google-cloud-bigquery==2.34.0 \
--hash=sha256:2b75e19e53c859244120180c72a1aa99e849156e3c0413071a4bfbae0b74c530 \
--hash=sha256:653bf2465bc1ae56063c83a75ce0af980b816cafe587a7bba851e6a81be85ae2
# via -r requirements.in
google-cloud-core==2.2.2 \
--hash=sha256:7d19bf8868b410d0bdf5a03468a3f3f2db233c0ee86a023f4ecc2b7a4b15f736 \
--hash=sha256:d9cffaf86df6a876438d4e8471183bbe404c9a15de9afe60433bc7dce8cb4252
# via
# -r requirements.in
# google-cloud-bigquery
google-crc32c==1.3.0 \
--hash=sha256:04e7c220798a72fd0f08242bc8d7a05986b2a08a0573396187fd32c1dcdd58b3 \
--hash=sha256:05340b60bf05b574159e9bd940152a47d38af3fb43803ffe71f11d704b7696a6 \
--hash=sha256:12674a4c3b56b706153a358eaa1018c4137a5a04635b92b4652440d3d7386206 \
--hash=sha256:127f9cc3ac41b6a859bd9dc4321097b1a4f6aa7fdf71b4f9227b9e3ebffb4422 \
--hash=sha256:13af315c3a0eec8bb8b8d80b8b128cb3fcd17d7e4edafc39647846345a3f003a \
--hash=sha256:1926fd8de0acb9d15ee757175ce7242e235482a783cd4ec711cc999fc103c24e \
--hash=sha256:226f2f9b8e128a6ca6a9af9b9e8384f7b53a801907425c9a292553a3a7218ce0 \
--hash=sha256:276de6273eb074a35bc598f8efbc00c7869c5cf2e29c90748fccc8c898c244df \
--hash=sha256:318f73f5484b5671f0c7f5f63741ab020a599504ed81d209b5c7129ee4667407 \
--hash=sha256:3bbce1be3687bbfebe29abdb7631b83e6b25da3f4e1856a1611eb21854b689ea \
--hash=sha256:42ae4781333e331a1743445931b08ebdad73e188fd554259e772556fc4937c48 \
--hash=sha256:58be56ae0529c664cc04a9c76e68bb92b091e0194d6e3c50bea7e0f266f73713 \
--hash=sha256:5da2c81575cc3ccf05d9830f9e8d3c70954819ca9a63828210498c0774fda1a3 \
--hash=sha256:6311853aa2bba4064d0c28ca54e7b50c4d48e3de04f6770f6c60ebda1e975267 \
--hash=sha256:650e2917660e696041ab3dcd7abac160b4121cd9a484c08406f24c5964099829 \
--hash=sha256:6a4db36f9721fdf391646685ecffa404eb986cbe007a3289499020daf72e88a2 \
--hash=sha256:779cbf1ce375b96111db98fca913c1f5ec11b1d870e529b1dc7354b2681a8c3a \
--hash=sha256:7f6fe42536d9dcd3e2ffb9d3053f5d05221ae3bbcefbe472bdf2c71c793e3183 \
--hash=sha256:891f712ce54e0d631370e1f4997b3f182f3368179198efc30d477c75d1f44942 \
--hash=sha256:95c68a4b9b7828ba0428f8f7e3109c5d476ca44996ed9a5f8aac6269296e2d59 \
--hash=sha256:96a8918a78d5d64e07c8ea4ed2bc44354e3f93f46a4866a40e8db934e4c0d74b \
--hash=sha256:9c3cf890c3c0ecfe1510a452a165431b5831e24160c5fcf2071f0f85ca5a47cd \
--hash=sha256:9f58099ad7affc0754ae42e6d87443299f15d739b0ce03c76f515153a5cda06c \
--hash=sha256:a0b9e622c3b2b8d0ce32f77eba617ab0d6768b82836391e4f8f9e2074582bf02 \
--hash=sha256:a7f9cbea4245ee36190f85fe1814e2d7b1e5f2186381b082f5d59f99b7f11328 \
--hash=sha256:bab4aebd525218bab4ee615786c4581952eadc16b1ff031813a2fd51f0cc7b08 \
--hash=sha256:c124b8c8779bf2d35d9b721e52d4adb41c9bfbde45e6a3f25f0820caa9aba73f \
--hash=sha256:c9da0a39b53d2fab3e5467329ed50e951eb91386e9d0d5b12daf593973c3b168 \
--hash=sha256:ca60076c388728d3b6ac3846842474f4250c91efbfe5afa872d3ffd69dd4b318 \
--hash=sha256:cb6994fff247987c66a8a4e550ef374671c2b82e3c0d2115e689d21e511a652d \
--hash=sha256:d1c1d6236feab51200272d79b3d3e0f12cf2cbb12b208c835b175a21efdb0a73 \
--hash=sha256:dd7760a88a8d3d705ff562aa93f8445ead54f58fd482e4f9e2bafb7e177375d4 \
--hash=sha256:dda4d8a3bb0b50f540f6ff4b6033f3a74e8bf0bd5320b70fab2c03e512a62812 \
--hash=sha256:e0f1ff55dde0ebcfbef027edc21f71c205845585fffe30d4ec4979416613e9b3 \
--hash=sha256:e7a539b9be7b9c00f11ef16b55486141bc2cdb0c54762f84e3c6fc091917436d \
--hash=sha256:eb0b14523758e37802f27b7f8cd973f5f3d33be7613952c0df904b68c4842f0e \
--hash=sha256:ed447680ff21c14aaceb6a9f99a5f639f583ccfe4ce1a5e1d48eb41c3d6b3217 \
--hash=sha256:f52a4ad2568314ee713715b1e2d79ab55fab11e8b304fd1462ff5cccf4264b3e \
--hash=sha256:fbd60c6aaa07c31d7754edbc2334aef50601b7f1ada67a96eb1eb57c7c72378f \
--hash=sha256:fc28e0db232c62ca0c3600884933178f0825c99be4474cdd645e378a10588125 \
--hash=sha256:fe31de3002e7b08eb20823b3735b97c86c5926dd0581c7710a680b418a8709d4 \
--hash=sha256:fec221a051150eeddfdfcff162e6db92c65ecf46cb0f7bb1bf812a1520ec026b \
--hash=sha256:ff71073ebf0e42258a42a0b34f2c09ec384977e7f6808999102eedd5b49920e3
# via
# -r requirements.in
# google-resumable-media
google-resumable-media==2.3.0 \
--hash=sha256:1a7dce5790b04518edc02c2ce33965556660d64957106d66a945086e2b642572 \
--hash=sha256:36dc2f7201ee1cb360ef502187aa4e1f2b6ec4467fcee92e08a8cf165e36f587
# via
# -r requirements.in
# google-cloud-bigquery
googleapis-common-protos==1.55.0 \
--hash=sha256:183bb0356bd614c4330ad5158bc1c1bcf9bcf7f5e7f911317559fe209496eeee \
--hash=sha256:53eb313064738f45d5ac634155ae208e121c963659627b90dfcb61ef514c03e1
# via
# -r requirements.in
# google-api-core
# grpcio-status
grpcio==1.44.0 \
--hash=sha256:05467acd391e3fffb05991c76cb2ed2fa1309d0e3815ac379764bc5670b4b5d4 \
--hash=sha256:0ac72d4b953b76924f8fa21436af060d7e6d8581e279863f30ee14f20751ac27 \
--hash=sha256:11f811c0fffd84fca747fbc742464575e5eb130fd4fb4d6012ccc34febd001db \
--hash=sha256:13343e7b840c20f43b44f0e6d3bbdc037c964f0aec9735d7cb685c407731c9ff \
--hash=sha256:14eefcf623890f3f7dd7831decd2a2116652b5ce1e0f1d4b464b8f52110743b0 \
--hash=sha256:19e54f0c7083c8332b5a75a9081fc5127f1dbb67b6c1a32bd7fe896ef0934918 \
--hash=sha256:36a7bdd6ef9bca050c7ade8cba5f0e743343ea0756d5d3d520e915098a9dc503 \
--hash=sha256:3d47553b8e86ab1e59b0185ba6491a187f94a0239f414c8fc867a22b0405b798 \
--hash=sha256:41036a574cab3468f24d41d6ed2b52588fb85ed60f8feaa925d7e424a250740b \
--hash=sha256:4201c597e5057a9bfef9ea5777a6d83f6252cb78044db7d57d941ec2300734a5 \
--hash=sha256:46d4843192e7d36278884282e100b8f305cf37d1b3d8c6b4f736d4454640a069 \
--hash=sha256:4bae1c99896045d3062ab95478411c8d5a52cb84b91a1517312629fa6cfeb50e \
--hash=sha256:4ee51964edfd0a1293a95bb0d72d134ecf889379d90d2612cbf663623ce832b4 \
--hash=sha256:4fcb53e4eb8c271032c91b8981df5fc1bb974bc73e306ec2c27da41bd95c44b5 \
--hash=sha256:5c30a9a7d3a05920368a60b080cbbeaf06335303be23ac244034c71c03a0fd24 \
--hash=sha256:5f3c54ebb5d9633a557335c01d88d3d4928e9b1b131692283b6184da1edbec0b \
--hash=sha256:6641a28cc826a92ef717201cca9a035c34a0185e38b0c93f3ce5f01a01a1570a \
--hash=sha256:790d7493337558ae168477d1be3178f4c9b8f91d8cd9b8b719d06fd9b2d48836 \
--hash=sha256:871078218fa9117e2a378678f327e32fda04e363ed6bc0477275444273255d4d \
--hash=sha256:898c159148f27e23c08a337fb80d31ece6b76bb24f359d83929460d813665b74 \
--hash=sha256:89b390b1c0de909965280d175c53128ce2f0f4f5c0f011382243dd7f2f894060 \
--hash=sha256:8fa6584046a7cf281649975a363673fa5d9c6faf9dc923f261cc0e56713b5892 \
--hash=sha256:9075c0c003c1ff14ebce8f0ba55cc692158cb55c68da09cf8b0f9fc5b749e343 \
--hash=sha256:9a86a91201f8345502ea81dee0a55ae13add5fafadf109b17acd858fe8239651 \
--hash=sha256:a8d610b7b557a7609fecee80b6dd793ecb7a9a3c3497fbdce63ce7d151cdd705 \
--hash=sha256:b81dc7894062ed2d25b74a2725aaa0a6895ce97ce854f432fe4e87cad5a07316 \
--hash=sha256:b8d852329336c584c636caa9c2db990f3a332b19bc86a80f4646b58d27c142db \
--hash=sha256:be857b7ec2ac43455156e6ba89262f7d7ae60227049427d01a3fecd218a3f88d \
--hash=sha256:bebe90b8020b4248e5a2076b56154cc6ff45691bbbe980579fc9db26717ac968 \
--hash=sha256:bfd36b959c3c4e945119387baed1414ea46f7116886aa23de0172302b49d7ff1 \
--hash=sha256:c122dac5cb299b8ad7308d61bd9fe0413de13b0347cce465398436b3fdf1f609 \
--hash=sha256:c5c2f8417d13386e18ccc8c61467cb6a6f9667a1ff7000a2d7d378e5d7df693f \
--hash=sha256:ccd388b8f37b19d06e4152189726ce309e36dc03b53f2216a4ea49f09a7438e6 \
--hash=sha256:cd61b52d9cf8fcf8d9628c0b640b9e44fdc5e93d989cc268086a858540ed370c \
--hash=sha256:cf220199b7b4992729ad4d55d5d3f652f4ccfe1a35b5eacdbecf189c245e1859 \
--hash=sha256:d1e22d3a510438b7f3365c0071b810672d09febac6e8ca8a47eab657ae5f347b \
--hash=sha256:d2ec124a986093e26420a5fb10fa3f02b2c232f924cdd7b844ddf7e846c020cd \
--hash=sha256:dc3290d0411ddd2bd49adba5793223de8de8b01588d45e9376f1a9f7d25414f4 \
--hash=sha256:e2149077d71e060678130644670389ddf1491200bcea16c5560d4ccdc65e3f2e \
--hash=sha256:e2de61005118ae59d48d5d749283ebfd1ba4ca68cc1000f8a395cd2bdcff7ceb \
--hash=sha256:e50ddea6de76c09b656df4b5a55ae222e2a56e625c44250e501ff3c904113ec1 \
--hash=sha256:e898194f76212facbaeb6d7545debff29351afa23b53ff8f0834d66611af5139 \
--hash=sha256:f6a9cf0e77f72f2ac30c9c6e086bc7446c984c51bebc6c7f50fbcd718037edba \
--hash=sha256:fdb0a3e0e64843441793923d9532a3a23907b07b2a1e0a7a31f186dc185bb772
# via
# -r requirements.in
# google-api-core
# google-cloud-bigquery
# grpcio-status
grpcio-status==1.44.0 \
--hash=sha256:ac613ab7a45380cbfa3e529022d0b37317d858f172ba6e65c188aa7355539398 \
--hash=sha256:caf831c1fdcafeb3f48f7f2500e6ffb0c755120354a302f8695b698b0a2faace
# via
# -r requirements.in
# google-api-core
idna==3.3 \
--hash=sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff \
--hash=sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d
# via
# -r requirements.in
# requests
importlib-resources==5.4.0 \
--hash=sha256:33a95faed5fc19b4bc16b29a6eeae248a3fe69dd55d4d229d2b480e23eeaad45 \
--hash=sha256:d756e2f85dd4de2ba89be0b21dba2a3bbec2e871a42a3a16719258a11f87506b
# via
# -r requirements.in
# jsonschema
iniconfig==1.1.1 \
--hash=sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3 \
--hash=sha256:bc3af051d7d14b2ee5ef9969666def0cd1a000e121eaea580d4a313df4b37f32
# via
# -r requirements.in
# pytest
isort==5.10.1 \
--hash=sha256:6f62d78e2f89b4500b080fe3a81690850cd254227f27f75c3a0c491a1f351ba7 \
--hash=sha256:e8443a5e7a020e9d7f97f1d7d9cd17c88bcb3bc7e218bf9cf5095fe550be2951
# via -r requirements.in
jinja2==3.0.3 \
--hash=sha256:077ce6014f7b40d03b47d1f1ca4b0fc8328a692bd284016f806ed0eaca390ad8 \
--hash=sha256:611bb273cd68f3b993fabdc4064fc858c5b47a973cb5aa7999ec1ba405c87cd7
# via -r requirements.in
jsonschema==4.4.0 \
--hash=sha256:636694eb41b3535ed608fe04129f26542b59ed99808b4f688aa32dcf55317a83 \
--hash=sha256:77281a1f71684953ee8b3d488371b162419767973789272434bbc3f29d9c8823
# via -r requirements.in
markupsafe==2.1.0 \
--hash=sha256:023af8c54fe63530545f70dd2a2a7eed18d07a9a77b94e8bf1e2ff7f252db9a3 \
--hash=sha256:09c86c9643cceb1d87ca08cdc30160d1b7ab49a8a21564868921959bd16441b8 \
--hash=sha256:142119fb14a1ef6d758912b25c4e803c3ff66920635c44078666fe7cc3f8f759 \
--hash=sha256:1d1fb9b2eec3c9714dd936860850300b51dbaa37404209c8d4cb66547884b7ed \
--hash=sha256:204730fd5fe2fe3b1e9ccadb2bd18ba8712b111dcabce185af0b3b5285a7c989 \
--hash=sha256:24c3be29abb6b34052fd26fc7a8e0a49b1ee9d282e3665e8ad09a0a68faee5b3 \
--hash=sha256:290b02bab3c9e216da57c1d11d2ba73a9f73a614bbdcc027d299a60cdfabb11a \
--hash=sha256:3028252424c72b2602a323f70fbf50aa80a5d3aa616ea6add4ba21ae9cc9da4c \
--hash=sha256:30c653fde75a6e5eb814d2a0a89378f83d1d3f502ab710904ee585c38888816c \
--hash=sha256:3cace1837bc84e63b3fd2dfce37f08f8c18aeb81ef5cf6bb9b51f625cb4e6cd8 \
--hash=sha256:4056f752015dfa9828dce3140dbadd543b555afb3252507348c493def166d454 \
--hash=sha256:454ffc1cbb75227d15667c09f164a0099159da0c1f3d2636aa648f12675491ad \
--hash=sha256:598b65d74615c021423bd45c2bc5e9b59539c875a9bdb7e5f2a6b92dfcfc268d \
--hash=sha256:599941da468f2cf22bf90a84f6e2a65524e87be2fce844f96f2dd9a6c9d1e635 \
--hash=sha256:5ddea4c352a488b5e1069069f2f501006b1a4362cb906bee9a193ef1245a7a61 \
--hash=sha256:62c0285e91414f5c8f621a17b69fc0088394ccdaa961ef469e833dbff64bd5ea \
--hash=sha256:679cbb78914ab212c49c67ba2c7396dc599a8479de51b9a87b174700abd9ea49 \
--hash=sha256:6e104c0c2b4cd765b4e83909cde7ec61a1e313f8a75775897db321450e928cce \
--hash=sha256:736895a020e31b428b3382a7887bfea96102c529530299f426bf2e636aacec9e \
--hash=sha256:75bb36f134883fdbe13d8e63b8675f5f12b80bb6627f7714c7d6c5becf22719f \
--hash=sha256:7d2f5d97fcbd004c03df8d8fe2b973fe2b14e7bfeb2cfa012eaa8759ce9a762f \
--hash=sha256:80beaf63ddfbc64a0452b841d8036ca0611e049650e20afcb882f5d3c266d65f \
--hash=sha256:84ad5e29bf8bab3ad70fd707d3c05524862bddc54dc040982b0dbcff36481de7 \
--hash=sha256:8da5924cb1f9064589767b0f3fc39d03e3d0fb5aa29e0cb21d43106519bd624a \
--hash=sha256:961eb86e5be7d0973789f30ebcf6caab60b844203f4396ece27310295a6082c7 \
--hash=sha256:96de1932237abe0a13ba68b63e94113678c379dca45afa040a17b6e1ad7ed076 \
--hash=sha256:a0a0abef2ca47b33fb615b491ce31b055ef2430de52c5b3fb19a4042dbc5cadb \
--hash=sha256:b2a5a856019d2833c56a3dcac1b80fe795c95f401818ea963594b345929dffa7 \
--hash=sha256:b8811d48078d1cf2a6863dafb896e68406c5f513048451cd2ded0473133473c7 \
--hash=sha256:c532d5ab79be0199fa2658e24a02fce8542df196e60665dd322409a03db6a52c \
--hash=sha256:d3b64c65328cb4cd252c94f83e66e3d7acf8891e60ebf588d7b493a55a1dbf26 \
--hash=sha256:d4e702eea4a2903441f2735799d217f4ac1b55f7d8ad96ab7d4e25417cb0827c \
--hash=sha256:d5653619b3eb5cbd35bfba3c12d575db2a74d15e0e1c08bf1db788069d410ce8 \
--hash=sha256:d66624f04de4af8bbf1c7f21cc06649c1c69a7f84109179add573ce35e46d448 \
--hash=sha256:e67ec74fada3841b8c5f4c4f197bea916025cb9aa3fe5abf7d52b655d042f956 \
--hash=sha256:e6f7f3f41faffaea6596da86ecc2389672fa949bd035251eab26dc6697451d05 \
--hash=sha256:f02cf7221d5cd915d7fa58ab64f7ee6dd0f6cddbb48683debf5d04ae9b1c2cc1 \
--hash=sha256:f0eddfcabd6936558ec020130f932d479930581171368fd728efcfb6ef0dd357 \
--hash=sha256:fabbe18087c3d33c5824cb145ffca52eccd053061df1d79d4b66dafa5ad2a5ea \
--hash=sha256:fc3150f85e2dbcf99e65238c842d1cfe69d3e7649b19864c1cc043213d9cd730
# via
# -r requirements.in
# jinja2
mccabe==0.6.1 \
--hash=sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42 \
--hash=sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f
# via
# -r requirements.in
# flake8
mypy==0.931 \
--hash=sha256:0038b21890867793581e4cb0d810829f5fd4441aa75796b53033af3aa30430ce \
--hash=sha256:1171f2e0859cfff2d366da2c7092b06130f232c636a3f7301e3feb8b41f6377d \
--hash=sha256:1b06268df7eb53a8feea99cbfff77a6e2b205e70bf31743e786678ef87ee8069 \
--hash=sha256:1b65714dc296a7991000b6ee59a35b3f550e0073411ac9d3202f6516621ba66c \
--hash=sha256:1bf752559797c897cdd2c65f7b60c2b6969ffe458417b8d947b8340cc9cec08d \
--hash=sha256:300717a07ad09525401a508ef5d105e6b56646f7942eb92715a1c8d610149714 \
--hash=sha256:3c5b42d0815e15518b1f0990cff7a705805961613e701db60387e6fb663fe78a \
--hash=sha256:4365c60266b95a3f216a3047f1d8e3f895da6c7402e9e1ddfab96393122cc58d \
--hash=sha256:50c7346a46dc76a4ed88f3277d4959de8a2bd0a0fa47fa87a4cde36fe247ac05 \
--hash=sha256:5b56154f8c09427bae082b32275a21f500b24d93c88d69a5e82f3978018a0266 \
--hash=sha256:74f7eccbfd436abe9c352ad9fb65872cc0f1f0a868e9d9c44db0893440f0c697 \
--hash=sha256:7b3f6f557ba4afc7f2ce6d3215d5db279bcf120b3cfd0add20a5d4f4abdae5bc \
--hash=sha256:8c11003aaeaf7cc2d0f1bc101c1cc9454ec4cc9cb825aef3cafff8a5fdf4c799 \
--hash=sha256:8ca7f8c4b1584d63c9a0f827c37ba7a47226c19a23a753d52e5b5eddb201afcd \
--hash=sha256:c89702cac5b302f0c5d33b172d2b55b5df2bede3344a2fbed99ff96bddb2cf00 \
--hash=sha256:d8f1ff62f7a879c9fe5917b3f9eb93a79b78aad47b533911b853a757223f72e7 \
--hash=sha256:d9d2b84b2007cea426e327d2483238f040c49405a6bf4074f605f0156c91a47a \
--hash=sha256:e839191b8da5b4e5d805f940537efcaa13ea5dd98418f06dc585d2891d228cf0 \
--hash=sha256:f9fe20d0872b26c4bba1c1be02c5340de1019530302cf2dcc85c7f9fc3252ae0 \
--hash=sha256:ff3bf387c14c805ab1388185dd22d6b210824e164d4bb324b195ff34e322d166
# via -r requirements.in
mypy-extensions==0.4.3 \
--hash=sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d \
--hash=sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8
# via
# -r requirements.in
# black
# mypy
packaging==21.3 \
--hash=sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb \
--hash=sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522
# via
# -r requirements.in
# google-cloud-bigquery
# pytest
pathspec==0.9.0 \
--hash=sha256:7d15c4ddb0b5c802d161efc417ec1a2558ea2653c2e8ad9c19098201dc1c993a \
--hash=sha256:e564499435a2673d586f6b2130bb5b95f04a3ba06f81b8f895b651a3c76aabb1
# via
# -r requirements.in
# black
platformdirs==2.5.1 \
--hash=sha256:7535e70dfa32e84d4b34996ea99c5e432fa29a708d0f4e394bbcb2a8faa4f16d \
--hash=sha256:bcae7cab893c2d310a711b70b24efb93334febe65f8de776ee320b517471e227
# via
# -r requirements.in
# black
pluggy==1.0.0 \
--hash=sha256:4224373bacce55f955a878bf9cfa763c1e360858e330072059e10bad68531159 \
--hash=sha256:74134bbf457f031a36d68416e1509f34bd5ccc019f0bcc952c7b909d06b37bd3
# via
# -r requirements.in
# pytest
proto-plus==1.20.3 \
--hash=sha256:b06be21c3848fbc20387d1d6891a9b97dfa1cdd0f10d3d42ef70b5700ec0f423 \
--hash=sha256:f28b225bc9e6c14e206fb7f8e996a46fb2ccd902648e512d496abb6a716a4ae5
# via
# -r requirements.in
# google-cloud-bigquery
protobuf==3.19.4 \
--hash=sha256:072fbc78d705d3edc7ccac58a62c4c8e0cec856987da7df8aca86e647be4e35c \
--hash=sha256:09297b7972da685ce269ec52af761743714996b4381c085205914c41fcab59fb \
--hash=sha256:16f519de1313f1b7139ad70772e7db515b1420d208cb16c6d7858ea989fc64a9 \
--hash=sha256:1c91ef4110fdd2c590effb5dca8fdbdcb3bf563eece99287019c4204f53d81a4 \
--hash=sha256:3112b58aac3bac9c8be2b60a9daf6b558ca3f7681c130dcdd788ade7c9ffbdca \
--hash=sha256:36cecbabbda242915529b8ff364f2263cd4de7c46bbe361418b5ed859677ba58 \
--hash=sha256:4276cdec4447bd5015453e41bdc0c0c1234eda08420b7c9a18b8d647add51e4b \
--hash=sha256:435bb78b37fc386f9275a7035fe4fb1364484e38980d0dd91bc834a02c5ec909 \
--hash=sha256:48ed3877fa43e22bcacc852ca76d4775741f9709dd9575881a373bd3e85e54b2 \
--hash=sha256:54a1473077f3b616779ce31f477351a45b4fef8c9fd7892d6d87e287a38df368 \
--hash=sha256:69da7d39e39942bd52848438462674c463e23963a1fdaa84d88df7fbd7e749b2 \
--hash=sha256:6cbc312be5e71869d9d5ea25147cdf652a6781cf4d906497ca7690b7b9b5df13 \
--hash=sha256:7bb03bc2873a2842e5ebb4801f5c7ff1bfbdf426f85d0172f7644fcda0671ae0 \
--hash=sha256:7ca7da9c339ca8890d66958f5462beabd611eca6c958691a8fe6eccbd1eb0c6e \
--hash=sha256:835a9c949dc193953c319603b2961c5c8f4327957fe23d914ca80d982665e8ee \
--hash=sha256:84123274d982b9e248a143dadd1b9815049f4477dc783bf84efe6250eb4b836a \
--hash=sha256:8961c3a78ebfcd000920c9060a262f082f29838682b1f7201889300c1fbe0616 \
--hash=sha256:96bd766831596d6014ca88d86dc8fe0fb2e428c0b02432fd9db3943202bf8c5e \
--hash=sha256:9df0c10adf3e83015ced42a9a7bd64e13d06c4cf45c340d2c63020ea04499d0a \
--hash=sha256:b38057450a0c566cbd04890a40edf916db890f2818e8682221611d78dc32ae26 \
--hash=sha256:bd95d1dfb9c4f4563e6093a9aa19d9c186bf98fa54da5252531cc0d3a07977e7 \
--hash=sha256:c1068287025f8ea025103e37d62ffd63fec8e9e636246b89c341aeda8a67c934 \
--hash=sha256:c438268eebb8cf039552897d78f402d734a404f1360592fef55297285f7f953f \
--hash=sha256:cdc076c03381f5c1d9bb1abdcc5503d9ca8b53cf0a9d31a9f6754ec9e6c8af0f \
--hash=sha256:f358aa33e03b7a84e0d91270a4d4d8f5df6921abe99a377828839e8ed0c04e07 \
--hash=sha256:f51d5a9f137f7a2cec2d326a74b6e3fc79d635d69ffe1b036d39fc7d75430d37
# via
# -r requirements.in
# google-api-core
# google-cloud-bigquery
# googleapis-common-protos
# grpcio-status
# proto-plus
py==1.11.0 \
--hash=sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719 \
--hash=sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378
# via
# -r requirements.in
# pytest
pyasn1==0.4.8 \
--hash=sha256:39c7e2ec30515947ff4e87fb6f456dfc6e84857d34be479c9d4a4ba4bf46aa5d \
--hash=sha256:aef77c9fb94a3ac588e87841208bdec464471d9871bd5050a287cc9a475cd0ba
# via
# -r requirements.in
# pyasn1-modules
# rsa
pyasn1-modules==0.2.8 \
--hash=sha256:905f84c712230b2c592c19470d3ca8d552de726050d1d1716282a1f6146be65e \
--hash=sha256:a50b808ffeb97cb3601dd25981f6b016cbb3d31fbf57a8b8a87428e6158d0c74
# via
# -r requirements.in
# google-auth
pycodestyle==2.8.0 \
--hash=sha256:720f8b39dde8b293825e7ff02c475f3077124006db4f440dcbc9a20b76548a20 \
--hash=sha256:eddd5847ef438ea1c7870ca7eb78a9d47ce0cdb4851a5523949f2601d0cbbe7f
# via
# -r requirements.in
# flake8
pyflakes==2.4.0 \
--hash=sha256:05a85c2872edf37a4ed30b0cce2f6093e1d0581f8c19d7393122da7e25b2b24c \
--hash=sha256:3bb3a3f256f4b7968c9c788781e4ff07dce46bdf12339dcda61053375426ee2e
# via
# -r requirements.in
# flake8
pyparsing==3.0.7 \
--hash=sha256:18ee9022775d270c55187733956460083db60b37d0d0fb357445f3094eed3eea \
--hash=sha256:a6c06a88f252e6c322f65faf8f418b16213b51bdfaece0524c1c1bc30c63c484
# via
# -r requirements.in
# packaging
pyrsistent==0.18.1 \
--hash=sha256:0e3e1fcc45199df76053026a51cc59ab2ea3fc7c094c6627e93b7b44cdae2c8c \
--hash=sha256:1b34eedd6812bf4d33814fca1b66005805d3640ce53140ab8bbb1e2651b0d9bc \
--hash=sha256:4ed6784ceac462a7d6fcb7e9b663e93b9a6fb373b7f43594f9ff68875788e01e \
--hash=sha256:5d45866ececf4a5fff8742c25722da6d4c9e180daa7b405dc0a2a2790d668c26 \
--hash=sha256:636ce2dc235046ccd3d8c56a7ad54e99d5c1cd0ef07d9ae847306c91d11b5fec \
--hash=sha256:6455fc599df93d1f60e1c5c4fe471499f08d190d57eca040c0ea182301321286 \
--hash=sha256:6bc66318fb7ee012071b2792024564973ecc80e9522842eb4e17743604b5e045 \
--hash=sha256:7bfe2388663fd18bd8ce7db2c91c7400bf3e1a9e8bd7d63bf7e77d39051b85ec \
--hash=sha256:7ec335fc998faa4febe75cc5268a9eac0478b3f681602c1f27befaf2a1abe1d8 \
--hash=sha256:914474c9f1d93080338ace89cb2acee74f4f666fb0424896fcfb8d86058bf17c \
--hash=sha256:b568f35ad53a7b07ed9b1b2bae09eb15cdd671a5ba5d2c66caee40dbf91c68ca \
--hash=sha256:cdfd2c361b8a8e5d9499b9082b501c452ade8bbf42aef97ea04854f4a3f43b22 \
--hash=sha256:d1b96547410f76078eaf66d282ddca2e4baae8964364abb4f4dcdde855cd123a \
--hash=sha256:d4d61f8b993a7255ba714df3aca52700f8125289f84f704cf80916517c46eb96 \
--hash=sha256:d7a096646eab884bf8bed965bad63ea327e0d0c38989fc83c5ea7b8a87037bfc \
--hash=sha256:df46c854f490f81210870e509818b729db4488e1f30f2a1ce1698b2295a878d1 \
--hash=sha256:e24a828f57e0c337c8d8bb9f6b12f09dfdf0273da25fda9e314f0b684b415a07 \
--hash=sha256:e4f3149fd5eb9b285d6bfb54d2e5173f6a116fe19172686797c056672689daf6 \
--hash=sha256:e92a52c166426efbe0d1ec1332ee9119b6d32fc1f0bbfd55d5c1088070e7fc1b \
--hash=sha256:f87cc2863ef33c709e237d4b5f4502a62a00fab450c9e020892e8e2ede5847f5 \
--hash=sha256:fd8da6d0124efa2f67d86fa70c851022f87c98e205f0594e1fae044e7119a5a6
# via
# -r requirements.in
# jsonschema
pytest==7.0.1 \
--hash=sha256:9ce3ff477af913ecf6321fe337b93a2c0dcf2a0a1439c43f5452112c1e4280db \
--hash=sha256:e30905a0c131d3d94b89624a1cc5afec3e0ba2fbdb151867d8e0ebd49850f171
# via
# -r requirements.in
# pytest-black
# pytest-cov
# pytest-flake8
pytest-black==0.3.12 \
--hash=sha256:1d339b004f764d6cd0f06e690f6dd748df3d62e6fe1a692d6a5500ac2c5b75a5
# via -r requirements.in
pytest-cov==3.0.0 \
--hash=sha256:578d5d15ac4a25e5f961c938b85a05b09fdaae9deef3bb6de9a6e766622ca7a6 \
--hash=sha256:e7f0f5b1617d2210a2cabc266dfe2f4c75a8d32fb89eafb7ad9d06f6d076d470
# via -r requirements.in
pytest-flake8==1.0.7 \
--hash=sha256:c28cf23e7d359753c896745fd4ba859495d02e16c84bac36caa8b1eec58f5bc1 \
--hash=sha256:f0259761a903563f33d6f099914afef339c085085e643bee8343eb323b32dd6b
# via -r requirements.in
python-dateutil==2.8.2 \
--hash=sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86 \
--hash=sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9
# via
# -r requirements.in
# google-cloud-bigquery
pytz==2021.3 \
--hash=sha256:3672058bc3453457b622aab7a1c3bfd5ab0bdae451512f6cf25f64ed37f5b87c \
--hash=sha256:acad2d8b20a1af07d4e4c9d2e9285c5ed9104354062f275f3fcd88dcef4f1326
# via -r requirements.in
requests==2.27.1 \
--hash=sha256:68d7c56fd5a8999887728ef304a6d12edc7be74f1cfa47714fc8b414525c9a61 \
--hash=sha256:f22fa1e554c9ddfd16e6e41ac79759e17be9e492b3587efa038054674760e72d
# via
# -r requirements.in
# google-api-core
# google-cloud-bigquery
rsa==4.8 \
--hash=sha256:5c6bd9dc7a543b7fe4304a631f8a8a3b674e2bbfc49c2ae96200cdbe55df6b17 \
--hash=sha256:95c5d300c4e879ee69708c428ba566c59478fd653cc3a22243eeb8ed846950bb
# via
# -r requirements.in
# google-auth
six==1.16.0 \
--hash=sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926 \
--hash=sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254
# via
# -r requirements.in
# google-auth
# grpcio
# python-dateutil
smmap==5.0.0 \
--hash=sha256:2aba19d6a040e78d8b09de5c57e96207b09ed71d8e55ce0959eeee6c8e190d94 \
--hash=sha256:c840e62059cd3be204b0c9c9f74be2c09d5648eddd4580d9314c3ecde0b30936
# via
# -r requirements.in
# gitdb
toml==0.10.2 \
--hash=sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b \
--hash=sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f
# via
# -r requirements.in
# pytest-black
tomli==2.0.1 \
--hash=sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc \
--hash=sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f
# via
# -r requirements.in
# black
# coverage
# mypy
# pytest
types-futures==3.3.8 \
--hash=sha256:6fe8ccc2c2af7ef2fdd9bf73eab6d617074f09f30ad7d373510b4043d39c42de \
--hash=sha256:d6e97ec51d56b96debfbf1dea32ebec22c1687f16d2547ea0a34b48db45df205
# via -r requirements.in
types-pkg-resources==0.1.3 \
--hash=sha256:0cb9972cee992249f93fff1a491bf2dc3ce674e5a1926e27d4f0866f7d9b6d9c \
--hash=sha256:834a9b8d3dbea343562fd99d5d3359a726f6bf9d3733bccd2b4f3096fbab9dae
# via -r requirements.in
types-protobuf==3.19.12 \
--hash=sha256:130bf00e688ae9376682ac8f3d6610be797d8a6072f102f530ca1d579b52e54b \
--hash=sha256:b022247347471219acbe2fe68d2f3623d788252e99e9b97358b7ab904d268a78
# via -r requirements.in
types-pytz==2021.3.5 \
--hash=sha256:8831f689379ac9e2a62668157381379ed74b3702980e08e71f8673c179c4e3c7 \
--hash=sha256:fef8de238ee95135952229a2a23bfb87bd63d5a6c8598106a46cfcf48f069ea8
# via -r requirements.in
types-pyyaml==6.0.4 \
--hash=sha256:6252f62d785e730e454dfa0c9f0fb99d8dae254c5c3c686903cf878ea27c04b7 \
--hash=sha256:693b01c713464a6851f36ff41077f8adbc6e355eda929addfb4a97208aea9b4b
# via -r requirements.in
types-requests==2.27.11 \
--hash=sha256:506279bad570c7b4b19ac1f22e50146538befbe0c133b2cea66a9b04a533a859 \
--hash=sha256:6a7ed24b21780af4a5b5e24c310b2cd885fb612df5fd95584d03d87e5f2a195a
# via -r requirements.in
types-six==1.16.11 \
--hash=sha256:12967c3c1ed9bc82e1fb371bbc7b9edc5aa42741bd836cd77987d20f60dff90d \
--hash=sha256:b09ce896e543cb6b9376bbdd3cd536036dfa3d3d41f77abe3ecbd3ef791309c0
# via -r requirements.in
types-toml==0.10.4 \
--hash=sha256:4a9ffd47bbcec49c6fde6351a889b2c1bd3c0ef309fa0eed60dc28e58c8b9ea6 \
--hash=sha256:9340e7c1587715581bb13905b3af30b79fe68afaccfca377665d5e63b694129a
# via -r requirements.in
types-urllib3==1.26.10 \
--hash=sha256:a26898f530e6c3f43f25b907f2b884486868ffd56a9faa94cbf9b3eb6e165d6a \
--hash=sha256:d755278d5ecd7a7a6479a190e54230f241f1a99c19b81518b756b19dc69e518c
# via
# -r requirements.in
# types-requests
typing-extensions==4.1.1 \
--hash=sha256:1a9462dcc3347a79b1f1c0271fbe79e844580bb598bafa1ed208b94da3cdcd42 \
--hash=sha256:21c85e0fe4b9a155d0799430b0ad741cdce7e359660ccbd8b530613e8df88ce2
# via
# -r requirements.in
# black
# mypy
urllib3==1.26.8 \
--hash=sha256:000ca7f471a233c2251c6c7023ee85305721bfdf18621ebff4fd17a8653427ed \
--hash=sha256:0e7c33d9a63e7ddfcb86780aac87befc2fbddf46c58dbb487e0855f7ceec283c
# via
# -r requirements.in
# requests
zipp==3.7.0 \
--hash=sha256:9f50f446828eb9d45b267433fd3e9da8d801f614129124863f9c51ebceafb87d \
--hash=sha256:b47250dd24f92b7dd6a0a8fc5244da14608f3ca90a5efcd37a3b1642fac9a375
# via
# -r requirements.in
# importlib-resources

6
script/update_deps Executable file
Просмотреть файл

@ -0,0 +1,6 @@
#!/bin/bash
pip-compile -o - - <<< '.[testing]' |
grep -v 'file://' |
sed 's/pip-compile.*/update_deps/' > requirements.in
pip-compile --generate-hashes -o requirements.txt requirements.in

Просмотреть файл

@ -0,0 +1,76 @@
from setuptools import setup
def text_from_file(path):
with open(path, encoding="utf-8") as f:
return f.read()
test_dependencies = [
"coverage",
"isort",
"jsonschema",
"pytest",
"pytest-black",
"pytest-cov",
"pytest-flake8",
"mypy",
"types-futures",
"types-pkg-resources",
"types-protobuf",
"types-pytz",
"types-PyYAML",
"types-requests",
"types-six",
"types-toml",
]
extras = {
"testing": test_dependencies,
}
setup(
name="mozilla-opmon",
author="Mozilla Corporation",
author_email="fx-data-dev@mozilla.org",
description="Continuous monitoring of experiments and rollouts",
url="https://github.com/mozilla/opmon",
packages=[
"opmon",
"opmon.config",
"opmon.templates",
"opmon.tests",
"opmon.tests.integration",
],
package_data={
"opmon.config": ["*.toml"],
"opmon.templates": ["*.sql"],
"opmon.tests": ["data/*"],
"opmon": ["../*.toml"],
},
install_requires=[
"attrs",
"cattrs",
"Click",
"GitPython",
"google-cloud-bigquery",
"grpcio", # https://github.com/googleapis/google-cloud-python/issues/6259
"jinja2",
"pytz",
"requests",
"toml",
],
include_package_data=True,
tests_require=test_dependencies,
extras_require=extras,
long_description=text_from_file("README.md"),
long_description_content_type="text/markdown",
python_requires=">=3.8",
entry_points="""
[console_scripts]
opmon=opmon.cli:cli
""",
# This project does not issue releases, so this number is not meaningful
# and should not need to change.
version="2022.3.0",
)

28
tox.ini Normal file
Просмотреть файл

@ -0,0 +1,28 @@
[tox]
envlist = py38, py38-mypy
[testenv]
deps = -rrequirements.txt
commands =
flake8 opmon
black --check opmon
isort --check opmon
pytest \
{envsitepackagesdir}/opmon \
--cov={envsitepackagesdir}/opmon \
--cov-report=term \
--cov-report=html \
{posargs}
passenv = GOOGLE_APPLICATION_CREDENTIALS
[testenv:py38-integration]
commands = pytest --integration {envsitepackagesdir}/opmon {posargs}
[testenv:py38-mypy]
commands = mypy -p opmon
[flake8]
max_line_length=100
[coverage:run]
omit = */tests/*