* Upgrade Airflow from 2.7.3 to 2.8.2.
* Update CircleCI Docker orb from 2.2.0 to 2.5.0.
* Update Ruff config for version 0.2.
* fix dockerfile URL scheme to use https for gcloud CLI
* Restore `google-auth` extra, with its built-in dependency on `authlib`.
* Add missing Airflow state colors for "restarting", "deferred", and "removed".
* Explain the difference between our configured Airflow state colors and the defaults.
---------
Co-authored-by: mikaeld <mducharme@mozilla.com>
* CI runtime decreased by 70-80%
* docker-compose setup time decreased significantly. **Local use: takes a few seconds as opposed to a few minutes before**. Airflow Variables and Connections are loaded via `import` CLI command; replaces `bin/run` script using Airflow CLI.
* Improved dev secrets security by dynamically generating a Fernet key in makefile
* `.env` file is generated from `make up`, contains UID and Fernet key. `.env` is automatically loaded into environment variables by `docker-compose`
* Retire shell scripts to use builtin features in our stack
* `bin/run` replaced by docker-compose and CI
* `bin/test-dag-tags` replaced by pytest unit tests
* `bin/test-parse` replaced by pytest unit tests
* Bug 1572115 - Add dataproc version aggregates job that writes to a dev database
* Add bigquery source and project-id
* Expose storage_bucket and artifact_bucket for local testing
* Add GOOGLE_APPLICATION_CREDENTIALS to docker-compose for local testing
* Update prerelease aggregates to run locally
* Minimize mozaggregator runner
* Use default service account on production
* Use n1-standard-8 on mozaggregator job
* Move telemetry_aggregates_parquet to databricks
* Move telemetry aggregates to databricks
* Update mobile aggregates to use databricks
* Propagate DB_TOKEN properly
* Use python2 by default for mozaggregator jobs
* Up dev instance count for prerelease aggregate view
Airflow has two issues:
- Dags are broken if connections are undefined
- It checks for env connections before db
To alleviate these issues, we initially set the connections
via the UI; however that meant they never overrode the
environment connections.
This change will fix the dags, but still let the connections
be defined via the UI.
* Add bgbb jobs to airflow
* Update bgbb jobs
* Update the environment to use the external module runner
* Change submission date to use iso8601
* Add environment to docker-compose for databricks and gcp
* Update naming, comment on dependencies, and use ds_next