Apache Airflow (Incubating)
Перейти к файлу
Jarek Potiuk e11a838ee1
Optimize GitLab CI configuration (#8499)
* tests are not executed for doc-only changes
* images will be (once merged) downloaded from GitHub Registry so likely much faster
* we have a "scheduled" nightly build that will build everything from scratch and check if no requirements have been broken
* improved split of static checks between two static check jobs - to utilise parallelism better.
* reorganised some fast jobs (requirements, prod image) that do not depend on tests so that they can run earlier
* shorter names for jobs so that they are nicer to view in the actions view
* matrix definitions of the jobs so that we can manage them better
2020-04-23 01:11:25 +02:00
.github Optimize GitLab CI configuration (#8499) 2020-04-23 01:11:25 +02:00
airflow Remove unused session variable from www/view.py (#8504) 2020-04-22 12:56:37 +02:00
backport_packages Added Facebook Ads Operator #7887 (#8008) 2020-04-14 15:08:35 +02:00
common List of integrations is now maintained in one place. (#8496) 2020-04-22 14:38:56 +02:00
dags [AIRFLOW-6817] remove imports from `airflow/__init__.py`, replaced implicit imports with explicit imports, added entry to `UPDATING.MD` - squashed/rebased (#7456) 2020-02-22 08:21:19 +01:00
dev Fix minor issues with Announcements Dev Scripts (#8141) 2020-04-04 20:11:34 +01:00
docs Add installation description for repeatable PyPi installation (#8513) 2020-04-22 15:05:37 +02:00
hooks [AIRFLOW-7010] Skip in-container checks for Dockerhub builds (#7652) 2020-03-09 10:00:10 +01:00
images [AIRFLOW-XXXX] Add Docker installation in WSL (#7591) 2020-03-01 17:38:07 +01:00
license-templates [AIRFLOW-5234] Rst files have consistent, auto-added license 2019-08-18 19:51:02 -04:00
licenses [AIRFLOW-5277] Gantt chart respects per-user the Timezone UI setting (#8096) 2020-04-03 17:54:45 +01:00
manifests [AIRFLOW-7013] Automated check if Breeze image needs to be pulled (#7656) 2020-03-12 09:48:24 +01:00
requirements Optimize GitLab CI configuration (#8499) 2020-04-23 01:11:25 +02:00
scripts Optimize GitLab CI configuration (#8499) 2020-04-23 01:11:25 +02:00
tests Add support for caching of image in GitHub's registry (#8497) 2020-04-22 16:08:58 +02:00
.asf.yaml Enable Github issues (#7779) 2020-03-20 15:48:04 +00:00
.bash_completion [AIRFLOW-3611] Simplified development environment (#4932) 2019-08-27 14:39:36 -04:00
.coveragerc [AIRFLOW-5063] Fix performance when switching between master/v1-10 (#5677) 2019-07-29 16:15:55 +02:00
.dockerignore Add Production Docker image support (#7832) 2020-04-02 18:52:11 +01:00
.editorconfig [AIRFLOW-6714] Remove magic comments about UTF-8 (#7338) 2020-02-02 22:18:19 +01:00
.flake8 [AIRFLOW-6864] Make airfow/jobs pylint compatible (#7484) 2020-02-25 22:06:46 +01:00
.gitignore Move Dockerfile to Dockerfile.ci (#7829) 2020-03-23 08:56:26 +01:00
.hadolint.yaml [AIRFLOW-5180] Added static checks (yamllint) + auto-licences for yaml file (#5790) 2019-08-22 10:13:56 -04:00
.mailmap [AIRFLOW-XXXX] Add more .mailmap entries (#7545) 2020-02-26 15:43:52 +00:00
.pre-commit-config.yaml List of integrations is now maintained in one place. (#8496) 2020-04-22 14:38:56 +02:00
.rat-excludes CSS linting integrated into pre-commit (#8218) 2020-04-09 21:50:17 +01:00
.readthedocs.yml [AIRFLOW-5180] Added static checks (yamllint) + auto-licences for yaml file (#5790) 2019-08-22 10:13:56 -04:00
.travis.yml Add support for caching of image in GitHub's registry (#8497) 2020-04-22 16:08:58 +02:00
BREEZE.rst List of integrations is now maintained in one place. (#8496) 2020-04-22 14:38:56 +02:00
CHANGELOG.txt Add Changelog and Updating note for 1.10.10 (#8235) 2020-04-10 08:58:58 +01:00
CONTRIBUTING.rst Get rid of Travis CI from the docs (#8488) 2020-04-21 17:27:09 +02:00
Dockerfile Pin Hadolint to version released 2020.04.20 (#8485) 2020-04-21 13:33:11 +02:00
Dockerfile.ci Less aggressive eager upgrade of requirements (#8267) 2020-04-13 18:50:46 +02:00
IMAGES.rst Get rid of Travis CI from the docs (#8488) 2020-04-21 17:27:09 +02:00
INSTALL Added Facebook Ads Operator #7887 (#8008) 2020-04-14 15:08:35 +02:00
LICENSE [AIRFLOW-5277] Gantt chart respects per-user the Timezone UI setting (#8096) 2020-04-03 17:54:45 +01:00
LOCAL_VIRTUALENV.rst Move Dockerfile to Dockerfile.ci (#7829) 2020-03-23 08:56:26 +01:00
MANIFEST.in [AIRFLOW-6542] Add spark-on-k8s operator/hook/sensor (#7163) 2020-03-10 12:56:24 +00:00
NOTICE Housekeeping of auth backend & Update Security doc (#8071) 2020-04-03 18:20:39 +02:00
README.md Add installation description for repeatable PyPi installation (#8513) 2020-04-22 15:05:37 +02:00
STATIC_CODE_CHECKS.rst Get rid of Travis CI from the docs (#8488) 2020-04-21 17:27:09 +02:00
TESTING.rst List of integrations is now maintained in one place. (#8496) 2020-04-22 14:38:56 +02:00
UPDATING.md Use python client in BQ hook create_empty_table/dataset and table_exists (#8377) 2020-04-22 09:36:19 +02:00
breeze Fix building image manifest (was removed by accident with prod img) (#8408) 2020-04-16 19:21:22 +02:00
breeze-complete List of integrations is now maintained in one place. (#8496) 2020-04-22 14:38:56 +02:00
confirm fixed typo in confirm script (#8419) 2020-04-17 17:14:28 +02:00
entrypoint.sh Optimize GitLab CI configuration (#8499) 2020-04-23 01:11:25 +02:00
pylintrc Enable super init not called check in pylint (#7834) 2020-03-27 16:45:05 +00:00
pytest.ini [AIRFLOW-6460] - Reverting "Reduce timeout in pytest (#7051)" (#7062) 2020-01-05 13:33:35 +01:00
setup.cfg Use Github Actions to run CI (#8376) 2020-04-15 13:03:14 -07:00
setup.py Optimize GitLab CI configuration (#8499) 2020-04-23 01:11:25 +02:00
yamllint-config.yml [AIRFLOW-5180] Added static checks (yamllint) + auto-licences for yaml file (#5790) 2019-08-22 10:13:56 -04:00

README.md

Apache Airflow

PyPI version Airflow Build Status Coverage Status Documentation Status License PyPI - Python Version Twitter Follow Slack Status

Apache Airflow (or simply Airflow) is a platform to programmatically author, schedule, and monitor workflows.

When workflows are defined as code, they become more maintainable, versionable, testable, and collaborative.

Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command line utilities make performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress, and troubleshoot issues when needed.

Table of contents

Requirements

Apache Airflow is tested with:

Master version (2.0.0dev)

  • Python versions: 3.6, 3.7
  • Postgres DB: 9.6, 10
  • MySQL DB: 5.7
  • Sqlite - latest stable (it is used mainly for development purpose)

Stable version (1.10.9)

  • Python versions: 2.7, 3.5, 3.6, 3.7
  • Postgres DB: 9.6, 10
  • MySQL DB: 5.6, 5.7
  • Sqlite - latest stable (it is used mainly for development purpose)

Additional notes on Python version requirements

  • Stable version requires at least Python 3.5.3 when using Python 3
  • Both versions are currently incompatible with Python 3.8 due to a known compatibility issue with a dependent library

Getting started

Please visit the Airflow Platform documentation (latest stable release) for help with installing Airflow, getting a quick start, or a more complete tutorial.

Documentation of GitHub master (latest development branch): ReadTheDocs Documentation

For further information, please visit the Airflow Wiki.

Official container (Docker) images for Apache Airflow are described in IMAGES.rst.

Installing from PyPI

Airflow is published as apache-airflow package in PyPI. Installing it however might be sometimes tricky because Airflow is a bit of both a library and application. Libraries usually keep their dependencies open and applications usually pin them, but we should do neither and both at the same time. We decided to keep our dependencies as open as possible (in setup.py) so users can install different versions of libraries if needed. This means that from time to time plain pip install apache-airflow will not work or will produce unusable Airflow installation.

In order to have repeatable installation, however, starting from Airflow 1.10.10 we also keep a set of "known-to-be-working" requirement files in the requirements folder. Those "known-to-be-working" requirements are per major/minor python version (3.6/3.7). You can use them as constraint files when installing Airflow from PyPI. Note that you have to specify correct Airflow version and python versions in the URL.

  1. Installing just airflow:
pip install apache-airflow==1.10.10 \
 --constraint https://raw.githubusercontent.com/apache/airflow/1.10.10/requirements/requirements-python3.7.txt
  1. Installing with extras (for example postgres,gcp)
pip install apache-airflow[postgres,gcp]==1.10.10 \
 --constraint https://raw.githubusercontent.com/apache/airflow/1.10.10/requirements/requirements-python3.7.txt

Beyond the Horizon

Airflow is not a data streaming solution. Tasks do not move data from one to the other (though tasks can exchange metadata!). Airflow is not in the Spark Streaming or Storm space, it is more comparable to Oozie or Azkaban.

Workflows are expected to be mostly static or slowly changing. You can think of the structure of the tasks in your workflow as slightly more dynamic than a database structure would be. Airflow workflows are expected to look similar from a run to the next, this allows for clarity around unit of work and continuity.

Principles

  • Dynamic: Airflow pipelines are configuration as code (Python), allowing for dynamic pipeline generation. This allows for writing code that instantiates pipelines dynamically.
  • Extensible: Easily define your own operators, executors and extend the library so that it fits the level of abstraction that suits your environment.
  • Elegant: Airflow pipelines are lean and explicit. Parameterizing your scripts is built into the core of Airflow using the powerful Jinja templating engine.
  • Scalable: Airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers.

User Interface

  • DAGs: Overview of all DAGs in your environment.

  • Tree View: Tree representation of a DAG that spans across time.

  • Graph View: Visualization of a DAG's dependencies and their current status for a specific run.

  • Task Duration: Total time spent on different tasks over time.

  • Gantt View: Duration and overlap of a DAG.

  • Code View: Quick way to view source code of a DAG.

Using hooks and Operators from "master" in Airflow 1.10

Currently, stable versions of Apache Airflow are released in 1.10.* series. We are working on the future, major version of Airflow from the 2.0.* series. It is going to be released in 2020. However, the exact time of release depends on many factors and is yet unknown. We have already a lot of changes in the hooks/operators/sensors for many external systems and they are not used because they are part of the master/2.0 release.

In the Airflow 2.0 - following AIP-21 "change in import paths" all the non-core operators/hooks/sensors of Apache Airflow have been moved to the "airflow.providers" package. This opened a possibility to use the operators from Airflow 2.0 in Airflow 1.10 - with the constraint that those packages can only be used in python3.6+ environment.

Therefore we decided to prepare and release backport packages that can be installed for older Airflow versions. Those backport packages are released more frequently. Users do not have to upgrade their Airflow version to use those packages. There are a number of changes between Airflow 2.0 and 1.10.* - documented in UPDATING.md. With backported providers package users can migrate their DAGs to the new providers package incrementally and once they convert to the new operators/sensors/hooks they can seamlessly migrate their environments to Airflow 2.0.

More information about the status and releases of the back-ported packages are available at Backported providers package page

Dependencies between packages are stored in airflow/providers/dependencies.json. See CONTRIBUTING.rst

Contributing

Want to help build Apache Airflow? Check out our contributing documentation.

Who uses Apache Airflow?

As the Apache Airflow community grows, we'd like to keep track of who is using the platform. Please send a PR with your company name and @githubhandle if you may.

Currently officially using Airflow:

  1. 4G Capital [@posei]
  2. 6play [@lemourA, @achaussende, @d-nguyen, @julien-gm]
  3. 8fit [@nicor88, @frnzska]
  4. 90 Seconds [@aaronmak]
  5. 99 [@fbenevides, @gustavoamigo & @mmmaia]
  6. AdBOOST [AdBOOST]
  7. Adobe [@mishikaSingh, @ramandumcs, @vardancse]
  8. Agari [@r39132]
  9. Agoda [@akki]
  10. Airbnb [@mistercrunch, @artwr]
  11. AirDNA
  12. Airfinity [@sibowyer]
  13. Airtel [@harishbisht]
  14. Akamas [@GiovanniPaoloGibilisco, @lucacavazzana]
  15. Alan [@charles-go]
  16. allegro.pl [@kretes]
  17. AloPeyk [@blcksrx, @AloPeyk]
  18. AltX [@pedromduarte]
  19. AMPATH[@AMPATH, @fatmali]
  20. Apigee [@btallman]
  21. ARGO Labs [@California Data Collaborative]
  22. ARMEDANGELS [@swiffer]
  23. Arquivei [@arquivei]
  24. Arrive
  25. Asana [@chang, @dima-asana, @jdavidheiser, @ricardoandresrojas]
  26. Astronomer [@schnie, @ashb, @kaxil, @dimberman, @andriisoldatenko, @ryw, @andrewhharmon]
  27. Auth0 [@sicarul]
  28. Automattic [@anandnalya, @bperson, @khrol, @xyu]
  29. Away [@trunsky]
  30. Azri Solutions [@userimack]
  31. Bagelcode
  32. BalanceHero [@swalloow]
  33. Banco de Formaturas [@guiligan]
  34. BandwidthX [@dineshdsharma]
  35. Basetis
  36. BBM
  37. Beamly [@christopheralcock]
  38. Beeswax
  39. Bellhops
  40. BelugaDB [@fabio-nukui & @joao-sallaberry & @lucianoviola & @tmatuki]
  41. Betterment [@betterment]
  42. Bexs Bank [@felipefb & @ilarsen]
  43. BigQuant [@bigquant]
  44. Birdz by Veolia [@benjamingrenier]
  45. BlaBlaCar [@puckel & @wmorin]
  46. Blacklane [@serkef]
  47. Bloc [@dpaola2]
  48. Bloomberg [@dimberman]
  49. Blue Yonder [@blue-yonder]
  50. BlueApron [@jasonjho & @matthewdavidhauser]
  51. Bluecore [@JLDLaughlin]
  52. Bluekiri [@Bluekiri]
  53. Boda Telecom Suite - CE [@erssebaggala, @bodastage]
  54. Bodastage Solutions [@erssebaggala, @bodastage]
  55. Bombora Inc [@jeffkpayne, @pakelley, @dNavalta, @austynh, @TheOriginalAlex]
  56. Bonial International GmbH
  57. Bonnier Broadcasting [@wileeam]
  58. BounceX [@JoshFerge, @hudsonrio, @ronniekritou]
  59. Braintree [@coopergillan, @curiousjazz77, @raymondberg]
  60. Branch [@sdebarshi, @dmitrig01]
  61. Caesars Entertainment
  62. California Data Collaborative powered by ARGO Labs
  63. Capital One [@anoopengineer]
  64. Carbonite [@ajbosco]
  65. CarLabs [@sganz & @odannyc]
  66. CAVA [@minh5 & @patchus]
  67. Celect [@superdosh & @chadcelect]
  68. Censys [@zakird, @dadrian, & @andrewsardone]
  69. Change.org [@change, @vijaykramesh]
  70. Chartboost [@cgelman & @dclubb]
  71. Checkr [@tongboh]
  72. Children's Hospital of Philadelphia Division of Genomic Diagnostics [@genomics-geek]
  73. Cinimex DataLab [@kdubovikov]
  74. City of San Diego [@MrMaksimize, @andrell81 & @arnaudvedy]
  75. City of Toronto [@CityofToronto, @radumas]
  76. ciValue [@chencivalue, @YoavGaudin, @saleem-boshnak]
  77. Civey [@WesleyBatista]
  78. Clairvoyant [@shekharv]
  79. Classmethod, Inc. [@shoito]
  80. Cleartax [@anks & @codebuff]
  81. Clover Health [@gwax & @vansivallab]
  82. Colgate-Palmolive [@fhoda]
  83. Collectivehealth Inc. [@retornam]
  84. Compass [@wdhorton]
  85. ConnectWise [@jacobeturpin]
  86. ContaAzul [@bern4rdelli, @renanleme & @sabino]
  87. Cotap [@maraca & @richardchew]
  88. Craig@Work
  89. Crealytics
  90. Credit Karma [@preete-dixit-ck & @harish-gaggar-ck & @greg-finley-ck]
  91. Creditas [@dcassiano]
  92. CreditCards.com[@vmAggies & @jay-wallaby]
  93. Cryptalizer.com
  94. Custom Ink [@david-dalisay, @dmartin11 & @mpeteuil]
  95. Cyscale [@ocical]
  96. Dailymotion [@germaintanguy & @hc]
  97. Danamica [@testvinder]
  98. Data Reply [@kaxil]
  99. DataCamp [@dgrtwo]
  100. DataFox [@sudowork]
  101. Dentsu Inc. [@bryan831 & @loozhengyuan]
  102. Digital First Media [@duffn & @mschmo & @seanmuth]
  103. DigitalOcean [@ajbosco]
  104. Digitas Pixelpark [@feluelle]
  105. DoorDash
  106. Dotmodus [@dannylee12]
  107. Drivy [@AntoineAugusti]
  108. Easy Taxi [@caique-lima & @diraol]
  109. EllisDon [@d2kalra & @zbasama]
  110. Endesa [@drexpp]
  111. Enigma [@hydrosquall]
  112. Datamaran [@valexharo]
  113. Etsy [@mchalek]
  114. evo.company [@orhideous]
  115. Experity (formerly DocuTAP) [@cloneluke & @tobyjoliver]
  116. Fathom Health
  117. Firestone Inventing [@zihengCat]
  118. Flipp [@sethwilsonwishabi]
  119. Format [@format & @jasonicarter]
  120. FreeNow [@freenowtech]
  121. FreshBooks [@DinoCow]
  122. Freshworks [@shaikshakeel]
  123. FullContact
  124. Fuller, Inc. [@wutali & @sh-tech]
  125. Fundera [@andyxhadji]
  126. G Adventures [@chchtv11, @tgumbley, @tomwross]
  127. GameWisp [@tjbiii & @theryanwalls]
  128. Geekie [@wolney]
  129. GeneCards [@oferze]
  130. Gentner Lab [@neuromusic]
  131. Get Simpl [@rootcss]
  132. GitLab [@tayloramurphy & @m_walker]
  133. Glassdoor [@syvineckruyk & @sid88in]
  134. Global Fashion Group [@GFG]
  135. GoDataDriven [@BasPH, @danielvdende, @ffinfo, @Fokko, @gglanzani, @hgrif, @jrderuiter, @NielsZeilemaker]
  136. Gojek [@gojek]
  137. GovTech GDS [@chrissng & @datagovsg]
  138. Grab [@calvintran]
  139. Gradeup [@gradeup]
  140. Grand Rounds [@richddr, @timz1290, @wenever, & @runongirlrunon]
  141. Groupalia [@jesusfcr]
  142. Groupon [@stevencasey]
  143. Growbots[@exploy]
  144. GSN Games
  145. Gusto [@frankhsu]
  146. Handshake [@mhickman]
  147. Handy [@marcintustin / @mtustin-handy]
  148. happn [@pcorbel]
  149. HAVAN [@botbiz]
  150. HBC Digital [@tmccartan & @dmateusp]
  151. HBO[@yiwang]
  152. Healthjump [@miscbits]
  153. HelloFresh [@tammymendt & @davidsbatista & @iuriinedostup]
  154. Hipages [@arihantsurana]
  155. Holimetrix [@thibault-ketterer]
  156. HomeToGo [@HomeToGo, @AurimasGr]
  157. Hootsuite
  158. Hostnfly [@CyrilLeMat & @pierrechopin & @alexisrosuel]
  159. HotelQuickly [@zinuzoid]
  160. Huq Industries [@huqindustries, @alepuccetti, @turbomerl]
  161. Iflix [@ChaturvediSulabh]
  162. IFTTT [@apurvajoshi]
  163. iHeartRadio[@yiwang]
  164. imgix [@dclubb]
  165. ING
  166. Instacart 🥕 [@arp1t & @code-sauce & @jasonlew & @j4p3 & @lubert & @mmontagna & @RyanAD &@zzadeh]
  167. Intercom [@fox & @paulvic]
  168. Interia
  169. Investorise [@svenvarkel]
  170. iS2.co [@iS2co]
  171. Jampp
  172. Jeitto [@BrennerPablo & @ds-mauri]
  173. Jetlore [@bderose]
  174. JobTeaser [@stefani75 & @knil-sama]
  175. JULO [@sepam & @tenapril & @verzqy]
  176. Kalibrr [@charlesverdad]
  177. Kargo [@chaithra-yenikapati, @akarsh3007 & @dineshanchan]
  178. Karmic [@hyw]
  179. King [@nathadfield]
  180. King Abdullah Petroleum Studies and Research Center(KAPSARC) [@saianupkumarp]
  181. Kiwi.com [@underyx]
  182. Kogan.com [@geeknam]
  183. Korbit [@jensenity]
  184. KPN B.V. [@biyanisuraj & @gmic]
  185. Kroton Educacional
  186. Lemann Foundation [@fernandosjp]
  187. LeMans Corporation [@alloydwhitlock] & [@tinyrye]
  188. LendUp [@lendup]
  189. LetsBonus [@jesusfcr & @OpringaoDoTurno]
  190. Liberty Global [@LibertyGlobal]
  191. liligo [@tromika]
  192. LingoChamp [@haitaoyao]
  193. Logitravel Group
  194. Los Angeles Times [@standyro]
  195. LokSuvidha [@saurabhwahile]
  196. Lucid [@jbrownlucid & @kkourtchikov]
  197. Lumos Labs [@rfroetscher & @zzztimbo]
  198. Lyft [@feng-tao, @milton0825, @astahlman, @youngyjd, @ArgentFalcon]
  199. M4U [@msantino]
  200. Madrone [@mbreining & @scotthb]
  201. Markovian [@al-xv, @skogsbaeck, @waltherg]
  202. Mercadoni [@demorenoc]
  203. Mercari [@yu-iskw]
  204. MFG Labs
  205. MiNODES [@dice89, @diazcelsa]
  206. Modernizing Medicine[@kehv1n, @dalupus]
  207. Movember
  208. Multiply [@nrhvyc]
  209. National Bank of Canada [@brilhana]
  210. Neoway [@neowaylabs]
  211. Nerdwallet
  212. New Relic [@marcweil]
  213. Newzoo [@newzoo-nexus]
  214. NEXT Trucking [@earthmancash2, @kppullin]
  215. Nextdoor [@SivaPandeti, @zshapiro & @jthomas123]
  216. Nine [@TheZepto]
  217. OdysseyPrime [@davideberdin]
  218. OfferUp
  219. OneFineStay [@slangwald]
  220. Open Knowledge International @vitorbaptista
  221. Optum - UnitedHealthGroup [@fhoda, @ianstanton, @nilaybhatt,@hiteshrd]
  222. Outcome Health [@mikethoun, @rolandotribo]
  223. Overstock [@mhousley & @mct0006]
  224. OVH [@ncrocfer & @anthonyolea]
  225. Pagar.me [@pagarme]
  226. Palo Alto Networks [@PaloAltoNetworks]
  227. Pandora Media [@Acehaidrey & @wolfier]
  228. PayFit [@pcorbel]
  229. PAYMILL [@paymill & @matthiashuschle]
  230. PayPal [@r39132 & @jhsenjaliya]
  231. Pecan [@ohadmata]
  232. Pernod-Ricard [@romain-nio]
  233. Plaid [@plaid, @AustinBGibbons & @jeeyoungk]
  234. Playbuzz [@clintonboys & @dbn]
  235. PMC [@andrewm4894]
  236. Polidea [@potiuk, @mschickensoup, @mik-laj, @turbaszek, @michalslowikowski00, @olchas]
  237. Poshmark
  238. Postmates [@syeoryn]
  239. Premise [@jmccallum-premise]
  240. Pronto Tools [@zkan & @mesodiar]
  241. proton.ai [@prmsolutions]
  242. PubNub [@jzucker2]
  243. PXYData [@patchus]
  244. Qplum [@manti]
  245. Quantopian [@eronarn]
  246. Qubole [@msumit]
  247. QuintoAndar [@quintoandar]
  248. Quizlet [@quizlet]
  249. Quora
  250. Qoala [@gnomeria, @qoala-engineering]
  251. Rakuten
  252. Raízen [@rudlac & @guifneves]
  253. Rapido [@ChethanUK]
  254. REA Group
  255. Reddit [@reddit]
  256. Reverb[@reverbdotcom]
  257. Revolut [@sztanko & @nautilus28]
  258. Robinhood [@vineet-rh]
  259. Scaleway [@kdeldycke]
  260. Seasoned [@joshuacano] & [@mmyers] & [@tjward]
  261. Secret Escapes [@secretescapes]
  262. Semantics3 [@abishekk92]
  263. Sense360 [@kamilmroczek]
  264. Sentry.io [@tiopi]
  265. ShopBack [@shopback]
  266. Shopkick [@shopkick]
  267. Sidecar [@getsidecar]
  268. SimilarWeb [@similarweb]
  269. Skyscanner [@skyscanner]
  270. SmartNews [@takus]
  271. SnapTravel
  272. SocialCops [@vinayak-mehta & @sharky93]
  273. Société générale [@medmrgh & @s83]
  274. Spotahome [@spotahome]
  275. SpotHero [@benjigoldberg]
  276. Spotify [@znichols]
  277. Square
  278. Stackspace
  279. StoneCo [@lgwacker]
  280. Strava [@strava, @dhuang & @liamstewart]
  281. Stripe [@jbalogh]
  282. Strongmind [@tomchapin & @wongstein]
  283. Surfline [@jawang35]
  284. T2 Systems [@unclaimedpants]
  285. Tails.com [@alanmcruickshank]
  286. TEK [@telac]
  287. Telefonica Innovation Alpha [@Alpha-Health]
  288. Telia Company
  289. Ternary Data [@mhousley, @JoeReis]
  290. Tesla [@thoralf-gutierrez]
  291. The Home Depot[@apekshithr]
  292. THE ICONIC [@revathijay] [@ilikedata]
  293. Thinking Machines [@marksteve]
  294. Thinknear [@d3cay1, @ccson, & @ababian]
  295. ThoughtWorks [@sann3]
  296. Thumbtack [@natekupp]
  297. Tictail
  298. Tile [@ranjanmanish]
  299. Tinder [@kbendick]
  300. Tink [@tink-ab]
  301. TokenAnalyst [@simonohanlon101, @ankitchiplunkar, @sidshekhar, @sp6pe]
  302. Tokopedia [@topedmaria]
  303. Trocafone [@idontdomath & @gseva & @ordonezf & @PalmaLeandro]
  304. Twine Labs [@ivorpeles]
  305. Twitter [@aoen]
  306. Ubisoft [@Walkoss]
  307. Udacity [@dandikunited, @simon-uc]
  308. United Airlines [@ilopezfr]
  309. Upsight
  310. VeeR VR [@pishilong]
  311. Veikkaus [@hixus]
  312. Vente-Exclusive.com [@alexvanboxel]
  313. Vevo [@csetiawan & @jerrygillespie]
  314. Vidio
  315. Ville de Montréal@VilledeMontreal]
  316. Vnomics [@lpalum]
  317. Walmart Labs [@bharathpalaksha, @vipul007ravi]
  318. Waze [@waze]
  319. WePay [@criccomini & @mtagle]
  320. WeTransfer [@coredipper & @higee & @azclub]
  321. Whistle Labs [@ananya77041]
  322. Wildlifestudios
  323. WiseBanyan
  324. Wooga
  325. Wrike [@eliseealex & teoretic6]
  326. Xero [@yan9yu & adamantnz]
  327. Xoom
  328. Yahoo!
  329. Yieldr [@ggeorgiadis]
  330. Zapier [@drknexus & @statwonk]
  331. Zego [@ruimffl, @james-welly, @ken-payne]
  332. Zendesk
  333. Zenly [@cerisier & @jbdalido]
  334. Zymergen
  335. Zynga

Who Maintains Apache Airflow?

Airflow is the work of the community, but the core committers/maintainers are responsible for reviewing and merging PRs as well as steering conversation around new feature requests. If you would like to become a maintainer, please review the Apache Airflow committer requirements.

Can I use the Apache Airflow logo in my presentation?

Yes! Be sure to abide by the Apache Foundation trademark policies and the Apache Airflow Brandbook. The most up to date logos are found in this repo and on the Apache Software Foundation website.