[AIRFLOW-5556] Add separate config for timeout from scheduler dag processing (#6186)

As of right now, it uses dagbag_import_timeout to control time out of
DagFileProcessor, which is intended to control the timeout of loading
python file, but the DagProcessor does a bit more so needs a longer
timeout.
This commit is contained in:
Ping Zhang 2019-09-26 14:42:16 -07:00 коммит произвёл Ash Berlin-Taylor
Родитель 93e856e702
Коммит 2abf0f5b70
2 изменённых файлов: 5 добавлений и 2 удалений

Просмотреть файл

@ -159,9 +159,12 @@ fernet_key = {FERNET_KEY}
# Whether to disable pickling dags
donot_pickle = True
# How long before timing out a python file import while filling the DagBag
# How long before timing out a python file import
dagbag_import_timeout = 30
# How long before timing out a DagFileProcessor, which processes a dag file
dag_file_processor_timeout = 50
# The class to use for running task instances in a subprocess
task_runner = StandardTaskRunner

Просмотреть файл

@ -1296,7 +1296,7 @@ class SchedulerJob(BaseJob):
# so the scheduler job and DAG parser don't access the DB at the same time.
async_mode = not self.using_sqlite
processor_timeout_seconds = conf.getint('core', 'dagbag_import_timeout')
processor_timeout_seconds = conf.getint('core', 'dag_file_processor_timeout')
processor_timeout = timedelta(seconds=processor_timeout_seconds)
self.processor_agent = DagFileProcessorAgent(self.subdir,
known_file_paths,