要在Airflow中不使用gunicorn作为工作服务器,可以使用其他服务器或容器来运行Airflow。
以下是一种使用Celery作为Airflow的工作服务器的解决方案。
pip install celery redis
executor
设置为CeleryExecutor
,并添加Celery的配置:# airflow.cfg
executor = CeleryExecutor
[celery]
broker_url = redis://localhost:6379/0
result_backend = redis://localhost:6379/0
celery_worker.py
的文件,用于启动Celery worker:# celery_worker.py
from airflow import settings
from airflow.contrib.jobs import get_current_context
from airflow.contrib.executors import CeleryExecutor
app = CeleryExecutor.Celery('airflow.worker', broker=settings.CELERY_BROKER_URL)
app.conf.update(settings.CELERY_CONFIG)
context = get_current_context()
context.update(locals())
app.worker_main(['worker'])
celery -A celery_worker worker --loglevel=info
现在,Airflow将使用Celery作为工作服务器,而不是gunicorn。