如果您使用的是Docker Airflow,请在您的docker-compose文件中设置用户匹配:
services:
webserver:
image: puckel/docker-airflow
environment:
- LOAD_EX=n
- EXECUTOR=CeleryExecutor
- FERNET_KEY=myFernetKey
volumes:
- ./dags:/usr/local/airflow/dags
- ./logs:/usr/local/airflow/logs
- ./plugins:/usr/local/airflow/plugins
depends_on:
- postgres
user: "${UID}:${GID}"
如果您的Airflow DAG对齐不正确,请检查您的代码是否符合PEP8,特别是缩进是否正确。可以尝试在命令行中运行flake8来查找PEP8问题。
此外,您应该确保您的DAG定义中的每个任务都使用正确的依赖关系来确保DAG正确运行。例如,在以下示例中,任务B和任务C都依赖于任务A:
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime
dag = DAG('example_dag', description='Example DAG', schedule_interval='0 0 * * *', start_date=datetime(2021, 1, 1), catchup=False)
task_a = BashOperator(task_id='task_a', bash_command='echo "Task A"', dag=dag)
task_b = BashOperator(task_id='task_b', bash_command='echo "Task B"', dag=dag, depends_on_past=True)
task_c = BashOperator(task_id='task_c', bash_command='echo "Task C"', dag=dag)
task_a >> task_b >> task_c