BeamRunPythonPipelineOperator和DataFlowPythonOperator都是用于在Airflow中运行基于Python的Apache Beam pipeline的Operator。 但是,它们之间存在一些重要的区别。
BeamRunPythonPipelineOperator(BRP)可以在本地或任何支持Beam的环境中运行Beam pipeline。 因此,它可以在任何支持Python和Beam的环境中运行。 而DataFlowPythonOperator(DFP)则在Google Cloud Platform的Dataflow环境中运行Beam pipeline。
BRP示例代码:
from airflow.contrib.operators.beam_runner_operator import BeamRunPythonPipelineOperator from my_package import DoSomething
beam_pipeline = DoSomething()
t1 = BeamRunPythonPipelineOperator( task_id='run_beam_pipeline', py_file='/path/to/python/code', runner='DirectRunner', pipeline=beam_pipeline )
DFP示例代码:
from airflow.contrib.operators.dataflow_operator import DataFlowPythonOperator from my_package import DoSomething
beam_pipeline = DoSomething()
t1 = DataFlowPythonOperator( task_id='run_beam_pipeline', py_file='/path/to/python/code', dataflow_default_options={ 'project': 'my-gcp-project', 'zone': 'us-central1-f', 'tempLocation': 'gs://my-gcs-bucket/tmp', 'stagingLocation': 'gs://my-gcs-bucket/staging' }, py_options=['-m'], py_requirements=['pandas==1.0.1'], # optional py_interpreter='python3', py_system_site_packages=False, pipeline_options={ 'runner': 'DataFlowRunner', 'job_name': 'my-dataflow-job' }, pipeline=beam_pipeline )
因此,BRP和DFP之间的差异在于他们运行Beam pipeline的环境。 BeamRunPythonPipelineOperator在任何支持Beam的环境中都可以运行Beam pipeline,而DataFlowPythonOperator仅能在Google Cloud Platform的Dataflow环境中运行Beam pipeline。