Read csv file and load it to bigquery through dataflow job - use python coding for this instead of templates How perform this task using terraform(GCP) anyone help
I trying to do it but not understanding what terraform script should I write for it
Read csv file and load it to bigquery through dataflow job - use python coding for this instead of templates How perform this task using terraform(GCP) anyone help
I trying to do it but not understanding what terraform script should I write for it
It's not the responsability of Terraform to deploy a Dataflow job.
There is only a Terraform resource to instantiate a Dataflow template
You can deleguate this to your CI CD.
Example with Beam Python :
Beam PythonPython Beam code to a Cloud Storage bucketDataflow job and main file with Python command lineExample with Beam Java and mvn compile :
Beam Java and Maven or Gradlemvn compile command to execute the Dataflow jobExample with Beam Java and a fat jar :
Beam Java and Maven or Gradlefat jarfat jar to a Cloud Storage bucketDataflow job and the Main inside the fat jar with java -jar commandExample with Beam Python and Airflow/Cloud Composer :
Beam PythonPython Beam code to the Cloud Composer bucket with gcloud composerAirflow code, uses BeamRunPythonPipelineOperator to instantiate the Dataflow jobAirflow DAG to run the Dataflow jobExample with Beam Java and Airflow/Cloud Composer :
Beam Javafat jarfat jar to a Cloud Storage bucketAirflow code, uses BeamRunJavaPipelineOperator to instantiate the Dataflow job targeting on the path of the fat jarAirflow DAG to run the Dataflow job