Hello,
I am creating a file drop location to process a CSV file, then archive it once imported. I can read the CSV from a google cloud storage location, and load into Big Query. However, I am struggling to move the CSV from the drop folder to an archive folder. Ideally I would use GoogleCloudStorageToGoogleCloudStorageOperator but from research online, v1.9.0 doesnt support this.
From the link below, I attempted to get the hook and operator from github, and placed in the plugins/ folder
When I attemp to load the DAG, getting the following error message. No module named gcs_to_gcs
The main parts to my dag
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
from airflow.contrib.operators.bigquery_operator import BigQueryOperator
from airflow.contrib.operators.bigquery_to_gcs import BigQueryToCloudStorageOperator
from airflow.contrib.operators.gcs_to_bq import GoogleCloudStorageToBigQueryOperator
from airflow.contrib.hooks.gcs_hook import GoogleCloudStorageHook
from airflow.contrib.operators.gcs_to_gcs import GoogleCloudStorageToGoogleCloudStorageOperator
from airflow.models import BaseOperator
So three questions, is my attempt to load GoogleCloudStorageToGoogleCloudStorageOperator correct? Is their a better way? And Can airflow / composer be upgraded?
Thanks!
Aron