Google Data Flow not Passing all Runtime Parameters

577 views
Skip to first unread message

Adam Williams

unread,
Feb 2, 2021, 11:02:05 AM2/2/21
to Google Cloud Developers
Hello,

Let me preface this with the fact that I'm completely new to the Google Cloud platform and all of the resources it provides. Please let me know if this is the appropriate place to have this discussion!

I'm working on creating a pipeline that takes an input file, parses it, and splits it into multiple files based on an element within the file. However, I'm running into issues where Google Dataflow isn't passing all of the run time parameters I list in cloud shell (see below) to the template I created. Attached is some code that writes the runtime parameters to the Dataflow log, and it confirms that Dataflow is only passing the first parameter (inputFile) to the template. What do I need to do to correct this and have it pass all of the runtime parameters to the template? 

gcloud dataflow jobs run fileParser --gcs-location gs://dataflow-dev/templates/fileParser --parameters=fileType="txt",inputFile="gs://dataflow-dev/pre-processor/testfile.csv",outputDest1="gs://dataflow-dev/dest1/",outputLegacy="gs://dataflow-dev/dest2/"  
testParams.py

Jun (Cloud Platform Support)

unread,
Feb 5, 2021, 6:02:07 PM2/5/21
to Google Cloud Developers
Hi, 

Have you tried putting "--parameters" before each custom parameter rather than using commas as suggested by Travis Webb at [1] to see if it is working?

In the meanwhile, please notice that Google Groups is meant for general product discussions, so normally it is recommended to go to StackOverflow for technical questions or Issue Tracker for product bugs (unexpected behaviors) and feature requests. 

To get a better support you should post to the relevant forum, thus please read the Community Support article for better understanding.

Reply all
Reply to author
Forward
0 new messages