Hello,
Let me preface this with the fact that I'm completely new to the Google Cloud platform and all of the resources it provides. Please let me know if this is the appropriate place to have this discussion!
I'm working on creating a pipeline that takes an input file, parses it, and splits it into multiple files based on an element within the file. However, I'm running into issues where Google Dataflow isn't passing all of the run time parameters I list in cloud shell (see below) to the template I created. Attached is some code that writes the runtime parameters to the Dataflow log, and it confirms that Dataflow is only passing the first parameter (inputFile) to the template. What do I need to do to correct this and have it pass all of the runtime parameters to the template?
gcloud dataflow jobs run fileParser --gcs-location gs://dataflow-dev/templates/fileParser --parameters=fileType="txt",inputFile="gs://dataflow-dev/pre-processor/testfile.csv",outputDest1="gs://dataflow-dev/dest1/",outputLegacy="gs://dataflow-dev/dest2/"