<Not a exact delta lake question, but want to get idea if anyone else had similar situation Or kindly redirect me to appropriate forum for this>
I am having some ETL pipelines running on azure Databricks which read data from blob storage and after some transformation insert into Delta table. I have few of similar ETLs.
All the ETLs have simple join, selecting column and filter on column and list of values.
to reduce maintaining all ETLs as separate code, I am looking for some option so that i can make these pipeline as generic code which reads some config (like blob folder location, join on table/column name, filter name) from some config file and then execute it. So that I have single code but different config file for each ETL.
Any suggestion on if there is any tool to achieve this Or any guidance on direction on implementing this.
Thanks