Justin,
This is indeed the realm of regular pipeline ETL tools. Although any programming language like Python, Ruby, etc, could also hack this up quickly, but perhaps your team is limited on their skill set for that.
Your use case is exactly what I am doing in Pentaho at this very moment... I took a customer's CSV file and split multi-valued fields where appropriate using a simple regex along with an additional formula. I built the Transformation using one that I had previously built for the same case, just a few months ago. Many ETL products let you rinse and repeat and reuse your existing code or transformations to adapt to changing customers needs. For your case, it's probably just a few tweaks here and there as you go through each customers CSV file. Piece of cake.
I'd be happy to help you wire something up quickly in about 5 minutes that handles your case in Pentaho.
In fact, a quick sample of your use case is already done for you in Pentaho.
You can also just download and install Pentaho Data Integration 4.4, and if on Windows, start it with the spoon.bat file. (which will launch the Java startup and config and get you running)
Go to File, then Import From An XML File, then navigate to the folder path of /pdi-ce-4.4.0-stable/data-integration/samples/transformations and then load the sample file called "CSV Input - Reading customer data.ktr"
Then double-click on the boxy component on the screen called "CSV file input" and change or tweak to your desire.
Reach me in private mail, offlist, if you want more help.