Even with a wide breadth of shared knowledge online, it can be surprisingly difficult to handle CSV files in Power Automate without costly premium services. Here’s how I managed to get it done with in-the-box tools.

Email Attachment Handling
Before I begin, here’s a tip to avoid automatic loop creation in Power Automate cloud flows. If you know that your trigger email will always contain a single attachment, you can avoid the for-each-attachment loop using this function for your Attachment Id in the GetAttachment step: first(triggerOutputs()?['body/attachments'])?['id']
CSV Parsing
Quick confession: for my purposes, I opted to use a tab-delimited file instead of a true CSV (although the software sending me the file still uses a .csv file name). This makes no significant difference to the flow except (split by comma is just split by tab for me), but it simplifies parsing that would be needed to handle a quoted CSV if any data fields contain commas, and in my case the fields can’t contain tabs anyway, so any tabs are already perfect for delineation. Let’s look at the flow steps that made this process easy for me:
Step 1: Use Compose to convert attachment byes to text: base64ToString(outputs('Get_Attachment_(V2)')?['body/contentBytes'])
Step 2: Use Compose to remove double quotes (not necessary but I don’t want quotes in the extracted text later): replace(outputs('Compose_String'),'"', '')
Step 3: Use Compose to split text file at every line break character. Every line becomes an element in an array): split(outputs('Remove_quotes'),decodeUriComponent('%0A'))
Step 4: Use Apply to Each. We’re going to perform the same actions on every row (record) of the CSV. I used a formula which selects all the rows of the step 3 output array except for the first row (header) and the last row (an empty blank row my software causes on each CSV created): take(skip(outputs('splitNewLine'),1), sub(length(skip(outputs('splitNewLine'),1)),1))
Step 5: Use Compose to split the row at every tab character split(item(), decodeUriComponent('%09'))
(the %09
was originally a ,
but I swapped it as noted above). After this is done, we have an array with a length equal to the number of fields in the CSV data. Remember we’re inside the for-each loop right now, so this only does a single row at a time.
Step 6: Use Compose to create a JSON object text string with the data in this array.

Step 7: Back outside the for-each loop, use a Compose based on the outputs of the JSON step to create a complete JSON object with all elements of the for-each loop. Note this only works when you do not use any nested loops (now you see why it was important to avoid the for-each-attachment loop when referencing the triggering email attachment, yes?)

Step 8: Use a Parse JSON step to compose the JSON and set each data type, which allows you to use these in fields later on in the flow by friendly name rather than position in the data file:

At this point, I go on to loop again with a for-each of this JSON array, and perform more steps for specific conditions (ignoring some rows based on value of the data found there for example). That’s not really the point of this post, although you can read about that in my earlier Automating a Manual Order Fulfillment Task post.
This flow totally satisfies my needs right now, but I will adminit that it was easier and more robust (for variable definitions of robust) when I was leaning on the Encodian CSV Parsing API service to convert the CSV attachment directly to JSON. However, that service’s free tier didn’t offer enough usage to meet my needs and and spending an afternoon rolling my own CSV (or TSV) parsing in Power Automate will save a $500/year API cost of purchasing their bare-minimum paid plan.
Leave a Reply