In Microsoft Flow it is possible to use the Excel connectors to read data form a spreadsheet. When you want to read large Excel files you might run into issues.
Excel connector issues
There few major issues with the excel connector are:
- You data needs to exist within Excel Tables.
- Large tables take long to load. (4000 rows takes about 2 minutes)
- The Excel connector likes hard coded names for sites, libraries and Excel files.
- Excel files get locked as you configure flows. And it can take a while for the locks to be removed.
If the above is not a problem please feel free to create a flow like this:
Read large Excel files
If you want more however you might want to look into the graph API when you read large Excel files. With the help of Flow Ninja, John Liu‘s post on twitter I got this all working.
I started with setting up my app permissions which gave me a Tenant ID, Client Id and Secret which I then used in 3 variables. If you need some help setting this up you could look at the infoPulse site.
Now I’m having to Collect the site details from MS Graph. This can be done by configuring an HTTP action as shown below. After the v1.0/sites bit simply supply the url of your subsite.
This now gives the ID of the site that we need to get the list of the lists an libraries that we are interested in. Remember you will need to specify the same authentication bits as in the previous HTTP call.
To get the ID ( if not available in dynamic content you can use the following expression.
Now I can use the ID returned for the library. I used a compose in between these actions to hold the library details.
To create this compose action you might have to use a filter action to select the right document library from the HTTP 3 action.
For help on filtering arrays in power automate please look at my post about array filtering. this should now give you the single library id. I’ve created a compose action called Compose 2 to hold this library record.
This now give me the fully generic solution that helps me copy the flow between my Dev, UAT and production environment
Then finally I can run through the Excel data with an Apply to each step. In this case I only did a Compose to collect each row but most likely you would now do something with the data like updating a SharePoint list.
- No need for Excel Tables.
- Large tables are quick to load. (4000 rows takes about 4 seconds)
- My flow is a generic process that is easy to copy between environments
- No file locking on my Excel files.