The history of a Dynamics 365 Business Central data processing workflow…

Once upon a time there was a workflow named SpongeFlow created with the intention of processing complex data coming from different data sources, merging informations between sources and, based on the result of the heavy processing, performing CRUD operations in Dynamics 365 Business Central.

Our workflow wandered around the cloud scared and lost for days, looking for someone to show it the correct tool to reach its goal.

One day our workflow met a very elegant Power Automate consultant who told him bewitching phrases like “with the Power Platform we will change the world!“, “with Power Automate we can manage any integration“, “I am an expert in low-code cloud solutions” etc.

The consultant immediately proposed a brand new Power Automate machine to SpongeFlow.

SpongeFlow was fascinated by this consultant’s words and immediately jumped on the elegant new Power Automate machine. SpongeFlow did a short test drive with the new machine and everything went perfectly.

SpongeFlow was very happy with his new car. One fine day he said “OK, I’m now ready to face the difficult roads of the Production environment. Let’s go!“. And here he is, ready to run along the roads of Production

But when the going got tough, SpongeFlow‘s car started having performance problems. “This machine is a slug!!SpongeFlow said angrily to his Power Automate consultant.

The Power Automate consultant was desperate. “Oh my God, what do I do now? The car is terribly slow and I have no other means at my disposal.“.

The Power Automate consultant then contacted an expert friend. The friend said “Hey… I have the right solution for your problems. This car is too slow because it is not aerodynamic. We need to upgrade to a more aerodynamic version. Let’s change the bodywork of the car, just take the engine of your Power Automate machine and mount it on the bodywork of a Ferrari Logic App. you’ll see that you’ll fly!“.

SpongeFlow was doubtful about this solution, but agreed. The Power Automate consultant and his friend then took the Power Automate car engine and put it into the Ferrari Logic App bodywork. “Oh wow… now I will have a very performant machine!” said SpongeFlow.

But as soon as SpongeFlow tested the new car on the roads of Production, the results were not as expected. The car was only slightly faster than before…

The Power Automate consultant was desperate… “Help, I have no more solutions. It’s not possible to go faster than this.” All I have to do is raise the white flag… I’ve failed.“.

And then the Power Automate “I-Do-Everything-With-PowerFluffs” consultant was sent to home by SpongeFlow in few minutes:

SpongeFlow now was desperate… “Oh my god, I cannot go faster than this. I cannot reach my goal…“.

But a day SpongeFlow had an apparition. The Azure God appears to him:

Don’t worry my friend SpongeFlow. There’s a solution for everything, you just need to use the right tools for what you really want to achieve. Don’t trust those who only offer what they know, broaden your mind…” Azure God said.

The Azure God fired a magical bolt towards SpongeFlow saying “This is the Logic App Standard bolt. It will give the you power and speed you need…

Suddenly SpongeFlow felt full of energy. He took his Ferrari Logic App and began to drive in the streets of Production at full speed. Now finally he had the desired speed to achieve all his goals!!

And now?

This story is just a metaphor that describes what often happens in real-world low-code workflows projects when the topic is integrations between systems and complex data processing. Don’t trust who want to solve all these tasks with Power Automate. But not because Power Automate is not powerful (it is and it’s lovely!) but simply because when you need extreme performances and scalability, this is not the best low-code solution that you can have.

Here I want to report the results of a real-world low-code workflow that respect this story in total. The workflow was written with low-code technologies (this was a customer requirement). The timer-triggered flow retrieves data from 3 external systems (data coming in a complex JSON and XML format), then need to parse those data and merge them. The output of this complex operation is a JSON containing a set of CRUD operations that must be performed into Dynamics 365 Business Central.

The workflow was initially designed as a Power Automate flow from one of the previously mentioned “I-Do-All-With-Power-Stuffs” consultants. The workflow was tested with demo data and it performed quite well. An operation on a single set of data files took about 3/4 minutes to complete. Wonderful… 3 minutes * about 50 iterations = 150 minutes. Not good but acceptable for the customer.

But when in production, things could change. The volume of data files growed after few months. The workflow need to process not only about 50 but now it need to process about 300 data files per day. Result? The flow took 15 hours to complete!! Unacceptable! And another problem we saw is that the Power Automate flow frequently had spikes on its execution time.

The “I-Do-All-With-Power-Stuffs” consultant was sent to home and now the new cloud partner (SD alias me) need to take care of that problem. The first test done was moving this cloud flow to an Azure Logic App Consumption instance. We immediately see a more reliable and constant execution time (no spikes), there was a gain in terms of execution time but this was not so relevant for the customer.

This is the chart showing the time for processing a set of data files during the day with the Power Automate flow and with the same flow moved to an Azure Logic App Consumption instance:

As you can see from the chart, there’s a gain in execution time when moving to Azure Logic Apps Consumption, but it’s honestly not so relevant (in some iterations it was from a maximum of 20 seconds to 2/3 seconds of gain).

But then we’ll do the magic and we move from the above result to the result that you can see in this new chart below (see the yellow line):

Yes… you see correctly. We moved the workflow from a low-code solution running in 3/4 minutes per data file to a low-code solution running 4/6 seconds per data file.

The entire process at the customer’s site was moved from an execution time of 14/15 hours to less than 1 hour!

How I’ve done that?

The final of this story will be presented LIVE at this session in Directions EMEA 2023 in Lyon on November 1 at 11.15AM:

Probably also the customer mentioned in this project will be in the session.

If you’re a workflow passionate or you frequently work on integrating Dynamics 365 Business Central with different applications and these integrations need to have high performances and scalability, I encourage you to attend this session. You’ll probably see something new (and cool!) that could change your workflows implementations (and you could also enrich your laptops with a new sticker because I’ve something to share to attendees…) 🙂

Always remember that Dynamics 365 Business Central ❤ Azure and if you need performances you need to use the right tool for the right job, not just only the tool you already know.

Waiting for you in Lyon then…

P.S. Just to be clear to everyone: I ❤ Power Automate, I don’t ❤ just only Azure Logic Apps. Simply I only frequently see that it’s abused and not used for the right job.

1 Comment

  1. Hello Stefano,

    I am really interested in learning how you achieved these results and the solution. Unfortunately, I wasn’t able to attend Directions in Lyon. Would you be willing to share the slide deck or are you planning on making a new post on the solution?

    Thanks

    Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.