Azure Logic Apps and polling: are you always doing the best choice?

How often in your real world projects have you the need to poll an external system or an external source for data? How often your customers ask you something like “Every 10 minutes I want to receive data from source X into my Dynamics 365 Business Central environment for processing”? I think this is quite a common scenario.

When using Azure Logic Apps for integrations (because I assume here that you WILL NEVER DO THIS TYPE OF INTEGRATIONS USING AL AND JOB QUEUE!), using a Recurrence trigger is often the solution.

To start and run your workflow on a schedule, you can use the generic Recurrence trigger as the first step. You can set a date, time, and time zone for starting the workflow and a recurrence for repeating that workflow. The Recurrence trigger is part of the built-in Schedule connector and runs natively on the Azure Logic Apps runtime.

If you want to have a workflow running every every hour, you can do the following:

and then you can add your actions to the workflow.

The common situation I see always is that these workflows are created in the consumption plan. The pay-as-you-go pricing model charges your flows executions based on triggers and actions that are specified in a logic app. Every time a Logic App definition runs the triggers, action and connector executions are metered.

For details about Logic Apps pricing you can see the following page. At the time of writing this post, a workflow running on Western Europe region in the consumption plan is billed as follows:

  • Actions: €0.000024 per execution, first 4000 actions per subscription (not per-workflow!) are free
  • Standard Connector: €0.000116 per call (Dynamics 365 Business Central connector is a Standard connector for Logic Apps)
  • Enterprise Connector: €0.000923 per call

Now let’s start to do some simplified calculations based on the Recurrence trigger. Imagine to have a workflow that every X it polls a system for available new data, retrieves that data, parse the JSON and calls Dynamics 365 Business Central. In this scenario we have a Recurrence trigger + 3 actions using 2 connectors (HTTP and Business Central).

Let’s do some rough calcultions of the costs only for the recurrence trigger (I don’t consider the free actions here).

For a workflow running 1 hour a day for 30 days a month, the cost is about:

1 * 24 * 30 = 720 actions * 0.000024 = 0,01728 euro/month

Now let’s consider a scenario where I need a workflow that every 10 seconds need to poll an external system for data. The cost will be:

6 * 60 * 24 * 30 = 259200 actions * * 0.000024 = 6,2208 euro/month

Now imagine to have 20 of these polling workflows (different scenarios). Monthly cost will be more than 124 euro/month, despite having data to retrieve or not.

These “polling costs” can sometimes become too much for some customers. The first recommendation that I can give when using timer-triggered cloud workflows with Logic Apps is: try to avoid excessive polling! If you poll with a low frequency, standard Azure Logic Apps Recurrence trigger is a great and low-cost choice.

But what can I do if polling is a business requirement and I cannot reduce the polling interval? I absolutely need to poll every 10 seconds, but I want to reduce costs.

The solution can be the following:

  • Use a Timer Trigger Azure Function
  • Call your Logic Apps workflow from the Azure Function

Azure Functions consumption plan is billed based on per-second resource consumption and executions. Consumption plan pricing includes a monthly free grant of 1 million requests and 400,000 GB-s of resource consumption per month per subscription in pay-as-you-go pricing across all function apps in that subscription.

How can I optimize frequent polling costs for my workflows with Azure Functions?

To use this approach, your Logic Apps workflow should not use the Recurrence trigger but instead should be trigger by an HTTP request (When an HTTP request is received trigger):

The body of the incoming HTTP request can be defined by you as needed.

Then, you need to create a Timer Trigger Azure Function (triggered every 10 seconds as per the above business needs) and from here you can call your Azure Logic Apps workflow when needed (only when you have data to process). The Azure Function can be defined as follows:

In this way our Logic Apps workflow is triggered only when really needed (when you have data to process). Instead of having always 259200 executions per month (workflow running every 10 seconds every day), if we have data only in 10000 polling executions, the Azure Logic Apps workflow rung only for 10000 executions. And you will save a lot of money!

This post is just for saying that I agree that low-code is wonderful and we’ll always would like to be able to use low-code technologies when designing cloud workflows, but this could not always be the best choice.

2 Comments

  1. Why should I “NEVER DO THIS TYPE OF INTEGRATIONS USING AL AND JOB QUEUE”?

    Why should I use an external service, ie. add an extra layer of complexity for both maintainability (separate documentation, logs, monitoring, repo, etc. for azure functions and al code) and execution?

    Granted job queue functionality is less than excellent, but for something being run every 10 minutes it’s more than adequate. Imo.

    Like

    1. The word “never” in the article was sarcastic. Generally speaking, it depends on the integration scenario. If you have to import from an external system lots of records every 10 mins, I always prefer to avoid doing that internally for many reasons, expecially scalability, performances, impact on users and limits on concurrent tasks on SaaS. In SaaS if you offload the tenant is always a great choice.

      Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.