Dynamics 365 Business Central: handling files via FTP on SaaS

I’ve written lot of time ago a post on how to save a file from Dynamics 365 Business Central SaaS to an FTP server. The solution can be found here.

After this post I’ve received lots of requests about how to do the opposite: if someone sends me a file on an FTP server, how can I retrieve the file, parse it and then save the data into Dynamics 365 Business Central SaaS?

There are certainly different ways for doing that, but (as promised in the past days to some of you) in this post I want to describe what I think it’s one of the best solution in terms of performances, scalability and reliability. The schema of the solution is the following:

Here, we’re using an Azure Logic App for connecting to an FTP server (via the FTP connector). The Logic App retrieves the content of the uploaded files and pass it to an Azure Function. The Azure Function is responsible for efficiently parsing the files (the file can be a simple CSV or maybe an XML, a JSON or a more complex file like PEPPOL or EDI, maybe with tons of rows). When parsed, the content of the file is then saved into Dynamics 365 Business Central by using APIs.

In the scenario described in this post, an external application loads on the FTP server a ZIP package containing a big CSV files that we need to parse for loading its data in Dynamics 365 Business Central. In more details:

  • The external application loads a .zip file containing a comma delimited CSV file to process (file is big, this is the reason while it’s packaged as a zip file)
  • The Azure Logic App retrieves the zip package from the FTP and loads it on Azure Blob Storage
  • The Azure Function:
    • unpacks the ZIP package and extracts the CSV file
    • processes the CSV file
    • Saves data into Dynamics 365 Business Central via APIs

Designing the workflow with an Azure Logic App

To interact with the FTP server from the cloud environment, let’s start a new Azure Logic Apps project (here created with the Consumption model). In the Logic App Designer, search for the FTP connector and select When a file is added or modified (properties only) as trigger:

This will trigger our workflow when a file is added to the FTP server.

Set the FTP connection parameters:

and then configure the FTP folder to check:

You can also specify the number of files to return and how often the workflow must check for newly added items to the FTP folder.

Now that we have defined a trigger for our workflow, we need to define the actions that we want to execute when a file is added or modified on the FTP server.

Select New Step, search for FTP and under the FTP connector select the Get file metadata action:

In the Get File Metadata action configuration, click inside the File box and select the List of Files Id dynamic property:

In this way this action will retrieve the unique Id of the files added (or modified) in the FTP server after the last run.

Now add a new action and select Get file content:

In the Get file content action configuration pane, click inside the File box and select Id from the Dynamics content:

This action will retrieve the file content of the unique file id passed as input.

Now that we have retrieved the file from the FTP server, we need to store it into an Azure Blob Storage account and from here we want to call our Azure Function for processing the file.

For this example, I imagine that the uploaded file that we want to manage is a CSV file (an external system uses FTP to pass data files in CSV format to Dynamics 365 Business Central) but the same approach can be used for any types of files you want to manage (this is the power of this solution).

Let’s add a new action to our workflow for loading the file into an Azure Blob Storage instance (that we have previously created, I avoid to show the steps here because they are really simple). Search for the Azure Blob Storage connector and then select the Create Blob (V2) action:

When selected, you need to define the connection to your Azure Blob Storage account. You can use the Access Key or you can use more advanced connection settings like Azure AD or Managed Identity (this is up to you):

When defined the storage connection, select the storage account folder where to save the files and then set Blob name property with the Path dynamic property (that comes from the FTP connector) and set the Blob content property with the File Content dynamic property from the FTP connector:

If all is ok, we can now test the workflow in order to check if the file is retrieved from the FTP server and then saved into our Azure Blob Storage account.

Let’s load a file into the FTP:

and after a bit you can see that the workflow starts. You can monitor the execution directly from the Azure Portal and if all actions are successfully executed (green check) you will have the file loaded into the Azure Storage Account:

Now we want to trigger directly from the workflow an Azure Function that will be responsible for managing the loaded file (unpacking the zip file, extracting the CSV, parsing it and saving its values to Dynamics 365 Business Central).

To do that, you can select the Azure Functions action (one of the coolest feature of Logic Apps):

and then set the Azure Function you want to trigger (with the appropriate request body for calling it). Here for example I have an Azure Function that wants the package name to process as parameter (request body in JSON format), so I trigger it by passing the Path parameter of the FTP connector, then I delete the file from the FTP:

Designing the Azure Function for processing the file

Before starting, someone could ask: why the Azure Function for the file parsing task?

Because handling large files with code is extremely more efficient, expecially for complex types of files (in this way I’m able to parse not only CSV files but also XML, JSON and other more complex formats extremely difficult to handle with low code approaches). And you don’t require premium or payed connectors for that.

You can use Logic Apps for parsing CSV files or other text files, but it’s not so easy and it’s also not so performant, expecially with large files.

What about this Azure Function code?

In the real world, I have an Azure Function app that contains different functions for processing different types of files. To be easy here, I provide a skelethon of a function that processes CSV files.

The Azure Function is an HttpTrigger function (here called ParseCSVPackage) that supports only the POST http method. The function is defined as follows:

The function retrieves the package parameter from the request body and then calls an internal function called HandleCSVPackage.

The HandleCSVPackage private function downloads the zip package (passed as input) from the Blob storage, unpacks it and loads its data into a Datatable in memory. The code is as follows (in this sample the CSV contains Customer data and we create Customers record in Dynamics 365 Business Central):

When the CSV package is parsed and its content is loaded into a Datatable, a method called SaveDataToD365BC is called. This method is responsible for doing an API call to Dynamics 365 Business Central for every record in the Datatable. Code is as follows:

Please not that this code is only for demo/idea purposes, it’s not the real full code we have in production (you need to better handle exceptions and logging in a real production app).

Advantages of this solution:

  • Good performances
  • Scalability
  • Monitoring (you can use built-in monitoring tools and/or Application Insights)
  • Low cost
  • No licenses needed
  • Reliability
  • Redundancy support

I will never stop to say: please don’t do all kind of stuffs with AL language!

As a conclusion of this post, someone could ask: why not using Power Automate for interacting with the FTP?

I think that the answer can be summarized in the above bulleted points, but to be more exhaustive in the answer, please wait for the next post. 🙂

1 Comment

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.