I think that many of you on your Dynamics NAV or Dynamics 365 Business Central projects have tasks executed by using console applications (.exe) scheduled by using Windows Task Scheduler or other systems. This works good for on-premise environments (just take your .exe application, create a new schedule with Windows Task Scheduler and you’re ready to go), but what about using these .exe applications on a SaaS environment?
If you want to periodically execute your custom .exe tasks on a Cloud environment, using an Azure Virtual machine for this scope is a possible option but (as you can imagine) it’s not the best and the cheapest option available: in this way, you pay for the VM always and it’s not the best option to spin up an Azure VM only for hosting console applications and triggering them by using Windows Scheduler.
How to do that in a more efficient way? By using Azure Functions, you can run your .exe applications in the cloud and only pay for the resources they consume when the workload is running (maybe few seconds, first million executions for month are free), without creating and maintaining environments (like Azure VMs) and (most important) without changing nothing on your .exe code. And you can do that in few minutes just with drag & drop. Let’s see how to do that.
As a first step, we create a new Function app via the Azure Portal by selecting Powershell Core as Runtime stack:

In the Hosting tab, select Plan type = Consumption (you pay only when the function is running) and then click Create to deploy your Function app:

When the Function app is created, you can see the new istance in the Function App panel on your Azure subscription. The istance is empty and we need to create a new Azure Function. We can do that manually by using the KUDU Console (link available under Platform features|Development Tools|Advanced Tools (KUDU) or by directing appending .scm to the function url):

If you select CMD and you navigate into the D:\home\site\wwwroot folder, you can see that the Function app is empty (only host.json is present):

Here we need to create a new Azure Function. An Azure Function is simply composed by a folder (named as your Function) that contains a function.json file and your function code. So, just create a new folder (here called AzureFunctionConsoleApp) and inside this folder we have to place 3 files (drag & drop from your local machine):
- AzureFunctionConsoleApp.exe: it’s your .exe file that you want to run in the cloud
- Function.json: it’s the function definition (bindings)
- run.ps1: Powershell script that is automatically executed when your Azure Function runs

The function.json file is defined as follows:
{ "bindings": [ { "name": "Timer", "type": "timerTrigger", "direction": "in", "schedule": "*/60 * * * * *" } ], "disabled": false }
We have defined a TimerTrigger Azure Function that runs every minute.
The run.ps1 Powershell file is defined as follows:
param($Timer, $TriggerMetadata)
Write-Output "Executing AzureFunctionConsoleApp.exe at: $(get-date)"
Set-Location "D:\home\site\wwwroot\AzureFunctionConsoleApp"
.\AzureFunctionConsoleApp.exe
The script reads the input parameter binding (this is a need to avoid an error like “No parameter defined in the script or function for the input binding XXX“) and then executes the AzureFunctionConsoleApp.exe file.
When these two files are deployed, the function is immediately executed on your Azure environment and your .exe application is started.
You can also use the Azure Function Application settings in order to declare variable parameters that you need to use (or pass) to your .exe application (here for example I’ve declared a parameter called D365BCEndpoint):

This is quite useful when you need to manage parameters that could change in the future because you can directly change them via the Azure Portal easily:

In your .exe application, you can direclty read these Application settings parameters. In my AzureFunctionConsoleApp.exe application for example I read the previously declared parameter with:
Console.WriteLine($"Connection to D365BC URL {Environment.GetEnvironmentVariable("D365BCEndpoint")}!");
In this way, your .exe file is executed in the cloud and you can monitor it with the standard Azure Function log:

or (better) with Application Insights (that permits you to have a Live Metric Stream for real-time monitoring):

This is a quite easy and quick way to execute .exe applications in the cloud and pay only for the time they’re running. If you have this need, I suggest you to give it a try.
Hi Demiliani,
I have uploaded zip to Azure Functions App and it runs fine to unzip and untar .tar.gz file when uploaded to the function itself (in testing). However, if I run the function on files saved in Blob container, it doesn’t work and throws error.
020-09-11T16:06:42.721 [Error] ERROR: Program ‘7za.exe’ failed to run: StandardOutputEncoding is only supported when standard output is redirected.At D:\home\site\wwwroot\tools\run.ps1:9 char:1+ .\7za.exe e $InputBlob+ ~~~~~~~~~~~~~~~~~~~~~~.Exception :Type : System.Management.Automation.ApplicationFailedExceptionErrorRecord :Exception :Type : System.Management.Automation.ParentContainsErrorRecordExceptionMessage : Program ‘7za.exe’ failed to run: StandardOutputEncoding is only supported when standard output is redirected.At D:\home\site\wwwroot\tools\run.ps1:9 char:1+ .\7za.exe e $InputBlob+ ~~~~~~~~~~~~~~~~~~~~~~.HResult : -2146233087CategoryInfo : ResourceUnavailable: (:) [], ParentContainsErrorRecordExceptionFullyQualifiedErrorId : NativeCommandFailedInvocationInfo :ScriptLineNumber : 9OffsetInLine : 1HistoryId : -1ScriptName : D:\home\site\wwwroot\tools\run.ps1Line : .\7za.exe e $InputBlobPositionMessage : At D:\home\site\wwwroot\tools\run.ps1:9 char:1+ .\7za.exe e $InputBlob+ ~~~~~~~~~~~~~~~~~~~~~~PSScriptRoot : D:\home\site\wwwroot\toolsPSCommandPath
Any thoughts on how to fix this? Thank you very much!
LikeLike
Can you please explain what you’re trying to do in details? Quite a strange error…
LikeLike
Hi Demiliani – thank you for the reply
I have many many .tar.gz files that I need to load into Azure SQL database. I need to decompress and unpack them in order to do the ETL in Azure Data Factory. ADF can decompress, but won’t unpack the .tar files. So, I uploaded the 7zip standalone with the 7z.dll files to Azure Function app and tested it on .targ.gz file that I uploaded to the function itself. However, when I try to test the Function App on files store in Azure Blob Container, I get the error above.
Here’s the content of run.ps1
Set-Location D:\home\site\wwwroot\PMHCpmBlob
# The command below works fine for the file I uploaded to the function
.\7za.exe x 1.tar.gz
# The command below throws the error
.\7za.exe x $InputBlob
Thank you again.
LikeLike
It seems that $InputBlob variable has a value that cannot be find.
LikeLike
Excellent article.
One question about it. What about if the executable need a library (a dll for example). Can you upload with the executable, in the same folder?
LikeLike
Yes you can do that.
LikeLike
What if I want to execute 3rd party exe like docfx.exe?
Is FA feasible solution?
LikeLike
Yes you can also run 3rd party exe files as described.
LikeLiked by 1 person
My scenario is I need to download complete git project and run docfx.exe
Instead of vm I’m thinking to use blob storage to save my git repo and save my docfx.exe files and then run it using FA. Any inputs from your end?
LikeLike