Logic App Template Creator Updates Januari 2019

Updates on the Logic Apps Template Creator has been published:

A minor thing for usage but great for consistency and quality in the generator is that there is now a build and release pipeline setup in DevOps.

  • Added Build & Release via DevOps to increase quality in merged sprints and automate release to PowerShell Gallery LogicAppTemplate
  • Improved support for Connectors to Storage Tables and Queues.
  • Added Commandlet to generate ARM Template for Integration Account Maps

Now the LogicAppTemplate is updated more frequqntly since I’ve added Build and Release setup to publish new releases to PowerShell Gallery.

PS> Install-Module -Name LogicAppTemplate

Or update to the newest version

PS> Update-Module -Name LogicAppTemplate

Storage Connector Tables and Queues

Now generated on the same way as Storage Blob Connector, meaning that the key will be collected duirng the deployment time based on the storage account name instead of needed to be provided as parameters. This will make it simpler and more neat to do deployments.

There are 3 parameters added

"azureblob_name": {
      "type": "string",
      "defaultValue": "azureblob"
    "azureblob_displayName": {
      "type": "string",
      "defaultValue": "myDisplayName"
    "azureblob_accountName": {
      "type": "string",
      "defaultValue": "myStorageAccountName",
      "metadata": {
        "description": "Name of the storage account the connector should use."

And they are later used in the connection to get the accountKey automatically during deployment.

      "type": "Microsoft.Web/connections",
      "apiVersion": "2016-06-01",
      "location": "[parameters('logicAppLocation')]",
      "name": "[parameters('azureblob_name')]",
      "properties": {
        "api": {
          "id": "[concat('/subscriptions/',subscription().subscriptionId,'/providers/Microsoft.Web/locations/',parameters('logicAppLocation'),'/managedApis/azureblob')]"
        "displayName": "[parameters('azureblob_displayName')]",
        "parameterValues": {
          "accountName": "[parameters('azureblob_accountName')]",
          "accessKey": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('azureblob_accountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]"

All magic is in this ListKeys part, so this tells ARM to collect the key based on a reference to the storage account. (this also means that the account doing the Resource Group Deployment also needs access to the storage account).

[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('azureblob_accountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]

New CommandLet Get-IntegrationAccountMapTemplate

So another improvement is that we now can extract a map from an Integration Account in to a directly deployable ARM template. It work similar to the other extractions and bellow is a sample of how to get the Integration Account map as an ARM template.

Get-IntegrationAccountMapTemplate -ArtifactName 'mapname' -IntegrationAccount 'ianame' -ResourceGroup 'myresourcegroup' -SubscriptionId 'guid()' -TenantName 'mattiaslogdberg.onmicrosoft.com'


Exporting Logic Apps via the Logic App Template Extractor simplifies the CI/CD area of using Logic Apps. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | LogicApps  | Integration  | ARM Templates  | ARM 

API Management Template Creator Updates December 2018

Updates on the API Management Template Creator has been published:

I’ve got a lot of help this time frome some awesome contributors a big thanks to them! NilsHedstrom, Geronius, joeyeng, Lfalk.

A minor thing for usage but great for consistency and quality in the generator is that there is now a build and release pipeline setup in DevOps.

  • Improved support for Identity Providers, Products and more
  • Added automation for Build & Release via DevOps to increase quality in merged sprints and automate release to PowerShell Gallery
  • Added to PowerShell Gallery APIManagementTemplate
  • Split the Template to a template per resource to follow API Management DevOps best pratices

Now it’s mutch easier to get started, just install the APIMangementTemplate modul from PowerShell Gallery.

PS> Install-Module -Name APIManagementTemplate

Or update to the newest version

PS> Update-Module -Name APIManagementTemplate

Split the Template to a template per resource

In order to follow the best pratice from Azure API Management DevOps example we now have a new command Write-APIManagementTemplates this command will take a template as input and split into a file per resource to make it easy to manage and handle files and create more customized deployment with a linked template. Read more at GitHub

 Get-APIManagementTemplate -APIManagement MyApiManagementInstance -ResourceGroup myResourceGroup -SubscriptionId 80d4fe69-xxxx-4dd2-a938-9250f1c8ab03 | Write-APIManagementTemplates -OutputDirectory C:\temp\templates -SeparatePolicyFile $true


Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 

Innovation enabled by Integration

Normally we focus on technical implementations and problem solving but from time to time we get the opportunity to expand the view and focus on innovative thinking. This is the time where we can show our customers the real value of applying good architecture and principles. Currently I’m having so much fun doing innovative prototyping work with one of my customers that I need to share it with you. We are using Power Apps and Flow for our prototyping in order to get prototypes done fast and easy, without spending too much time and effort.

Power Apps is a platform where you easy can build apps in a point-and-click approach and to multiple device types at the same time, read more at: https://powerapps.microsoft.com/

Flow is a lightweight integration platform that runs on top of Logic Apps, it’s designed to make it easy to create your own flows and turn repetitive tasks into automated workflows, read more at https://emea.flow.microsoft.com/

These platforms are very easy to get started with and there are several samples of applications regarding office365 and common applications. But the real power for innovation comes clear if you have a good API landscape that could be used when building these applications.

We have a customer that we started to work with for almost a year ago and now when all the hard work is done, and we have a good integration landscape we can harvest the benefits of our good architecture. I started with showing the case described below and now they really hunger for more of this kind of innovation both for internal and external use. It’s so wonderful to see how our hard work pays off. Showing that we really are the “integration heroes”, we are the enablers, making this possible.

Architecture that enables innovation?

In reality it’s quite simple, create a platform that contains API’s and flows that can be reused. Make sure to build domain model-based API’s and function driven API’s and focus on making them simple to use and most important reusable. Here is a Sample Architecture:

Integration Architecture Sample

And in our API Management instance we have created a few domain model API’s:

API List Sample

A few months back we helped in replacing an internal system for product information, it was a system where you could enter a serial number and get basic information about the product. The end solution was a website that used a couple of API’s to retrieve the information needed. The fun part of this was that most of the API’s where already created in other projects for other/similar needs but used from other systems/apps. All we did was fine tune an API, leaving us with time to innovate. On a computer it makes sense to write a serial number but on a mobile phone we expect to be able to scan the serial. Since we had the time we started to investigate the problems around this and if this was solvable.

We started to look in to this and the first “problem” was that the serial plates on the machines had no barcodes, only pure text:

Serial Plate Sample

Luckily this function exists in the Cognitive services in Azure read more at Azure Cogniative Services

We started to test this with a Power App that used a Flow to send the image to the Cognitive Service. The Power App solution is very simple, it uses the camera to take a picture and afterwards sends the picture to a Flow that utilizes the cognitive service to identify the serial number from the plate. The result from the Flow is a URL with the serial number that we used to launch the website, making use of the web app. With this we had expanded the functionality with a Serial Plate Scanner and provided basis for the decision on investing in a mobile app.

Here is the flow:

Microsoft Flow Overview

Here is the Power App in the Power Apps Studio developed in my browser:

Microsoft Power App

As you can see there is not much going on but a Camera item with a OnSelect statement that launches the web browser with the URL that comes back from our Flow “Get-Path-To-Product”.

The flow is also quite simple:

Microsoft Flow Solution

The tricky part was how to send the image to the Cognitive Services, in this case we created a compose action with a dataUritoBinary function:

Microsoft Flow Action Compose *dataUriToBinary(triggerBody()[‘Skapa_Indata’])

And in the HTTP action we send the binary data to the OCR service: (in the sample below the OCR service is behind our API Management instance)

Microsoft Flow Action HTTP

All in all, the prototype was easy and fast to build, but most importantly it successfully gave us the power to demo the concept. Providing a visual demo to show the value of having an app that can scan the serial plate. This gave a good understanding in the benefits of the investment on a native app to decision makers without spending too much time and resources.


The app is currently under development and we have more of these innovative prototypes coming up to prototype new or upgraded functionality in a variety of sizes. Best part of using this approach is that we can create a prototype in a day or two providing fast prototyping, giving decision makers a visual and testable solution to review. All thanks to our hard work with API’s, workflows and the amazing tools Power Apps and Flow!

Become the Integration Hero that you really can be and let that unused innovation power free!

Help Business Evolve

Posted in: | Innovation  | Power Apps  | Flow  | Tagged: | Innovation  | Integration  | Architecture  | Integration Architecture  | Power Apps  | Flow 

API Management Template Creator Updates August 2018

Updates on the API Management Template Creator has been published a few weeks back:

A minor thing for usage but great for consistency and quality in the generator is that there are alot of tests added.

  • Functions support added
  • Updated Logic App Support
  • Removed some name parameter generation

Functions support added

Finally the support for API’s that are generated from Functions is live, now an API generated from Functions get the appropiate automatic generation of function keys via ARM functions. Meaning that you don’t need to supply the function kode in the parameter files they are collected automatically via Listsecrets function in ARM.


      "comments": "Generated for resource /subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/properties/5b41934c6d0f59440d20c5ee",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_ibizmalo_name'),'/','5b41934c6d0f59440d20c5ee')]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "maloapimtest_HttpTriggerAdminKey_query_5b41934c6d0f59440d20c5ee",
        "value": "[listsecrets(resourceId(parameters('FunctionApp_maloapimtest_resourceGroup'),'Microsoft.Web/sites/functions', parameters('FunctionApp_maloapimtest_siteName'), parameters('operations_api-HttpTriggerAdminKey-post_name')),'2015-08-01').key]",
        "tags": [],
        "secret": true
      "resources": [],
      "dependsOn": []


parameters: {
    "INT2051functionkey_value": {
      "type": "securestring",
      "defaultValue": "Aze/1W8wCwItuP8JacdyHa2vDw8YScrOkbq6uHcTXiOasb2wi3kZoQ=="
	. . . 
      "comments": "Generated for resource /subscriptions/1fake145-d7f4-4d0f-b406-7394a2b64fb4/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apidev/properties/int2051functionkey",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_cpidev_name'), '/' ,parameters('property_INT2051functionkey_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "INT2051functionkey",
        "value": "[parameters('INT2051functionkey_value')]",
        "tags": null,
        "secret": true
      "resources": [],
      "dependsOn": []

Updated Logic App Support

Two bigger things has been targeted to improve the export and later deployment experience for APIM API’s that uses Logic Apps.

Trigger name fix

A Logic App Trigger can now have any name, previous it was hardcoded to ‘Manual’. There is now logic added to retrieve the information at extraction time, the extractor fetches the Logic App definition and finds the HTTP trigger to get the correct name.

Trigger name is extracted from the Logic App and added to the ARM functions (customtriggername in below sample):

Logic App Trigger name

Will result in (note the ‘customtriggername’):

"[listCallbackUrl(resourceId(parameters('LogicApp_customtrigger_resourceGroup'), 'Microsoft.Logic/workflows/triggers', parameters('LogicApp_customtrigger_logicAppName'), 'customtriggername'), '2017-07-01').queries.sig]"

Add Logic App as operation to exisitng API fix

There has been diffrent behavior between API’s generated from a Logic App imported as a new operation to an existing API. The main diffrent has been that the sig has been added as a “normal” parameter and no ARM functionality added to generate the sig. This is now adressed and the appropiate ARM functions will be added and generated automatically for both types.

Properties will get value from the [ListCallBackUrl] function:

      "comments": "Generated for resource /subscriptions/1fake145-d7f4-4d0f-b406-7394a2b64fb4/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apidev/properties/int7100-payment-to-o-test_manual-invoke_5b349f2a42974a989226cf33",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_apidev_name'), '/' ,parameters('property_int7100-payment-to-o-dev_manual-invoke_5b349f2a42974a989226cf33_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "int7100-payment-to-o-dev_manual-invoke_5b349f2a42974a989226cf33",
        "value": "[listCallbackUrl(resourceId(parameters('LogicApp_INT7100-Payment-To-o-DEV_resourceGroup'), 'Microsoft.Logic/workflows/triggers', parameters('LogicApp_INT7100-Payment-To-o-DEV_logicAppName'), 'request'),'2017-07-01').queries.sig]",
        "tags": [],
        "secret": true
      "resources": [],
      "dependsOn": []

Removed some name parameters

Parameters generated for name of resources like, Policy, version sets, operations etc has no real use case or benefits associcated with changing the name and is therefore removed to simplify the ARM templates.

Sample of parameters that are now removed:

    "versionset_5af9817ca656c6952416b779_name": {
      "type": "string",
      "defaultValue": "5af9817ca656c6952416b779"
    "operations_GetSalesOrderById_name": {
      "type": "string",
      "defaultValue": "GetSalesOrderById"
    "operations_policy_name": {
      "type": "string",
      "defaultValue": "policy"


Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 

API Management Template Creator Updates May 2018

Updates on the API Management Template Creator has been dragging but now I’m pleased to have fixed the two most urgent ones.

  • Support for Versions, with version sets
  • Schemas

Support for Versions, with Version Sets

If an API that is extracted has enabled the current versionset is also extracted and provided inside the ARM Template. The version set states the version that is used and is needed byt the API to be able to support versions, it contains the information about how the verions is handled for the specific API. The sample bellow is using a HTTP header named API-Version to set the version of the API, the version later is then set on the API itself.

Read more in the documentation: https://docs.microsoft.com/en-us/azure/templates/microsoft.apimanagement/service/api-version-sets

      "comments": "Generated for resource /subscriptions/fake439-770d-43f3-9e4a-8b910457a10c/resourceGroups/API/providers/Microsoft.ApiManagement/service/dev/api-version-sets/5ae6f90fc96f5f1090700732",
      "type": "Microsoft.ApiManagement/service/api-version-sets",
      "name": "[concat(parameters('service_dev_name'), '/' ,parameters('versionset_5ae6f90fc96f5f1090700732_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "Arkivet",
        "description": null,
        "versioningScheme": "Header",
        "versionQueryName": null,
        "versionHeaderName": "API-Version"
      "resources": [],
      "dependsOn": []

Support for Schema

If an API has schemas added to oeprations these schemas will be extracted and added to the ARM template, all operations will also have a “dependsOn” to the specific schema to prevent errors when executing the ARM template. The sample bellow is a simple schema that is added in the ARM template and referenced in the Operations section.

Read more in the documentation: https://docs.microsoft.com/en-us/azure/templates/microsoft.apimanagement/service/apis/schemas

          "comments": "Generated for resource /subscriptions/fake439-770d-43f3-9e4a-8b910457a10c/resourceGroups/API/providers/Microsoft.ApiManagement/service/dev/apis/arkivet/schemas/5af038365b73730dd01453ad",
          "type": "Microsoft.ApiManagement/service/apis/schemas",
          "name": "[concat(parameters('service_dev_name'),'/',parameters('api_arkivet_name'), '/5af038365b73730dd01453ad')]",
          "apiVersion": "2017-03-01",
          "properties": {
            "contentType": "application/vnd.ms-azure-apim.swagger.definitions+json",
            "document": {
              "definitions": {
                "Body": {
                  "type": "object",
                  "properties": {
                    "No": {
                      "type": "string"
                    "ReportedDate": {
                      "type": "string",
                      "format": "date-time"
                "BodyExists": {
                  "type": "object",
                  "properties": {
                    "CivicNo": {
                      "type": "string"
                "Post400ApplicationJsonResponse": {
                  "type": "object",
                  "properties": {
                    "message": {
                      "type": "string"
                    "description": {
                      "type": "string"
                    "errors": {
                      "type": "array",
                      "items": {
                        "type": "object",
                        "properties": {
                          "field": {
                            "type": "string"
                          "message": {
                            "type": "string"
                    "hasErrors": {
                      "type": "boolean"
                "ExistsPost200ApplicationJsonResponse": {
                  "type": "object",
                  "properties": {
                    "arkivet": {
                      "type": "string"
          "resources": [],
          "dependsOn": []


Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 

Access AAD Secured Web API's from API Management

Protecting Web Apps and Web API’s by the built in Authentication and authorization in Azure App Service is a great way to protect resources without adding code to handle the authorization. This means that the site or api is fully secure without the need of implementing it, which is a great example of seperation of concerns. Read more on how it works

What we then need is to access the API and in this case we need to be able to access the API via Management using service principals since the client using our API’s in API Management is outside our organization and should not be bothered by our internal security mechanisms.

The scenario: The API is protected by Azure AD and in order to access it we need to add a valid JWT token to the Auhtorization header in the request, first(1) we need to get the JWT token and then(2) add it in the request to the API App.

Scenario Overview

In this post we have used the standard generated API App from Visual Studio and published it to a Web API instances in Azure named Protected.

Enabling the built in security Authentication and Authorization to the API’s

In the Azure Portal head to the API App and go in to the settings tab “Authentication/Authorization”. Enable the App Service Authentication by pressing the On button.

Enable Authentication/Ahithorization

Now we can enable several diffrent providers and in this demo we will focus on the Azure Active Directory, press the Active Directory.

Configure Azure AD Athentication

After chosing the Azure Active Directory, we are presented with 2 options to configure the Azure Active Directory setup. Express gives some guidance and the possiblity to create or select an existing AD Application while in Advance we can write in the values needed, the AAD Application clientid and the issuer, normaly https://sts.windows.net/{tenantguid}/ followed by the tenat guid URL. In this setup we will use the Express option and create a new AD Application.

Here we can choose to use an existing AD Application or create a new one, I will create a new one named ProtectedAppAAD.

New Azure AD App

After this is installed we can now save and test that the API is protected. If you open a new browser or log out from the account you should be forced to login and the site is now secured.

Login Sign

So now the site is securely protected and we can access it when logged in, now we need to be able to access the API via API Managamenet.

Provide access to the API

So the API is now protected and we need to provide a way for API Management to access it, we will need to use another AAD Application (App Registration) and provide access to the ProtectedAppAAD. So we need to create a new Application, named APIManagementAADApp, the sign in URL can be http://localhost.

Create a new AAD Application

After the AAD application is created we need to provide access to the ProtectedAppAAD this is done via permissions; to assign permissions go to Settings, press Required Permissions and Add.

AAD Application add permission

Under Select an API we need to search for ProtectedAppAAD since first the list only shows Microsoft Standard API’s. Enter the ProtectedAppAAD in the text box and press Select.

AAD Application add permission select API

Now we need to select the permission and the one we want is Access, select it and press Select, finish the setup by press Done.

AAD Application add permission select API

Last step is pressing the Grant Permissions to enable the permissions (do not forget this!).

AAD Application add permission select API

API Management Policy for Aquiring JWT Token

In order to be able to expose this API we need to get a token from AAD using the Application, this will be done inside a policy and luckily for us the API Management team has provided a set of code snippets at GitHub and one of these is doing exactly that, get it here.

There is more of them at the repository in GitHub and if you have a great snippet that you want to share you can add a Pull Request and if the team approves it. it will end up there. Here is the GitHub repository, https://github.com/Azure/api-management-policy-snippets/blob/master/Snippets/.

The Get OAuth2 access token from AAD snipet looks like this.

<!-- The policy defined in this file provides an example of using OAuth2 for authorization between the gateway and a backend. -->
<!-- It shows how to obtain an access token from AAD and forward it to the backend. -->

<!-- Send request to AAD to obtain a bearer token -->
<!-- Parameters: authorizationServer - format https://login.windows.net/TENANT-GUID/oauth2/token -->
<!-- Parameters: scope - a URI encoded scope value -->
<!-- Parameters: clientId - an id obtained during app registration -->
<!-- Parameters: clientSecret - a URL encoded secret, obtained during app registration -->

<!-- Copy the following snippet into the inbound section. -->

    <base />
      <send-request ignore-error="true" timeout="20" response-variable-name="bearerToken" mode="new">
        <set-header name="Content-Type" exists-action="override">
          return "client_id=&resource=&client_secret=&grant_type=client_credentials";

      <set-header name="Authorization" exists-action="override">
          @("Bearer " + (String)((IResponse)context.Variables["bearerToken"]).Body.As<JObject>()["access_token"])

      <!--  Don't expose APIM subscription key to the backend. -->
      <set-header exists-action="delete" name="Ocp-Apim-Subscription-Key"/>
    <base />
    <base />
    <base />

The way it works is that two things happen before we send the request to the backend.

  1. First a request in the send-request action is made to the AAD tenant and aquiring a token.
  2. The set-header action is adding the token in the Authroization header, by extracting it from the result from the first request.

But before we can add this snippet to our API we need to add a few values to the Named Values section in our API Management instance.

  • authorizationServer - the tenant URL My tenant ID is 1cb87777-3df4-428b-811a-86a0f215cd35 so the URL then is https://login.microsoftonline.com/1cb87777-3df4-428b-811a-86a0f215cd35/oauth2/token
  • clientId - the Application id of our AAD Application in this case of APIManagementAADApp
    • AAD Application id APIManagementAADApp
  • clientSecret - the key to our AAD Application in this case of APIManagementAADApp
    • Create a new key Create AAD Application key APIManagementAADApp
  • scope - the AAD Application id of the AAD Application that is protecting our resource, in this case the id of ProtectedAppAAD
    • AAD Application id ProtectedAppAAD

After adding these the Named Values section of our API Managemet will end up looking like this. Make sure to add these before adding the policy, since the policy canno be saved if the referenced Named Value dosen’t exist.

AAD Application id ProtectedAppAAD

Now we just need to add the policy to our API and we will be able to access the protected API, after the Policy is saved test it and you should now recieve the result from the API.

AAD Application id ProtectedAppAAD


Todays solutions almost always accesses other resurces and when building this landscape we need to make sure that all parts are protected and secure, often even we need to combine a few of them to be compliant with requirments from customers. In this post I showed how to use one of the built in security from Web Apps with AAD that can be used standalone or as an extra layer of security above the “standard” Basic, Oauth, API keys, Certificates etc. authentications. Another benefit is that is not part of the code so it’s not possible to “forgett” or accidently removed in a bug fix or similar.

AAD Application id ProtectedAppAAD

As a bonus the API Management Snippets is a really nice initiative with pre created advanced snippets so make sure to check this out.


Posted in: | API Management  | Tagged: | API Management  | Integration  | Security 

Logic App Custom Connector WSDL Tips

So continue my journey with the Logic Apps Custom Connector and this time we are doing some work with the WSDL part of the Logic Apps Custom Connector and I thought I would share some tips and trix I’ve learned and are using when importing and using Connectors with WSDL. Since the only way to be a good user of a resource like the Custom Connector is to understand how we can investigate behaviors, tweak and fix problems and find debug/logs.

Import Error

There are several reasons why there will be errors when importing the WSDL and since it’s API Management functionality we can (until documentation is up to speed) see the limits on the API Management site:


I encountered the “rare” error with recursive objects, apparently there where a “lazy” coder that just referenced the whole entity in a parent child situation even if the only “needed” field was the ID. Anyway removing the recursive element solves the problem,alternatively manipulating the representation could have been done.

Runtime problems, unexpected behaviors and errors from the backend system

So there are many reasons why there can be errors sent from the backend system but one of the common ones I’ve found is System.FormatException: Input string was not in a correct format and that is due to a element specified as int and the value sent in is null. Now how can that be a problem? Since the generation is done by the same logic as in API Management we can import the WSDL in API Management and see how the actual liquid template looks like.

Let’s look at an example.

With a system I’m sent a link to the WSDL and a sample this to easy integrate with them, testing the sample works fine but importing the WSDL provides alot more than the sample as we are used to there are more fields than we need. So sending in the test file via postman works good but the connector does not work. Let’s take a look why, bellow is the sample a fairly easy message with few elements.

<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
    <ProcessArticle xmlns="http://ongoingsystems.se/WSI">
		<ArticleName>Display Stand</ArticleName>
        <ArticleDescription>Display Stand</ArticleDescription>

But when using the Custom Connector all the complex structures are mapped: (this is generated via API Management):

 <set-body template="liquid">
			<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns="http://ongoingsystems.se/WSI">

And we start running in to problems due to the fact that we are not sending in more data than earlier but the generated xml is much larger and just look at the enormous representation in the GUI:

Unmodified Connector in GUI

Therefore we get some unwanted errors since there are some implications added when sending in starts kin complex structures, since I can’t really do much about the Logic I need to modify the data sent in, but I can modify the WSDL and reimport it. So after changing the WSDL to only contain the elements that I needed in my request it looks alot better and the xml message sent are now matching the sample:

Modified definition in the WSDL so the definition is minimal:

<s:complexType name="ArticleDefinition">
          <s:element minOccurs="1" maxOccurs="1" name="ArticleOperation" type="tns:ArticleOperation" />
          <s:element minOccurs="1" maxOccurs="1" name="ArticleIdentification" type="tns:ArticleIdentificationType" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleNumber" type="s:string" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleName" type="s:string" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleDescription" type="s:string" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleUnitCode" type="s:string" />
          <s:element minOccurs="1" maxOccurs="1" name="IsStockArticle" nillable="true" type="s:boolean" />
          <s:element minOccurs="0" maxOccurs="1" name="Weight" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="NetWeight" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Volume" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Length" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Width" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Height" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="QuantityPerPallet" type="s:int" />
          <s:element minOccurs="0" maxOccurs="1" name="QuantityPerPackage" type="s:int" />

Sample from the GUI:

Unmodified Connector in GUI

The request can now be sent to the backend without any problems. This approach can be used to both detect problems and also understand the behavior of the Custom Connector and changeing the WSDL can help us to easier use the Connector even if the maintainance is heavier and we need to keep track of these changes and do the again if a update WSDL would be imported.


As stated we need to find the ways of understanding the resource before beeing good users of it, so I hope this will help out and give ideas around workarounds for some troublesome scenarios and that more debug and customization properties is coming along the way. When more power is needed I would advise using API Management for now until there is more customization properties but most of the time it works like a charm!

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Custom Connector 

CI with Logic App Custom Connector

Just had some fortune to work with some projects using the Logic App Custom Connector it’s alot of fun now with all new capabilites on prem connectivity and all and as allways we end upp with setting up CI deployments via TFS.

That is when I found that the On Prem Data Gateway ARM template is poorly documented and the generated template from automation script are missing some important properties. So I thought I’ll gather the information needed here, provide som pitfalls and so on.

The schema for the ARM template can be found at GitHub: https://github.com/Azure/azure-resource-manager-schemas/blob/master/schemas/2016-06-01/Microsoft.Web.json.

It’s inside the Web schema and not that “easy” to get that you are in the correct definition but go down to row 119 and you find the customApis part, this is good for reference.

So let’s see how a ARM template for the On Premise Data Gateway looks like when complete: ( I’m actually adding a fully working connector for https://billogram.com/) The full represenatation of the swagger can also be found at my GitHub repository https://github.com/MLogdberg/LogicAppCustomConnectors/tree/master/Billogram

  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "",
  "parameters": {
    "customApis_name": {
      "defaultValue": "Billogram",
      "type": "String"
    "location": {
      "defaultValue": "westeurope",
      "type": "String"
    "serviceUrl": {
      "defaultValue": "https://sandbox.billogram.com/api/v2",
      "type": "String"
  "variables": {},
  "resources": [
      "type": "Microsoft.Web/customApis",
      "name": "[parameters('customApis_name')]",
      "apiVersion": "2016-06-01",
      "location": "[parameters('location')]",
      "properties": {
        "connectionParameters": {
          "username": {
            "type": "securestring",
            "uiDefinition": {
              "displayName": "API USer",
              "description": "The API User for this api",
              "tooltip": "Provide the API User",
              "constraints": {
                "tabIndex": 2,
                "clearText": true,
                "required": "true"
          "password": {
            "type": "securestring",
            "uiDefinition": {
              "displayName": "API Password",
              "description": "The API Password for this api",
              "tooltip": "Provide the API Apssword",
              "constraints": {
                "tabIndex": 3,
                "clearText": false,
                "required": "true"
        "backendService": {
          "serviceUrl": "[parameters('serviceUrl')]"
        "swagger": {
          "swagger": "2.0",
          "info": {
            "description": "The Billogram API is built according to RESTful principles, which means it uses HTTP as an application protocol rather than just as a transport layer for a custom protocol, like SOAP does. In other words, HTTP features such as the various request types (GET, PUT, POST, DELETE), response codes (403 Forbidden, 404 Not Found, 500 Internal Server Error) and standard headers (Accept, Authorization, Cache-Control) are a part of the API.",
            "version": "1.0.0",
            "title": "Swagger Invoice/Billogram",
            "termsOfService": "http://swagger.io/terms/",
            "contact": {
              "email": "billogram@billogram.se"
          "host": "sandbox.billogram.com",
          "basePath": "/api/v2",
          "schemes": [
          "consumes": [],
          "produces": [],
          "paths": {
            ...removed for simplicity display...
          "parameters": {},
          "responses": {},
          "securityDefinitions": {
            "basic_auth": {
              "type": "basic"
          "security": [
              "basic_auth": []
          "tags": [
              "name": "Billogram Invoice",
              "description": "Handling Billogram/Invoices",
              "externalDocs": {
                "description": "Billogram structure",
                "url": "https://billogram.com/api/documentation#billogram_structure"
          "externalDocs": {
            "description": "Find out more about Swagger",
            "url": "http://swagger.io"
        "description": "My ARM deployed Custom Connector",
        "displayName": "[parameters('customApis_name')]",
        "iconUri": "/Content/retail/assets/default-connection-icon.6296199fc1d74aa45ca68e1253762025.2.svg"
      "dependsOn": []

Make sure to add both the swagger and the backendService object since the url specified in the serviceUrl since this is mandator and the URL that the Custom Connector are using during runtime, not the one specified in the swagger, this is good for easier management of dev/test/prod environments.

The Custom Connector is a great addon to Logic Apps and by been able to easily manage and deploy them is crusial, so even if this is great to be able to manage my Custom Connector via ARM I’m still missing the possibility to reference my swagger via a url since that would be most suitable way of deploying and kkeping the ARM template and swagger separated.

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Custom Connector 

Januari update Logic App Template Creator

A new year comes with new updates, as you might now there is now a Custom Connector option for Logic Apps and earlier this had some changes in the reource id handling in comparision with the managed api’s. So this has now been updated and fixed.

Also there where some issues regarding names that could not contain . (dot) and ohter special characters that has been fixed, a new way of handlign reource id now prevent problems related to naming restrictions.

The last part added was an option to be able to save the incoming body of the requested resources, this was done to be able to easier to debugg when developing, thanks to this and a Mock resource collector class that can replay the saved files we now can create full test runs that are replaying the whole collection of resources.

The github repository is found here: https://github.com/jeffhollan/LogicAppTemplateCreator


  • Now supporting extraction of Logic Apps containg Custom Connectors and generating working importable Connection resources with the Logic App.
  • Updates to resource id handling, now able to handle resource groups, resouirce names with .(dot) and other special characters that previously was not supported
  • Added a resource handler to be able to print out the incoming requests, to be able to build complete test suites with full replay of a extraction of a Logic App

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Template Generator 

Logic App Custom Connector via On Prem Gateway

In an ever growing hybrid world the need of exposing services that are hosted On Premise are growing and so are the limitations and requirments aswell, the newest way of exposing HTTP based services in Logic Apps is via the Logic App Custom Connector via the On Premise Data Gateway

So previusly when having the task of calling on-premise firewall protected services from our Logic App’s it wasent possible without firewall openings or other resources to help connecting.

Task Overview

But now it’s possible and let’s solve it with the newly updated On Premise Data Gateway and a Logic App Custom Connector.


Install the Gateway

First of make sure you have the latest version installed, follow the install instructions found here. .

Rememeber the name of the installed Gateway if you have many.

Create the API

In this scenario I will expose a swagger api endpoint, the simple standard Values Api, that are created default in a new Web App project with type Api App after enabling Swagger we can browse the swagger definition, as follows:

Values Controller Swagger

Copy the URL to the swagger defintion, shown in the middle bo in the image above ( http://localhost:51921/swagger/docs/v1 ) and save the content as a text file. (Simplest way is to browse it and copy the swagger definition to a text file)

Create Custom Logic App Connector

Now we are prepared and can start of with creating the Logic App Custom Connector based on the swagger file, note that it’s also possible to use WSDL files as source.

In the Azure Portal create a Logic Apps Custom Connector:

Azure Portal Create Logic Apps Custom Connector

No definition is added during the creation time, sp we will have to edit the Custom Connector once it’s created. Start edit the newly created Custom Connector and let’s add the swagger definition:

Go to Resource

Press the Edit button.

Press the edit button

Upload the swagger file: Upload swagger file

Now the API Connection is generated based on the swagger file, scroll down and make changes as you see them neeed but dont’ fotgett to check the “Connect via on-premises data gateway” checkbox and if needed change the URL’s to match the URL’s to the server exposing the service (I’ll just keep em same since it’s on the same machine for me)

Check the checkbox for on-premise data gateway

Now press Continue and move on with the process, once you are finished make sure to update the Logic Apps Custom Connector by pressing the check mark at the top:

Save the updates

After the Logic Apps Custom Connector is saved we can now use it in our Logic Apps, so lets’ create a Logic App and start using our newly created Custom Connector. (Make sure to create the Logic App in the same Location as your on-premise data gateway is installed otherwise it wont be available in the Logic App).

Inside the Logic App we can now find our Custom Connector among all the others and if I start searching for it I can easily select the appropiate action I want to use:

Logic App browse the Custom Connector

After I pick my operation, I just took the Get All I need to create the API Connection that will be connected to my Custom Connector, here is where I can pcik authentication method and also the Gateway I want to use.

Logic App create the API Connection for the Custom Connector

Pressing create will create the API Connection and the Logic App is completed for this sample, let’s save the Logic App and test run it, the result is that we will get the “values” returned as promised:

Logic App run result

The flow is now completed and since we have installed an On Premise Data Gateway on our webservice we can use the Logic App Custom Connector definition in our calls to our on-premise webservice, when creating the Logic App we are creating a API Connection based on the Logic App Custom Connector that holds credentials and Gateway information to be used when executing the Logic App run and we can now connect to our firewall protected on-premise webservice!

Ovreview result


I think this is an amasing new feature that we really have wanted for a long time. This enables so mutch simpler exposure of services on-premises and in a secure and reliable way. No need for fireall openings, vpn’s, express routes or reverse proxies in a DMZ, we can just install the gateway and use it as our entry point.

Remember this is still in preview and I’ve encountered some issues when working with SOAP so there are some fixes needed, but it looks good so far!

Posted in: | LogicApps  | Tagged: | Logic Apps  | Integration  | Hybrid