Fixing Function v2 add extension issues in portal

I’m out doing educations and it’s both fun and a great chance for me to learn new stuffs since it’s always something that just for no reason going south.

So today I was out on one of these missions and we where doing some labs developing functions, when we added an extra input from a table storage the Function App just chrashed and then we just got errors that the site was offline and it dident come up. Everybody encountered it so I had to investigate.

The response was a 503 Service Unavailable and the body was the following html content (removed the style since it just took alot of space):

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" >
<head>
    <title>Host Offline</title>
 </head>
<body>
    <div>&nbsp;</div>
    <div id="wrapper">
        
        <div id="page">
            <div id="content">

                <div class="box">
                    <h2>This Azure Functions app is down for maintenance</h2>
                    <p>
                        Host is offline
                    </p>
                </div>
            </div>
        </div>
    </div>
     
</body>
</html>

So what did we do to get here? Well we just wanted to add an input to from a Table Storage to a standard HTTP Trigger and we started from a blank new defualt function.

Here is my default http function: Defualt Http Trigger

Then I went to the Integrate pane and started to add the input from my Azure Table Storage

Integrate Pane

After selecting the Azure Table Storage we get an warning that the extension is not installed, let’s install it!

Install Storage Extension

It’s installing and we proceed as told in the message:

Install Storage Extension Message

So we just continued and filled the other details and pressed Save.

After that we went back to the function to add the input, it’s just a simple signature update from:

public static async Task<IActionResult> Run(HttpRequest req, ILogger log)

To:

public static async Task<IActionResult> Run(HttpRequest req,Newtonsoft.Json.Linq.JArray inputTable, ILogger log)

Now when we run the function the inputTable should be populated from the Azure Table Storage but when we try to execute it we just get a http error code 503 Service Unavailable and in the body I can see the text “Host Offline”.

503 Service Unavailable Host Offline

So we started to look into this and read that it could be due to extension not installed properly, to fix this we followed this guide: https://github.com/Azure/azure-functions-host/wiki/Updating-your-function-app-extensions

If you have the same problemes make sure that the extension file exists and contains atleast “Microsoft.Azure.WebJobs.Extensions.Storage if not create and/or add “Microsoft.Azure.WebJobs.Extensions.Storage to the file. Sample bellow:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netstandard2.0</TargetFramework>
    <WarningsAsErrors />
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Microsoft.Azure.WebJobs.Extensions.Storage" Version="3.0.0" />
    <PackageReference Include="Microsoft.Azure.WebJobs.Script.ExtensionsMetadataGenerator" Version="1.0.*" />
  </ItemGroup>
</Project>  

If you are missing the file it can easily be created in the command prompt witb a simple script:

echo.>extensions.csproj

Run the build command (this takes a while to run so be patient):

dotnet build extensions.csproj -o bin --no-incremental --packages D:\home\.nuget

Restore extension build complete

So now the extensions is installed, let’s verify if this solves the issue completly? Start the Function App and test the function again.

If you are as unlucky as me and the function is not working we need to do one more thing. Let’s go back to Kudos and the CMD Prompt again, the problem is the app_offline.htm file:

Restore extension build complete

Delete it and go back and test agian, it works again!

Restore extension build complete

Summary:

The problem is that the extension is not properly installed and when we add the extension in the configuration without the extension installed the Function App chrashes as produces the app_offline.htm and if it’s present in the folder the response is defaulted to 503 and Host Offline, by removing the file the Function App starts executing as normal and if we have fixed the extensions no errors comes up. This works for all extensions.

By showing the problem and the reproduce scenario we can help out improving the products! So I hope this helps anyone and leads to a fix from the prodcut team.

Posted in: | Azure Functions  | Tagged: | Azure Functions  | Integration  | Serverless 


Logic App Template Creator Updates Januari 2019

Updates on the Logic Apps Template Creator has been published:

A minor thing for usage but great for consistency and quality in the generator is that there is now a build and release pipeline setup in DevOps.

  • Added Build & Release via DevOps to increase quality in merged sprints and automate release to PowerShell Gallery LogicAppTemplate
  • Improved support for Connectors to Storage Tables and Queues.
  • Added Commandlet to generate ARM Template for Integration Account Maps

Now the LogicAppTemplate is updated more frequqntly since I’ve added Build and Release setup to publish new releases to PowerShell Gallery.

PS> Install-Module -Name LogicAppTemplate

Or update to the newest version

PS> Update-Module -Name LogicAppTemplate

Storage Connector Tables and Queues

Now generated on the same way as Storage Blob Connector, meaning that the key will be collected duirng the deployment time based on the storage account name instead of needed to be provided as parameters. This will make it simpler and more neat to do deployments.

There are 3 parameters added

"azureblob_name": {
      "type": "string",
      "defaultValue": "azureblob"
    },
    "azureblob_displayName": {
      "type": "string",
      "defaultValue": "myDisplayName"
    },
    "azureblob_accountName": {
      "type": "string",
      "defaultValue": "myStorageAccountName",
      "metadata": {
        "description": "Name of the storage account the connector should use."
      }
    }

And they are later used in the connection to get the accountKey automatically during deployment.

 {
      "type": "Microsoft.Web/connections",
      "apiVersion": "2016-06-01",
      "location": "[parameters('logicAppLocation')]",
      "name": "[parameters('azureblob_name')]",
      "properties": {
        "api": {
          "id": "[concat('/subscriptions/',subscription().subscriptionId,'/providers/Microsoft.Web/locations/',parameters('logicAppLocation'),'/managedApis/azureblob')]"
        },
        "displayName": "[parameters('azureblob_displayName')]",
        "parameterValues": {
          "accountName": "[parameters('azureblob_accountName')]",
          "accessKey": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('azureblob_accountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]"
        }
      }
    }

All magic is in this ListKeys part, so this tells ARM to collect the key based on a reference to the storage account. (this also means that the account doing the Resource Group Deployment also needs access to the storage account).

[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('azureblob_accountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]

New CommandLet Get-IntegrationAccountMapTemplate

So another improvement is that we now can extract a map from an Integration Account in to a directly deployable ARM template. It work similar to the other extractions and bellow is a sample of how to get the Integration Account map as an ARM template.

Get-IntegrationAccountMapTemplate -ArtifactName 'mapname' -IntegrationAccount 'ianame' -ResourceGroup 'myresourcegroup' -SubscriptionId 'guid()' -TenantName 'mattiaslogdberg.onmicrosoft.com'

Summary:

Exporting Logic Apps via the Logic App Template Extractor simplifies the CI/CD area of using Logic Apps. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | LogicApps  | Integration  | ARM Templates  | ARM 


API Management Template Creator Updates December 2018

Updates on the API Management Template Creator has been published:

I’ve got a lot of help this time frome some awesome contributors a big thanks to them! NilsHedstrom, Geronius, joeyeng, Lfalk.

A minor thing for usage but great for consistency and quality in the generator is that there is now a build and release pipeline setup in DevOps.

  • Improved support for Identity Providers, Products and more
  • Added automation for Build & Release via DevOps to increase quality in merged sprints and automate release to PowerShell Gallery
  • Added to PowerShell Gallery APIManagementTemplate
  • Split the Template to a template per resource to follow API Management DevOps best pratices

Now it’s mutch easier to get started, just install the APIMangementTemplate modul from PowerShell Gallery.

PS> Install-Module -Name APIManagementTemplate

Or update to the newest version

PS> Update-Module -Name APIManagementTemplate

Split the Template to a template per resource

In order to follow the best pratice from Azure API Management DevOps example we now have a new command Write-APIManagementTemplates this command will take a template as input and split into a file per resource to make it easy to manage and handle files and create more customized deployment with a linked template. Read more at GitHub

 Get-APIManagementTemplate -APIManagement MyApiManagementInstance -ResourceGroup myResourceGroup -SubscriptionId 80d4fe69-xxxx-4dd2-a938-9250f1c8ab03 | Write-APIManagementTemplates -OutputDirectory C:\temp\templates -SeparatePolicyFile $true

Summary:

Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 


Innovation enabled by Integration

Normally we focus on technical implementations and problem solving but from time to time we get the opportunity to expand the view and focus on innovative thinking. This is the time where we can show our customers the real value of applying good architecture and principles. Currently I’m having so much fun doing innovative prototyping work with one of my customers that I need to share it with you. We are using Power Apps and Flow for our prototyping in order to get prototypes done fast and easy, without spending too much time and effort.

Power Apps is a platform where you easy can build apps in a point-and-click approach and to multiple device types at the same time, read more at: https://powerapps.microsoft.com/

Flow is a lightweight integration platform that runs on top of Logic Apps, it’s designed to make it easy to create your own flows and turn repetitive tasks into automated workflows, read more at https://emea.flow.microsoft.com/

These platforms are very easy to get started with and there are several samples of applications regarding office365 and common applications. But the real power for innovation comes clear if you have a good API landscape that could be used when building these applications.

We have a customer that we started to work with for almost a year ago and now when all the hard work is done, and we have a good integration landscape we can harvest the benefits of our good architecture. I started with showing the case described below and now they really hunger for more of this kind of innovation both for internal and external use. It’s so wonderful to see how our hard work pays off. Showing that we really are the “integration heroes”, we are the enablers, making this possible.

Architecture that enables innovation?

In reality it’s quite simple, create a platform that contains API’s and flows that can be reused. Make sure to build domain model-based API’s and function driven API’s and focus on making them simple to use and most important reusable. Here is a Sample Architecture:

Integration Architecture Sample

And in our API Management instance we have created a few domain model API’s:

API List Sample

A few months back we helped in replacing an internal system for product information, it was a system where you could enter a serial number and get basic information about the product. The end solution was a website that used a couple of API’s to retrieve the information needed. The fun part of this was that most of the API’s where already created in other projects for other/similar needs but used from other systems/apps. All we did was fine tune an API, leaving us with time to innovate. On a computer it makes sense to write a serial number but on a mobile phone we expect to be able to scan the serial. Since we had the time we started to investigate the problems around this and if this was solvable.

We started to look in to this and the first “problem” was that the serial plates on the machines had no barcodes, only pure text:

Serial Plate Sample

Luckily this function exists in the Cognitive services in Azure read more at Azure Cogniative Services

We started to test this with a Power App that used a Flow to send the image to the Cognitive Service. The Power App solution is very simple, it uses the camera to take a picture and afterwards sends the picture to a Flow that utilizes the cognitive service to identify the serial number from the plate. The result from the Flow is a URL with the serial number that we used to launch the website, making use of the web app. With this we had expanded the functionality with a Serial Plate Scanner and provided basis for the decision on investing in a mobile app.

Here is the flow:

Microsoft Flow Overview

Here is the Power App in the Power Apps Studio developed in my browser:

Microsoft Power App

As you can see there is not much going on but a Camera item with a OnSelect statement that launches the web browser with the URL that comes back from our Flow “Get-Path-To-Product”.

The flow is also quite simple:

Microsoft Flow Solution

The tricky part was how to send the image to the Cognitive Services, in this case we created a compose action with a dataUritoBinary function:

Microsoft Flow Action Compose *dataUriToBinary(triggerBody()[‘Skapa_Indata’])

And in the HTTP action we send the binary data to the OCR service: (in the sample below the OCR service is behind our API Management instance)

Microsoft Flow Action HTTP

All in all, the prototype was easy and fast to build, but most importantly it successfully gave us the power to demo the concept. Providing a visual demo to show the value of having an app that can scan the serial plate. This gave a good understanding in the benefits of the investment on a native app to decision makers without spending too much time and resources.

Summary:

The app is currently under development and we have more of these innovative prototypes coming up to prototype new or upgraded functionality in a variety of sizes. Best part of using this approach is that we can create a prototype in a day or two providing fast prototyping, giving decision makers a visual and testable solution to review. All thanks to our hard work with API’s, workflows and the amazing tools Power Apps and Flow!

Become the Integration Hero that you really can be and let that unused innovation power free!

Help Business Evolve

Posted in: | Innovation  | Power Apps  | Flow  | Tagged: | Innovation  | Integration  | Architecture  | Integration Architecture  | Power Apps  | Flow 


API Management Template Creator Updates August 2018

Updates on the API Management Template Creator has been published a few weeks back:

A minor thing for usage but great for consistency and quality in the generator is that there are alot of tests added.

  • Functions support added
  • Updated Logic App Support
  • Removed some name parameter generation

Functions support added

Finally the support for API’s that are generated from Functions is live, now an API generated from Functions get the appropiate automatic generation of function keys via ARM functions. Meaning that you don’t need to supply the function kode in the parameter files they are collected automatically via Listsecrets function in ARM.

Now:

{
      "comments": "Generated for resource /subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/properties/5b41934c6d0f59440d20c5ee",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_ibizmalo_name'),'/','5b41934c6d0f59440d20c5ee')]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "maloapimtest_HttpTriggerAdminKey_query_5b41934c6d0f59440d20c5ee",
        "value": "[listsecrets(resourceId(parameters('FunctionApp_maloapimtest_resourceGroup'),'Microsoft.Web/sites/functions', parameters('FunctionApp_maloapimtest_siteName'), parameters('operations_api-HttpTriggerAdminKey-post_name')),'2015-08-01').key]",
        "tags": [],
        "secret": true
      },
      "resources": [],
      "dependsOn": []
    }

Previously:

parameters: {
    "INT2051functionkey_value": {
      "type": "securestring",
      "defaultValue": "Aze/1W8wCwItuP8JacdyHa2vDw8YScrOkbq6uHcTXiOasb2wi3kZoQ=="
    }
 }
	. . . 
 {
      "comments": "Generated for resource /subscriptions/1fake145-d7f4-4d0f-b406-7394a2b64fb4/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apidev/properties/int2051functionkey",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_cpidev_name'), '/' ,parameters('property_INT2051functionkey_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "INT2051functionkey",
        "value": "[parameters('INT2051functionkey_value')]",
        "tags": null,
        "secret": true
      },
      "resources": [],
      "dependsOn": []
}

Updated Logic App Support

Two bigger things has been targeted to improve the export and later deployment experience for APIM API’s that uses Logic Apps.

Trigger name fix

A Logic App Trigger can now have any name, previous it was hardcoded to ‘Manual’. There is now logic added to retrieve the information at extraction time, the extractor fetches the Logic App definition and finds the HTTP trigger to get the correct name.

Trigger name is extracted from the Logic App and added to the ARM functions (customtriggername in below sample):

Logic App Trigger name

Will result in (note the ‘customtriggername’):

"[listCallbackUrl(resourceId(parameters('LogicApp_customtrigger_resourceGroup'), 'Microsoft.Logic/workflows/triggers', parameters('LogicApp_customtrigger_logicAppName'), 'customtriggername'), '2017-07-01').queries.sig]"

Add Logic App as operation to exisitng API fix

There has been diffrent behavior between API’s generated from a Logic App imported as a new operation to an existing API. The main diffrent has been that the sig has been added as a “normal” parameter and no ARM functionality added to generate the sig. This is now adressed and the appropiate ARM functions will be added and generated automatically for both types.

Properties will get value from the [ListCallBackUrl] function:

{
      "comments": "Generated for resource /subscriptions/1fake145-d7f4-4d0f-b406-7394a2b64fb4/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apidev/properties/int7100-payment-to-o-test_manual-invoke_5b349f2a42974a989226cf33",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_apidev_name'), '/' ,parameters('property_int7100-payment-to-o-dev_manual-invoke_5b349f2a42974a989226cf33_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "int7100-payment-to-o-dev_manual-invoke_5b349f2a42974a989226cf33",
        "value": "[listCallbackUrl(resourceId(parameters('LogicApp_INT7100-Payment-To-o-DEV_resourceGroup'), 'Microsoft.Logic/workflows/triggers', parameters('LogicApp_INT7100-Payment-To-o-DEV_logicAppName'), 'request'),'2017-07-01').queries.sig]",
        "tags": [],
        "secret": true
      },
      "resources": [],
      "dependsOn": []
    }

Removed some name parameters

Parameters generated for name of resources like, Policy, version sets, operations etc has no real use case or benefits associcated with changing the name and is therefore removed to simplify the ARM templates.

Sample of parameters that are now removed:

    "versionset_5af9817ca656c6952416b779_name": {
      "type": "string",
      "defaultValue": "5af9817ca656c6952416b779"
    },
    "operations_GetSalesOrderById_name": {
      "type": "string",
      "defaultValue": "GetSalesOrderById"
    },
    "operations_policy_name": {
      "type": "string",
      "defaultValue": "policy"
    }

Summary:

Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 


API Management Template Creator Updates May 2018

Updates on the API Management Template Creator has been dragging but now I’m pleased to have fixed the two most urgent ones.

  • Support for Versions, with version sets
  • Schemas

Support for Versions, with Version Sets

If an API that is extracted has enabled the current versionset is also extracted and provided inside the ARM Template. The version set states the version that is used and is needed byt the API to be able to support versions, it contains the information about how the verions is handled for the specific API. The sample bellow is using a HTTP header named API-Version to set the version of the API, the version later is then set on the API itself.

Read more in the documentation: https://docs.microsoft.com/en-us/azure/templates/microsoft.apimanagement/service/api-version-sets

{
      "comments": "Generated for resource /subscriptions/fake439-770d-43f3-9e4a-8b910457a10c/resourceGroups/API/providers/Microsoft.ApiManagement/service/dev/api-version-sets/5ae6f90fc96f5f1090700732",
      "type": "Microsoft.ApiManagement/service/api-version-sets",
      "name": "[concat(parameters('service_dev_name'), '/' ,parameters('versionset_5ae6f90fc96f5f1090700732_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "Arkivet",
        "description": null,
        "versioningScheme": "Header",
        "versionQueryName": null,
        "versionHeaderName": "API-Version"
      },
      "resources": [],
      "dependsOn": []
    }

Support for Schema

If an API has schemas added to oeprations these schemas will be extracted and added to the ARM template, all operations will also have a “dependsOn” to the specific schema to prevent errors when executing the ARM template. The sample bellow is a simple schema that is added in the ARM template and referenced in the Operations section.

Read more in the documentation: https://docs.microsoft.com/en-us/azure/templates/microsoft.apimanagement/service/apis/schemas

{
          "comments": "Generated for resource /subscriptions/fake439-770d-43f3-9e4a-8b910457a10c/resourceGroups/API/providers/Microsoft.ApiManagement/service/dev/apis/arkivet/schemas/5af038365b73730dd01453ad",
          "type": "Microsoft.ApiManagement/service/apis/schemas",
          "name": "[concat(parameters('service_dev_name'),'/',parameters('api_arkivet_name'), '/5af038365b73730dd01453ad')]",
          "apiVersion": "2017-03-01",
          "properties": {
            "contentType": "application/vnd.ms-azure-apim.swagger.definitions+json",
            "document": {
              "definitions": {
                "Body": {
                  "type": "object",
                  "properties": {
                    "No": {
                      "type": "string"
                    },
                    "ReportedDate": {
                      "type": "string",
                      "format": "date-time"
                    }
                  }
                },
                "BodyExists": {
                  "type": "object",
                  "properties": {
                    "CivicNo": {
                      "type": "string"
                    }
                  }
                },
                "Post400ApplicationJsonResponse": {
                  "type": "object",
                  "properties": {
                    "message": {
                      "type": "string"
                    },
                    "description": {
                      "type": "string"
                    },
                    "errors": {
                      "type": "array",
                      "items": {
                        "type": "object",
                        "properties": {
                          "field": {
                            "type": "string"
                          },
                          "message": {
                            "type": "string"
                          }
                        }
                      }
                    },
                    "hasErrors": {
                      "type": "boolean"
                    }
                  }
                },
                "ExistsPost200ApplicationJsonResponse": {
                  "type": "object",
                  "properties": {
                    "arkivet": {
                      "type": "string"
                    }
                  }
                }
              }
            }
          },
          "resources": [],
          "dependsOn": []
        }

Summary:

Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 


Access AAD Secured Web API's from API Management

Protecting Web Apps and Web API’s by the built in Authentication and authorization in Azure App Service is a great way to protect resources without adding code to handle the authorization. This means that the site or api is fully secure without the need of implementing it, which is a great example of seperation of concerns. Read more on how it works

What we then need is to access the API and in this case we need to be able to access the API via Management using service principals since the client using our API’s in API Management is outside our organization and should not be bothered by our internal security mechanisms.

The scenario: The API is protected by Azure AD and in order to access it we need to add a valid JWT token to the Auhtorization header in the request, first(1) we need to get the JWT token and then(2) add it in the request to the API App.

Scenario Overview

In this post we have used the standard generated API App from Visual Studio and published it to a Web API instances in Azure named Protected.

Enabling the built in security Authentication and Authorization to the API’s

In the Azure Portal head to the API App and go in to the settings tab “Authentication/Authorization”. Enable the App Service Authentication by pressing the On button.

Enable Authentication/Ahithorization

Now we can enable several diffrent providers and in this demo we will focus on the Azure Active Directory, press the Active Directory.

Configure Azure AD Athentication

After chosing the Azure Active Directory, we are presented with 2 options to configure the Azure Active Directory setup. Express gives some guidance and the possiblity to create or select an existing AD Application while in Advance we can write in the values needed, the AAD Application clientid and the issuer, normaly https://sts.windows.net/{tenantguid}/ followed by the tenat guid URL. In this setup we will use the Express option and create a new AD Application.

Here we can choose to use an existing AD Application or create a new one, I will create a new one named ProtectedAppAAD.

New Azure AD App

After this is installed we can now save and test that the API is protected. If you open a new browser or log out from the account you should be forced to login and the site is now secured.

Login Sign

So now the site is securely protected and we can access it when logged in, now we need to be able to access the API via API Managamenet.

Provide access to the API

So the API is now protected and we need to provide a way for API Management to access it, we will need to use another AAD Application (App Registration) and provide access to the ProtectedAppAAD. So we need to create a new Application, named APIManagementAADApp, the sign in URL can be http://localhost.

Create a new AAD Application

After the AAD application is created we need to provide access to the ProtectedAppAAD this is done via permissions; to assign permissions go to Settings, press Required Permissions and Add.

AAD Application add permission

Under Select an API we need to search for ProtectedAppAAD since first the list only shows Microsoft Standard API’s. Enter the ProtectedAppAAD in the text box and press Select.

AAD Application add permission select API

Now we need to select the permission and the one we want is Access, select it and press Select, finish the setup by press Done.

AAD Application add permission select API

Last step is pressing the Grant Permissions to enable the permissions (do not forget this!).

AAD Application add permission select API

API Management Policy for Aquiring JWT Token

In order to be able to expose this API we need to get a token from AAD using the Application, this will be done inside a policy and luckily for us the API Management team has provided a set of code snippets at GitHub and one of these is doing exactly that, get it here.

There is more of them at the repository in GitHub and if you have a great snippet that you want to share you can add a Pull Request and if the team approves it. it will end up there. Here is the GitHub repository, https://github.com/Azure/api-management-policy-snippets/blob/master/Snippets/.

The Get OAuth2 access token from AAD snipet looks like this.

<!-- The policy defined in this file provides an example of using OAuth2 for authorization between the gateway and a backend. -->
<!-- It shows how to obtain an access token from AAD and forward it to the backend. -->

<!-- Send request to AAD to obtain a bearer token -->
<!-- Parameters: authorizationServer - format https://login.windows.net/TENANT-GUID/oauth2/token -->
<!-- Parameters: scope - a URI encoded scope value -->
<!-- Parameters: clientId - an id obtained during app registration -->
<!-- Parameters: clientSecret - a URL encoded secret, obtained during app registration -->

<!-- Copy the following snippet into the inbound section. -->

<policies>
  <inbound>
    <base />
      <send-request ignore-error="true" timeout="20" response-variable-name="bearerToken" mode="new">
        <set-url></set-url>
        <set-method>POST</set-method>
        <set-header name="Content-Type" exists-action="override">
          <value>application/x-www-form-urlencoded</value>
        </set-header>
        <set-body>
          @{
          return "client_id=&resource=&client_secret=&grant_type=client_credentials";
          }
        </set-body>
      </send-request>

      <set-header name="Authorization" exists-action="override">
        <value>
          @("Bearer " + (String)((IResponse)context.Variables["bearerToken"]).Body.As<JObject>()["access_token"])
	  </value>
      </set-header>

      <!--  Don't expose APIM subscription key to the backend. -->
      <set-header exists-action="delete" name="Ocp-Apim-Subscription-Key"/>
  </inbound>
  <backend>
    <base />
  </backend>
  <outbound>
    <base />
  </outbound>
  <on-error>
    <base />
  </on-error>
</policies>

The way it works is that two things happen before we send the request to the backend.

  1. First a request in the send-request action is made to the AAD tenant and aquiring a token.
  2. The set-header action is adding the token in the Authroization header, by extracting it from the result from the first request.

But before we can add this snippet to our API we need to add a few values to the Named Values section in our API Management instance.

  • authorizationServer - the tenant URL My tenant ID is 1cb87777-3df4-428b-811a-86a0f215cd35 so the URL then is https://login.microsoftonline.com/1cb87777-3df4-428b-811a-86a0f215cd35/oauth2/token
  • clientId - the Application id of our AAD Application in this case of APIManagementAADApp
    • AAD Application id APIManagementAADApp
  • clientSecret - the key to our AAD Application in this case of APIManagementAADApp
    • Create a new key Create AAD Application key APIManagementAADApp
  • scope - the AAD Application id of the AAD Application that is protecting our resource, in this case the id of ProtectedAppAAD
    • AAD Application id ProtectedAppAAD

After adding these the Named Values section of our API Managemet will end up looking like this. Make sure to add these before adding the policy, since the policy canno be saved if the referenced Named Value dosen’t exist.

AAD Application id ProtectedAppAAD

Now we just need to add the policy to our API and we will be able to access the protected API, after the Policy is saved test it and you should now recieve the result from the API.

AAD Application id ProtectedAppAAD

Summary:

Todays solutions almost always accesses other resurces and when building this landscape we need to make sure that all parts are protected and secure, often even we need to combine a few of them to be compliant with requirments from customers. In this post I showed how to use one of the built in security from Web Apps with AAD that can be used standalone or as an extra layer of security above the “standard” Basic, Oauth, API keys, Certificates etc. authentications. Another benefit is that is not part of the code so it’s not possible to “forgett” or accidently removed in a bug fix or similar.

AAD Application id ProtectedAppAAD

As a bonus the API Management Snippets is a really nice initiative with pre created advanced snippets so make sure to check this out.

https://github.com/Azure/api-management-policy-snippets/blob/master/Snippets/

Posted in: | API Management  | Tagged: | API Management  | Integration  | Security 


Logic App Custom Connector WSDL Tips

So continue my journey with the Logic Apps Custom Connector and this time we are doing some work with the WSDL part of the Logic Apps Custom Connector and I thought I would share some tips and trix I’ve learned and are using when importing and using Connectors with WSDL. Since the only way to be a good user of a resource like the Custom Connector is to understand how we can investigate behaviors, tweak and fix problems and find debug/logs.

Import Error

There are several reasons why there will be errors when importing the WSDL and since it’s API Management functionality we can (until documentation is up to speed) see the limits on the API Management site:

https://docs.microsoft.com/en-us/azure/api-management/api-management-api-import-restrictions#wsdl

I encountered the “rare” error with recursive objects, apparently there where a “lazy” coder that just referenced the whole entity in a parent child situation even if the only “needed” field was the ID. Anyway removing the recursive element solves the problem,alternatively manipulating the representation could have been done.

Runtime problems, unexpected behaviors and errors from the backend system

So there are many reasons why there can be errors sent from the backend system but one of the common ones I’ve found is System.FormatException: Input string was not in a correct format and that is due to a element specified as int and the value sent in is null. Now how can that be a problem? Since the generation is done by the same logic as in API Management we can import the WSDL in API Management and see how the actual liquid template looks like.

Let’s look at an example.

With a system I’m sent a link to the WSDL and a sample this to easy integrate with them, testing the sample works fine but importing the WSDL provides alot more than the sample as we are used to there are more fields than we need. So sending in the test file via postman works good but the connector does not work. Let’s take a look why, bellow is the sample a fairly easy message with few elements.

<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
  <soap:Body>
    <ProcessArticle xmlns="http://ongoingsystems.se/WSI">
      <GoodsOwnerCode>code1234</GoodsOwnerCode>
      <UserName>user1234</UserName>
      <Password>pass1234</Password>
      <art>
        <ArticleOperation>CreateOrUpdate</ArticleOperation>
        <ArticleIdentification>ArticleNumber</ArticleIdentification>
        <ArticleNumber>6553173735310</ArticleNumber>
		<ArticleName>Display Stand</ArticleName>
        <ArticleDescription>Display Stand</ArticleDescription>
        <ArticleUnitCode>St</ArticleUnitCode>
        <IsStockArticle>1</IsStockArticle>
      </art>
    </ProcessArticle>
  </soap:Body>
</soap:Envelope>

But when using the Custom Connector all the complex structures are mapped: (this is generated via API Management):

 <set-body template="liquid">
			<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns="http://ongoingsystems.se/WSI">
				<soap:Body>
					<ProcessArticle>
						<GoodsOwnerCode></GoodsOwnerCode>
						<UserName></UserName>
						<Password></Password>
						<art>
							<ArticleOperation></ArticleOperation>
							<ArticleIdentification></ArticleIdentification>
							<ArticleSystemId></ArticleSystemId>
							<ArticleNumber></ArticleNumber>
							<ArticleName></ArticleName>
							<ProductCode></ProductCode>
							<BarCode></BarCode>
							<SupplierArticleNumber></SupplierArticleNumber>
							<ArticleDescription></ArticleDescription>
							<ArticleUnitCode></ArticleUnitCode>
							<CountryOfOriginCode></CountryOfOriginCode>
							<StatisticsNumber></StatisticsNumber>
							<PurchaseCurrencyCode></PurchaseCurrencyCode>
							<Weight></Weight>
							<NetWeight></NetWeight>
							<Volume></Volume>
							<Length></Length>
							<Width></Width>
							<Height></Height>
							<QuantityPerPallet></QuantityPerPallet>
							<QuantityPerPackage></QuantityPerPackage>
							<OrderPoint></OrderPoint>
							<Price></Price>
							<CustomerPrice></CustomerPrice>
							<PurchasePrice></PurchasePrice>
							<IsStockArticle></IsStockArticle>
							<ArticleGroup>
								<ArticleGroupOperation></ArticleGroupOperation>
								<ArticleGroupIdentification></ArticleGroupIdentification>
								<ArticleGroupCode></ArticleGroupCode>
								<ArticleGroupName></ArticleGroupName>
							</ArticleGroup>
							<ArticleCategory>
								<TypeOperation></TypeOperation>
								<TypeIdentification></TypeIdentification>
								<Code></Code>
								<Name></Name>
							</ArticleCategory>
							<VatCode>
								<VatCodeOperation></VatCodeOperation>
								<VatCodeIdentification></VatCodeIdentification>
								<VatCode></VatCode>
								<VatPercent></VatPercent>
							</VatCode>
							<DangerousGoods>
								<UNNumber></UNNumber>
								<UNIsMarineHazard></UNIsMarineHazard>
								<UNIsDangerousGoods></UNIsDangerousGoods>
								<UNPackageType></UNPackageType>
								<UNTunnelCodes>
</UNTunnelCodes>
								<UNClassNumber></UNClassNumber>
								<UNProperShippingName>
									<Name></Name>
									<LanguageCode></LanguageCode>
								</UNProperShippingName>
								<UNLabelNumbers></UNLabelNumbers>
								<DangerousGoodsCoefficient></DangerousGoodsCoefficient>
								<EmSCode></EmSCode>
							</DangerousGoods>
							<ArticleNames>
</ArticleNames>
							<ArticleStructureSpecification>
</ArticleStructureSpecification>
							<MainSupplier>
								<SupplierIdentificationType></SupplierIdentificationType>
								<SupplierOperation></SupplierOperation>
								<SupplierNumber></SupplierNumber>
								<SupplierName></SupplierName>
								<Address>
									<Name></Name>
									<Address></Address>
									<Address2></Address2>
									<Address3></Address3>
									<PostCode></PostCode>
									<City></City>
									<TelePhone></TelePhone>
									<Remark></Remark>
									<Email></Email>
									<MobilePhone></MobilePhone>
									<IsEuCountry></IsEuCountry>
									<CountryStateCode></CountryStateCode>
									<CountryCode></CountryCode>
									<DeliveryInstruction></DeliveryInstruction>
									<IsVisible></IsVisible>
									<NotifyBySMS></NotifyBySMS>
									<NotifyByEmail></NotifyByEmail>
									<NotifyByTelephone></NotifyByTelephone>
								</Address>
								<comment></comment>
							</MainSupplier>
							<IsGSPCertified></IsGSPCertified>
							<MaxStockDays></MaxStockDays>
							<BarCodePackage></BarCodePackage>
							<LinkToPicture></LinkToPicture>
							<BarCodePallet></BarCodePallet>
							<QuantityPerLayer></QuantityPerLayer>
							<PalletHeight></PalletHeight>
							<TaricNumbers>
</TaricNumbers>
							<IsObsolete></IsObsolete>
							<MinDaysToExpiryDate></MinDaysToExpiryDate>
							<AdditionalStatisticsNumber></AdditionalStatisticsNumber>
							<CustomsExportConditions></CustomsExportConditions>
							<ArticleColor>
								<ColorCode></ColorCode>
								<ColorName></ColorName>
							</ArticleColor>
							<ArticleSize>
								<SizeCode></SizeCode>
								<SizeName></SizeName>
							</ArticleSize>
							<IsSerialNumberArticle></IsSerialNumberArticle>
							<IsBatchArticle></IsBatchArticle>
							<ArticleDefinitionClasses>
								<ArticleDefinitionClassesOperation></ArticleDefinitionClassesOperation>
								<Classes>
</Classes>
							</ArticleDefinitionClasses>
							<ArticleFreeDecimal1></ArticleFreeDecimal1>
							<ArticleFreeDecimal2></ArticleFreeDecimal2>
						</art>
					</ProcessArticle>
				</soap:Body>
			</soap:Envelope>
		</set-body>

And we start running in to problems due to the fact that we are not sending in more data than earlier but the generated xml is much larger and just look at the enormous representation in the GUI:

Unmodified Connector in GUI

Therefore we get some unwanted errors since there are some implications added when sending in starts kin complex structures, since I can’t really do much about the Logic I need to modify the data sent in, but I can modify the WSDL and reimport it. So after changing the WSDL to only contain the elements that I needed in my request it looks alot better and the xml message sent are now matching the sample:

Modified definition in the WSDL so the definition is minimal:

<s:complexType name="ArticleDefinition">
        <s:sequence>
          <s:element minOccurs="1" maxOccurs="1" name="ArticleOperation" type="tns:ArticleOperation" />
          <s:element minOccurs="1" maxOccurs="1" name="ArticleIdentification" type="tns:ArticleIdentificationType" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleNumber" type="s:string" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleName" type="s:string" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleDescription" type="s:string" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleUnitCode" type="s:string" />
          <s:element minOccurs="1" maxOccurs="1" name="IsStockArticle" nillable="true" type="s:boolean" />
          <s:element minOccurs="0" maxOccurs="1" name="Weight" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="NetWeight" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Volume" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Length" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Width" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Height" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="QuantityPerPallet" type="s:int" />
          <s:element minOccurs="0" maxOccurs="1" name="QuantityPerPackage" type="s:int" />
        </s:sequence>
      </s:complexType>

Sample from the GUI:

Unmodified Connector in GUI

The request can now be sent to the backend without any problems. This approach can be used to both detect problems and also understand the behavior of the Custom Connector and changeing the WSDL can help us to easier use the Connector even if the maintainance is heavier and we need to keep track of these changes and do the again if a update WSDL would be imported.

Summary

As stated we need to find the ways of understanding the resource before beeing good users of it, so I hope this will help out and give ideas around workarounds for some troublesome scenarios and that more debug and customization properties is coming along the way. When more power is needed I would advise using API Management for now until there is more customization properties but most of the time it works like a charm!

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Custom Connector 


CI with Logic App Custom Connector

Just had some fortune to work with some projects using the Logic App Custom Connector it’s alot of fun now with all new capabilites on prem connectivity and all and as allways we end upp with setting up CI deployments via TFS.

That is when I found that the On Prem Data Gateway ARM template is poorly documented and the generated template from automation script are missing some important properties. So I thought I’ll gather the information needed here, provide som pitfalls and so on.

The schema for the ARM template can be found at GitHub: https://github.com/Azure/azure-resource-manager-schemas/blob/master/schemas/2016-06-01/Microsoft.Web.json.

It’s inside the Web schema and not that “easy” to get that you are in the correct definition but go down to row 119 and you find the customApis part, this is good for reference.

So let’s see how a ARM template for the On Premise Data Gateway looks like when complete: ( I’m actually adding a fully working connector for https://billogram.com/) The full represenatation of the swagger can also be found at my GitHub repository https://github.com/MLogdberg/LogicAppCustomConnectors/tree/master/Billogram

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "customApis_name": {
      "defaultValue": "Billogram",
      "type": "String"
    },
    "location": {
      "defaultValue": "westeurope",
      "type": "String"
    },
    "serviceUrl": {
      "defaultValue": "https://sandbox.billogram.com/api/v2",
      "type": "String"
    }
  },
  "variables": {},
  "resources": [
    {
      "type": "Microsoft.Web/customApis",
      "name": "[parameters('customApis_name')]",
      "apiVersion": "2016-06-01",
      "location": "[parameters('location')]",
      "properties": {
        "connectionParameters": {
          "username": {
            "type": "securestring",
            "uiDefinition": {
              "displayName": "API USer",
              "description": "The API User for this api",
              "tooltip": "Provide the API User",
              "constraints": {
                "tabIndex": 2,
                "clearText": true,
                "required": "true"
              }
            }
          },
          "password": {
            "type": "securestring",
            "uiDefinition": {
              "displayName": "API Password",
              "description": "The API Password for this api",
              "tooltip": "Provide the API Apssword",
              "constraints": {
                "tabIndex": 3,
                "clearText": false,
                "required": "true"
              }
            }
          }
        },
        "backendService": {
          "serviceUrl": "[parameters('serviceUrl')]"
        },
        "swagger": {
          "swagger": "2.0",
          "info": {
            "description": "The Billogram API is built according to RESTful principles, which means it uses HTTP as an application protocol rather than just as a transport layer for a custom protocol, like SOAP does. In other words, HTTP features such as the various request types (GET, PUT, POST, DELETE), response codes (403 Forbidden, 404 Not Found, 500 Internal Server Error) and standard headers (Accept, Authorization, Cache-Control) are a part of the API.",
            "version": "1.0.0",
            "title": "Swagger Invoice/Billogram",
            "termsOfService": "http://swagger.io/terms/",
            "contact": {
              "email": "billogram@billogram.se"
            }
          },
          "host": "sandbox.billogram.com",
          "basePath": "/api/v2",
          "schemes": [
            "https"
          ],
          "consumes": [],
          "produces": [],
          "paths": {
            ...removed for simplicity display...
              }
            }
          },
          "parameters": {},
          "responses": {},
          "securityDefinitions": {
            "basic_auth": {
              "type": "basic"
            }
          },
          "security": [
            {
              "basic_auth": []
            }
          ],
          "tags": [
            {
              "name": "Billogram Invoice",
              "description": "Handling Billogram/Invoices",
              "externalDocs": {
                "description": "Billogram structure",
                "url": "https://billogram.com/api/documentation#billogram_structure"
              }
            }
          ],
          "externalDocs": {
            "description": "Find out more about Swagger",
            "url": "http://swagger.io"
          }
        },
        "description": "My ARM deployed Custom Connector",
        "displayName": "[parameters('customApis_name')]",
        "iconUri": "/Content/retail/assets/default-connection-icon.6296199fc1d74aa45ca68e1253762025.2.svg"
      },
      "dependsOn": []
    }
  ]
}

Make sure to add both the swagger and the backendService object since the url specified in the serviceUrl since this is mandator and the URL that the Custom Connector are using during runtime, not the one specified in the swagger, this is good for easier management of dev/test/prod environments.

The Custom Connector is a great addon to Logic Apps and by been able to easily manage and deploy them is crusial, so even if this is great to be able to manage my Custom Connector via ARM I’m still missing the possibility to reference my swagger via a url since that would be most suitable way of deploying and kkeping the ARM template and swagger separated.

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Custom Connector 


Januari update Logic App Template Creator

A new year comes with new updates, as you might now there is now a Custom Connector option for Logic Apps and earlier this had some changes in the reource id handling in comparision with the managed api’s. So this has now been updated and fixed.

Also there where some issues regarding names that could not contain . (dot) and ohter special characters that has been fixed, a new way of handlign reource id now prevent problems related to naming restrictions.

The last part added was an option to be able to save the incoming body of the requested resources, this was done to be able to easier to debugg when developing, thanks to this and a Mock resource collector class that can replay the saved files we now can create full test runs that are replaying the whole collection of resources.

The github repository is found here: https://github.com/jeffhollan/LogicAppTemplateCreator

Updates

  • Now supporting extraction of Logic Apps containg Custom Connectors and generating working importable Connection resources with the Logic App.
  • Updates to resource id handling, now able to handle resource groups, resouirce names with .(dot) and other special characters that previously was not supported
  • Added a resource handler to be able to print out the incoming requests, to be able to build complete test suites with full replay of a extraction of a Logic App

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Template Generator