Azure Functions logs in Application Insights

Azure Functions is a great tool in our toolbox and as all our tools they have their strengths and flaws. When it comes to logging and monitoring Functions rely on Application Insight’s and later on Azure Monitor. It’s also alot dependent on how the implement of your solution, and there are som out of box features that are really amazing.

If you are unfamiliar with Functions and Application Insights, read more here: https://docs.microsoft.com/en-us/azure/azure-functions/functions-monitoring

By just enabling application insights, without any further addons we get all the logs that are sent to the ILogger object to application Insights. And we also get some End-To-End tracking via supported SDK’s and HttpClient requests. This is very useful to get an understanding of what is happening. So let’s look in to how this manifests in a rather complex scenario.

The Scenario is that a message is recieve via API Management, sent to a Azure Function that publishes the message on a topic, a second Azure Function reads the published message and publishes the message on a second topic, a third Azure Function recieves the message and sends the message over http to a Logic App that represents a “3:e party system”.

Scenario image

After connecting all functions to the same Application Insight instance we are sending a message! Let’s see how this can look like in Application Insights, first we go to the Performance tab:

Performance tab

First let’s just point out some small good to know things, the Operation Name in this view is actually the name of the function that you execute, so Function 1 in the list of Operation Names bellow:

Operation Name

Has this signature in the code FunctionName(“Function1”), this is important as your instances grow make sure the name of the function is unique to easily find em.

 [FunctionName("Function1")]
        [return: ServiceBus("topic1", Microsoft.Azure.WebJobs.ServiceBus.EntityType.Topic, Connection = "servicebusConnection")]
        public static async Task<string> Run(
            [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
            ILogger log)
        {

Then we pick one of the requests, press the samples and pick one, it loads the end-to-end overivew

End to end Overview

We can actually see all the steps from the first function to the last function and the call to the Logic App, impressive! The message is beeing sent over Service bus topics but the correlation is all set and we can follow the message all the way!

So this is truly awesome and alot of standard Properties are set out of box, but none specific to our request so it’s hard to understand what data is sent so let’s look in some special ways of adding custom information to the logs.

Logs in Application Insights

Out of box logs is sent to Application Insights via the ILogger object so let’s take a short look at the interface, it contains 3 different operations, Log that logs some information, there are different types and extensions on this to make it easier to use. An isEnabled check to see if the logger is enabled and last a BeginScope operation that will create a scope for our log entries. Read more

public interface ILogger
{
    //
    // Summary:
    //     Begins a logical operation scope.
    //
    // Parameters:
    //   state:
    //     The identifier for the scope.
    //
    // Returns:
    //     An IDisposable that ends the logical operation scope on dispose.
    IDisposable BeginScope<TState>(TState state);
    //
    // Summary:
    //     Checks if the given logLevel is enabled.
    //
    // Parameters:
    //   logLevel:
    //     level to be checked.
    //
    // Returns:
    //     true if enabled.
    bool IsEnabled(LogLevel logLevel);
    //
    // Summary:
    //     Writes a log entry.
    //
    // Parameters:
    //   logLevel:
    //     Entry will be written on this level.
    //
    //   eventId:
    //     Id of the event.
    //
    //   state:
    //     The entry to be written. Can be also an object.
    //
    //   exception:
    //     The exception related to this entry.
    //
    //   formatter:
    //     Function to create a string message of the state and exception.
    void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception exception, Func<TState, Exception, string> formatter);
}

Let’s look at the simplest one just adding a Log text with LogInformation (an extension on Log)some unique data this is to log our productnumber, so we can see what productnumber this request is processing:

	log.LogInformation("ProductNumber {productnumber}", productnumber);

Let’s move to the collected Telemetry to see this log in Azure Portal:

Scenario image

And now we can find our log row:

ProductNumber log row

So now we know this request is processing the specific product, this is searchable in the Search Menu, and an easy click will take you to the End-to-End tracking.

ProductNumber log row

Create a log scope with BeginScope

Sometimes we need to add some data that will be added as properties to all upcoming logs, this can be done via the BeginScope, with the following line all upcoming logs will automatically have the market and productnumber added as properties, the are added as with a prefix of prop__

	using (log.BeginScope("Function3 processed message for {market} with productnumber {productnumber}", market, productnumber))
	{
		log.LogInformation("Processing Lines");
	}

Upcoming Log entry will have prop_market and prop_productnumber.

Properties added with Begin Scope

One good use is to do looping over multiple items, and therefore all entries will have these properties added, also all other properties added with the {} format in log.Information as lineNumber and guid in bellow sample will be added in the customDimension with prefix prop__

	 log.LogInformation("Processing {lineNumber} and random guid {guid}",i.ToString(),Guid.NewGuid().ToString());

And if we want to loop over multiple lines we can add an extra scope to make sure we log the specific unique data for each iteration, following we log the lineNumber and a random guid and as you can see we still have the productnumber and market from before all with the prop__ prefix.

Properties added with Begin Scope

Code sample of the loop

	using (log.BeginScope("Function3 processed message for {market} with productnumber {productnumber}", market, productnumber))
    {
	    log.LogInformation("Processing Lines");
	    for (int i = 0; i < 10; i++)
	    {
	        using (log.BeginScope("Function3 Processing line {lineNumber}", i.ToString()))
	        {
	            log.LogInformation("Processing {lineNumber} and random guid {guid}",i.ToString(),Guid.NewGuid().ToString());
	        }
	    }
	}

Add Filterable Property to Request

Let’s say that we want to not just find a specific run, we want to understand overall performance or errors on a more wider term, like for a market. Let’s consider we want to understand performance or failures for a specific market. This is done via the Performance or Failures tabs, but in order to narrow down the data presented we need to be able to filter the data like the the image bellow adding a filter on Market.

ProductNumber log row

To get this working we need to add an property to our request log. (The request log is the first log event, see the telemetry for more information) the code to achieve this is super simple we just need to add this small snippet of code:

System.Diagnostics.Activity.Current?.AddTag("Market", market);

The function now looks like:

  public static class Function1
    {
        [FunctionName("Function1")]
        [return: ServiceBus("topic1", Microsoft.Azure.WebJobs.ServiceBus.EntityType.Topic, Connection = "servicebusConnection")]
        public static async Task<string> Run(
            [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
            ILogger log)
        {
            log.LogInformation("C# HTTP trigger function processed a request.");

            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            dynamic data = JsonConvert.DeserializeObject(requestBody);
            string productnumber = (string)data.productnumber;
            string market = (string)data.marketnumber;

            log.LogInformation("ProductNumber {productnumber}", productnumber);
            System.Diagnostics.Activity.Current?.AddTag("Market", market);

           
            return JsonConvert.SerializeObject(data);
        }
    }

This will now give us the following result:

In the Search section we can pick the Market in the filter to build our queries.

Search with add filter Market

In the Search section we can pick the Market in the filter to build our queries.

Search with add filter Market

In the Performance and Failures sections we need to add a bit more to be able to work with the filter, we need to add customDimensions before Market making the filter look like customDimensions.Market.

Search with add filter Market

And as you can see we get the filtered result after pressing the green confirm button.

Search with add filter Market

Adding more Tags will allow more possibliites for search, the tag will be attached only to the request log entry and not to the following traces. But I would recomend logging both a id and a more broader value like market or similar to find errors/performace issues that span a broader range.

Summary:

I love developing in Azure Functions but in order to have a good Operations experience there is alot of things to consider in the logging strategy to make sure it’s easy to understand what is going on and how to take advantage of the statistics provided by Application Insights.

This post is about how things can be done and hopefully this can be a guidance in creating a better logging and searching experience, we want to avoid the developers “let me just attach my debugger” response to what is the problem with the flow running in production.

Posted in: | Azure Functions  | Application Insights  | Tagged: | Azure Functions  | Integration  | Serverless  | Application Insights 


August 2019 update Logic App Template Creator

There has been some time since last updates where announced on the Logic App Template Creator, but work has been ongoing thanks to contributors so time to form out the new updates.

There are some small changes and bug fixes as usual and fixed some issues reported but here is some new functionality.

The github repository is found here: https://github.com/jeffhollan/LogicAppTemplateCreator

Updates

  • ISE support, you can now extract Logic Apps in ISE for deployments
  • Extract custom Connector, you can now extract an ARM template for Custom Connectors
  • Extract Integration Account Schemas, you can now extract an ARM template for Integration Account Schemas.
  • Get the Deployed Logic App’s URL as an output parameter from the ARM template deploy (usable when URL is needed in nested ARM templates)
  • Improved Get-Help method to provide more valuable help and information
  • Extract Logic App that will be in disabled mode when deployed
  • Possibility to improved security around parameters and passwords generated in the ARM template and paramter files

Thanks to contributors!

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Template Generator 


Resource Group Deployments 800 Limit fix

There is a limit of how many historical deployments that are saved for a resource group in Azure. Currently the limit is 800 and yet there are no out of box solution to start deleting these historical deployments when the limit is closing in. So what happens is that the next deployment, the 801:st is failing error code is DeploymentQuotaExceeded and message as follows:

Creating the deployment 'cra3000_website-20190131-150417-7d5f' would exceed the quota of '800'. The current deployment count is '800', please delete some deployments before creating a new one. Please see https://aka.ms/arm-deploy for usage details.

So to solve this we manually entered the resource group and deleted some deployments, but that isent what we want to do in the long run. So I looked around and found some powershell scripts I could reuse for this purpose and created a runbbook in an automation account.

I’m not no pro on automation account but I can set em up and use them, there might bsome of you who would give me a tip or 2 on this subject.

So when creating the account we need to use the Azure Run As account, since this will create an account i AAD and this will be used for access, by default the account will have contributor access on the whole subscription (if you have the permissions to create it) otherwise you need someone who can.

Create Automation aCcount

If we then look at the resource group we can find this account with contributor access, wich also means that if you only want it to be for sertain resource groups all you need to do is to restrict access for this user and when the script runs Get-AzureRmResourceGroup only the resource groups where this user has access to is returned. (by default the whole subscription)

Automation account contributor

So the script is simple and there might be some improvements with parallel actions needed for big environments but after running it on schedule for a while it shouldnt be a big issue.

All in all it collects all resource groups that the user has access to and start collecting deployments, in this script I always skip to run the script if there is less than 100 deployments made since it’s no pain in keeping these, it’s just painfull if we get over 800. But if we have more the script starts deleting deployments older than 90 days (by degault) to keep some history there, this can easily be changed in the script. This is run via a Powershell Workbook.

$daysBackToKeep = 90

$connectionName = "AzureRunAsConnection"
try
{
    # Get the connection "AzureRunAsConnection "
    $servicePrincipalConnection = Get-AutomationConnection -Name $connectionName  

    "Logging in to Azure..."
    (Add-AzureRmAccount `
        -ServicePrincipal `
        -TenantId $servicePrincipalConnection.TenantId `
        -ApplicationId $servicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint ) | out-null

    "Obtaining Subscription information..."
    $context = Get-AzureRmContext
    Write-Output ("Subscription Name: '{0}'" -f $context.SubscriptionName)
}
catch {
    if (!$servicePrincipalConnection)
    {
        $ErrorMessage = "Connection $connectionName not found."
        throw $ErrorMessage
    } else {
        Write-Error -Message $_.Exception
        throw $_.Exception
    }
}

$ResourceGroups = Get-AzureRmResourceGroup

if ($ResourceGroupName)
{
    $ResourceGroups = $ResourceGroups | where { $_.ResourceGroupName -EQ $ResourceGroupName }
}

foreach( $resouregroup in $ResourceGroups) {
    $rgname = $resouregroup.ResourceGroupName
    Write-Output ("Resource Group: '{0}'" -f $rgname)

    $deployments = Get-AzureRmResourceGroupDeployment -ResourceGroupName $rgname
    Write-Output ("Deployments: '{0}'" -f $deployments.Count)

    if($deployments.Count -gt 100 ) {
        $deploymentsToDelete = $deployments | where { $_.Timestamp -lt ((get-date).AddDays(-1*$daysBackToKeep)) }
        Write-Output ("Deloyments to delete: '{0}'" -f $deploymentsToDelete.Count )
        foreach ($deployment in $deploymentsToDelete) { 

            Remove-AzureRmResourceGroupDeployment -ResourceGroupName $rgname -DeploymentName $deployment.DeploymentName -Force
            Write-Output ("Deloyments deleted: '{0}'" -f $deployment.DeploymentName )
        }
        Write-Output ("Deployments deleted")
    }
    else {
        Write-Output ("Less than 100 deployments to deleted, only {0}" -f $deployments.Count)
    }

}

Running it will start deleting deployments, so here is a pre run deployments history

Deployments history pre clear

And after we can see that history is deleted (for this run I just removed the 100 check)

Deployments history post clear

After running it the logs will look something like this:

![Runbook logs](/assets/uploads/2019/02/runbook_logs.png

Summary:

This limit is annoying and as I understand something that should be fixed over time, but until then we need to have these kind of helping scripts in environments with alot of deployments. So hopefully this helps out in the meantime.

Read more on the limits:

https://docs.microsoft.com/en-us/azure/azure-subscription-service-limits#resource-group-limits

Posted in: | ARM  | Tagged: | Azure  | ARM  | Azure Resource Manager  | Powershell  | Automation Account 


Fixing Function v2 add extension issues in portal

I’m out doing educations and it’s both fun and a great chance for me to learn new stuffs since it’s always something that just for no reason going south.

So today I was out on one of these missions and we where doing some labs developing functions, when we added an extra input from a table storage the Function App just chrashed and then we just got errors that the site was offline and it dident come up. Everybody encountered it so I had to investigate.

The response was a 503 Service Unavailable and the body was the following html content (removed the style since it just took alot of space):

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" >
<head>
    <title>Host Offline</title>
 </head>
<body>
    <div>&nbsp;</div>
    <div id="wrapper">
        
        <div id="page">
            <div id="content">

                <div class="box">
                    <h2>This Azure Functions app is down for maintenance</h2>
                    <p>
                        Host is offline
                    </p>
                </div>
            </div>
        </div>
    </div>
     
</body>
</html>

So what did we do to get here? Well we just wanted to add an input to from a Table Storage to a standard HTTP Trigger and we started from a blank new defualt function.

Here is my default http function: Defualt Http Trigger

Then I went to the Integrate pane and started to add the input from my Azure Table Storage

Integrate Pane

After selecting the Azure Table Storage we get an warning that the extension is not installed, let’s install it!

Install Storage Extension

It’s installing and we proceed as told in the message:

Install Storage Extension Message

So we just continued and filled the other details and pressed Save.

After that we went back to the function to add the input, it’s just a simple signature update from:

public static async Task<IActionResult> Run(HttpRequest req, ILogger log)

To:

public static async Task<IActionResult> Run(HttpRequest req,Newtonsoft.Json.Linq.JArray inputTable, ILogger log)

Now when we run the function the inputTable should be populated from the Azure Table Storage but when we try to execute it we just get a http error code 503 Service Unavailable and in the body I can see the text “Host Offline”.

503 Service Unavailable Host Offline

So we started to look into this and read that it could be due to extension not installed properly, to fix this we followed this guide: https://github.com/Azure/azure-functions-host/wiki/Updating-your-function-app-extensions

If you have the same problemes make sure that the extension file exists and contains atleast “Microsoft.Azure.WebJobs.Extensions.Storage if not create and/or add “Microsoft.Azure.WebJobs.Extensions.Storage to the file. Sample bellow:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netstandard2.0</TargetFramework>
    <WarningsAsErrors />
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Microsoft.Azure.WebJobs.Extensions.Storage" Version="3.0.0" />
    <PackageReference Include="Microsoft.Azure.WebJobs.Script.ExtensionsMetadataGenerator" Version="1.0.*" />
  </ItemGroup>
</Project>  

If you are missing the file it can easily be created in the command prompt witb a simple script:

echo.>extensions.csproj

Run the build command (this takes a while to run so be patient):

dotnet build extensions.csproj -o bin --no-incremental --packages D:\home\.nuget

Restore extension build complete

So now the extensions is installed, let’s verify if this solves the issue completly? Start the Function App and test the function again.

If you are as unlucky as me and the function is not working we need to do one more thing. Let’s go back to Kudos and the CMD Prompt again, the problem is the app_offline.htm file:

Restore extension build complete

Delete it and go back and test agian, it works again!

Restore extension build complete

Summary:

The problem is that the extension is not properly installed and when we add the extension in the configuration without the extension installed the Function App chrashes as produces the app_offline.htm and if it’s present in the folder the response is defaulted to 503 and Host Offline, by removing the file the Function App starts executing as normal and if we have fixed the extensions no errors comes up. This works for all extensions.

By showing the problem and the reproduce scenario we can help out improving the products! So I hope this helps anyone and leads to a fix from the prodcut team.

Posted in: | Azure Functions  | Tagged: | Azure Functions  | Integration  | Serverless 


Logic App Template Creator Updates Januari 2019

Updates on the Logic Apps Template Creator has been published:

A minor thing for usage but great for consistency and quality in the generator is that there is now a build and release pipeline setup in DevOps.

  • Added Build & Release via DevOps to increase quality in merged sprints and automate release to PowerShell Gallery LogicAppTemplate
  • Improved support for Connectors to Storage Tables and Queues.
  • Added Commandlet to generate ARM Template for Integration Account Maps

Now the LogicAppTemplate is updated more frequqntly since I’ve added Build and Release setup to publish new releases to PowerShell Gallery.

PS> Install-Module -Name LogicAppTemplate

Or update to the newest version

PS> Update-Module -Name LogicAppTemplate

Storage Connector Tables and Queues

Now generated on the same way as Storage Blob Connector, meaning that the key will be collected duirng the deployment time based on the storage account name instead of needed to be provided as parameters. This will make it simpler and more neat to do deployments.

There are 3 parameters added

"azureblob_name": {
      "type": "string",
      "defaultValue": "azureblob"
    },
    "azureblob_displayName": {
      "type": "string",
      "defaultValue": "myDisplayName"
    },
    "azureblob_accountName": {
      "type": "string",
      "defaultValue": "myStorageAccountName",
      "metadata": {
        "description": "Name of the storage account the connector should use."
      }
    }

And they are later used in the connection to get the accountKey automatically during deployment.

 {
      "type": "Microsoft.Web/connections",
      "apiVersion": "2016-06-01",
      "location": "[parameters('logicAppLocation')]",
      "name": "[parameters('azureblob_name')]",
      "properties": {
        "api": {
          "id": "[concat('/subscriptions/',subscription().subscriptionId,'/providers/Microsoft.Web/locations/',parameters('logicAppLocation'),'/managedApis/azureblob')]"
        },
        "displayName": "[parameters('azureblob_displayName')]",
        "parameterValues": {
          "accountName": "[parameters('azureblob_accountName')]",
          "accessKey": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('azureblob_accountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]"
        }
      }
    }

All magic is in this ListKeys part, so this tells ARM to collect the key based on a reference to the storage account. (this also means that the account doing the Resource Group Deployment also needs access to the storage account).

[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('azureblob_accountName')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]

New CommandLet Get-IntegrationAccountMapTemplate

So another improvement is that we now can extract a map from an Integration Account in to a directly deployable ARM template. It work similar to the other extractions and bellow is a sample of how to get the Integration Account map as an ARM template.

Get-IntegrationAccountMapTemplate -ArtifactName 'mapname' -IntegrationAccount 'ianame' -ResourceGroup 'myresourcegroup' -SubscriptionId 'guid()' -TenantName 'mattiaslogdberg.onmicrosoft.com'

Summary:

Exporting Logic Apps via the Logic App Template Extractor simplifies the CI/CD area of using Logic Apps. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | LogicApps  | Integration  | ARM Templates  | ARM 


API Management Template Creator Updates December 2018

Updates on the API Management Template Creator has been published:

I’ve got a lot of help this time frome some awesome contributors a big thanks to them! NilsHedstrom, Geronius, joeyeng, Lfalk.

A minor thing for usage but great for consistency and quality in the generator is that there is now a build and release pipeline setup in DevOps.

  • Improved support for Identity Providers, Products and more
  • Added automation for Build & Release via DevOps to increase quality in merged sprints and automate release to PowerShell Gallery
  • Added to PowerShell Gallery APIManagementTemplate
  • Split the Template to a template per resource to follow API Management DevOps best pratices

Now it’s mutch easier to get started, just install the APIMangementTemplate modul from PowerShell Gallery.

PS> Install-Module -Name APIManagementTemplate

Or update to the newest version

PS> Update-Module -Name APIManagementTemplate

Split the Template to a template per resource

In order to follow the best pratice from Azure API Management DevOps example we now have a new command Write-APIManagementTemplates this command will take a template as input and split into a file per resource to make it easy to manage and handle files and create more customized deployment with a linked template. Read more at GitHub

 Get-APIManagementTemplate -APIManagement MyApiManagementInstance -ResourceGroup myResourceGroup -SubscriptionId 80d4fe69-xxxx-4dd2-a938-9250f1c8ab03 | Write-APIManagementTemplates -OutputDirectory C:\temp\templates -SeparatePolicyFile $true

Summary:

Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 


Innovation enabled by Integration

Normally we focus on technical implementations and problem solving but from time to time we get the opportunity to expand the view and focus on innovative thinking. This is the time where we can show our customers the real value of applying good architecture and principles. Currently I’m having so much fun doing innovative prototyping work with one of my customers that I need to share it with you. We are using Power Apps and Flow for our prototyping in order to get prototypes done fast and easy, without spending too much time and effort.

Power Apps is a platform where you easy can build apps in a point-and-click approach and to multiple device types at the same time, read more at: https://powerapps.microsoft.com/

Flow is a lightweight integration platform that runs on top of Logic Apps, it’s designed to make it easy to create your own flows and turn repetitive tasks into automated workflows, read more at https://emea.flow.microsoft.com/

These platforms are very easy to get started with and there are several samples of applications regarding office365 and common applications. But the real power for innovation comes clear if you have a good API landscape that could be used when building these applications.

We have a customer that we started to work with for almost a year ago and now when all the hard work is done, and we have a good integration landscape we can harvest the benefits of our good architecture. I started with showing the case described below and now they really hunger for more of this kind of innovation both for internal and external use. It’s so wonderful to see how our hard work pays off. Showing that we really are the “integration heroes”, we are the enablers, making this possible.

Architecture that enables innovation?

In reality it’s quite simple, create a platform that contains API’s and flows that can be reused. Make sure to build domain model-based API’s and function driven API’s and focus on making them simple to use and most important reusable. Here is a Sample Architecture:

Integration Architecture Sample

And in our API Management instance we have created a few domain model API’s:

API List Sample

A few months back we helped in replacing an internal system for product information, it was a system where you could enter a serial number and get basic information about the product. The end solution was a website that used a couple of API’s to retrieve the information needed. The fun part of this was that most of the API’s where already created in other projects for other/similar needs but used from other systems/apps. All we did was fine tune an API, leaving us with time to innovate. On a computer it makes sense to write a serial number but on a mobile phone we expect to be able to scan the serial. Since we had the time we started to investigate the problems around this and if this was solvable.

We started to look in to this and the first “problem” was that the serial plates on the machines had no barcodes, only pure text:

Serial Plate Sample

Luckily this function exists in the Cognitive services in Azure read more at Azure Cogniative Services

We started to test this with a Power App that used a Flow to send the image to the Cognitive Service. The Power App solution is very simple, it uses the camera to take a picture and afterwards sends the picture to a Flow that utilizes the cognitive service to identify the serial number from the plate. The result from the Flow is a URL with the serial number that we used to launch the website, making use of the web app. With this we had expanded the functionality with a Serial Plate Scanner and provided basis for the decision on investing in a mobile app.

Here is the flow:

Microsoft Flow Overview

Here is the Power App in the Power Apps Studio developed in my browser:

Microsoft Power App

As you can see there is not much going on but a Camera item with a OnSelect statement that launches the web browser with the URL that comes back from our Flow “Get-Path-To-Product”.

The flow is also quite simple:

Microsoft Flow Solution

The tricky part was how to send the image to the Cognitive Services, in this case we created a compose action with a dataUritoBinary function:

Microsoft Flow Action Compose *dataUriToBinary(triggerBody()[‘Skapa_Indata’])

And in the HTTP action we send the binary data to the OCR service: (in the sample below the OCR service is behind our API Management instance)

Microsoft Flow Action HTTP

All in all, the prototype was easy and fast to build, but most importantly it successfully gave us the power to demo the concept. Providing a visual demo to show the value of having an app that can scan the serial plate. This gave a good understanding in the benefits of the investment on a native app to decision makers without spending too much time and resources.

Summary:

The app is currently under development and we have more of these innovative prototypes coming up to prototype new or upgraded functionality in a variety of sizes. Best part of using this approach is that we can create a prototype in a day or two providing fast prototyping, giving decision makers a visual and testable solution to review. All thanks to our hard work with API’s, workflows and the amazing tools Power Apps and Flow!

Become the Integration Hero that you really can be and let that unused innovation power free!

Help Business Evolve

Posted in: | Innovation  | Power Apps  | Flow  | Tagged: | Innovation  | Integration  | Architecture  | Integration Architecture  | Power Apps  | Flow 


API Management Template Creator Updates August 2018

Updates on the API Management Template Creator has been published a few weeks back:

A minor thing for usage but great for consistency and quality in the generator is that there are alot of tests added.

  • Functions support added
  • Updated Logic App Support
  • Removed some name parameter generation

Functions support added

Finally the support for API’s that are generated from Functions is live, now an API generated from Functions get the appropiate automatic generation of function keys via ARM functions. Meaning that you don’t need to supply the function kode in the parameter files they are collected automatically via Listsecrets function in ARM.

Now:

{
      "comments": "Generated for resource /subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/properties/5b41934c6d0f59440d20c5ee",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_ibizmalo_name'),'/','5b41934c6d0f59440d20c5ee')]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "maloapimtest_HttpTriggerAdminKey_query_5b41934c6d0f59440d20c5ee",
        "value": "[listsecrets(resourceId(parameters('FunctionApp_maloapimtest_resourceGroup'),'Microsoft.Web/sites/functions', parameters('FunctionApp_maloapimtest_siteName'), parameters('operations_api-HttpTriggerAdminKey-post_name')),'2015-08-01').key]",
        "tags": [],
        "secret": true
      },
      "resources": [],
      "dependsOn": []
    }

Previously:

parameters: {
    "INT2051functionkey_value": {
      "type": "securestring",
      "defaultValue": "Aze/1W8wCwItuP8JacdyHa2vDw8YScrOkbq6uHcTXiOasb2wi3kZoQ=="
    }
 }
	. . . 
 {
      "comments": "Generated for resource /subscriptions/1fake145-d7f4-4d0f-b406-7394a2b64fb4/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apidev/properties/int2051functionkey",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_cpidev_name'), '/' ,parameters('property_INT2051functionkey_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "INT2051functionkey",
        "value": "[parameters('INT2051functionkey_value')]",
        "tags": null,
        "secret": true
      },
      "resources": [],
      "dependsOn": []
}

Updated Logic App Support

Two bigger things has been targeted to improve the export and later deployment experience for APIM API’s that uses Logic Apps.

Trigger name fix

A Logic App Trigger can now have any name, previous it was hardcoded to ‘Manual’. There is now logic added to retrieve the information at extraction time, the extractor fetches the Logic App definition and finds the HTTP trigger to get the correct name.

Trigger name is extracted from the Logic App and added to the ARM functions (customtriggername in below sample):

Logic App Trigger name

Will result in (note the ‘customtriggername’):

"[listCallbackUrl(resourceId(parameters('LogicApp_customtrigger_resourceGroup'), 'Microsoft.Logic/workflows/triggers', parameters('LogicApp_customtrigger_logicAppName'), 'customtriggername'), '2017-07-01').queries.sig]"

Add Logic App as operation to exisitng API fix

There has been diffrent behavior between API’s generated from a Logic App imported as a new operation to an existing API. The main diffrent has been that the sig has been added as a “normal” parameter and no ARM functionality added to generate the sig. This is now adressed and the appropiate ARM functions will be added and generated automatically for both types.

Properties will get value from the [ListCallBackUrl] function:

{
      "comments": "Generated for resource /subscriptions/1fake145-d7f4-4d0f-b406-7394a2b64fb4/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apidev/properties/int7100-payment-to-o-test_manual-invoke_5b349f2a42974a989226cf33",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_apidev_name'), '/' ,parameters('property_int7100-payment-to-o-dev_manual-invoke_5b349f2a42974a989226cf33_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "int7100-payment-to-o-dev_manual-invoke_5b349f2a42974a989226cf33",
        "value": "[listCallbackUrl(resourceId(parameters('LogicApp_INT7100-Payment-To-o-DEV_resourceGroup'), 'Microsoft.Logic/workflows/triggers', parameters('LogicApp_INT7100-Payment-To-o-DEV_logicAppName'), 'request'),'2017-07-01').queries.sig]",
        "tags": [],
        "secret": true
      },
      "resources": [],
      "dependsOn": []
    }

Removed some name parameters

Parameters generated for name of resources like, Policy, version sets, operations etc has no real use case or benefits associcated with changing the name and is therefore removed to simplify the ARM templates.

Sample of parameters that are now removed:

    "versionset_5af9817ca656c6952416b779_name": {
      "type": "string",
      "defaultValue": "5af9817ca656c6952416b779"
    },
    "operations_GetSalesOrderById_name": {
      "type": "string",
      "defaultValue": "GetSalesOrderById"
    },
    "operations_policy_name": {
      "type": "string",
      "defaultValue": "policy"
    }

Summary:

Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 


API Management Template Creator Updates May 2018

Updates on the API Management Template Creator has been dragging but now I’m pleased to have fixed the two most urgent ones.

  • Support for Versions, with version sets
  • Schemas

Support for Versions, with Version Sets

If an API that is extracted has enabled the current versionset is also extracted and provided inside the ARM Template. The version set states the version that is used and is needed byt the API to be able to support versions, it contains the information about how the verions is handled for the specific API. The sample bellow is using a HTTP header named API-Version to set the version of the API, the version later is then set on the API itself.

Read more in the documentation: https://docs.microsoft.com/en-us/azure/templates/microsoft.apimanagement/service/api-version-sets

{
      "comments": "Generated for resource /subscriptions/fake439-770d-43f3-9e4a-8b910457a10c/resourceGroups/API/providers/Microsoft.ApiManagement/service/dev/api-version-sets/5ae6f90fc96f5f1090700732",
      "type": "Microsoft.ApiManagement/service/api-version-sets",
      "name": "[concat(parameters('service_dev_name'), '/' ,parameters('versionset_5ae6f90fc96f5f1090700732_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "Arkivet",
        "description": null,
        "versioningScheme": "Header",
        "versionQueryName": null,
        "versionHeaderName": "API-Version"
      },
      "resources": [],
      "dependsOn": []
    }

Support for Schema

If an API has schemas added to oeprations these schemas will be extracted and added to the ARM template, all operations will also have a “dependsOn” to the specific schema to prevent errors when executing the ARM template. The sample bellow is a simple schema that is added in the ARM template and referenced in the Operations section.

Read more in the documentation: https://docs.microsoft.com/en-us/azure/templates/microsoft.apimanagement/service/apis/schemas

{
          "comments": "Generated for resource /subscriptions/fake439-770d-43f3-9e4a-8b910457a10c/resourceGroups/API/providers/Microsoft.ApiManagement/service/dev/apis/arkivet/schemas/5af038365b73730dd01453ad",
          "type": "Microsoft.ApiManagement/service/apis/schemas",
          "name": "[concat(parameters('service_dev_name'),'/',parameters('api_arkivet_name'), '/5af038365b73730dd01453ad')]",
          "apiVersion": "2017-03-01",
          "properties": {
            "contentType": "application/vnd.ms-azure-apim.swagger.definitions+json",
            "document": {
              "definitions": {
                "Body": {
                  "type": "object",
                  "properties": {
                    "No": {
                      "type": "string"
                    },
                    "ReportedDate": {
                      "type": "string",
                      "format": "date-time"
                    }
                  }
                },
                "BodyExists": {
                  "type": "object",
                  "properties": {
                    "CivicNo": {
                      "type": "string"
                    }
                  }
                },
                "Post400ApplicationJsonResponse": {
                  "type": "object",
                  "properties": {
                    "message": {
                      "type": "string"
                    },
                    "description": {
                      "type": "string"
                    },
                    "errors": {
                      "type": "array",
                      "items": {
                        "type": "object",
                        "properties": {
                          "field": {
                            "type": "string"
                          },
                          "message": {
                            "type": "string"
                          }
                        }
                      }
                    },
                    "hasErrors": {
                      "type": "boolean"
                    }
                  }
                },
                "ExistsPost200ApplicationJsonResponse": {
                  "type": "object",
                  "properties": {
                    "arkivet": {
                      "type": "string"
                    }
                  }
                }
              }
            }
          },
          "resources": [],
          "dependsOn": []
        }

Summary:

Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 


Access AAD Secured Web API's from API Management

Protecting Web Apps and Web API’s by the built in Authentication and authorization in Azure App Service is a great way to protect resources without adding code to handle the authorization. This means that the site or api is fully secure without the need of implementing it, which is a great example of seperation of concerns. Read more on how it works

What we then need is to access the API and in this case we need to be able to access the API via Management using service principals since the client using our API’s in API Management is outside our organization and should not be bothered by our internal security mechanisms.

The scenario: The API is protected by Azure AD and in order to access it we need to add a valid JWT token to the Auhtorization header in the request, first(1) we need to get the JWT token and then(2) add it in the request to the API App.

Scenario Overview

In this post we have used the standard generated API App from Visual Studio and published it to a Web API instances in Azure named Protected.

Enabling the built in security Authentication and Authorization to the API’s

In the Azure Portal head to the API App and go in to the settings tab “Authentication/Authorization”. Enable the App Service Authentication by pressing the On button.

Enable Authentication/Ahithorization

Now we can enable several diffrent providers and in this demo we will focus on the Azure Active Directory, press the Active Directory.

Configure Azure AD Athentication

After chosing the Azure Active Directory, we are presented with 2 options to configure the Azure Active Directory setup. Express gives some guidance and the possiblity to create or select an existing AD Application while in Advance we can write in the values needed, the AAD Application clientid and the issuer, normaly https://sts.windows.net/{tenantguid}/ followed by the tenat guid URL. In this setup we will use the Express option and create a new AD Application.

Here we can choose to use an existing AD Application or create a new one, I will create a new one named ProtectedAppAAD.

New Azure AD App

After this is installed we can now save and test that the API is protected. If you open a new browser or log out from the account you should be forced to login and the site is now secured.

Login Sign

So now the site is securely protected and we can access it when logged in, now we need to be able to access the API via API Managamenet.

Provide access to the API

So the API is now protected and we need to provide a way for API Management to access it, we will need to use another AAD Application (App Registration) and provide access to the ProtectedAppAAD. So we need to create a new Application, named APIManagementAADApp, the sign in URL can be http://localhost.

Create a new AAD Application

After the AAD application is created we need to provide access to the ProtectedAppAAD this is done via permissions; to assign permissions go to Settings, press Required Permissions and Add.

AAD Application add permission

Under Select an API we need to search for ProtectedAppAAD since first the list only shows Microsoft Standard API’s. Enter the ProtectedAppAAD in the text box and press Select.

AAD Application add permission select API

Now we need to select the permission and the one we want is Access, select it and press Select, finish the setup by press Done.

AAD Application add permission select API

Last step is pressing the Grant Permissions to enable the permissions (do not forget this!).

AAD Application add permission select API

API Management Policy for Aquiring JWT Token

In order to be able to expose this API we need to get a token from AAD using the Application, this will be done inside a policy and luckily for us the API Management team has provided a set of code snippets at GitHub and one of these is doing exactly that, get it here.

There is more of them at the repository in GitHub and if you have a great snippet that you want to share you can add a Pull Request and if the team approves it. it will end up there. Here is the GitHub repository, https://github.com/Azure/api-management-policy-snippets/blob/master/Snippets/.

The Get OAuth2 access token from AAD snipet looks like this.

<!-- The policy defined in this file provides an example of using OAuth2 for authorization between the gateway and a backend. -->
<!-- It shows how to obtain an access token from AAD and forward it to the backend. -->

<!-- Send request to AAD to obtain a bearer token -->
<!-- Parameters: authorizationServer - format https://login.windows.net/TENANT-GUID/oauth2/token -->
<!-- Parameters: scope - a URI encoded scope value -->
<!-- Parameters: clientId - an id obtained during app registration -->
<!-- Parameters: clientSecret - a URL encoded secret, obtained during app registration -->

<!-- Copy the following snippet into the inbound section. -->

<policies>
  <inbound>
    <base />
      <send-request ignore-error="true" timeout="20" response-variable-name="bearerToken" mode="new">
        <set-url></set-url>
        <set-method>POST</set-method>
        <set-header name="Content-Type" exists-action="override">
          <value>application/x-www-form-urlencoded</value>
        </set-header>
        <set-body>
          @{
          return "client_id=&resource=&client_secret=&grant_type=client_credentials";
          }
        </set-body>
      </send-request>

      <set-header name="Authorization" exists-action="override">
        <value>
          @("Bearer " + (String)((IResponse)context.Variables["bearerToken"]).Body.As<JObject>()["access_token"])
	  </value>
      </set-header>

      <!--  Don't expose APIM subscription key to the backend. -->
      <set-header exists-action="delete" name="Ocp-Apim-Subscription-Key"/>
  </inbound>
  <backend>
    <base />
  </backend>
  <outbound>
    <base />
  </outbound>
  <on-error>
    <base />
  </on-error>
</policies>

The way it works is that two things happen before we send the request to the backend.

  1. First a request in the send-request action is made to the AAD tenant and aquiring a token.
  2. The set-header action is adding the token in the Authroization header, by extracting it from the result from the first request.

But before we can add this snippet to our API we need to add a few values to the Named Values section in our API Management instance.

  • authorizationServer - the tenant URL My tenant ID is 1cb87777-3df4-428b-811a-86a0f215cd35 so the URL then is https://login.microsoftonline.com/1cb87777-3df4-428b-811a-86a0f215cd35/oauth2/token
  • clientId - the Application id of our AAD Application in this case of APIManagementAADApp
    • AAD Application id APIManagementAADApp
  • clientSecret - the key to our AAD Application in this case of APIManagementAADApp
    • Create a new key Create AAD Application key APIManagementAADApp
  • scope - the AAD Application id of the AAD Application that is protecting our resource, in this case the id of ProtectedAppAAD
    • AAD Application id ProtectedAppAAD

After adding these the Named Values section of our API Managemet will end up looking like this. Make sure to add these before adding the policy, since the policy canno be saved if the referenced Named Value dosen’t exist.

AAD Application id ProtectedAppAAD

Now we just need to add the policy to our API and we will be able to access the protected API, after the Policy is saved test it and you should now recieve the result from the API.

AAD Application id ProtectedAppAAD

Summary:

Todays solutions almost always accesses other resurces and when building this landscape we need to make sure that all parts are protected and secure, often even we need to combine a few of them to be compliant with requirments from customers. In this post I showed how to use one of the built in security from Web Apps with AAD that can be used standalone or as an extra layer of security above the “standard” Basic, Oauth, API keys, Certificates etc. authentications. Another benefit is that is not part of the code so it’s not possible to “forgett” or accidently removed in a bug fix or similar.

AAD Application id ProtectedAppAAD

As a bonus the API Management Snippets is a really nice initiative with pre created advanced snippets so make sure to check this out.

https://github.com/Azure/api-management-policy-snippets/blob/master/Snippets/

Posted in: | API Management  | Tagged: | API Management  | Integration  | Security