Centralize secrets in Azure Key Vault

When working with usernames, passwords or api keys these need to be stored in a secure and manageble way. Usually I find that these are added to Application Settings and manually handled in several places, this is not a desirable way of working and may look something like this, secrets spread out in all areas with red circle:

Normal setup

The first step is to centralize the Values and there I find that Azure Key Vault Vault is a superb place for storage, we got RBAC support for granular security, making it possible for a developer to access the values via code or when deploying via ARM template but not to see or edit the value. So with this we can make sure that passwords/usernames and other secrets are not spread via email’s, stored in dropbox or other “safe” places when distributed to the developers to add them to the App Settings or store them in parameter files for our ARM templates. It also adds reusability so if the value is needed in a Function and in a Logic App we can make sure it’s the same value that is used and we can manage it in one place.

So let’s have this sample in our KeyVault where sqluser and sqlpassword should be used in both a Function and when creating a Logic App Connector.

Key Vault sample secrets

Finding the Key vault Resource id: Key Vault resource id

ARM Deployment Logic App Connector

Using KeyVault with ARM deployment requires that the KeyVault has enabled access for Azure Resource Manager for template demployment and that the principal (AAD Application) that is used when deploying has permissions to access the secrets.

Enable ARM Deployment with Key Vault

After this secrets are accessible via the ARM Template Parameter file, (only from the parameter file or from a Wrapping ARM Template) here is a sample how it would look like in a parameter file:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "name": {
      "value": "INT001-SQL"
    },
    "location": {
      "value": "West Europe"
    }
    "sqlusername" :{
      "reference": {
        "keyVault": {
          "id": "/subscriptions/fake-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/testenvironment/providers/Microsoft.KeyVault/vaults/ibizmalotest"
        },
        "secretName": "sqluser"
      }
    },
	"sqlpassword" :{
      "reference": {
        "keyVault": {
          "id": "/subscriptions/fake-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/testenvironment/providers/Microsoft.KeyVault/vaults/ibizmalotest"
        },
        "secretName": "sqlpassword"
      }
    }
  }
}

When doing a Resource Group Deployment the value is collected from Key Vault during deployment time and then stored in the API Connection. (Remember deployment time means that if the value is changed a new deployment is needed to reflect the change to the API Connection)

Now secrets are stored in Key Vault and the developer setting this up don’t know the actual values in test/prod just what the secret’s name is. The deployment flow looks as follows below where during deployment the secrets is collected from KeyVault (1) and used when creating/updating the API Connection (2) ) this also means that no secrets are stored in the ARM template parameters file and that is great!

ARM Deployment with Key Vault

IMPORTANT if the value in Key Vault is changed a new deployment is required to update the API Connection.

Accessing Secrets from Functions/Web Apps or other c# applications

Any c# application can easily use the KeyVault SDK to retrieve secrets from Key Vault during runtime, but we do need some things in order to make this possible, from code we need an AAD Application in order to grant access to the secrets, I will not go through how to create one here but read more here

When it’s created we need to get the Application ID (clientId) and one Key (clientSecret).

Get AAD Application information

Now that we have collected the information needed we also need to make sure that our AAD Application has access to the secrets, that is something we are adding to the Key Vault under the “Access Policies”:

Set Access Policies

Press the “Add new”, select the principal (AAD Application) in my case the KeyVaultApp, in the Secret Permissions we only want and need Get and no more privileges should be given to the Policy.

Set Access Policies Settings

Now to start coding we need some nuget packages: *Microsoft.Azure.KeyVault; *Microsoft.IdentityModel.Clients.ActiveDirectory

And a bit of code, sample bellow, for more indept about how to use in Functions read more at Jeff Holan’s blog post:

Code to retrieve a secret thesecrets

public static class KeyVaultFunction
{

    private static string clientID = Environment.GetEnvironmentVariable("clientId", EnvironmentVariableTarget.Process);
    private static string clientSecret = Environment.GetEnvironmentVariable("clientSecret", EnvironmentVariableTarget.Process);       
    private static string keyvaultname = Environment.GetEnvironmentVariable("keyvaultname", EnvironmentVariableTarget.Process);    

    [FunctionName("KeyVaultFunction")]
    public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
    {
        log.Info("C# HTTP trigger function processed a request.");

        string secretUri = $"https://{keyvaultname}.vault.azure.net/secrets/";

        var kv = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(GetToken));
        log.Info("Token aquired");
        var sqlusernameSecret = await kv.GetSecretAsync(secretUri + "sqluser");
		var sqlpasswordSecret = await kv.GetSecretAsync(secretUri + "sqlpassword");
		
		//do some code here

        return req.CreateResponse(HttpStatusCode.OK, "SQL credentials collected ok");
    }

    public static async Task<string> GetToken(string authority, string resource, string scope)
    {
        var authContext = new AuthenticationContext(authority);
        ClientCredential clientCred = new ClientCredential(clientID, clientSecret);
        AuthenticationResult result = await authContext.AcquireTokenAsync(resource, clientCred);

        if (result == null)
            throw new System.Exception("Failed to obtain the JWT token");

        return result.AccessToken;
    }
}

First the settings are added to the Function App Application Settings:

private static string clientID = Environment.GetEnvironmentVariable("clientId", EnvironmentVariableTarget.Process);
private static string clientSecret = Environment.GetEnvironmentVariable("clientSecret", EnvironmentVariableTarget.Process);       
private static string keyvaultname = Environment.GetEnvironmentVariable("keyvaultname", EnvironmentVariableTarget.Process);
private static string secretname = Environment.GetEnvironmentVariable("secretname", EnvironmentVariableTarget.Process);

To handle the authentication we need a method to send i to the AuthenticationCallback object that takes 3 parameters, authority is the login authroity i.e: “https://login.windows.net/1cb87777-3df4-428b-811a-86a0f215cd35” the resoruce is what resource we are trying to access (KeyVault): “https://vault.azure.net” and scope is the scope we are accessing with normaly empty string.

public static async Task<string> GetToken(string authority, string resource, string scope)
{
    var authContext = new AuthenticationContext(authority);
    ClientCredential clientCred = new ClientCredential(clientID, clientSecret);
    AuthenticationResult result = await authContext.AcquireTokenAsync(resource, clientCred);

    if (result == null)
        throw new System.Exception("Failed to obtain the JWT token");

    return result.AccessToken;
}

Then the code part that is actually collecting the secret is the following, we need the URI to the secret that we compose of the Key Vault name and the secret name, and the GetToken function to send in to the AuthenticationCallback object.

string secretUri = $"https://{keyvaultname}.vault.azure.net/secrets/";
var kv = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(GetToken));
var sqlusernameSecret = await kv.GetSecretAsync(secretUri + "sqluser");

After this setup is completed we now are collecting the secrets from Key Vault during runtime and therefore there are no more secrets stored in application settings or in code and we have reused the same secrets in diffrent areas.

Secrets accessed from KeyVault in code

Summary:

I really like to remove username and passwords from code and settings since it’s always a mess to manage and in handling them, also there are a few security aspects to consider as well. Making it possible to get username/password or other secrets from Key Vault inside your Function, Web App and at the same time use them when deploying ARM templates like with Logic Apps API Connections or API Management is really the true win, since we now only need to manage these in one place and can use them wherever needed.

In the end we are creating a solution as the image bellow and this gives us a centralized place for our secrets and we only spreading them if we really need to like to an API Connection and that is done integrated with ARM.

Secrets centralised in KeyVault

OBS! Make note that if we access Key vault via code an update is reflected emediatly while used via ARM a new deployment is needed in order for the new value to be deployed to i.e. password for a API Connection, making it vital to have good Build and Release processes).

Posted in: | Key Vault  | Tagged: | Key Vault  | Integration  | Security 


Azure Functions npm install via VSTS

When setting upp build and release cycles for diffrent Azure resources we bump in to diffrent challenges, this post will cover the challenge of npm install of node modules in to a Function App when doing deployments.

When we are working with our development enviornment it’s okay to go in and run kudo commands and so on to make sure our Function App has the correct packages installed, but as we are adding speed and agility to our processes our Release pipelines in VSTS has to automate more of these processes. It’s not that it’s hard to go in to Kudo and run the command but it’s a manual step that takes time and knowledge when doing a release, I prefer to have everything prepped so that I know things are working when the Release is installed.

So let’s look in to how we can run Kudu commands from VSTS, since Kudu has a REST API we can use it for these kind of tasks, read more on it here Kudu Rest API

When looking in to the API documentation we easily find a function for executing commands:

POST /api/command
{
    "command": 'echo Hello World',
    "dir": 'site\\repository'
}

This means that we could create a message that looked like this for a npm install of the request package:

{
    "command": "npm install request",
    "dir": "site\\wwwroot"
}

So let’s start figuring out the rest, how to execute this from VSTS?

As good as it’s get this can be executed via PowerShell explained and sampled at the bottom of the API reference (I changed the sample in the following code snippet to execute the command for npm install):

$username = "`$website"
$password = "pwd"
# Note that the $username here should look like `SomeUserName`, and **not** `SomeSite\SomeUserName`
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))

$userAgent = "powershell/1.0"
$apiUrl = "https://$functionAppName.scm.azurewebsites.net/api/command"
$command = '{"command": "npm install ' + $npmpackage + '","dir": "site\\wwwroot"}'

Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} -UserAgent $userAgent -Method POST -Body $command -ContentType "application/json"

The credentials is now needed and that is somewhat messy, but since it’s the same as the deployment credentials we can add a commands to get that information via AzureRmResourceAction command:

$creds = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroup -ResourceType Microsoft.Web/sites/config `
            -ResourceName $functionAppName/publishingcredentials -Action list -ApiVersion 2015-08-01 -Force

So with this information we can now build a more generic script that will only need three (3) parameters to execute a npm install command on our Function App:

  • functionAppName: the name of the Function App
  • resourceGroupName: the name of the resource group that contains the Function App
  • npmpackage: the npm package to install

param([string] $functionAppName, [string] $resourceGroup, [string] $npmpackage)

$creds = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroup -ResourceType Microsoft.Web/sites/config `
            -ResourceName $functionAppName/publishingcredentials -Action list -ApiVersion 2015-08-01 -Force

$username = $creds.Properties.PublishingUserName
$password = $creds.Properties.PublishingPassword


$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))

$userAgent = "powershell/1.0"
$apiUrl = "https://$functionAppName.scm.azurewebsites.net/api/command"
$command = '{"command": "npm install ' + $npmpackage + '","dir": "site\\wwwroot"}'

Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} -UserAgent $userAgent -Method POST -Body $command -ContentType "application/json"

Now all we have left is to execute this script in a Azure Powershell Task that will install the npm pacakge during the release, image bellow shows the Release setup and as you can see the parameters are added in the “Script Arguments” input area. I’ve also added the zcript to a shared repo andlinked the build setup to be able to share the script and manage it in one place.

VSTS Release Setup

If you are using package.json files we can make sure all packages are installed via npm install command, let’s see how the following could look like with a package.json file:

package.json file in project

Modifying the PowerShell script will then give us the following:

  • functionAppName: the name of the Function App
  • resourceGroupName: the name of the resource group that contains the Function App
  • functionfolder: the name of the folder that the function is in (same name as the function)
param([string] $functionAppName, [string] $resourceGroup, [string] $functionfolder)

$creds = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroup -ResourceType Microsoft.Web/sites/config `
            -ResourceName $functionAppName/publishingcredentials -Action list -ApiVersion 2015-08-01 -Force

$username = $creds.Properties.PublishingUserName
$password = $creds.Properties.PublishingPassword

$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))

$userAgent = "powershell/1.0"
$apiUrl = "https://$functionAppName.scm.azurewebsites.net/api/command"
$command = '{"command": "npm install","dir": "site\\wwwroot\\'+ $functionfolder +'"}'

Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} -UserAgent $userAgent -Method POST -Body $command -ContentType "application/json"

Summary:

I like to automate these tasks since it will give me a “easier” Release and a more reliable Release, but as for now it’s hard to verify that the package is installed and if it’s installed previously so a verification step that the function is loaded correctly is something that might be needed.

Recomended is to use the package.json approach since it shows the dependencies and allow to easyily add new packages, but jeep in mind that it also gives some delays when running the npm install command since it will install alot of packages the first time.

Posted in: | Azure Functions  | Tagged: | Azure Functions  | Integration  | Deployments 


Azure Functions - Manage Application Settings via ARM

One of the most common task when working with deployments is the need to handle is application settings. This is where we are adding environment variables to the current Azure Function App, the environment variables often has diffrent values in diffrent environmnets (Dev/Test/Prod). So we need to manage them to make sure that they have the correct values.

These can be managed manually since they are rarely changed but it requires dicipline and good release documentation and sometimes quite advanced procedures to get the information, some of these cases are i.e. getting the URL of a Logic App, a value from Key Vault or we might just walue quality and reliable deployments.

I prefer to manage these settings via ARM templates since it gives me the possiblity to get values from Key Vault or automate the process to get values from other Azure resources and at the same time gives me control and robustness so I know that the settings are set and that the values are correct. A note to this, make sure that operations persons are aware that settings are been set from ARM template deployment since if they do changes directly in the Azure functions Application Settings tab they will be overwriten at the next deployment and that might cause problems.

Let’s look in to the ARM template, I’ve just copied this from the Azure Portal (automation option when creating a new Azure Function).

{
  "parameters": {
    "name": {
      "type": "string"
    },
    "storageName": {
      "type": "string"
    },
    "location": {
      "type": "string"
    }
  },
  "resources": [
    {
      "name": "[parameters('name')]",
      "type": "Microsoft.Web/sites",
      "dependsOn": [
        "[resourceId('Microsoft.Storage/storageAccounts', parameters('storageName'))]",
        "[resourceId('microsoft.insights/components/', parameters('name'))]"
      ],
      "properties": {
        "siteConfig": {
          "appSettings": [
            {
              "name": "AzureWebJobsDashboard",
              "value": "[concat('DefaultEndpointsProtocol=https;AccountName=',parameters('storageName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2015-05-01-preview').key1)]"
            },
            {
              "name": "AzureWebJobsStorage",
              "value": "[concat('DefaultEndpointsProtocol=https;AccountName=',parameters('storageName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2015-05-01-preview').key1)]"
            },
            {
              "name": "FUNCTIONS_EXTENSION_VERSION",
              "value": "~1"
            },
            {
              "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
              "value": "[concat('DefaultEndpointsProtocol=https;AccountName=',parameters('storageName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2015-05-01-preview').key1)]"
            },
            {
              "name": "WEBSITE_CONTENTSHARE",
              "value": "[concat(toLower(parameters('name')), 'a217')]"
            },
            {
              "name": "WEBSITE_NODE_DEFAULT_VERSION",
              "value": "6.5.0"
            },
            {
              "name": "APPINSIGHTS_INSTRUMENTATIONKEY",
              "value": "[reference(resourceId('microsoft.insights/components/', parameters('name')), '2015-05-01').InstrumentationKey]"
            }
          ]
        },
        "name": "[parameters('name')]",
        "clientAffinityEnabled": false
      },
      "apiVersion": "2016-03-01",
      "location": "[parameters('location')]",
      "kind": "functionapp"
    },
    {
      "apiVersion": "2015-05-01-preview",
      "type": "Microsoft.Storage/storageAccounts",
      "name": "[parameters('storageName')]",
      "location": "[parameters('location')]",
      "properties": {
        "accountType": "Standard_LRS"
      }
    },
    {
      "apiVersion": "2015-05-01",
      "name": "[parameters('name')]",
      "type": "microsoft.insights/components",
      "location": "West Europe",
      "tags": {
        "[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/sites/', parameters('name'))]": "Resource"
      },
      "properties": {
        "ApplicationId": "[parameters('name')]",
        "Request_Source": "IbizaWebAppExtensionCreate"
      }
    }
  ],
  "$schema": "http://schema.management.azure.com/schemas/2014-04-01-preview/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0"
}

The interesting section is the appSettings under the siteConfig shown bellow.

Here is the standard properties that are used when deploying a Azure Function and here we can add new Application Settings. (note that ARM functions are used to get keys to the storage account)

"properties": {
        "siteConfig": {
          "appSettings": [
            {
              "name": "AzureWebJobsDashboard",
              "value": "[concat('DefaultEndpointsProtocol=https;AccountName=',parameters('storageName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2015-05-01-preview').key1)]"
            },
            {
              "name": "AzureWebJobsStorage",
              "value": "[concat('DefaultEndpointsProtocol=https;AccountName=',parameters('storageName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2015-05-01-preview').key1)]"
            },
            {
              "name": "FUNCTIONS_EXTENSION_VERSION",
              "value": "~1"
            },
            {
              "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
              "value": "[concat('DefaultEndpointsProtocol=https;AccountName=',parameters('storageName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2015-05-01-preview').key1)]"
            },
            {
              "name": "WEBSITE_CONTENTSHARE",
              "value": "[concat(toLower(parameters('name')), 'a217')]"
            },
            {
              "name": "WEBSITE_NODE_DEFAULT_VERSION",
              "value": "6.5.0"
            },
            {
              "name": "APPINSIGHTS_INSTRUMENTATIONKEY",
              "value": "[reference(resourceId('microsoft.insights/components/', parameters('name')), '2015-05-01').InstrumentationKey]"
            }
          ]
        },

Adding a new property to the list is easy and adding a new key customkey looks like this, and we also add a new ARM parameter so we can change the value between environments

    {
      "name": "APPINSIGHTS_INSTRUMENTATIONKEY",
      "value": "[reference(resourceId('microsoft.insights/components/', parameters('name')), '2015-05-01').InstrumentationKey]"
    },
    {
      "name": "customkey",
      "value": "[parameters('customkeyparam')]"
    }
  ]
},

Added parameter:

"location": {
  "type": "string"
},
"customkeyparam" :{
  "type": "string"
}

And now we can use this with a parameter file, I copied the paramter file the same way and now we can update this parameter file and adding the new parameter:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "name": {
      "value": "ibizmalotest"
    },
    "storageName": {
      "value": "ibizmalotest912c"
    },
    "location": {
      "value": "West Europe"
    }
    "customkeyparam" :{
      "value": "MyCustomValueSetViaARM"
    }
  }
}

After doing a Resource Group Deployment, we can go in and see the new setting in the Application Setting on the Azure Function App and start using it.

Summary:

I like this approach since we can make sure that parameters are set and prepare before deployment without needing to change values in the Function App, I also love the ARM functions that we can use during deployments to automate processes or just get values from Azure Key Vaults to make it easier to manage secrets (note that secrets are in plain text in application settings for the users who has access).

The ARM functions can really speed up and automate the deployment processes even further, here we can as shown aboive get keys to storage account or get the URL of a Logic App during deployment time, wich is a complex/time consuming task and when using ARM functions there is also a garantee that the Logic App is deployed and ready to be used.

Creating a good ARM template will make it easier to move between environments and it will add quality to your deployments.

But remember to make sure that changes that are made directly to the Function App Application Settings are reflected to the parameter files and arm templates aswell to prevent changes to be overwriten.

Sample getting the URL of a Locic App (assumes added parameters for Resource Group and Logic App name and the trigger name). (template file)

    {
      "name": "APPINSIGHTS_INSTRUMENTATIONKEY",
      "value": "[reference(resourceId('microsoft.insights/components/', parameters('name')), '2015-05-01').InstrumentationKey]"
    },
    {
      "name": "LogicAppUrl",
      "value": "[listCallbackUrl(resourceId(parameters('logicApp_resourcegroup'),'Microsoft.Logic/workflows/triggers', parameters('logicApp_name'),parameters('logicApp_trigger')), providers('Microsoft.Logic', 'workflows').apiVersions[0]).value]"
    }
  ]
},

Sample Azure Key Vault get a specific Value from Key Vault. (parameter file)

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "name": {
      "value": "ibizmalotest"
    },
    "storageName": {
      "value": "ibizmalotest912c"
    },
    "location": {
      "value": "West Europe"
    }
    "customkeyparam" :{
      "reference": {
        "keyVault": {
          "id": "/subscriptions/fake-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/testenvironment/providers/Microsoft.KeyVault/vaults/ibizmalotest"
        },
        "secretName": "customkeysecretname"
      }
    }
  }
}

Posted in: | Azure Functions  | Tagged: | Azure Functions  | Integration 


API Management ARM Template Creator

ARM support was release in the July rlease of API Management (read more here) and this is relly a great improvement on the deployment experience of API Management. For samples on ARM templates released by the product group got the Sample Github page

But there is some “blocking” things here, we are not using ARM templates when devloping, so how do we get the ARM Templates? First you can download the entire API Management instance ARM specification directly in the portal, Automation Script

But when starting to look into this I got a little dissapointed, it’s to mutch hardcoded values like backend URL’s parameters and there is for me unwanted things aswell as Users, Groups, Subscriptions etc. All that could be sorted out and scripted (alot of work but could be done) but the real deal breaker using the automation script was the missing request/response representation and the missing queryParameters wich lead to crahsing imports.

It’s import to understand the Environment reference and boundries, we don’t want cross reference communcation that is not intended:

When doing a deployment we want as little manual work as possible and above we have 2 issues, manual configuration equals downtime and sample messagas is one of the best things when onboarding new API consumers we can’t ignore this.

Let’s look at an example, this is how an operation is looking (note the sample):

And when this api is exported via the Automation Script the sample is not part of the object returned:

 {
    "comments": "Generalized from resource: '/subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/apis/58a3138ca9f54041296baffd/operations/58a313e40647c00eecf8d408'.",
    "type": "Microsoft.ApiManagement/service/apis/operations",
    "name": "[concat(parameters('service_ibizmalo_name'), '/', parameters('apis_58a3138ca9f54041296baffd_name'), '/', parameters('operations_58a313e40647c00eecf8d408_name'))]",
    "apiVersion": "2017-03-01",
    "scale": null,
    "properties": {
        "displayName": "Conversion Rate",
        "method": "POST",
        "urlTemplate": "/conversionrate",
        "description": "<br><b>Get conversion rate from one currency to another currency <b><br><p><b><font color='#000080' size='1' face='Verdana'><u>Differenct currency Code and Names around the world</u></font></b></p><blockquote><p><font face='Verdana' size='1'>AFA-Afghanistan Afghani<br>ALL-Albanian Lek<br>DZD-Algerian Dinar<br>ARS-Argentine Peso<br>AWG-Aruba Florin<br>AUD-Australian Dollar<br>BSD-Bahamian Dollar<br>BHD-Bahraini Dinar<br>BDT-Bangladesh Taka<br>BBD-Barbados Dollar<br>BZD-Belize Dollar<br>BMD-Bermuda Dollar<br>BTN-Bhutan Ngultrum<br>BOB-Bolivian Boliviano<br>BWP-Botswana Pula<br>BRL-Brazilian Real<br>GBP-British Pound<br>BND-Brunei Dollar<br>BIF-Burundi Franc<br>XOF-CFA Franc (BCEAO)<br>XAF-CFA Franc (BEAC)<br>KHR-Cambodia Riel<br>CAD-Canadian Dollar<br>CVE-Cape Verde Escudo<br>KYD-Cayman Islands Dollar<br>CLP-Chilean Peso<br>CNY-Chinese Yuan<br>COP-Colombian Peso<br>KMF-Comoros Franc<br>CRC-Costa Rica Colon<br>HRK-Croatian Kuna<br>CUP-Cuban Peso<br>CYP-Cyprus Pound<br>CZK-Czech Koruna<br>DKK-Danish Krone<br>DJF-Dijibouti Franc<br>DOP-Dominican Peso<br>XCD-East Caribbean Dollar<br>EGP-Egyptian Pound<br>SVC-El Salvador Colon<br>EEK-Estonian Kroon<br>ETB-Ethiopian Birr<br>EUR-Euro<br>FKP-Falkland Islands Pound<br>GMD-Gambian Dalasi<br>GHC-Ghanian Cedi<br>GIP-Gibraltar Pound<br>XAU-Gold Ounces<br>GTQ-Guatemala Quetzal<br>GNF-Guinea Franc<br>GYD-Guyana Dollar<br>HTG-Haiti Gourde<br>HNL-Honduras Lempira<br>HKD-Hong Kong Dollar<br>HUF-Hungarian Forint<br>ISK-Iceland Krona<br>INR-Indian Rupee<br>IDR-Indonesian Rupiah<br>IQD-Iraqi Dinar<br>ILS-Israeli Shekel<br>JMD-Jamaican Dollar<br>JPY-Japanese Yen<br>JOD-Jordanian Dinar<br>KZT-Kazakhstan Tenge<br>KES-Kenyan Shilling<br>KRW-Korean Won<br>KWD-Kuwaiti Dinar<br>LAK-Lao Kip<br>LVL-Latvian Lat<br>LBP-Lebanese Pound<br>LSL-Lesotho Loti<br>LRD-Liberian Dollar<br>LYD-Libyan Dinar<br>LTL-Lithuanian Lita<br>MOP-Macau Pataca<br>MKD-Macedonian Denar<br>MGF-Malagasy Franc<br>MWK-Malawi Kwacha<br>MYR-Malaysian Ringgit<br>MVR-Maldives Rufiyaa<br>MTL-Maltese Lira<br>MRO-Mauritania Ougulya<br>MUR-Mauritius Rupee<br>MXN-Mexican Peso<br>MDL-Moldovan Leu<br>MNT-Mongolian Tugrik<br>MAD-Moroccan Dirham<br>MZM-Mozambique Metical<br>MMK-Myanmar Kyat<br>NAD-Namibian Dollar<br>NPR-Nepalese Rupee<br>ANG-Neth Antilles Guilder<br>NZD-New Zealand Dollar<br>NIO-Nicaragua Cordoba<br>NGN-Nigerian Naira<br>KPW-North Korean Won<br>NOK-Norwegian Krone<br>OMR-Omani Rial<br>XPF-Pacific Franc<br>PKR-Pakistani Rupee<br>XPD-Palladium Ounces<br>PAB-Panama Balboa<br>PGK-Papua New Guinea Kina<br>PYG-Paraguayan Guarani<br>PEN-Peruvian Nuevo Sol<br>PHP-Philippine Peso<br>XPT-Platinum Ounces<br>PLN-Polish Zloty<br>QAR-Qatar Rial<br>ROL-Romanian Leu<br>RUB-Russian Rouble<br>WST-Samoa Tala<br>STD-Sao Tome Dobra<br>SAR-Saudi Arabian Riyal<br>SCR-Seychelles Rupee<br>SLL-Sierra Leone Leone<br>XAG-Silver Ounces<br>SGD-Singapore Dollar<br>SKK-Slovak Koruna<br>SIT-Slovenian Tolar<br>SBD-Solomon Islands Dollar<br>SOS-Somali Shilling<br>ZAR-South African Rand<br>LKR-Sri Lanka Rupee<br>SHP-St Helena Pound<br>SDD-Sudanese Dinar<br>SRG-Surinam Guilder<br>SZL-Swaziland Lilageni<br>SEK-Swedish Krona<br>TRY-Turkey Lira<br>CHF-Swiss Franc<br>SYP-Syrian Pound<br>TWD-Taiwan Dollar<br>TZS-Tanzanian Shilling<br>THB-Thai Baht<br>TOP-Tonga Pa'anga<br>TTD-Trinidad&amp;amp;Tobago Dollar<br>TND-Tunisian Dinar<br>TRL-Turkish Lira<br>USD-U.S. Dollar<br>AED-UAE Dirham<br>UGX-Ugandan Shilling<br>UAH-Ukraine Hryvnia<br>UYU-Uruguayan New Peso<br>VUV-Vanuatu Vatu<br>VEB-Venezuelan Bolivar<br>VND-Vietnam Dong<br>YER-Yemen Riyal<br>YUM-Yugoslav Dinar<br>ZMK-Zambian Kwacha<br>ZWD-Zimbabwe Dollar</font></p></blockquote>",
        "policies": null
    },
    "dependsOn": [
        "[resourceId('Microsoft.ApiManagement/service', parameters('service_ibizmalo_name'))]",
        "[resourceId('Microsoft.ApiManagement/service/apis', parameters('service_ibizmalo_name'), parameters('apis_58a3138ca9f54041296baffd_name'))]"
    ]
},

To be compared with the representation recieved when extracting the API using the REST interface:

{
  {
    "id": "/subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/apis/58a3138ca9f54041296baffd/operations/58a313e40647c00eecf8d408",
    "type": "Microsoft.ApiManagement/service/apis/operations",
    "name": "58a313e40647c00eecf8d408",
    "properties": {
      "displayName": "Conversion Rate",
      "method": "POST",
      "urlTemplate": "/conversionrate",
      "templateParameters": [],
      "description": "<br><b>Get conversion rate from one currency to another currency <b><br><p><b><font color='#000080' size='1' face='Verdana'><u>Differenct currency Code and Names around the world</u></font></b></p><blockquote><p><font face='Verdana' size='1'>AFA-Afghanistan Afghani<br>ALL-Albanian Lek<br>DZD-Algerian Dinar<br>ARS-Argentine Peso<br>AWG-Aruba Florin<br>AUD-Australian Dollar<br>BSD-Bahamian Dollar<br>BHD-Bahraini Dinar<br>BDT-Bangladesh Taka<br>BBD-Barbados Dollar<br>BZD-Belize Dollar<br>BMD-Bermuda Dollar<br>BTN-Bhutan Ngultrum<br>BOB-Bolivian Boliviano<br>BWP-Botswana Pula<br>BRL-Brazilian Real<br>GBP-British Pound<br>BND-Brunei Dollar<br>BIF-Burundi Franc<br>XOF-CFA Franc (BCEAO)<br>XAF-CFA Franc (BEAC)<br>KHR-Cambodia Riel<br>CAD-Canadian Dollar<br>CVE-Cape Verde Escudo<br>KYD-Cayman Islands Dollar<br>CLP-Chilean Peso<br>CNY-Chinese Yuan<br>COP-Colombian Peso<br>KMF-Comoros Franc<br>CRC-Costa Rica Colon<br>HRK-Croatian Kuna<br>CUP-Cuban Peso<br>CYP-Cyprus Pound<br>CZK-Czech Koruna<br>DKK-Danish Krone<br>DJF-Dijibouti Franc<br>DOP-Dominican Peso<br>XCD-East Caribbean Dollar<br>EGP-Egyptian Pound<br>SVC-El Salvador Colon<br>EEK-Estonian Kroon<br>ETB-Ethiopian Birr<br>EUR-Euro<br>FKP-Falkland Islands Pound<br>GMD-Gambian Dalasi<br>GHC-Ghanian Cedi<br>GIP-Gibraltar Pound<br>XAU-Gold Ounces<br>GTQ-Guatemala Quetzal<br>GNF-Guinea Franc<br>GYD-Guyana Dollar<br>HTG-Haiti Gourde<br>HNL-Honduras Lempira<br>HKD-Hong Kong Dollar<br>HUF-Hungarian Forint<br>ISK-Iceland Krona<br>INR-Indian Rupee<br>IDR-Indonesian Rupiah<br>IQD-Iraqi Dinar<br>ILS-Israeli Shekel<br>JMD-Jamaican Dollar<br>JPY-Japanese Yen<br>JOD-Jordanian Dinar<br>KZT-Kazakhstan Tenge<br>KES-Kenyan Shilling<br>KRW-Korean Won<br>KWD-Kuwaiti Dinar<br>LAK-Lao Kip<br>LVL-Latvian Lat<br>LBP-Lebanese Pound<br>LSL-Lesotho Loti<br>LRD-Liberian Dollar<br>LYD-Libyan Dinar<br>LTL-Lithuanian Lita<br>MOP-Macau Pataca<br>MKD-Macedonian Denar<br>MGF-Malagasy Franc<br>MWK-Malawi Kwacha<br>MYR-Malaysian Ringgit<br>MVR-Maldives Rufiyaa<br>MTL-Maltese Lira<br>MRO-Mauritania Ougulya<br>MUR-Mauritius Rupee<br>MXN-Mexican Peso<br>MDL-Moldovan Leu<br>MNT-Mongolian Tugrik<br>MAD-Moroccan Dirham<br>MZM-Mozambique Metical<br>MMK-Myanmar Kyat<br>NAD-Namibian Dollar<br>NPR-Nepalese Rupee<br>ANG-Neth Antilles Guilder<br>NZD-New Zealand Dollar<br>NIO-Nicaragua Cordoba<br>NGN-Nigerian Naira<br>KPW-North Korean Won<br>NOK-Norwegian Krone<br>OMR-Omani Rial<br>XPF-Pacific Franc<br>PKR-Pakistani Rupee<br>XPD-Palladium Ounces<br>PAB-Panama Balboa<br>PGK-Papua New Guinea Kina<br>PYG-Paraguayan Guarani<br>PEN-Peruvian Nuevo Sol<br>PHP-Philippine Peso<br>XPT-Platinum Ounces<br>PLN-Polish Zloty<br>QAR-Qatar Rial<br>ROL-Romanian Leu<br>RUB-Russian Rouble<br>WST-Samoa Tala<br>STD-Sao Tome Dobra<br>SAR-Saudi Arabian Riyal<br>SCR-Seychelles Rupee<br>SLL-Sierra Leone Leone<br>XAG-Silver Ounces<br>SGD-Singapore Dollar<br>SKK-Slovak Koruna<br>SIT-Slovenian Tolar<br>SBD-Solomon Islands Dollar<br>SOS-Somali Shilling<br>ZAR-South African Rand<br>LKR-Sri Lanka Rupee<br>SHP-St Helena Pound<br>SDD-Sudanese Dinar<br>SRG-Surinam Guilder<br>SZL-Swaziland Lilageni<br>SEK-Swedish Krona<br>TRY-Turkey Lira<br>CHF-Swiss Franc<br>SYP-Syrian Pound<br>TWD-Taiwan Dollar<br>TZS-Tanzanian Shilling<br>THB-Thai Baht<br>TOP-Tonga Pa'anga<br>TTD-Trinidad&amp;amp;Tobago Dollar<br>TND-Tunisian Dinar<br>TRL-Turkish Lira<br>USD-U.S. Dollar<br>AED-UAE Dirham<br>UGX-Ugandan Shilling<br>UAH-Ukraine Hryvnia<br>UYU-Uruguayan New Peso<br>VUV-Vanuatu Vatu<br>VEB-Venezuelan Bolivar<br>VND-Vietnam Dong<br>YER-Yemen Riyal<br>YUM-Yugoslav Dinar<br>ZMK-Zambian Kwacha<br>ZWD-Zimbabwe Dollar</font></p></blockquote>",
      "request": {
        "description": "ConversionRequest",
        "queryParameters": [],
        "headers": [],
        "representations": [
          {
            "contentType": "application/json",
            "sample": "{     \"fromCurrency\": \"SEK\",     \"toCurrency\": \"USD\" }",
            "schemaId": "322649e9-bfd5-4c93-8ad1-3fc1780e8045",
            "typeName": "ConversionRate"
          }
        ]
      },
      "responses": [
        {
          "statusCode": 200,
          "description": "ConversionResult",
          "representations": [
            {
              "contentType": "application/json",
              "sample": "{\r\n    \"rate\": 1.0\r\n}",
              "schemaId": "322649e9-bfd5-4c93-8ad1-3fc1780e8045",
              "typeName": "ConversionRateResponse"
            }
          ],
          "headers": []
        }
      ],
      "policies": null
    }
  }
}

We can clearly see that the REST object contains far more inforamtion and since we are working withe environment setups with DEV/TEST/PROD it would be dissapointing to not be able to provide samples in DEV but not in TEST and PRODUCTION (where it makes most sense):

So this was the start of my new project API Management ARM Template Creator

As some of you know I’ve been contributing alot to the Logic App Template Creator: https://github.com/jeffhollan/LogicAppTemplateCreator so I thought I’ll do something similar for API Management.

While starting this I added a few more requirements

  1. Possible to only extract one specific api, but get all required linked attributes such as Named Values.
  2. Parameterized URL’s (both API and backend)
  3. Parameterized Named Values value
  4. Automated url/Named values handling for Logic Apps
  5. Parameterized values on all changeable values that is needed for groups, products, authorization servers etc

The first release is made today and the project can be found at my GIT site: https://github.com/MLogdberg/APIManagementARMTemplateCreator

How does it work?

Well it’s no magic, I’ve just combined the result from the REST API’s to ARM Templates and along the way added some ARM functionality to improve the development experience.

Let’s show with a sample with one of my API’s: I have a “simple” rest API, that has two GET operations, with a policy for api and one for each operation, each operation need’s an authentication key that is stored in Named Values.

The key components that need’s to be changed are the URL and the authentication key used in the request to the backend.

The URL

The authenticationkey in the policy ()

The authenticationkey in Named Values ()

Even for a rather simple API there are som places that need’s to be changed and in order to prevent downtime on your API this is needed to be changed before/during deployment. So using the PowerShell module I can extract a ARM template that will represent this API.

Let’s look at some samples:

Manually created REST API

Parameterization of URL, the API looks like this and as you can see the URL is parameterized with [parameters(‘order_serviceUrl’)]

{
      "comments": "Generated for resource /subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/apis/order",
      "type": "Microsoft.ApiManagement/service/apis",
      "name": "[concat(parameters('service_ibizmalo_name'), '/' ,parameters('api_order_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "Order",
        "apiRevision": "[parameters('order_apiRevision')]",
        "description": "Azure Logic App.",
        "serviceUrl": "[parameters('order_serviceUrl')]",
        "path": "api/v1/order",
        "protocols": [
          "https"
        ],

Parameterization of authentication key, the Named Value (Properties it’s called in ARM) also has it’s value parameterized with [parameters(‘trafficauthkey_value’)]

{
      "comments": "Generated for resource /subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/properties/trafficauthkey",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_ibizmalo_name'), '/' ,parameters('property_trafficauthkey_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "trafficauthkey",
        "value": "[parameters('trafficauthkey_value')]",
        "tags": null,
        "secret": true
      },
      "resources": [],
      "dependsOn": []
    }

And due to these parameters it’s easy to change the values between environments and deliver robust and quality deployments.

Logic App

When generating the API from a Logic App the needed changes between enviroments are to change the following things: the resource group, the name of the logic app and the trigger name. The rest will be automated according the standard generation in API Management for Logic Apps.

Description of the standard generation of policy for Logic App instances, first there is a rewrite URL that will change the URL to match the Logic App and also add the sv and sig values (authentication) in a Named Value (Property) in to the URL. Second step is to set the Base URL of the Call to match the Logic App URL.

Lastly there is a comment section added with the path to the original Azure Resource (our Logic App) used when generating the policy. This is something we are using to automate the URL and authentication values.

The policy will be updated so that the policyContent will use the listCallbackUrl ARM function to retrieve the Logic App’s base url.

 {
  "comments": "Generated for resource /subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/apis/order/operations/59a6b4730d691205f068a8be/policies/policy",
  "type": "Microsoft.ApiManagement/service/apis/operations/policies",
  "name": "[concat(parameters('service_ibizmalo_name'), '/' ,parameters('api_order_name'), '/' ,parameters('operations_59a6b4730d691205f068a8be_name'), '/' ,parameters('policy_policy_name'))]",
  "apiVersion": "2017-03-01",
  "properties": {
    "policyContent": "[Concat('<policies>\r\n  <inbound>\r\n    <rewrite-uri id=\"apim-generated-policy\" template=\"?api-version=2016-06-01&amp;sp=/triggers/request/run&amp;\" />\r\n    <set-backend-service id=\"apim-generated-policy\" base-url=\"',listCallbackUrl(resourceId(parameters('logicApp_INT001-GetOrderInfo_resourcegroup'),'Microsoft.Logic/workflows/triggers', parameters('logicApp_INT001-GetOrderInfo_name'),parameters('logicApp_INT001-GetOrderInfo_trigger')), providers('Microsoft.Logic', 'workflows').apiVersions[0]).basePath,'\" />\r\n    <base />\r\n    <set-header name=\"Ocp-Apim-Subscription-Key\" exists-action=\"delete\" />\r\n  </inbound>\r\n  <outbound>\r\n    <base />\r\n  </outbound>\r\n  <backend>\r\n    <base />\r\n    <!-- { \"azureResource\": { \"type\": \"logicapp\", \"id\": \"/subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.Logic/workflows/INT001-GetOrderInfo/triggers/request\" } } -->\r\n  </backend>\r\n</policies>')]"
  },
  "resources": [],
  "dependsOn": [
    "[resourceId('Microsoft.ApiManagement/service/apis', parameters('service_ibizmalo_name') ,parameters('api_order_name'))]",
    "[resourceId('Microsoft.ApiManagement/service/apis/operations', parameters('service_ibizmalo_name'), parameters('api_order_name'), parameters('operations_59a6b4730d691205f068a8be_name'))]"
  ]
}

The Named Value (Property) will be automated the same way, using the listCallbackUrl ARM function.

{
  "comments": "Generated for resource /subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/properties/59a6b478285385ab3dfbb752",
  "type": "Microsoft.ApiManagement/service/properties",
  "name": "[concat(parameters('service_ibizmalo_name'), '/' ,parameters('property_59a6b478285385ab3dfbb752_name'))]",
  "apiVersion": "2017-03-01",
  "properties": {
    "displayName": "orderrequest59a6b4783fb21a7984df42ae",
    "value": "[concat('sv=',listCallbackUrl(resourceId(parameters('logicApp_INT001-GetOrderInfo_resourcegroup'),'Microsoft.Logic/workflows/triggers', parameters('logicApp_INT001-GetOrderInfo_name'),parameters('logicApp_INT001-GetOrderInfo_trigger')), providers('Microsoft.Logic', 'workflows').apiVersions[0]).queries.sv,'&sig=',listCallbackUrl(resourceId(parameters('logicApp_INT001-GetOrderInfo_resourcegroup'),'Microsoft.Logic/workflows/triggers', parameters('logicApp_INT001-GetOrderInfo_name'),parameters('logicApp_INT001-GetOrderInfo_trigger')), providers('Microsoft.Logic', 'workflows').apiVersions[0]).queries.sig)]",
    "tags": [],
    "secret": true
  },
  "resources": [],
  "dependsOn": []
}

By using the listCallbackUrl function we can retrieve the values from the specific Logic App at deployment time, removing all manual handling of URL’s and authentication values when working with API’s exposing Logic Apps.

SOAP

So when SOAP was introduced to API Management a new resource was added, Backend this is unfortunally not editable in the GUI yet but we can do changes on with ARM deployments, the REST and PS API’s. So in order to create a full ARM template for SOAP we need that extra resource and all other allready described information.

In my sample API Manegement instance my SOAP Currency Converter API is generated with a backend that has a parameter [parameters(‘f9211da9-8ce1-48c7-8e13-e38bc5dbcbcd_url’)] to be used to change the URL.

{
  "comments": "Generated for resource /subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/backends/f9211da9-8ce1-48c7-8e13-e38bc5dbcbcd",
  "type": "Microsoft.ApiManagement/service/backends",
  "name": "[concat(parameters('service_ibizmalo_name'), '/' ,parameters('backend_f9211da9-8ce1-48c7-8e13-e38bc5dbcbcd_name'))]",
  "apiVersion": "2017-03-01",
  "properties": {
    "title": "CurrencyConverter Soap Backend",
    "description": "CurrencyConverter Soap Backend",
    "url": "[parameters('f9211da9-8ce1-48c7-8e13-e38bc5dbcbcd_url')]",
    "protocol": "soap",
    "properties": {},
    "tls": {}
  },
  "resources": []
},

Get Started

So how do you get started? First of download the repo https://github.com/MLogdberg/APIManagementARMTemplateCreator, compile it and use this PowerShell script (replace the values with appropiate ones from your subscription and API Management instance).

#if you have problem with execution policy execute this in a administrator runned powershell window.
#Set-ExecutionPolicy -ExecutionPolicy Unrestricted

Import-Module "C:\temp\APIManagementARMTemplateCreator\APIManagementTemplate\bin\Debug\APIManagementTemplate.dll"


#Set the name of the API Mangement instance
$apimanagementname = 'ibizmalo'

#Set the resource group 
$resourcegroupname = 'PreDemoTest' # 'app-tst-cs-int-Migrated'
#Set the subscription id 
$subscriptionid = 'c107df29-a4af-4bc9-a733-f88f0eaa4296'#'d4baa1e9-15f5-4c85-bb3e-1e108dc79b00'#
#Set the tenant to use when login ing, make sure it has the right tennant
$tenant = 'mattiaslogdbergibizsolution.onmicrosoft.com'

#optional set filter for a specific api (using standard REST filter, with path we can select api based on the API path)
#$filter = "path eq 'api/v1/order'"
#$filter = "path eq 'api/v1/currencyconverter'"

 
#setting the output filename
$filenname = $apimanagementname + '.json'

Get-APIManagementTemplate -APIFilters $filter -APIManagement $apimanagementname -ResourceGroup $resourcegroupname -SubscriptionId $subscriptionid -TenantName $tenant -ExportPIManagementInstance $false  | Out-File $filenname

Upcoming

More functions and better support for authentication is on the todo list, also smarter/simpler product handling, here are my bullet points:

  • Automated support for all Azure Resources.
  • Operations Schema support.
  • Retrieve authroization servers linked to API, and parameterize them as needed.
  • Better support for authorization servers.
  • Support for Products and linking API’s to products

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM  | Build & Release 


API Management concurrency control

As I was mentioning in the last post with concurrency control for Logic App, it’s important in the distributed world today to make sure to not flood our backend services since there are more restrictions in the shared and distributed landscape than it has been in the good old days OnPrem where we had full control over all our services. That means that we need to have control over our integration solutions so they don’t spin away like crazy and causes unwanted flooding to our backend systems.

Historically we could control the request towards our backends system with Rate-Call-Limit and Qutoas to make sure that a maximum of request was made by a consumers during a time period, that could be minutes,hours,days,months.

But in the latest release the API Management the team added amongst other concurrency control. This means that we now also can control/restrict the number of concurrent calls to our backend service.

I.e. we have a SaaS service that allows 20 concurrent calls, all calls without that limit will be declined. This was hard to control from API Management and all the responsibility was left to the consumers. We had a case where the consumer push 1000 calls in a few miliseconds, distributed world is lovely! But the backend service couldnt handle it and most of the failed and a new batch was fired and yes the story repeated itself. It ended with the consumer needing to change their solution (yes I agree that was the best way) but we had no chance off protecting the backend from the flood except Rate-Limit but that was not an optimal solution.

Now whit Concurrency control we can garantée that the backend system don’t have more than 20 concurrent calls and all calls after are queued up to a maximum limit, and processed right away as one of the 20 concurrent calls are finished. Making sure to not waste resoruces in retrys and idle time. Using the Concurrency control is simple here is the policy: read more here

<limit-concurrency key="expression" max-count="number" timeout="in seconds" max-queue-length="number">
        <!— nested policy statements -->  
</limit-concurrency>

So let’s show a sample, I’ve created a Logic App to act as a backend service, it hasen’t mutch logic but a delay to simulate really heavy work (20 seconds)

Part 1: A simple flow will be a simple Logic app to Logic App with no restrictions:

And now let’s see how this looks like when consumed by another Logic App in a foreach loop (without any concurrency restrictions).

As assumed the execution takes abit longer than 20 seconds:

Part 2: Adding concurrency control by exposing the Backend Logic App via API Management

The Logic App is imported to API Management and the policy Concurrency Control is added:

And the policy looks like:

<policies>
	<inbound>
		<base />
		<rewrite-uri id="apim-generated-policy" template="?api-version=2016-06-01&amp;sp=/triggers/request/run&amp;" />
		<set-backend-service id="apim-generated-policy" base-url="https://prod-40.westeurope.logic.azure.com/workflows/24b778e6805344e686536a203ee47bce/triggers/request/paths/invoke" />
		<set-header name="Ocp-Apim-Subscription-Key" exists-action="delete" />
	</inbound>
	<backend>
		<limit-concurrency key="constantstring" max-count="2" timeout="120" max-queue-length="100">
			<forward-request timeout="120" />
		</limit-concurrency>
		<!--{ "azureResource": { "type": "logicapp", "id": "/subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/Concurrency/providers/Microsoft.Logic/workflows/INT0002-SingelInstance/triggers/request" } }-->
	</backend>
	<outbound>
		<base />
	</outbound>
	<on-error>
		<base />
	</on-error>
</policies>

To demonstrat the functionallity a new Logic App is now used , same logic as before but using the API Management to access the backend Logic App.

Executing the Logic App will now take 4 minutes! since the concurrency is set to 2.

So that is great now we have prevented to many concurrent calls to our backend system, but is it all good? Let’s look in to the log of our “backend system”

We can see that there are 2 failed runs and 22 in total runs, but as you remember the list had 20 records so we only wanted 20 request to our “backend system”.

The failed ones has failed due to timeout and that will also make the Logic App executing a retry, ending with a total of 22 calls to the backend.

Summary: This is truly a nice feature that we will use alot but we also need to understand the nature of the API and how to configure it correctly to prevent unwanted behavior. In a flow where resending the information is not allowed, the queue lenght should be set as low as possible to prevent the retry behavior of clients resending the information even if it’s not wanted. Or we might want to use other techniques for that kind of flows.

All in all a great feature and wery usefull one, but as allways make sure to understand the feature and the requirements of the protected API when configuring it.

Posted in: | API Management  | Tagged: | API Management  | Integration 


Logic Apps concurrency control

In a distributed world with increasing amount of moving parts we need to make sure to not flood our services since there are more restrictions in the shared and distributed landscape than it has been in the good old days where we had full control over all our services. That means that we need to have control over our solutions so they don’t spin away like crazy.

On this topic Logic Apps has earlier had problem, when looping over an array in a foreach loop and calling a connector we could choose between 20 concurrent actions or sequentially call it one by one. This mean that if 20 parallel actions was flooding the destination we had to make it to sequantial call the destination and therefore also taking alot longer. When working with batches this is not always possible since it might take to long time, so sometimes we had to create two or more parallel flows in Logic Apps to make suer that we could keep the time limit.

Making the flow complex and look “uggly”:

Newly released concurrency control!

But fortunately there is now a newly released object property on the for each action that will help us, only available in CodeView for now but that will work just fine.

The new property looks like:

"runtimeConfiguration": {
    "concurrency": {
        "repetitions": 2
    }
},

And setting the repetitions to 2 will be the more or less the same as above solution in execution but it will be mutch easier to maintain and work with the Logic App now.

The foreach now looks like: (removed the body to the function for clarity reasons)

"For_each": {
    "actions": {
        "INT0020-Update-Fuse-Users": {
            "inputs": {
                "body": {
                   ...
                },
                "function": {
                    "id": "/subscriptions/fakeee-15f5-4c85-bb3e-1e108dc79b00/resourceGroups/rgroup/providers/Microsoft.Web/sites/appname/functions/INT0020-Update-Fuse-Users-Compare"
                }
            },
            "runAfter": {},
            "type": "Function"
        }
    },
    "foreach": "@body('INT0020-Split-Array')",
    "runAfter": {
        "GET_FUSE_TOKEN": [
            "Succeeded"
        ]
    },
    "runtimeConfiguration": {
        "concurrency": {
            "repetitions": 2
        }
    },
    "type": "Foreach"
},

Logic App in the designer will now look like:

Triggers also have concurent control

We can also set this on triggers, sample bellow is on a recurrent trigger (or polling trigger withou spliton) will look like;

"trigger": {
"type":"http",
"recurrence": {},
"runtimeConfiguration": {
    "concurrency" : {
    "runs": 10
	}
}

For more detailed explanation on Trigger handling I suggest you read this great post from Toon Vanhoutte

I’m really happy that the product group finally has released this since it’s such an important part in our new world of ditributed landscapes and our “new” responsibility and interest in making sure our solutions are not flooding the destionations. We got the power and now we got the possibility to restrain them to the appropiate amount.

Posted in: | LogicApps  | Tagged: | Logic Apps  | Integration 


OMS and Non Events

When it comes to monitoring a integration flows, there are several types of monitoring that we need to cover, just the other day I got the requirement to make sure that a Logic App has been run atleast one time during a 24 hour period.

To solve this I started to look at possible solutions and turning my head to the Alert section of Logic Apps Diagnostics since it has the possibility to create a alarm based on input as number of runs, number of failed runs etc. unfortunally for me the max time limit was 6 hours. In this solution we are using Log Analytics and OMS as monitor tool, using the new Logic Apps Gallery it’s super nice (guess I need another post on this later on). Anyway there is alarm functionality inside the OMS Portal so I started to look in to that.

Alarms in OMS is easily created based on a Search so first of we need to create a search, this is done in the search area and easiest way is to click your way to a start of the search and then change the last parts manually, I needed to check that a workflow was executed and ended sucessfully. So my query ended up like this: (easy to reuse, just change the resource_workflowName_s to the name of your Logic App.

search * | where ( Type == "AzureDiagnostics" ) | where OperationName == "Microsoft.Logic/workflows/workflowRunCompleted" | where status_s == "Succeeded"   | where ( resource_workflowName_s == "INT002_Update_Work_Order2" )  

This query will return the results of succesfully runs, and we will be able to use it in our alarm.

Setting up the alarm is rather simple, just Add Alert Rule and fill in some information about the alarm, the important parts are the Search Query here you shall choose the saved query above, Time Window this is how far back we will look, so for our case we look 24 hours back. Alert Frequency is another time slot and it’s how often to check this rule, based on our case we wanted to check every hour. But in order for us to not get a new alert every hour (seems unnessisary to get 7 alarm emails during th night we can also specify Spuress alerts this will make sure that an alarm will not be sent out more often than every 6 hours.

So in reality we can now be sure that at most we will be alerted within 25 hours from the last run that there has not been any more runs and we will be alerted again every 6 hours until we have sucessfully completed a run.

There is possibility to invoke webhooks, Runbook or ITSM actions but in our case an email is good enough so we will use that. The email sent out is not containing that mutch information but it’s enought for us, we now that the flow has not been working and most likely the sending system has failed to send their file.

The email:

Monitoring for events that should happen but are not happening is always tricky and has historically always been hard to solve, so often collegues or other people in your organisation creates tasks for them to do daily checks of some sort. It can be small things like checking that a folder is empty or a log for a row with todays date on it. Even if the task itself is easy it burdens the people doing it and as more of these “small” tasks are added the more time it consumes, time these peoople should have spent working on more important tasks. So to be able to handle monitoring of predicted events and get an alarm we can give services of great value to these people, now we notify them if anything is stopped or so and they can focus on their work. When they can rely on the things we build, we are not only building robust integrations we are then also building trust and a service of great value.

It’s often the small things that makes the most.

Posted in: | OMS  | Tagged: | Logic Apps  | OMS  | Monitoring  | Integration 


New blog

Now I thought it was time to create my own blog.

As you might now I’ve been blogging alot on my work blog, http://blog.ibiz-solutions.se/.

So now I’ve felt that I’m ready for my own blog. I’ll keep with the same line, bloggin about Integration and most part will most surely be about Integration in Azure.

I’ll cover updates on the source projects I contriubute to here i.e Logic App Template Creator and some other new fun projects, so stay tuned and poke me if I’m keeping a to low pase on my blog.

Hope to see you around!

Cheers!

Posted in: | Thinking  | Tagged: | Thinking  | Integration