MSI with Service Bus

Using MSI with Service Bus is a great addition to the SaaS key that is the standard authentication mechanism.

What is MSI (Managed Service Identity)? Read more in previous post. In a short summary it’s a way of giving your resources (can be a Logic App, a Function App etc.) an identity and assing permissions.

In the case of using MSI with Service Bus it can be used with our internal services accessing the Service Bus queues, topics or topic subscriptions.

The Access Roles that are available on Service Bus are the following: Azure Service Bus Roles

  • Azure Service Bus Data Owner: Grants access to both send and recieve
  • Azure Service Bus Data Reciever: Grants access to read messages from queues or topic subscriptions
  • Azure Service Bus Data Sender: Grants access to send message on queues or topics.

When assigning roles there are some things to think about:

  • What are the current service suppose to be able to access? All queues or specific queues?
  • If you use Functions or Logic Apps as Reciever of messages from either queue or topic subscriptions you must have assigned the Azure Service Bus Data Reciever on the specific queue or topic subscription. It will not work to assign it on the whole namespace.

Let’s take an example of reading messages from a queue with a Logic App. We are creating a consumption Logic App and the first step we need to do is assignt an identity to the Logic App. Assign Identity to Logic Apps

Now we need to Assign Roles to the Logic Apps Identity to the Queue, remember that it has to be on the Queue in order to work with recieving messages.

Go to the Service Bus namespace and to the Queue, in my case Messages and add the role assingment Azure Service Bus Reciever to our Logic App.

Assign Permissions to Logic Apps

Pick the role and then select the Managed Identity that is connected to your Logic App

Pick identity of Logic Apps

Now we can create a trigger in the Logic Apps Designer as follows:

Create Trigger in Logic Apps

If you have assigned the Role on the queue you are now ready to start recieving messages.

Removing the access SaaS key

There is an option to remove access with the SaaS key. This is preferable if the Service Bus namespace can be used with resources who can use Managed Idenity or Service Principal when authenticating to the Service Bus Namespace. If the SaaS key access is removed the whole namepsace is only accessible with a valid Aure user. This is a much securer way and moves the access control to Azure AD rather than a Service Bus specific authentication mechanism.

To close the Access Key go to the namespace and press the Local Authentication: Enabled link and change to Disabled. After this only access via Azure AD Roles will be working.

Disable Local Auth

Troubelshooting

If no messages are recieved check the triggerhistory and verify that there are no errors.

Trigger History

Summary

Using Managed Identity or Servcei Principal with Service Bus is a great feature, it’s much easier and secure. No static keys that can wander of and no unkown users. Developers accessing the namespace have to have valid Azure AD accounts that have assigned permissions. I know it can be tricker in the start but in the long run it will help out alot.

Posted in: | Development  | Tagged: | Azure  | Service Bus  | MSI 


MSI what and why?

MSI or Managed Service Identity that it stands for has been around for some time but quite recently a new set of resources has been integrated in to this and we need to start thinking of using MSI alot more. Since this will increase security and streamline the authorization process, one place to grant access, same account and authorization process. Sadly most of Microsoft’s documentation around this is still for Virtual Machines and I want to enlighten this alot more for IPaaS services like Logic Apps, Functions, API Management, Web Apps, DataFactory Service Bus and Event Grid.

So MSI is an identity of a resource just like any other user in Azure AD. The beauty of this is that with this we can assign access rights to that identity and the resource now authenticates by itself (with credentials no one else nows) and with those credentials it can use the resource. I usually like to discribe it as a virtualization of the service and used for authorization purposes, in the image bellow we can see that an Azure Function are making requests to Cosmos. This is allowed since we have assigned the role Cosmos DB Built-in Data Contributor to the identity of the function in a scope containing the Cosmos resource (could be on resource itself, a resource group that grants access to all cosmos resources in the resource group, or subscription granting access to all cosmos resources in that subscription).

Result in VS Code

What Resources can use MSI: (there are more but these are Integration focused)

  • Logic Apps,
  • Azure Functions
  • API Management
  • Web Apps
  • DataFactory
  • Event Grid Read the full list here.

What resources can we use authenticate to with MSI.

  • Azure SQL
  • Cosmos
  • Service Bus
  • Event Hub
  • Azure Functions (if you use Azure AD as provider in Authentication)
  • Web Site (if you use Azure AD as provider in Authentication)
  • Logic Apps (If you use Oauth Authorization)
    • In Logic Apps we can use MSI to connect to resources like Event Grid for creating subscriptions.

Is this not what we accomplished before?

No before we created a SQL account, took keys from Cosmos, Service Bus etc, create a service principal or a service user in AD to accomplish other acesses. All of these could potentially be simplified and changed to only give the resource the appropiate access rights. Note not all resources is supporting this yet, but this is an area where Microsoft has done some great gains last 6 months.

But isn’t this making my development process a pain?

No on the contrary, you as a developer don’t need to gather all these credentials either, development and testing can be used with your own account. Making it easier to get started and another great benefit is that you cannot accidently keep credentials and access after an completed assignement or project. With good documentation it’s also easier to get a new teammate up to speed. Another good thing is that we know that credentials are not wandering away when members leave the team.

What should I think off

As a developer you could usually grant yourself access to diffrent resources. When using MSI this is no longer the case, since the access is now granted by Azure AD and Azure AD access is often prohibited for developers. Authroization now need’s to be given by a user with Owner or User Access Administrator, to overcome this you need to create a plan for your team how this should be done.

A tip here is to create a Azure AD Group and add all developers to it that need specific access, remove them when they don’t need it. An extra step would be creating a PIM group for it.

Read more:

Posted in: | Development  | Tagged: | Azure AD  | MSI 


Using MSI in APIM to Cosmos

Quite recently Microsoft finally gave us RBAC roles for data plane access to Cosmos, this means that we now can use MSI to access data in Cosmos.

So what’s it all about? Well the best part is that we now can grant access to resources rather than using keys. This increases security and accessability since we don’t need to manage the key any more and only specific resources can access our data. When we have switched to RBAC as the authentication method we can switch off the possibility to access data with primary/secondary keys,read more here.

So how do we do this? Here is a guide from Microsoft on how to use MSI with an Azure Function. And we will in this post use the same technique but from API Mangement to show the potential.

Prerequisites:

  • API Management instance
  • Identity created on your API Management instance
  • Cosmos DB
  • Azure CLI

First we need to grant the specific roles to our API Management identity. The role we want to assing is the Cosmos DB Built-in Data Reader. This role has read access and that matches our Least Previliges thinking, since when we only want to expose read operations we should only have read access.

Adding this role is not as straight forward as I would like it, they are not possible to set via the Azure Portal yet. So we need to do some PowerShell scripting. Fortunally all scripts needed can be found at the Microsoft docs on How to setup RBAC for Cosmos DB. I would recomend looking in there since they also provide how to create your own custom role definitions.

Anyway let’s get started and we need this script:

$resourceGroupName = "<myResourceGroup>"
$accountName = "<myCosmosAccount>"
$readOnlyRoleDefinitionId = "<roleDefinitionId>" # as fetched above
$principalId = "<aadPrincipalId>"
New-AzCosmosDBSqlRoleAssignment -AccountName $accountName `
    -ResourceGroupName $resourceGroupName `
    -RoleDefinitionId $readOnlyRoleDefinitionId `
    -Scope "/" `
    -PrincipalId $principalId

Let’s get the parameters, first we get the cosmos values, the resource group and the account name of the Cosmos instance ($resourceGroupName,$accountName) both marked in circles. Get Cosmos resourcegroup and account name

Then as we saw at the Built in role definitions the role with name Cosmos DB Built-in Data Reader has Id 00000000-0000-0000-0000-000000000001

Last we need the principal id of our Api Management instance (this could be any identity created for any resource). We go to the Managed Identity in our API Management instance. There we can collect the Object (principal) ID. Get APIM Managed Identity Prinipal Id

So the script ends up like this: (Make sure to run the latest az version)

$resourceGroupName = "microservice"
$accountName = "mlogdberg"
$principalId = "c7b89cc3-1b27-474b-9575-cdcbc51dffcc"
$readOnlyRoleDefinitionId = "00000000-0000-0000-0000-000000000001" # as fetched from definition list
az cosmosdb sql role assignment create --account-name $accountName --resource-group $resourceGroupName --scope "/" --principal-id $principalId --role-definition-id $readOnlyRoleDefinitionId

After this we are ready to start quering, I did a simple request that collects all documents in a specific partion. This can ofcourse be expanded but this is bare minimum what is needed in the API Managemenet Policy.

Some good to know things:

  • authentication-managed-identity is the step to get the token, make sure to have the patht to your cosmos instance.
  • x-ms-documentdb-partitionkey is what partion the query is targeting
  • rewrite-uri this is the patht to your database and collection /dbs/{dbname}/colls/{collectionname}/docs/
  • The url is set on the api to the path for my cosmos instance (https://mlogdberg.documents.azure.com).
    <inbound>
        <base />
        <set-variable name="requestDateString" value="@(DateTime.UtcNow.ToString("r"))" />
        <authentication-managed-identity resource="https://mlogdberg.documents.azure.com" output-token-variable-name="msi-access-token" ignore-error="false" />
        <set-header name="Authorization" exists-action="override">
            <value>@("type=aad&ver=1.0&sig=" + context.Variables["msi-access-token"])</value>
        </set-header>
        <set-header name="x-ms-date" exists-action="override">
            <value>@(context.Variables.GetValueOrDefault<string>("requestDateString"))</value>
        </set-header>
        <set-header name="x-ms-version" exists-action="override">
            <value>2018-12-31</value>
        </set-header>
        <set-header name="x-ms-documentdb-query-enablecrosspartition" exists-action="override">
            <value>false</value>
        </set-header>
        <set-header name="x-ms-documentdb-partitionkey" exists-action="override">
            <value>["1"]</value>
        </set-header>
        <rewrite-uri template="/dbs/staging/colls/order/docs/" copy-unmatched-params="false" />
    </inbound>

Now we shall be able to send a request,

Result request

Troubelshooting tips:

Request blocked by Auth

Before the setup is completed you will get this kind of respose trying to call the cosmos rest api.

HTTP/1.1 401 Unauthorized
content-location: https://mlogdberg.documents.azure.com/dbs/staging/colls/order/docs
content-type: application/json
    {
    "code": "Unauthorized",
    "message": "Request blocked by Auth mlogdberg : No AAD tenant is trusted by this database account. Please create atleast one Role Assignment prior to making an AAD-based request.\r\nActivityId: 31a968bf-7652-4434-abd9-5113ea0655ae, Microsoft.Azure.Documents.Common/2.14.0"
}

Partition key 1 is invalid

The header x-ms-documentdb-partitionkey is important and when badly configured these kind of errors is sent back, the expected format is a json array of strings, if your partion is “1” the header should have value [“1”] (like my example). You can find your partionkey from browsing your collection.

Browsing collection to find partionkey

HTTP/1.1 400 Bad Request
content-location: https://mlogdberg.documents.azure.com/dbs/staging/colls/order/docs
content-type: application/json
    {
    "code": "BadRequest",
    "message": "Partition key 1 is invalid.\r\nActivityId: 6bccbf25-657b-4a71-af92-b3e998f0490a, \r\nRequestStartTime: 2021-11-21T18:24:18.9692549Z, RequestEndTime: No response recorded; Current Time: 2021-11-21T18:24:18.9792812Z,  Number of regions attempted:1\r\n{\"systemHistory\":[{\"dateUtc\":\"2021-11-21T18:23:09.7385423Z\",\"cpu\":8.902,\"memory\":58190802944.000,\"threadInfo\":{\"isThreadStarving\":\"False\",\"threadWaitIntervalInMs\":0.012,\"availableThreads\":32761,\"minThreads\":40,\"maxThreads\":32767}},{\"dateUtc\":\"2021-11-21T18:23:19.7586326Z\",\"cpu\":7.062,\"memory\":56609513472.000,\"threadInfo\":{\"isThreadStarving\":\"False\",\"threadWaitIntervalInMs\":0.0161,\"availableThreads\":32764,\"minThreads\":40,\"maxThreads\":32767}},{\"dateUtc\":\"2021-11-21T18:23:29.7687307Z\",\"cpu\":5.752,\"memory\":56766377984.000,\"threadInfo\":{\"isThreadStarving\":\"False\",\"threadWaitIntervalInMs\":0.0173,\"availableThreads\":32765,\"minThreads\":40,\"maxThreads\":32767}},{\"dateUtc\":\"2021-11-21T18:23:49.7689519Z\",\"cpu\":5.634,\"memory\":57133617152.000,\"threadInfo\":{\"isThreadStarving\":\"False\",\"threadWaitIntervalInMs\":0.0145,\"availableThreads\":32765,\"minThreads\":40,\"maxThreads\":32767}},{\"dateUtc\":\"2021-11-21T18:23:59.7790729Z\",\"cpu\":5.298,\"memory\":57189658624.000,\"threadInfo\":{\"isThreadStarving\":\"False\",\"threadWaitIntervalInMs\":0.0125,\"availableThreads\":32765,\"minThreads\":40,\"maxThreads\":32767}},{\"dateUtc\":\"2021-11-21T18:24:09.7891858Z\",\"cpu\":8.835,\"memory\":56394321920.000,\"threadInfo\":{\"isThreadStarving\":\"False\",\"threadWaitIntervalInMs\":0.0193,\"availableThreads\":32764,\"minThreads\":40,\"maxThreads\":32767}}]}\r\n, Microsoft.Azure.Documents.Common/2.14.0"
}

Summary:

The usage of MSI (Microsoft Service Identity) is a great feature on Cosmos, no need for storage keys. It’s a win win thing, increased simplicity and also increased security. No credentials are stored and due to that they cannot be leaked. Forcing components to use MSI when connecting to the Cosmos could be done disableLocalAuth to true: "disableLocalAuth": true read more

Posted in: | Development  | Tagged: | Azure AD  | Api Management  | MSI 


AAD Personal Token in requests with VS Code

Two weeks ago I did a post around how I use Postman to generate AAD Oauth tokens for Service Principals. Soon after released I started to get questions, THANKS ALOT to all of you who reads and comments. One thing lead to another and an interesting discussion arised with one of you, from that dicussion I had two questions ringing in my head. First is it a good idea to give all developers client credentials to access services? It takes time to create and these credentials is possible to share amongst others, even outside organisations. And second can’t we as a developer just use our own credentials and access?

If we could use our own account instead of a service principal we would increae security, starting time and flexibility all at once.

So I started to look around and I found a good option for this, and that is what I’m going to share today.

I found that VS Code has a nice extension that I will use in this case the Rest Client.

So first install the extension in VS code.

VsCode Rest Client Extension

After this, let’s create a new Workspace and start testing this out. As a standard I create a folder and add the requests there, it’s easier to reuse and modify for new requests.

I’ve create a folder called D365 and added a file GetQuotes.http. the .http ending tells the Rest Client extension to VS Code it’s a http request and therefore adds a button on the top Send Request that will trigger the request.

Structure and semantics in this file is very easy, tactic is just write the request in raw format and the press send.

Let’s go thru some small exampels, simple GET example get comment 1, write the Method URL and HTTP protocol to use, so for a GET to https://example.com/comments/1 using standard HTTP/1.1 it will be written as:

GET https://example.com/comments/1 HTTP/1.1

A more advanced exampel, POST example, to create a new comment,POST to https://example.com/comments using standard HTTP/1.1 and we add the json body raw and don’t forgett the content-type header:

POST https://example.com/comments HTTP/1.1
content-type: application/json

{
    "name": "sample",
    "time": "Wed, 21 Oct 2015 18:27:50 GMT"
}

As you can see we just write what we want to send, but enough of the basics let’s get going.

When we want to use our own credentials in order to do calls to an AAD proteced resource like Dynamics D365. We can then use the built in variable $aadToken in the Rest Client, read more here.

That is simple to use we just need to add it to our Authorization header. Let’s look at an example to get Sales Quotation Headers from D365.

GET https://.cloudax.dynamics.com/data/SalesQuotationHeaders?cross-company=true&$top=3 HTTP/1.1
Authorization: 

When we then press the Send Request as shown in the image: VS Code http file

A login prompt will poput like the one bellow, press signin.

VS Code prompt

A browser will open and you need to paste the code from the previous promt in the code box (it’s in your clipboard at this point so just paste it)

Browser Login

Follow the login procedure and get back to VS Code to see the request been sent and you can see the data.

Result in VS Code

This request is then sent to D365 with a token generated for you. As with Postman or other tools you can now start building your own collections of requests in VS Code to share, check in with code or just for your own use. Read more in the Rest Client page to get details on advanced request building.

Summary:

As a developer we often need to query systems like D365. When using the “normal” way with service principals we are using credentials anyone can use. We can share them amongst collegues and they will be stored on developers computers. But with the described approach we can stop that and start pushing developers to use their own credentials. Increasing security and control over who has access to our systems and data.
So not only is it awesome for a developer to get started more quicker and easier without the need of a service principal. It’s also a better practice security wise.

There are probobly many more ways to solve this, but this is one way and my thoughts on it.

Posted in: | Development  | Tagged: | Azure AD  | Oauth  | VS Code  | Token generation 


Oauth Token generation with Postman

Beginning to develop solutions where ouath is used as authentication mechanism is often easy due to alot of frameworks sorting out the heavy lifting. But I prefer to go in and see the raw data and make small changes/new requests to the apis to get data in order to understand what’s going on. In those cases I use Postman and build up collections to easily get the data I want and need. And the last few days I’ve gotten questions from multiple people around how I do it so I felt I should share a blog post about it since more might be interested.

First of all this is the way I’m doing it so if you have a better please fill in the chat since there are more than one way to solve this problem. I create a new Workspace “Token generation demo” in Postman (Workspace is a container for i.e. a project where I need to gather all requests and settings I need when working in this project).

Postman Workspace start

Let’s go thru the basics needed, if you never have been working with Postman these are the areas we will use. 1) Collections, here is all our collections that contains requests that we want to group togheter, like “Sales Orders” or “D365” 2) Request area, here we will work with our requests 3) Environmnet variables, these will be used to safely store credentials and also quickly change between environments

Postman Workspace info

There are some built in mechanism but I’m always finding myself in some area of trouble and spend alot of time troubelshooting and just come out confused, so the approach I’m using feels easier and you get more control over what’s going on, call me old school if you want :).

So let’s start of by creating a collection, I name it Token Sample after that we are ready to add our first request, it’s the one for creating the token, we are doing this towards Azure AD. 1) Press the new button and pick ‘Http Request’ 2) Add the url and to Azure it’s always https://login.microsoftonline.com//oauth2/token only switch the to your tenant id. 3) Add the body, since this is where we will send in credentials and all other information needed.

grant_type:client_credentials
client_id:
client_secret:
resource:

What type to use, since we use client credentials we pick client_credentials and then the id and secret of your Azure AD app. And last the audience we are targeting, in this sample it’s the Azure Management api but ut could be any resource that you have in your AD.

4) Name the request so you can find it easily 5) Save it in your collection

New Request

Now we have added a bunch of variable, each place that looks like this is avariable with the name that’s inside the curly brackets. Let’s add an environment to add these variables too, clikc on the eye and then the Add button for environment. Start populating the variables we need:

  • tenantId - your tenant ID can be found in the tageted AD
  • client_id - the client id of the application you are authenticating with
  • client_secret - the client secret of the application you are authenticating with
  • audience - the targeted audience, meaning where do I want to have access, in my case to the Managemet Api.

New Environment

If we now go back to the request we can generate a token, make sure to pick the environment at top right (see image):

Test Get Token Request

Now we need that token when we do request’s there are several ways to do this but my favourit one is to add a variable for it, and get it updated automatically. Under the tab Tests we can add scripts, read more on scripts. We will then add a simple script that will verify that we get a json body back, and pick out the bearer token in the response and update the environment variable named bearerToken.

var json = JSON.parse(responseBody);
tests["Get Azure AD Token"] = !json.error && responseBody !== '' && responseBody !== '{}' && json.access_token !== '';
postman.setEnvironmentVariable("bearerToken", json.access_token);

Postman Test's tab

And after the run we now have the token in the variable bearerToken as we can see under the environment by clicking on the eye.

Token stored to Environment variables

Now we can start using it and it’s very simple to start using it, let’s start of by scanning what subscriptions I have access to with this service principal from the Azure management Api.

1) Create a new request 2) Name it Get Subscriptions 3) Set operation to GET and url to https://management.azure.com/subscriptions?api-version=2014-04-01-preview 4) Add the Authorization header with the Bearer type and the token from the variable. 5) Send the Request

Test the setup

As we can see there is a list of Subscriptions returned, and we can now easily setup new environments and reuse the same requests just change the enivornment and the results will be based on the new credentials. This will quickly get you up and running with wathere fun things you are doing.

Summary:

We often need the flexibility that running our own requests gives us, this is how you can use Postman. I often end up building request collections in Postman and distribute them over the team or use them for testing and retesting things. As I usually do I prepared a collection for you if you want to speed up the implementation, download it here

Posted in: | Development  | Tagged: | Azure AD  | Oauth  | Postman  | Token generation 


Troubelshooting Azure API Management

We often encounter situations where we need to troubelshoot our API’s and in this session I will go thru the different options we have. Following is the 3 different places where we can troubelshoot, here is a quick overview of the three options and here is a link to a video using all these options. Troubelshooting in Azure API Management

Built in Analytics

In the built in Analytics we can easily see a good overview and status on our API’s. We can see hwo request’s are going, succes, failed? and so on we can also see data transfered and get an insight into response times. Filtering can be done on API’s, Products and even subscriptions to get an insight into how a specific consumer experiences our API’s. Using the Request tab we can even se the requests made, very usefull i you want to find isues like 404 responses or similar.

Analytics startpage

Application Insights

In addition we can add other tools to get deeper insight into what is happening, i.e why do the consumer get a 401? Or find the error text with that 500 response that is returned. If our backend resource also uses Applciation Insight’s you get a the full end-to-end experience where you can track all relevant information connected to the request like dependencies, logged events and so forth. In Application Insights we can see all failures under the Failures tab.

App Insigths Failures

Clicking you thru we can see the timeline and also the specific error text, in my case there is a TokenClaimValueMismatch at the policy section validate-jwt and if you watch the video we can see that this is exactly why we get the 401. The token provided is missing a permission.

App Insigths Failures

Live Debugging in VSCode

If we need to go deeper in to your troubelshooting in your API Management Policy we can use the Azure API Management VSCode extension. In this extension we can edit our policies for our API Management instance and upload them back when done.

VSCode Policy

We can start a debugg session where we can send a request and debug the policy step by step! This is awesome and so usefull! (only that request will be captured, so this can safely be done even in production). Right click the Operation and select “Start Policy Debugging”.

VSCode Policy Debugging

The Request is creted and you can add you settings, payload etc and then press “Send Request”.

When using versioning please verify the URL, there is currently no support for the versioning so in my case the versioning is lost my URL is in the sample bellow https://.azure-api.net/Orders/orders/my but due to versioning should be https://.azure-api.net/Orders/v1/orders/my.

VSCode Send Request

For more details and talk, see the video: Troubelshooting in Azure API Management

Summary:

Beeing a great API publisher is about building robust API’s and taking care of our API Consumers and help them to give a great experience on all fronts, specially when there is issues. We want to be able to find em, fix em and report back to the consumer with either help on what to do to fix it on the consumer side or a “we fixed it” from the API publisher side.

Posted in: | API Management  | Tagged: | API Management  | Integration  | Serverless 


API Management and Auth0 Consumer Friendly API's

Today we will extend our setup with API Management even further. Focusing on consumer friendly API’s and in this setup that is knowing our user more. In this example we will add the customer id to the consumer representation in Auth0 and use this id to give the consumer an operation called My orders. No need for knowing or using or adding an id in the request. The id will be added to the token from Auth0 and then extracted in API Management and sent downstreams to filter the result set.

An Identity provider is a service that a user/application is signing in to (just like Azure AD) and this provider has functionality to provide needed information and grant access to requested resources that the IDP is handling. Just like the fact that you have access to a resource or resource group inside your subscription in Azure.

In API Management we have a trust setup as of the previus post Setup Auth0 with API Management. Then we added RBAC permission in the second post RBAC with API Management where we added functionality to provide granular control over who has access to what operation. Now we will continue this and focus on extending this setup with adding a customerid to our consumer representation i Auth0 and then extract that id in API Management to enforce that they only can access these customers orders. Providing an operation to get just my orders without any additional parameters. This is achieved via the validate-jwt policy and then reading the JWT token custom claim and store it in a variable for later use.

<set-variable name="customerid" value="@(((Jwt)context.Variables["token"]).Claims.GetValueOrDefault("https://mlogdberg.com/customerid","-1"))" />

The end scenario will look like the illustration bellow where we can extract the customerid from the token and use it later down the path.

Scenario image

I’ve created a video that will go thru all of this a link is provided bellow.

Azure API Management and OAuth with Auth0

Links used in the video:

Summary:

Beeing a great API publisher is not as easy at it first looks like. Always striding for better and easier to use API’s is a good thing and here we show how to move more logic inside the setup rather than at the consumer. If we can prevent the consumer from needing to filter data and/or manage identifiers and so forth we can make it easier for the app builders that consume our api’s and provide speed and agility to those teams. At the same time we can create more secure and robust API’s with more automated testing. There are more ways to use this and interested to see what you will find most usefull.

Posted in: | API Management  | Tagged: | API Management  | Integration  | Serverless  | Auth0  | IDP  | OAuth 


RBAC with API Management

Extending our setup with API Management and Auth0 to create RBAC control over our operations in our API’s. The idea is to make sure that consumers can be granted access to specific operations in an API and not only to the whole API. This can be achieved in a number of ways and in this topic we will use Auth0 as the Identity Provider to help out with this, but you could use any other of your choice.

An Identity provider is a service that a user/application is signing in to (just like Azure AD) and this provider has functionality to provide needed information and grant access to requested resources that the IDP is handling. Just like the fact that you have access to a resource or resource group inside your subscription in Azure.

In API Management we have a trust setup as of the previus post Setup Auth0 with API Management. Then we add permissions to our representation of the Api Management instance in Auth0 and grant the conusmer only the permissions that we want to give. After the permissions is setup and granted we willenforce these permissions in our API Management instance via the validate-jwt policy.

The end scenario will look like the illustration bellow where we can grant access to some or all operations in our API.

Scenario image

I’ve created a video that will go thru all of this a link is provided bellow.

Azure API Management and OAuth with Auth0

Links used in the video:

Summary:

There is alot of times where extra granularity is needed in the API’s exposed. It helps out with creating easier to use and more natural API’s. I can be a good selpoint if used to show that there are more features for premium customers. Regardless adding RBAC to your API’s will increase seurity and eas of use. It will also give you a possibility to move the approving of new customer out of the API Managament instance and become more of a business thing to increas new consumer adoption. The setup used in Auth0 is very powerfull and can be reflected on end users aswell and that can be very helpfull when adding more lightweight consumers with single purposes.

Posted in: | API Management  | Tagged: | API Management  | Integration  | Serverless  | Auth0  | IDP  | OAuth 


Setup Auth0 with API Management

API Management is an awesome API gateway with functionality to really excell in exposing API’s to consumers. When it comes to security there are several options and today we will look in to the OAuth. In order to do this we need an IDP (Identity Provider) that we can configure a trust releationship with.

An Identity provider is a service that a user/application is signing in to (just like Azure AD) and this provider has functionality to provide needed information and grant access to requested resources that the IDP is handling. Just like the fact that you have access to a resource or resource group inside your subscription in Azure.

In API Management a trust to an IDP and creation of a validation of the JWT provided from the IDP is done easily via the restrict policy called validate-jwt

Let’s go thur how the setup looks like, we will need to set up a Trust between your API Management instance and your Auht0 instance.

Scenario image

I’ve created a video that will go thru all of this a link is provided bellow.

Azure API Management and OAuth with Auth0

Links used in the video:

  • Validate JWT Token
  • Auth0 Openid Configuration url: https://YOUR_AUTH0_DOMAIN/.well-known/openid-configuration

Summary:

Adding a second security layer like this increases security and as you will see later on flexibility. It’s an awesome start in order to build a nice consumer experience for your API’s. In API Management it’s very easy to attach any IDP so you can pick and choose your favourite and the setup will be somehwat similar.

Posted in: | API Management  | Tagged: | API Management  | Integration  | Serverless  | Auth0  | IDP  | OAuth 


Azure Functions logs in Application Insights part 2

In part one I covered the ILogger interface and what it’s providing for us, but sometimes we want more control of our logging, enrich oru logging or we have allready implemented alot of logging with TelemtryClient and just want to connect the logging to our end-to-end experience.

We use the same Scenario, a message is recieve via API Management, sent to a Azure Function that publishes the message on a topic, a second Azure Function reads the published message and publishes the message on a second topic, a third Azure Function recieves the message and sends the message over http to a Logic App that represents a “3:e party system”.

Scenario image

Read more about the scenario background and the standard ILogger interface in previous post Azure Functions logs in Application Insights.

Connecting custom logs from TelemetryClient

If we need more custom logging options than the provided in the standard ILogger provides or we have alot of logging today with TelemetryClient that is not connected to the endo to end scenario. We need attach these to our end-to-end logging experience via the TelemtryClient to take advantage of the more advanced logging features.

In order to achieve this we need to add some context properties to our instance, here is one way to do this via Service Bus, here we recieve the Message object.

From the Message object we can extract the Activity, this is an extension found in the Microsoft.Azure.ServiceBus.Diagnostics namespace. From the result of this operation we can get the RootId and ParentId wich is the values for the current Operation Id and the parent id. Read more on how the Operation id is built and how the hierarchy is designed

public string Run([ServiceBusTrigger("topic1", "one", Connection = "servicebusConnection")]Message message, ILogger log)
{
    string response = "";
    var activity = message.ExtractActivity();

    telemetryClient.Context.Operation.Id = activity.RootId;
    telemetryClient.Context.Operation.ParentId = activity.ParentId;

    telemetryClient.TrackTrace("Received message");

This can also be done from a HTTP call read more.

// If there is a Request-Id received from the upstream service, set the telemetry context accordingly.
if (context.Request.Headers.ContainsKey("Request-Id"))
{
    var requestId = context.Request.Headers.Get("Request-Id");
    // Get the operation ID from the Request-Id (if you follow the HTTP Protocol for Correlation).
    telemetryClient.Context.Operation.Id = GetOperationId(requestId);
    telemetryClient.Context.Operation.ParentId = requestId;
}
	
//GetOperationId method
public static string GetOperationId(string id)
{
    // Returns the root ID from the '|' to the first '.' if any.
    int rootEnd = id.IndexOf('.');
    if (rootEnd < 0)
        rootEnd = id.Length;

    int rootStart = id[0] == '|' ? 1 : 0;
    return id.Substring(rootStart, rootEnd - rootStart);
}

Now we will also add this small line of code that trackes an event.

var evt = new EventTelemetry("Function called");
evt.Context.User.Id = "dummyUser";
telemetryClient.TrackEvent(evt);

And this can now be found in the end-to-end tracing.

Event in Overview Event in Telemetrylist

Adding an Extra “Operation”

This is probobly my favorit thing in all of this, when processing messages some parts of the process are bigger and/or more important to understand, this could be a complex transformation or just a set of steps that we want to group. Then we can use the StartOperation in TelemetryClient, this will start the scope for the Operation and it will be open until we execute the operation StopOperation it has a status and we can set some properties and get execution time of the process.

An Operation is either a DependencyOperation, intended to mark dependency to other resources or a RequestOperation meant for marking a creation of a request. I will use a DependencyOperation in my case since it’s the most alike operation for my purpose, I use it to mark that a more complex logic has been executed in my function and what the result of it was.

I use this in several ways one is when I process alot of items in my Function to mark that they are executed, like processing a record in a list, another is when performing a complex transformation and so on.

In my example bellow I’ll demonstrate the usage of an operation by doing a simple for loop and simulating processing of a line and adding some valuable metadata. As a tip set the Data property to the object that is going to be processed to easier understand what data is processed.

for (int i = 0; i < 10; i++)
{
    var op = telemetryClient.StartOperation<DependencyTelemetry>(string.Format("Function2_line_{0}", i.ToString()));

    op.Telemetry.Data = string.Format("Processing {0} and random guid {1}", i.ToString(), Guid.NewGuid().ToString());
    op.Telemetry.Type = "LineProcessor";
    op.Telemetry.Success = i < 9;
    op.Telemetry.Sequence = i.ToString();
    op.Telemetry.Context.GlobalProperties.Add("LineID", i.ToString());
	
	//do some heavy work!
	
    telemetryClient.StopOperation(op);                    
}

This will result in a mutch more detailed log in the End-To-End overview

Operation in Overview

Custom properties is easily set on each operation for traceability or information, bellow is how we can add LineID as a custom property.

op.Telemetry.Context.GlobalProperties.Add("LineID", i.ToString());

Operation and custom Property

We can also set the Operation to have failed even if we continued the run, good for highlight parts of failure while still needing to complete the rest of the run.

Operation and custom Property

Then we can find it under failures and from there start working on how to fix it.

Operation and failed operations

Read more:

Summary:

In this second part we have covered more advanced logging with TelemetryClient or to connect your current TelemetryClient logging to the End-To-End experience provided by Application Insights.

The StartOperation is really powerfull to highlight what is happening, but I would like to have a third option for an Operation since using the Dependency sounds wrong when it’s a section of the code in the same process, but it works!

This post is about how things can be done and hopefully this can be a guidance in creating a better logging experience, we want to avoid the developers “let me just attach my debugger” response to what is the problem with the flow running in production.

Posted in: | Azure Functions  | Application Insights  | Tagged: | Azure Functions  | Integration  | Serverless  | Application Insights