Innovation enabled by Integration

Normally we focus on technical implementations and problem solving but from time to time we get the opportunity to expand the view and focus on innovative thinking. This is the time where we can show our customers the real value of applying good architecture and principles. Currently I’m having so much fun doing innovative prototyping work with one of my customers that I need to share it with you. We are using Power Apps and Flow for our prototyping in order to get prototypes done fast and easy, without spending too much time and effort.

Power Apps is a platform where you easy can build apps in a point-and-click approach and to multiple device types at the same time, read more at: https://powerapps.microsoft.com/

Flow is a lightweight integration platform that runs on top of Logic Apps, it’s designed to make it easy to create your own flows and turn repetitive tasks into automated workflows, read more at https://emea.flow.microsoft.com/

These platforms are very easy to get started with and there are several samples of applications regarding office365 and common applications. But the real power for innovation comes clear if you have a good API landscape that could be used when building these applications.

We have a customer that we started to work with for almost a year ago and now when all the hard work is done, and we have a good integration landscape we can harvest the benefits of our good architecture. I started with showing the case described below and now they really hunger for more of this kind of innovation both for internal and external use. It’s so wonderful to see how our hard work pays off. Showing that we really are the “integration heroes”, we are the enablers, making this possible.

Architecture that enables innovation?

In reality it’s quite simple, create a platform that contains API’s and flows that can be reused. Make sure to build domain model-based API’s and function driven API’s and focus on making them simple to use and most important reusable. Here is a Sample Architecture:

Integration Architecture Sample

And in our API Management instance we have created a few domain model API’s:

API List Sample

A few months back we helped in replacing an internal system for product information, it was a system where you could enter a serial number and get basic information about the product. The end solution was a website that used a couple of API’s to retrieve the information needed. The fun part of this was that most of the API’s where already created in other projects for other/similar needs but used from other systems/apps. All we did was fine tune an API, leaving us with time to innovate. On a computer it makes sense to write a serial number but on a mobile phone we expect to be able to scan the serial. Since we had the time we started to investigate the problems around this and if this was solvable.

We started to look in to this and the first “problem” was that the serial plates on the machines had no barcodes, only pure text:

Serial Plate Sample

Luckily this function exists in the Cognitive services in Azure read more at Azure Cogniative Services

We started to test this with a Power App that used a Flow to send the image to the Cognitive Service. The Power App solution is very simple, it uses the camera to take a picture and afterwards sends the picture to a Flow that utilizes the cognitive service to identify the serial number from the plate. The result from the Flow is a URL with the serial number that we used to launch the website, making use of the web app. With this we had expanded the functionality with a Serial Plate Scanner and provided basis for the decision on investing in a mobile app.

Here is the flow:

Microsoft Flow Overview

Here is the Power App in the Power Apps Studio developed in my browser:

Microsoft Power App

As you can see there is not much going on but a Camera item with a OnSelect statement that launches the web browser with the URL that comes back from our Flow “Get-Path-To-Product”.

The flow is also quite simple:

Microsoft Flow Solution

The tricky part was how to send the image to the Cognitive Services, in this case we created a compose action with a dataUritoBinary function:

Microsoft Flow Action Compose *dataUriToBinary(triggerBody()[‘Skapa_Indata’])

And in the HTTP action we send the binary data to the OCR service: (in the sample below the OCR service is behind our API Management instance)

Microsoft Flow Action HTTP

All in all, the prototype was easy and fast to build, but most importantly it successfully gave us the power to demo the concept. Providing a visual demo to show the value of having an app that can scan the serial plate. This gave a good understanding in the benefits of the investment on a native app to decision makers without spending too much time and resources.

Summary:

The app is currently under development and we have more of these innovative prototypes coming up to prototype new or upgraded functionality in a variety of sizes. Best part of using this approach is that we can create a prototype in a day or two providing fast prototyping, giving decision makers a visual and testable solution to review. All thanks to our hard work with API’s, workflows and the amazing tools Power Apps and Flow!

Become the Integration Hero that you really can be and let that unused innovation power free!

Help Business Evolve

Posted in: | Innovation  | Power Apps  | Flow  | Tagged: | Innovation  | Integration  | Architecture  | Integration Architecture  | Power Apps  | Flow 


API Management Template Creator Updates August 2018

Updates on the API Management Template Creator has been published a few weeks back:

A minor thing for usage but great for consistency and quality in the generator is that there are alot of tests added.

  • Functions support added
  • Updated Logic App Support
  • Removed some name parameter generation

Functions support added

Finally the support for API’s that are generated from Functions is live, now an API generated from Functions get the appropiate automatic generation of function keys via ARM functions. Meaning that you don’t need to supply the function kode in the parameter files they are collected automatically via Listsecrets function in ARM.

Now:

{
      "comments": "Generated for resource /subscriptions/c107df29-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/PreDemoTest/providers/Microsoft.ApiManagement/service/ibizmalo/properties/5b41934c6d0f59440d20c5ee",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_ibizmalo_name'),'/','5b41934c6d0f59440d20c5ee')]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "maloapimtest_HttpTriggerAdminKey_query_5b41934c6d0f59440d20c5ee",
        "value": "[listsecrets(resourceId(parameters('FunctionApp_maloapimtest_resourceGroup'),'Microsoft.Web/sites/functions', parameters('FunctionApp_maloapimtest_siteName'), parameters('operations_api-HttpTriggerAdminKey-post_name')),'2015-08-01').key]",
        "tags": [],
        "secret": true
      },
      "resources": [],
      "dependsOn": []
    }

Previously:

parameters: {
    "INT2051functionkey_value": {
      "type": "securestring",
      "defaultValue": "Aze/1W8wCwItuP8JacdyHa2vDw8YScrOkbq6uHcTXiOasb2wi3kZoQ=="
    }
 }
	. . . 
 {
      "comments": "Generated for resource /subscriptions/1fake145-d7f4-4d0f-b406-7394a2b64fb4/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apidev/properties/int2051functionkey",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_cpidev_name'), '/' ,parameters('property_INT2051functionkey_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "INT2051functionkey",
        "value": "[parameters('INT2051functionkey_value')]",
        "tags": null,
        "secret": true
      },
      "resources": [],
      "dependsOn": []
}

Updated Logic App Support

Two bigger things has been targeted to improve the export and later deployment experience for APIM API’s that uses Logic Apps.

Trigger name fix

A Logic App Trigger can now have any name, previous it was hardcoded to ‘Manual’. There is now logic added to retrieve the information at extraction time, the extractor fetches the Logic App definition and finds the HTTP trigger to get the correct name.

Trigger name is extracted from the Logic App and added to the ARM functions (customtriggername in below sample):

Logic App Trigger name

Will result in (note the ‘customtriggername’):

"[listCallbackUrl(resourceId(parameters('LogicApp_customtrigger_resourceGroup'), 'Microsoft.Logic/workflows/triggers', parameters('LogicApp_customtrigger_logicAppName'), 'customtriggername'), '2017-07-01').queries.sig]"

Add Logic App as operation to exisitng API fix

There has been diffrent behavior between API’s generated from a Logic App imported as a new operation to an existing API. The main diffrent has been that the sig has been added as a “normal” parameter and no ARM functionality added to generate the sig. This is now adressed and the appropiate ARM functions will be added and generated automatically for both types.

Properties will get value from the [ListCallBackUrl] function:

{
      "comments": "Generated for resource /subscriptions/1fake145-d7f4-4d0f-b406-7394a2b64fb4/resourceGroups/Api-Default-West-Europe/providers/Microsoft.ApiManagement/service/apidev/properties/int7100-payment-to-o-test_manual-invoke_5b349f2a42974a989226cf33",
      "type": "Microsoft.ApiManagement/service/properties",
      "name": "[concat(parameters('service_apidev_name'), '/' ,parameters('property_int7100-payment-to-o-dev_manual-invoke_5b349f2a42974a989226cf33_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "int7100-payment-to-o-dev_manual-invoke_5b349f2a42974a989226cf33",
        "value": "[listCallbackUrl(resourceId(parameters('LogicApp_INT7100-Payment-To-o-DEV_resourceGroup'), 'Microsoft.Logic/workflows/triggers', parameters('LogicApp_INT7100-Payment-To-o-DEV_logicAppName'), 'request'),'2017-07-01').queries.sig]",
        "tags": [],
        "secret": true
      },
      "resources": [],
      "dependsOn": []
    }

Removed some name parameters

Parameters generated for name of resources like, Policy, version sets, operations etc has no real use case or benefits associcated with changing the name and is therefore removed to simplify the ARM templates.

Sample of parameters that are now removed:

    "versionset_5af9817ca656c6952416b779_name": {
      "type": "string",
      "defaultValue": "5af9817ca656c6952416b779"
    },
    "operations_GetSalesOrderById_name": {
      "type": "string",
      "defaultValue": "GetSalesOrderById"
    },
    "operations_policy_name": {
      "type": "string",
      "defaultValue": "policy"
    }

Summary:

Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 


API Management Template Creator Updates May 2018

Updates on the API Management Template Creator has been dragging but now I’m pleased to have fixed the two most urgent ones.

  • Support for Versions, with version sets
  • Schemas

Support for Versions, with Version Sets

If an API that is extracted has enabled the current versionset is also extracted and provided inside the ARM Template. The version set states the version that is used and is needed byt the API to be able to support versions, it contains the information about how the verions is handled for the specific API. The sample bellow is using a HTTP header named API-Version to set the version of the API, the version later is then set on the API itself.

Read more in the documentation: https://docs.microsoft.com/en-us/azure/templates/microsoft.apimanagement/service/api-version-sets

{
      "comments": "Generated for resource /subscriptions/fake439-770d-43f3-9e4a-8b910457a10c/resourceGroups/API/providers/Microsoft.ApiManagement/service/dev/api-version-sets/5ae6f90fc96f5f1090700732",
      "type": "Microsoft.ApiManagement/service/api-version-sets",
      "name": "[concat(parameters('service_dev_name'), '/' ,parameters('versionset_5ae6f90fc96f5f1090700732_name'))]",
      "apiVersion": "2017-03-01",
      "properties": {
        "displayName": "Arkivet",
        "description": null,
        "versioningScheme": "Header",
        "versionQueryName": null,
        "versionHeaderName": "API-Version"
      },
      "resources": [],
      "dependsOn": []
    }

Support for Schema

If an API has schemas added to oeprations these schemas will be extracted and added to the ARM template, all operations will also have a “dependsOn” to the specific schema to prevent errors when executing the ARM template. The sample bellow is a simple schema that is added in the ARM template and referenced in the Operations section.

Read more in the documentation: https://docs.microsoft.com/en-us/azure/templates/microsoft.apimanagement/service/apis/schemas

{
          "comments": "Generated for resource /subscriptions/fake439-770d-43f3-9e4a-8b910457a10c/resourceGroups/API/providers/Microsoft.ApiManagement/service/dev/apis/arkivet/schemas/5af038365b73730dd01453ad",
          "type": "Microsoft.ApiManagement/service/apis/schemas",
          "name": "[concat(parameters('service_dev_name'),'/',parameters('api_arkivet_name'), '/5af038365b73730dd01453ad')]",
          "apiVersion": "2017-03-01",
          "properties": {
            "contentType": "application/vnd.ms-azure-apim.swagger.definitions+json",
            "document": {
              "definitions": {
                "Body": {
                  "type": "object",
                  "properties": {
                    "No": {
                      "type": "string"
                    },
                    "ReportedDate": {
                      "type": "string",
                      "format": "date-time"
                    }
                  }
                },
                "BodyExists": {
                  "type": "object",
                  "properties": {
                    "CivicNo": {
                      "type": "string"
                    }
                  }
                },
                "Post400ApplicationJsonResponse": {
                  "type": "object",
                  "properties": {
                    "message": {
                      "type": "string"
                    },
                    "description": {
                      "type": "string"
                    },
                    "errors": {
                      "type": "array",
                      "items": {
                        "type": "object",
                        "properties": {
                          "field": {
                            "type": "string"
                          },
                          "message": {
                            "type": "string"
                          }
                        }
                      }
                    },
                    "hasErrors": {
                      "type": "boolean"
                    }
                  }
                },
                "ExistsPost200ApplicationJsonResponse": {
                  "type": "object",
                  "properties": {
                    "arkivet": {
                      "type": "string"
                    }
                  }
                }
              }
            }
          },
          "resources": [],
          "dependsOn": []
        }

Summary:

Exporting API’s ia the API Management Template Extractor simplifies the CI area of using API Management, here we can select a specific API and export only that API, with Operations, version sets, schemas, properties etc. Without needing to manually add extra work, making the development inside the portal and then just extracting the work and setup the import to the next environment .

Posted in: | API Management  | Tagged: | API Management  | Integration  | ARM Templates  | ARM 


Access AAD Secured Web API's from API Management

Protecting Web Apps and Web API’s by the built in Authentication and authorization in Azure App Service is a great way to protect resources without adding code to handle the authorization. This means that the site or api is fully secure without the need of implementing it, which is a great example of seperation of concerns. Read more on how it works

What we then need is to access the API and in this case we need to be able to access the API via Management using service principals since the client using our API’s in API Management is outside our organization and should not be bothered by our internal security mechanisms.

The scenario: The API is protected by Azure AD and in order to access it we need to add a valid JWT token to the Auhtorization header in the request, first(1) we need to get the JWT token and then(2) add it in the request to the API App.

Scenario Overview

In this post we have used the standard generated API App from Visual Studio and published it to a Web API instances in Azure named Protected.

Enabling the built in security Authentication and Authorization to the API’s

In the Azure Portal head to the API App and go in to the settings tab “Authentication/Authorization”. Enable the App Service Authentication by pressing the On button.

Enable Authentication/Ahithorization

Now we can enable several diffrent providers and in this demo we will focus on the Azure Active Directory, press the Active Directory.

Configure Azure AD Athentication

After chosing the Azure Active Directory, we are presented with 2 options to configure the Azure Active Directory setup. Express gives some guidance and the possiblity to create or select an existing AD Application while in Advance we can write in the values needed, the AAD Application clientid and the issuer, normaly https://sts.windows.net/{tenantguid}/ followed by the tenat guid URL. In this setup we will use the Express option and create a new AD Application.

Here we can choose to use an existing AD Application or create a new one, I will create a new one named ProtectedAppAAD.

New Azure AD App

After this is installed we can now save and test that the API is protected. If you open a new browser or log out from the account you should be forced to login and the site is now secured.

Login Sign

So now the site is securely protected and we can access it when logged in, now we need to be able to access the API via API Managamenet.

Provide access to the API

So the API is now protected and we need to provide a way for API Management to access it, we will need to use another AAD Application (App Registration) and provide access to the ProtectedAppAAD. So we need to create a new Application, named APIManagementAADApp, the sign in URL can be http://localhost.

Create a new AAD Application

After the AAD application is created we need to provide access to the ProtectedAppAAD this is done via permissions; to assign permissions go to Settings, press Required Permissions and Add.

AAD Application add permission

Under Select an API we need to search for ProtectedAppAAD since first the list only shows Microsoft Standard API’s. Enter the ProtectedAppAAD in the text box and press Select.

AAD Application add permission select API

Now we need to select the permission and the one we want is Access, select it and press Select, finish the setup by press Done.

AAD Application add permission select API

Last step is pressing the Grant Permissions to enable the permissions (do not forget this!).

AAD Application add permission select API

API Management Policy for Aquiring JWT Token

In order to be able to expose this API we need to get a token from AAD using the Application, this will be done inside a policy and luckily for us the API Management team has provided a set of code snippets at GitHub and one of these is doing exactly that, get it here.

There is more of them at the repository in GitHub and if you have a great snippet that you want to share you can add a Pull Request and if the team approves it. it will end up there. Here is the GitHub repository, https://github.com/Azure/api-management-policy-snippets/blob/master/Snippets/.

The Get OAuth2 access token from AAD snipet looks like this.

<!-- The policy defined in this file provides an example of using OAuth2 for authorization between the gateway and a backend. -->
<!-- It shows how to obtain an access token from AAD and forward it to the backend. -->

<!-- Send request to AAD to obtain a bearer token -->
<!-- Parameters: authorizationServer - format https://login.windows.net/TENANT-GUID/oauth2/token -->
<!-- Parameters: scope - a URI encoded scope value -->
<!-- Parameters: clientId - an id obtained during app registration -->
<!-- Parameters: clientSecret - a URL encoded secret, obtained during app registration -->

<!-- Copy the following snippet into the inbound section. -->

<policies>
  <inbound>
    <base />
      <send-request ignore-error="true" timeout="20" response-variable-name="bearerToken" mode="new">
        <set-url></set-url>
        <set-method>POST</set-method>
        <set-header name="Content-Type" exists-action="override">
          <value>application/x-www-form-urlencoded</value>
        </set-header>
        <set-body>
          @{
          return "client_id=&resource=&client_secret=&grant_type=client_credentials";
          }
        </set-body>
      </send-request>

      <set-header name="Authorization" exists-action="override">
        <value>
          @("Bearer " + (String)((IResponse)context.Variables["bearerToken"]).Body.As<JObject>()["access_token"])
	  </value>
      </set-header>

      <!--  Don't expose APIM subscription key to the backend. -->
      <set-header exists-action="delete" name="Ocp-Apim-Subscription-Key"/>
  </inbound>
  <backend>
    <base />
  </backend>
  <outbound>
    <base />
  </outbound>
  <on-error>
    <base />
  </on-error>
</policies>

The way it works is that two things happen before we send the request to the backend.

  1. First a request in the send-request action is made to the AAD tenant and aquiring a token.
  2. The set-header action is adding the token in the Authroization header, by extracting it from the result from the first request.

But before we can add this snippet to our API we need to add a few values to the Named Values section in our API Management instance.

  • authorizationServer - the tenant URL My tenant ID is 1cb87777-3df4-428b-811a-86a0f215cd35 so the URL then is https://login.microsoftonline.com/1cb87777-3df4-428b-811a-86a0f215cd35/oauth2/token
  • clientId - the Application id of our AAD Application in this case of APIManagementAADApp
    • AAD Application id APIManagementAADApp
  • clientSecret - the key to our AAD Application in this case of APIManagementAADApp
    • Create a new key Create AAD Application key APIManagementAADApp
  • scope - the AAD Application id of the AAD Application that is protecting our resource, in this case the id of ProtectedAppAAD
    • AAD Application id ProtectedAppAAD

After adding these the Named Values section of our API Managemet will end up looking like this. Make sure to add these before adding the policy, since the policy canno be saved if the referenced Named Value dosen’t exist.

AAD Application id ProtectedAppAAD

Now we just need to add the policy to our API and we will be able to access the protected API, after the Policy is saved test it and you should now recieve the result from the API.

AAD Application id ProtectedAppAAD

Summary:

Todays solutions almost always accesses other resurces and when building this landscape we need to make sure that all parts are protected and secure, often even we need to combine a few of them to be compliant with requirments from customers. In this post I showed how to use one of the built in security from Web Apps with AAD that can be used standalone or as an extra layer of security above the “standard” Basic, Oauth, API keys, Certificates etc. authentications. Another benefit is that is not part of the code so it’s not possible to “forgett” or accidently removed in a bug fix or similar.

AAD Application id ProtectedAppAAD

As a bonus the API Management Snippets is a really nice initiative with pre created advanced snippets so make sure to check this out.

https://github.com/Azure/api-management-policy-snippets/blob/master/Snippets/

Posted in: | API Management  | Tagged: | API Management  | Integration  | Security 


Logic App Custom Connector WSDL Tips

So continue my journey with the Logic Apps Custom Connector and this time we are doing some work with the WSDL part of the Logic Apps Custom Connector and I thought I would share some tips and trix I’ve learned and are using when importing and using Connectors with WSDL. Since the only way to be a good user of a resource like the Custom Connector is to understand how we can investigate behaviors, tweak and fix problems and find debug/logs.

Import Error

There are several reasons why there will be errors when importing the WSDL and since it’s API Management functionality we can (until documentation is up to speed) see the limits on the API Management site:

https://docs.microsoft.com/en-us/azure/api-management/api-management-api-import-restrictions#wsdl

I encountered the “rare” error with recursive objects, apparently there where a “lazy” coder that just referenced the whole entity in a parent child situation even if the only “needed” field was the ID. Anyway removing the recursive element solves the problem,alternatively manipulating the representation could have been done.

Runtime problems, unexpected behaviors and errors from the backend system

So there are many reasons why there can be errors sent from the backend system but one of the common ones I’ve found is System.FormatException: Input string was not in a correct format and that is due to a element specified as int and the value sent in is null. Now how can that be a problem? Since the generation is done by the same logic as in API Management we can import the WSDL in API Management and see how the actual liquid template looks like.

Let’s look at an example.

With a system I’m sent a link to the WSDL and a sample this to easy integrate with them, testing the sample works fine but importing the WSDL provides alot more than the sample as we are used to there are more fields than we need. So sending in the test file via postman works good but the connector does not work. Let’s take a look why, bellow is the sample a fairly easy message with few elements.

<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
  <soap:Body>
    <ProcessArticle xmlns="http://ongoingsystems.se/WSI">
      <GoodsOwnerCode>code1234</GoodsOwnerCode>
      <UserName>user1234</UserName>
      <Password>pass1234</Password>
      <art>
        <ArticleOperation>CreateOrUpdate</ArticleOperation>
        <ArticleIdentification>ArticleNumber</ArticleIdentification>
        <ArticleNumber>6553173735310</ArticleNumber>
		<ArticleName>Display Stand</ArticleName>
        <ArticleDescription>Display Stand</ArticleDescription>
        <ArticleUnitCode>St</ArticleUnitCode>
        <IsStockArticle>1</IsStockArticle>
      </art>
    </ProcessArticle>
  </soap:Body>
</soap:Envelope>

But when using the Custom Connector all the complex structures are mapped: (this is generated via API Management):

 <set-body template="liquid">
			<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns="http://ongoingsystems.se/WSI">
				<soap:Body>
					<ProcessArticle>
						<GoodsOwnerCode></GoodsOwnerCode>
						<UserName></UserName>
						<Password></Password>
						<art>
							<ArticleOperation></ArticleOperation>
							<ArticleIdentification></ArticleIdentification>
							<ArticleSystemId></ArticleSystemId>
							<ArticleNumber></ArticleNumber>
							<ArticleName></ArticleName>
							<ProductCode></ProductCode>
							<BarCode></BarCode>
							<SupplierArticleNumber></SupplierArticleNumber>
							<ArticleDescription></ArticleDescription>
							<ArticleUnitCode></ArticleUnitCode>
							<CountryOfOriginCode></CountryOfOriginCode>
							<StatisticsNumber></StatisticsNumber>
							<PurchaseCurrencyCode></PurchaseCurrencyCode>
							<Weight></Weight>
							<NetWeight></NetWeight>
							<Volume></Volume>
							<Length></Length>
							<Width></Width>
							<Height></Height>
							<QuantityPerPallet></QuantityPerPallet>
							<QuantityPerPackage></QuantityPerPackage>
							<OrderPoint></OrderPoint>
							<Price></Price>
							<CustomerPrice></CustomerPrice>
							<PurchasePrice></PurchasePrice>
							<IsStockArticle></IsStockArticle>
							<ArticleGroup>
								<ArticleGroupOperation></ArticleGroupOperation>
								<ArticleGroupIdentification></ArticleGroupIdentification>
								<ArticleGroupCode></ArticleGroupCode>
								<ArticleGroupName></ArticleGroupName>
							</ArticleGroup>
							<ArticleCategory>
								<TypeOperation></TypeOperation>
								<TypeIdentification></TypeIdentification>
								<Code></Code>
								<Name></Name>
							</ArticleCategory>
							<VatCode>
								<VatCodeOperation></VatCodeOperation>
								<VatCodeIdentification></VatCodeIdentification>
								<VatCode></VatCode>
								<VatPercent></VatPercent>
							</VatCode>
							<DangerousGoods>
								<UNNumber></UNNumber>
								<UNIsMarineHazard></UNIsMarineHazard>
								<UNIsDangerousGoods></UNIsDangerousGoods>
								<UNPackageType></UNPackageType>
								<UNTunnelCodes>
</UNTunnelCodes>
								<UNClassNumber></UNClassNumber>
								<UNProperShippingName>
									<Name></Name>
									<LanguageCode></LanguageCode>
								</UNProperShippingName>
								<UNLabelNumbers></UNLabelNumbers>
								<DangerousGoodsCoefficient></DangerousGoodsCoefficient>
								<EmSCode></EmSCode>
							</DangerousGoods>
							<ArticleNames>
</ArticleNames>
							<ArticleStructureSpecification>
</ArticleStructureSpecification>
							<MainSupplier>
								<SupplierIdentificationType></SupplierIdentificationType>
								<SupplierOperation></SupplierOperation>
								<SupplierNumber></SupplierNumber>
								<SupplierName></SupplierName>
								<Address>
									<Name></Name>
									<Address></Address>
									<Address2></Address2>
									<Address3></Address3>
									<PostCode></PostCode>
									<City></City>
									<TelePhone></TelePhone>
									<Remark></Remark>
									<Email></Email>
									<MobilePhone></MobilePhone>
									<IsEuCountry></IsEuCountry>
									<CountryStateCode></CountryStateCode>
									<CountryCode></CountryCode>
									<DeliveryInstruction></DeliveryInstruction>
									<IsVisible></IsVisible>
									<NotifyBySMS></NotifyBySMS>
									<NotifyByEmail></NotifyByEmail>
									<NotifyByTelephone></NotifyByTelephone>
								</Address>
								<comment></comment>
							</MainSupplier>
							<IsGSPCertified></IsGSPCertified>
							<MaxStockDays></MaxStockDays>
							<BarCodePackage></BarCodePackage>
							<LinkToPicture></LinkToPicture>
							<BarCodePallet></BarCodePallet>
							<QuantityPerLayer></QuantityPerLayer>
							<PalletHeight></PalletHeight>
							<TaricNumbers>
</TaricNumbers>
							<IsObsolete></IsObsolete>
							<MinDaysToExpiryDate></MinDaysToExpiryDate>
							<AdditionalStatisticsNumber></AdditionalStatisticsNumber>
							<CustomsExportConditions></CustomsExportConditions>
							<ArticleColor>
								<ColorCode></ColorCode>
								<ColorName></ColorName>
							</ArticleColor>
							<ArticleSize>
								<SizeCode></SizeCode>
								<SizeName></SizeName>
							</ArticleSize>
							<IsSerialNumberArticle></IsSerialNumberArticle>
							<IsBatchArticle></IsBatchArticle>
							<ArticleDefinitionClasses>
								<ArticleDefinitionClassesOperation></ArticleDefinitionClassesOperation>
								<Classes>
</Classes>
							</ArticleDefinitionClasses>
							<ArticleFreeDecimal1></ArticleFreeDecimal1>
							<ArticleFreeDecimal2></ArticleFreeDecimal2>
						</art>
					</ProcessArticle>
				</soap:Body>
			</soap:Envelope>
		</set-body>

And we start running in to problems due to the fact that we are not sending in more data than earlier but the generated xml is much larger and just look at the enormous representation in the GUI:

Unmodified Connector in GUI

Therefore we get some unwanted errors since there are some implications added when sending in starts kin complex structures, since I can’t really do much about the Logic I need to modify the data sent in, but I can modify the WSDL and reimport it. So after changing the WSDL to only contain the elements that I needed in my request it looks alot better and the xml message sent are now matching the sample:

Modified definition in the WSDL so the definition is minimal:

<s:complexType name="ArticleDefinition">
        <s:sequence>
          <s:element minOccurs="1" maxOccurs="1" name="ArticleOperation" type="tns:ArticleOperation" />
          <s:element minOccurs="1" maxOccurs="1" name="ArticleIdentification" type="tns:ArticleIdentificationType" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleNumber" type="s:string" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleName" type="s:string" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleDescription" type="s:string" />
          <s:element minOccurs="0" maxOccurs="1" name="ArticleUnitCode" type="s:string" />
          <s:element minOccurs="1" maxOccurs="1" name="IsStockArticle" nillable="true" type="s:boolean" />
          <s:element minOccurs="0" maxOccurs="1" name="Weight" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="NetWeight" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Volume" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Length" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Width" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="Height" type="s:decimal" />
          <s:element minOccurs="0" maxOccurs="1" name="QuantityPerPallet" type="s:int" />
          <s:element minOccurs="0" maxOccurs="1" name="QuantityPerPackage" type="s:int" />
        </s:sequence>
      </s:complexType>

Sample from the GUI:

Unmodified Connector in GUI

The request can now be sent to the backend without any problems. This approach can be used to both detect problems and also understand the behavior of the Custom Connector and changeing the WSDL can help us to easier use the Connector even if the maintainance is heavier and we need to keep track of these changes and do the again if a update WSDL would be imported.

Summary

As stated we need to find the ways of understanding the resource before beeing good users of it, so I hope this will help out and give ideas around workarounds for some troublesome scenarios and that more debug and customization properties is coming along the way. When more power is needed I would advise using API Management for now until there is more customization properties but most of the time it works like a charm!

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Custom Connector 


CI with Logic App Custom Connector

Just had some fortune to work with some projects using the Logic App Custom Connector it’s alot of fun now with all new capabilites on prem connectivity and all and as allways we end upp with setting up CI deployments via TFS.

That is when I found that the On Prem Data Gateway ARM template is poorly documented and the generated template from automation script are missing some important properties. So I thought I’ll gather the information needed here, provide som pitfalls and so on.

The schema for the ARM template can be found at GitHub: https://github.com/Azure/azure-resource-manager-schemas/blob/master/schemas/2016-06-01/Microsoft.Web.json.

It’s inside the Web schema and not that “easy” to get that you are in the correct definition but go down to row 119 and you find the customApis part, this is good for reference.

So let’s see how a ARM template for the On Premise Data Gateway looks like when complete: ( I’m actually adding a fully working connector for https://billogram.com/) The full represenatation of the swagger can also be found at my GitHub repository https://github.com/MLogdberg/LogicAppCustomConnectors/tree/master/Billogram

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "customApis_name": {
      "defaultValue": "Billogram",
      "type": "String"
    },
    "location": {
      "defaultValue": "westeurope",
      "type": "String"
    },
    "serviceUrl": {
      "defaultValue": "https://sandbox.billogram.com/api/v2",
      "type": "String"
    }
  },
  "variables": {},
  "resources": [
    {
      "type": "Microsoft.Web/customApis",
      "name": "[parameters('customApis_name')]",
      "apiVersion": "2016-06-01",
      "location": "[parameters('location')]",
      "properties": {
        "connectionParameters": {
          "username": {
            "type": "securestring",
            "uiDefinition": {
              "displayName": "API USer",
              "description": "The API User for this api",
              "tooltip": "Provide the API User",
              "constraints": {
                "tabIndex": 2,
                "clearText": true,
                "required": "true"
              }
            }
          },
          "password": {
            "type": "securestring",
            "uiDefinition": {
              "displayName": "API Password",
              "description": "The API Password for this api",
              "tooltip": "Provide the API Apssword",
              "constraints": {
                "tabIndex": 3,
                "clearText": false,
                "required": "true"
              }
            }
          }
        },
        "backendService": {
          "serviceUrl": "[parameters('serviceUrl')]"
        },
        "swagger": {
          "swagger": "2.0",
          "info": {
            "description": "The Billogram API is built according to RESTful principles, which means it uses HTTP as an application protocol rather than just as a transport layer for a custom protocol, like SOAP does. In other words, HTTP features such as the various request types (GET, PUT, POST, DELETE), response codes (403 Forbidden, 404 Not Found, 500 Internal Server Error) and standard headers (Accept, Authorization, Cache-Control) are a part of the API.",
            "version": "1.0.0",
            "title": "Swagger Invoice/Billogram",
            "termsOfService": "http://swagger.io/terms/",
            "contact": {
              "email": "billogram@billogram.se"
            }
          },
          "host": "sandbox.billogram.com",
          "basePath": "/api/v2",
          "schemes": [
            "https"
          ],
          "consumes": [],
          "produces": [],
          "paths": {
            ...removed for simplicity display...
              }
            }
          },
          "parameters": {},
          "responses": {},
          "securityDefinitions": {
            "basic_auth": {
              "type": "basic"
            }
          },
          "security": [
            {
              "basic_auth": []
            }
          ],
          "tags": [
            {
              "name": "Billogram Invoice",
              "description": "Handling Billogram/Invoices",
              "externalDocs": {
                "description": "Billogram structure",
                "url": "https://billogram.com/api/documentation#billogram_structure"
              }
            }
          ],
          "externalDocs": {
            "description": "Find out more about Swagger",
            "url": "http://swagger.io"
          }
        },
        "description": "My ARM deployed Custom Connector",
        "displayName": "[parameters('customApis_name')]",
        "iconUri": "/Content/retail/assets/default-connection-icon.6296199fc1d74aa45ca68e1253762025.2.svg"
      },
      "dependsOn": []
    }
  ]
}

Make sure to add both the swagger and the backendService object since the url specified in the serviceUrl since this is mandator and the URL that the Custom Connector are using during runtime, not the one specified in the swagger, this is good for easier management of dev/test/prod environments.

The Custom Connector is a great addon to Logic Apps and by been able to easily manage and deploy them is crusial, so even if this is great to be able to manage my Custom Connector via ARM I’m still missing the possibility to reference my swagger via a url since that would be most suitable way of deploying and kkeping the ARM template and swagger separated.

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Custom Connector 


Januari update Logic App Template Creator

A new year comes with new updates, as you might now there is now a Custom Connector option for Logic Apps and earlier this had some changes in the reource id handling in comparision with the managed api’s. So this has now been updated and fixed.

Also there where some issues regarding names that could not contain . (dot) and ohter special characters that has been fixed, a new way of handlign reource id now prevent problems related to naming restrictions.

The last part added was an option to be able to save the incoming body of the requested resources, this was done to be able to easier to debugg when developing, thanks to this and a Mock resource collector class that can replay the saved files we now can create full test runs that are replaying the whole collection of resources.

The github repository is found here: https://github.com/jeffhollan/LogicAppTemplateCreator

Updates

  • Now supporting extraction of Logic Apps containg Custom Connectors and generating working importable Connection resources with the Logic App.
  • Updates to resource id handling, now able to handle resource groups, resouirce names with .(dot) and other special characters that previously was not supported
  • Added a resource handler to be able to print out the incoming requests, to be able to build complete test suites with full replay of a extraction of a Logic App

Posted in: | LogicApps  | Tagged: | Logic Apps  | ARM Template  | Logic Apps Template Generator 


Logic App Custom Connector via On Prem Gateway

In an ever growing hybrid world the need of exposing services that are hosted On Premise are growing and so are the limitations and requirments aswell, the newest way of exposing HTTP based services in Logic Apps is via the Logic App Custom Connector via the On Premise Data Gateway

So previusly when having the task of calling on-premise firewall protected services from our Logic App’s it wasent possible without firewall openings or other resources to help connecting.

Task Overview

But now it’s possible and let’s solve it with the newly updated On Premise Data Gateway and a Logic App Custom Connector.

Prerequisites

Install the Gateway

First of make sure you have the latest version installed, follow the install instructions found here. .

Rememeber the name of the installed Gateway if you have many.

Create the API

In this scenario I will expose a swagger api endpoint, the simple standard Values Api, that are created default in a new Web App project with type Api App after enabling Swagger we can browse the swagger definition, as follows:

Values Controller Swagger

Copy the URL to the swagger defintion, shown in the middle bo in the image above ( http://localhost:51921/swagger/docs/v1 ) and save the content as a text file. (Simplest way is to browse it and copy the swagger definition to a text file)

Create Custom Logic App Connector

Now we are prepared and can start of with creating the Logic App Custom Connector based on the swagger file, note that it’s also possible to use WSDL files as source.

In the Azure Portal create a Logic Apps Custom Connector:

Azure Portal Create Logic Apps Custom Connector

No definition is added during the creation time, sp we will have to edit the Custom Connector once it’s created. Start edit the newly created Custom Connector and let’s add the swagger definition:

Go to Resource

Press the Edit button.

Press the edit button

Upload the swagger file: Upload swagger file

Now the API Connection is generated based on the swagger file, scroll down and make changes as you see them neeed but dont’ fotgett to check the “Connect via on-premises data gateway” checkbox and if needed change the URL’s to match the URL’s to the server exposing the service (I’ll just keep em same since it’s on the same machine for me)

Check the checkbox for on-premise data gateway

Now press Continue and move on with the process, once you are finished make sure to update the Logic Apps Custom Connector by pressing the check mark at the top:

Save the updates

After the Logic Apps Custom Connector is saved we can now use it in our Logic Apps, so lets’ create a Logic App and start using our newly created Custom Connector. (Make sure to create the Logic App in the same Location as your on-premise data gateway is installed otherwise it wont be available in the Logic App).

Inside the Logic App we can now find our Custom Connector among all the others and if I start searching for it I can easily select the appropiate action I want to use:

Logic App browse the Custom Connector

After I pick my operation, I just took the Get All I need to create the API Connection that will be connected to my Custom Connector, here is where I can pcik authentication method and also the Gateway I want to use.

Logic App create the API Connection for the Custom Connector

Pressing create will create the API Connection and the Logic App is completed for this sample, let’s save the Logic App and test run it, the result is that we will get the “values” returned as promised:

Logic App run result

The flow is now completed and since we have installed an On Premise Data Gateway on our webservice we can use the Logic App Custom Connector definition in our calls to our on-premise webservice, when creating the Logic App we are creating a API Connection based on the Logic App Custom Connector that holds credentials and Gateway information to be used when executing the Logic App run and we can now connect to our firewall protected on-premise webservice!

Ovreview result

Summary:

I think this is an amasing new feature that we really have wanted for a long time. This enables so mutch simpler exposure of services on-premises and in a secure and reliable way. No need for fireall openings, vpn’s, express routes or reverse proxies in a DMZ, we can just install the gateway and use it as our entry point.

Remember this is still in preview and I’ve encountered some issues when working with SOAP so there are some fixes needed, but it looks good so far!

Posted in: | LogicApps  | Tagged: | Logic Apps  | Integration  | Hybrid 


Centralize secrets in Azure Key Vault

When working with usernames, passwords or api keys these need to be stored in a secure and manageble way. Usually I find that these are added to Application Settings and manually handled in several places, this is not a desirable way of working and may look something like this, secrets spread out in all areas with red circle:

Normal setup

The first step is to centralize the Values and there I find that Azure Key Vault Vault is a superb place for storage, we got RBAC support for granular security, making it possible for a developer to access the values via code or when deploying via ARM template but not to see or edit the value. So with this we can make sure that passwords/usernames and other secrets are not spread via email’s, stored in dropbox or other “safe” places when distributed to the developers to add them to the App Settings or store them in parameter files for our ARM templates. It also adds reusability so if the value is needed in a Function and in a Logic App we can make sure it’s the same value that is used and we can manage it in one place.

So let’s have this sample in our KeyVault where sqluser and sqlpassword should be used in both a Function and when creating a Logic App Connector.

Key Vault sample secrets

Finding the Key vault Resource id: Key Vault resource id

ARM Deployment Logic App Connector

Using KeyVault with ARM deployment requires that the KeyVault has enabled access for Azure Resource Manager for template demployment and that the principal (AAD Application) that is used when deploying has permissions to access the secrets.

Enable ARM Deployment with Key Vault

After this secrets are accessible via the ARM Template Parameter file, (only from the parameter file or from a Wrapping ARM Template) here is a sample how it would look like in a parameter file:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "name": {
      "value": "INT001-SQL"
    },
    "location": {
      "value": "West Europe"
    }
    "sqlusername" :{
      "reference": {
        "keyVault": {
          "id": "/subscriptions/fake-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/testenvironment/providers/Microsoft.KeyVault/vaults/ibizmalotest"
        },
        "secretName": "sqluser"
      }
    },
	"sqlpassword" :{
      "reference": {
        "keyVault": {
          "id": "/subscriptions/fake-a4af-4bc9-a733-f88f0eaa4296/resourceGroups/testenvironment/providers/Microsoft.KeyVault/vaults/ibizmalotest"
        },
        "secretName": "sqlpassword"
      }
    }
  }
}

When doing a Resource Group Deployment the value is collected from Key Vault during deployment time and then stored in the API Connection. (Remember deployment time means that if the value is changed a new deployment is needed to reflect the change to the API Connection)

Now secrets are stored in Key Vault and the developer setting this up don’t know the actual values in test/prod just what the secret’s name is. The deployment flow looks as follows below where during deployment the secrets is collected from KeyVault (1) and used when creating/updating the API Connection (2) ) this also means that no secrets are stored in the ARM template parameters file and that is great!

ARM Deployment with Key Vault

IMPORTANT if the value in Key Vault is changed a new deployment is required to update the API Connection.

Accessing Secrets from Functions/Web Apps or other c# applications

Any c# application can easily use the KeyVault SDK to retrieve secrets from Key Vault during runtime, but we do need some things in order to make this possible, from code we need an AAD Application in order to grant access to the secrets, I will not go through how to create one here but read more here

When it’s created we need to get the Application ID (clientId) and one Key (clientSecret).

Get AAD Application information

Now that we have collected the information needed we also need to make sure that our AAD Application has access to the secrets, that is something we are adding to the Key Vault under the “Access Policies”:

Set Access Policies

Press the “Add new”, select the principal (AAD Application) in my case the KeyVaultApp, in the Secret Permissions we only want and need Get and no more privileges should be given to the Policy.

Set Access Policies Settings

Now to start coding we need some nuget packages: *Microsoft.Azure.KeyVault; *Microsoft.IdentityModel.Clients.ActiveDirectory

And a bit of code, sample bellow, for more indept about how to use in Functions read more at Jeff Holan’s blog post:

Code to retrieve a secret thesecrets

public static class KeyVaultFunction
{

    private static string clientID = Environment.GetEnvironmentVariable("clientId", EnvironmentVariableTarget.Process);
    private static string clientSecret = Environment.GetEnvironmentVariable("clientSecret", EnvironmentVariableTarget.Process);       
    private static string keyvaultname = Environment.GetEnvironmentVariable("keyvaultname", EnvironmentVariableTarget.Process);    

    [FunctionName("KeyVaultFunction")]
    public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
    {
        log.Info("C# HTTP trigger function processed a request.");

        string secretUri = $"https://{keyvaultname}.vault.azure.net/secrets/";

        var kv = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(GetToken));
        log.Info("Token aquired");
        var sqlusernameSecret = await kv.GetSecretAsync(secretUri + "sqluser");
		var sqlpasswordSecret = await kv.GetSecretAsync(secretUri + "sqlpassword");
		
		//do some code here

        return req.CreateResponse(HttpStatusCode.OK, "SQL credentials collected ok");
    }

    public static async Task<string> GetToken(string authority, string resource, string scope)
    {
        var authContext = new AuthenticationContext(authority);
        ClientCredential clientCred = new ClientCredential(clientID, clientSecret);
        AuthenticationResult result = await authContext.AcquireTokenAsync(resource, clientCred);

        if (result == null)
            throw new System.Exception("Failed to obtain the JWT token");

        return result.AccessToken;
    }
}

First the settings are added to the Function App Application Settings:

private static string clientID = Environment.GetEnvironmentVariable("clientId", EnvironmentVariableTarget.Process);
private static string clientSecret = Environment.GetEnvironmentVariable("clientSecret", EnvironmentVariableTarget.Process);       
private static string keyvaultname = Environment.GetEnvironmentVariable("keyvaultname", EnvironmentVariableTarget.Process);
private static string secretname = Environment.GetEnvironmentVariable("secretname", EnvironmentVariableTarget.Process);

To handle the authentication we need a method to send i to the AuthenticationCallback object that takes 3 parameters, authority is the login authroity i.e: “https://login.windows.net/1cb87777-3df4-428b-811a-86a0f215cd35” the resoruce is what resource we are trying to access (KeyVault): “https://vault.azure.net” and scope is the scope we are accessing with normaly empty string.

public static async Task<string> GetToken(string authority, string resource, string scope)
{
    var authContext = new AuthenticationContext(authority);
    ClientCredential clientCred = new ClientCredential(clientID, clientSecret);
    AuthenticationResult result = await authContext.AcquireTokenAsync(resource, clientCred);

    if (result == null)
        throw new System.Exception("Failed to obtain the JWT token");

    return result.AccessToken;
}

Then the code part that is actually collecting the secret is the following, we need the URI to the secret that we compose of the Key Vault name and the secret name, and the GetToken function to send in to the AuthenticationCallback object.

string secretUri = $"https://{keyvaultname}.vault.azure.net/secrets/";
var kv = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(GetToken));
var sqlusernameSecret = await kv.GetSecretAsync(secretUri + "sqluser");

After this setup is completed we now are collecting the secrets from Key Vault during runtime and therefore there are no more secrets stored in application settings or in code and we have reused the same secrets in diffrent areas.

Secrets accessed from KeyVault in code

Summary:

I really like to remove username and passwords from code and settings since it’s always a mess to manage and in handling them, also there are a few security aspects to consider as well. Making it possible to get username/password or other secrets from Key Vault inside your Function, Web App and at the same time use them when deploying ARM templates like with Logic Apps API Connections or API Management is really the true win, since we now only need to manage these in one place and can use them wherever needed.

In the end we are creating a solution as the image bellow and this gives us a centralized place for our secrets and we only spreading them if we really need to like to an API Connection and that is done integrated with ARM.

Secrets centralised in KeyVault

OBS! Make note that if we access Key vault via code an update is reflected emediatly while used via ARM a new deployment is needed in order for the new value to be deployed to i.e. password for a API Connection, making it vital to have good Build and Release processes).

Posted in: | Key Vault  | Tagged: | Key Vault  | Integration  | Security 


Azure Functions npm install via VSTS

When setting upp build and release cycles for diffrent Azure resources we bump in to diffrent challenges, this post will cover the challenge of npm install of node modules in to a Function App when doing deployments.

When we are working with our development enviornment it’s okay to go in and run kudo commands and so on to make sure our Function App has the correct packages installed, but as we are adding speed and agility to our processes our Release pipelines in VSTS has to automate more of these processes. It’s not that it’s hard to go in to Kudo and run the command but it’s a manual step that takes time and knowledge when doing a release, I prefer to have everything prepped so that I know things are working when the Release is installed.

So let’s look in to how we can run Kudu commands from VSTS, since Kudu has a REST API we can use it for these kind of tasks, read more on it here Kudu Rest API

When looking in to the API documentation we easily find a function for executing commands:

POST /api/command
{
    "command": 'echo Hello World',
    "dir": 'site\\repository'
}

This means that we could create a message that looked like this for a npm install of the request package:

{
    "command": "npm install request",
    "dir": "site\\wwwroot"
}

So let’s start figuring out the rest, how to execute this from VSTS?

As good as it’s get this can be executed via PowerShell explained and sampled at the bottom of the API reference (I changed the sample in the following code snippet to execute the command for npm install):

$username = "`$website"
$password = "pwd"
# Note that the $username here should look like `SomeUserName`, and **not** `SomeSite\SomeUserName`
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))

$userAgent = "powershell/1.0"
$apiUrl = "https://$functionAppName.scm.azurewebsites.net/api/command"
$command = '{"command": "npm install ' + $npmpackage + '","dir": "site\\wwwroot"}'

Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} -UserAgent $userAgent -Method POST -Body $command -ContentType "application/json"

The credentials is now needed and that is somewhat messy, but since it’s the same as the deployment credentials we can add a commands to get that information via AzureRmResourceAction command:

$creds = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroup -ResourceType Microsoft.Web/sites/config `
            -ResourceName $functionAppName/publishingcredentials -Action list -ApiVersion 2015-08-01 -Force

So with this information we can now build a more generic script that will only need three (3) parameters to execute a npm install command on our Function App:

  • functionAppName: the name of the Function App
  • resourceGroupName: the name of the resource group that contains the Function App
  • npmpackage: the npm package to install

param([string] $functionAppName, [string] $resourceGroup, [string] $npmpackage)

$creds = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroup -ResourceType Microsoft.Web/sites/config `
            -ResourceName $functionAppName/publishingcredentials -Action list -ApiVersion 2015-08-01 -Force

$username = $creds.Properties.PublishingUserName
$password = $creds.Properties.PublishingPassword


$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))

$userAgent = "powershell/1.0"
$apiUrl = "https://$functionAppName.scm.azurewebsites.net/api/command"
$command = '{"command": "npm install ' + $npmpackage + '","dir": "site\\wwwroot"}'

Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} -UserAgent $userAgent -Method POST -Body $command -ContentType "application/json"

Now all we have left is to execute this script in a Azure Powershell Task that will install the npm pacakge during the release, image bellow shows the Release setup and as you can see the parameters are added in the “Script Arguments” input area. I’ve also added the zcript to a shared repo andlinked the build setup to be able to share the script and manage it in one place.

VSTS Release Setup

If you are using package.json files we can make sure all packages are installed via npm install command, let’s see how the following could look like with a package.json file:

package.json file in project

Modifying the PowerShell script will then give us the following:

  • functionAppName: the name of the Function App
  • resourceGroupName: the name of the resource group that contains the Function App
  • functionfolder: the name of the folder that the function is in (same name as the function)
param([string] $functionAppName, [string] $resourceGroup, [string] $functionfolder)

$creds = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroup -ResourceType Microsoft.Web/sites/config `
            -ResourceName $functionAppName/publishingcredentials -Action list -ApiVersion 2015-08-01 -Force

$username = $creds.Properties.PublishingUserName
$password = $creds.Properties.PublishingPassword

$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))

$userAgent = "powershell/1.0"
$apiUrl = "https://$functionAppName.scm.azurewebsites.net/api/command"
$command = '{"command": "npm install","dir": "site\\wwwroot\\'+ $functionfolder +'"}'

Invoke-RestMethod -Uri $apiUrl -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} -UserAgent $userAgent -Method POST -Body $command -ContentType "application/json"

Summary:

I like to automate these tasks since it will give me a “easier” Release and a more reliable Release, but as for now it’s hard to verify that the package is installed and if it’s installed previously so a verification step that the function is loaded correctly is something that might be needed.

Recomended is to use the package.json approach since it shows the dependencies and allow to easyily add new packages, but jeep in mind that it also gives some delays when running the npm install command since it will install alot of packages the first time.

Posted in: | Azure Functions  | Tagged: | Azure Functions  | Integration  | Deployments