Category Archives: Function Apps

Multi-environment deployments for Compiled C# Azure Functions with VSTS Release Management

This post covers an approach you can use to deploy compiled C# Functions using the tooling available in Visual Studio 2017 and various Build and Release Management Tasks contained in Visual Studio Team Services (VSTS).

Note that this post discusses deploying to the v1 Functions runtime platform.

I was lucky enough to speak with Damian Brady on the DevOps Labs show on Channel 9 and cover the first part of this blog. If you’ve watched that, or you’ve come here via the Github repository for the solution we used, then we’ll go down to the next level and really look at how you can recreate this setup in your environment.

1. Pre-requisites

There are a few moving pieces we need to get into place first in order to complete the configuration. Let’s take a look at those.

a. Connecting environments

Note that in order to complete these steps you may require elevated privileges in one or more of the mentioned services. If you do not have access to an admin-level account in any of the services you will likely need to ask someone to configure these for you.

> Github to VSTS

In our demonstration we’re using a Github repository as our source repository and allowing VSTS’ Build capability to perform Continuous Integration Builds when a commit occurs on the master branch. Microsoft has documented how to configure Github to connect with VSTS already, so go and take a read and head back here when you’re done setting up the integration.

> VSTS to Azure

We use the Azure Resource Manager Service Endpoint option in VSTS to configure our connection into Azure from VSTS. There is documentation from Microsoft around how you setup the connection, including steps to create a custom Service Principal (or use a pre-existing one your Azure admin has created for you). Once again, go have a read and once you have a working Service Endpoint in VSTS head back over here.

b. Configure SendGrid

If you’d like to run the Functions once deployed you will need to configure SendGrid so you can use the binding in the Functions being deployed. You can follow the official Azure documentation on setting up a (free) SendGrid account and then make sure to set the API key value for the AzureWebJobsSendGridApiKey App Setting for your deployed Functions.

c. Create target Azure Resources

Go ahead and also create a Function App in the Subscription you want to deploy to (ensure that the Service Principal you setup previously has Contributor-level access to, at minimum, the Resource Group that will contain the Function).

You can use the process we document here to deploy to either a Service Plan or Consumption Plan, though there is a minor difference we will see later in the post.

The Azure resources to deploy should include:

  • Application Insights
  • Cosmos DB Account – Add Database “quotedemo” with Collections “quotes” and “leases”
  • Function App (can be Consumption or Service Plan)
  • Storage Account (can be created at same time as Function App)

Before we move on, make sure to capture the following:

  • Application Insights Telemetry Key (shown on the ‘Essentials’ part of the Application Insights instance)
  • Cosmos DB Account URL and Access Key
  • Function App:
    • AzureWebJobsStorage (Service Plan);
    • WEBSITE_CONTENTAZUREFILECONNECTIONSTRING (Consumption Plan);
    • WEBSITE_CONTENTSHARE (Consumption Plan);
    • FUNCTIONS_EXTENSION_VERSION (most likely set to “~1”).

d. Configure Application Insights for Release Annotations

In this scenario we will need to enable the Application Insights API or order to support Release Annotations which make it possible to see new release markers on timelines.

Once again, Microsoft has this well documented, including the Task you need to add in VSTS to allow you to create Annotations.

OK! Now we are ready to configure the Build and Release Management steps in VSTS.

2. Configure the Build

In a Team Project in VSTS you wish to use host the Build and Release Management Definitions go ahead and create a new Build.

Remember to select Github as your source repository.

Setup Github as source

When you are prompted for a template, select the ASP.Net Core (.Net Framework) build.

Select Build Template

If you want a Continuous Integration (CI) build then ensure you set the Trigger as required.

CI trigger

At this point we have a build that produces a packaged web application that can be pushed to the Azure App Service hosting the Function App. We could add more Tasks to the Build to do this, but we want to support multiple environments so this is where Release Management comes into play.

I recommend you run the Build to ensure it’s functional and to produce an artefact we can use in our next step.

3. Configure Release Management

Now we have a build that produces a build artefact we can now use VSTS Release Management (RM) to deploy and configure this artefact to any environment we can reach.

Let’s go ahead and choose to create a new Release Management Definition. When you have the option, select the “Azure App Service Deployment” template.

Release Management Template

The resulting Definition will be very vanilla and contain a single Task. We need to make some changes to deploy our service exactly as we’d like.

a. Add the Build Artefact

First we need to tell Release Management what we want to deploy, so let’s go ahead and add our existing Build by clicking on the Artefacts box and selecting our Build as shown below.

Add Artefact

If you wish to enable Continuous Deployment (CD) into Environments you can click on the lightning bolt on the Artefact and enable the CD trigger. Note that you can still stop automated deployments by putting in approvals or making deployments manual – by creating a Release you always have a build artefact to deploy.

CD Trigger

b. Configure RM Tasks

This is where your previously completed configuration with the Azure Service Endpoint and in setting up the resources in Azure will come into play.

Click on the Tasks tab and the RM Definition will open.

Clicking Task

Once open click on the Environment at the top of the Task list. As we are going to deploy all assets into a single Subscription we can set up a few items that will apply to all Tasks in the RM Definition.

The first thing we will do is to select the Service Endpoint we previously setup (below it is named “Service Principal for Demo”, but you can name it anything meaningful).

Setup Environment

Once you select the method of connection to the Azure Subscription change the “App type” field to be “Function App” and then from the final picker, select the Function App instance you setup earlier. If you don’t see it, it could be that you placed it another subscription or that the Service Endpoint does not have sufficient rights to list the Function Apps in the Subscription.

Your setting should look something like the below.

Environment Configuration

We could deploy the sample code now, but it would fail to run because it is missing configuration.

c. Deploying configuration

You will notice that up until now we’ve not dealt with any of the runtime configuration settings for the Function. When you develop locally the Functions Tools in VSTS will generate a “local.settings.json” file, but it will be blocked from commit via the gitignore included in the project type. It’s recommended you don’t change this, and even if you it won’t help you on deployment anyway (so… y’know, why bother to change the ignore file?)

For this Task we are going to need to pull in a free Marketplace Task – the Azure WebApp Configuration from Pascal Naber (Xpirit). This Task is a wrapper around some Azure Cmdlets, but it does a great job of removing your overhead in managing that 🙂

You will need to be a VSTS admin in order to install Marketplace Tasks (if you aren’t you can still request an admin to install them).

Once installed you can now add the Task to your Release Management Definition after the App Service Deployment (as shown below).

Release Tasks

The Task expects any App Settings you need to deploy to be added as Variables to the Definition. So, for example, if we want to control the value we set for the ‘APPINSIGHTS_INSTRUMENTATIONKEY’ App Setting in our target Function we would create a Variable in our Release called ‘appsetting.APPINSIGHTS_INSTRUMENTATIONKEY’ and set the value to be the Telemetry Key we captured earlier in the post.

The beauty of this approach is that you can one-way save secrets and they won’t show up (or be recoverable) via the Variables tab again. There is also an option to write them to Azure Key Vault if you want.

Below is a sample of the Variables once setup.

RM Variables

The eagle-eyed amongst you might spot that I am also setting default Function values (FUNCTIONS_EXTENSION_VERSION, AzureWebJobsStorage, AzureWebJobsDashboard for a Service Plan).

This is because I force the Application Settings Task to overwrite all existing values in the App Service. This is on purpose – it ensures no manual fixes are ever safe in Azure and our Release Management Definition is the source of truth for both the Artefact and the Configuration.

Note: For Consumption Plans make sure you set WEBSITE_CONTENTAZUREFILECONNECTIONSTRING, WEBSITE_CONTENTSHARE in addition to the above values. If you don’t your deployment will fail after the first deployment (this is the source of the error in the video).

The full set of Variables for our sample is listed below.

Common Variables

  • AppInsightsApiKey – API key you configured earlier for Application Insights (*not* Telemetry Key).
  • AppInsightsApp – Application ID configured earlier for Application Insights (also not Telemetry Key).
  • appsetting.APPINSIGHTS_INSTRUMENTATIONKEY – use telemetry key from Application Insights.
  • appsetting.AzureWebJobsDashboard – use exiting value from Function (before first deployment).
  • appsetting.AzureWebJobsSendGridApiKey – use SendGrid API key you setup earlier (should start ‘SG.’).
  • appsetting.AzureWebJobsStorage – use exiting value from Function (before first deployment).
  • appsetting.CosmosConnection – use Connection String from Cosmos Account you setup earlier.
  • appsetting.FUNCTIONS_EXTENSION_VERSION – use exiting value from Function (before first deployment).
  • appsetting.NotificationsSender – use an email address you control (to be used as From: in emails).

Service Plan deployment

  • None: above list is all you need

Consumption Plan deployment

  • appsetting.WEBSITE_CONTENTAZUREFILECONNECTIONSTRING – use exiting value from Function (before first deployment).
  • appsetting.WEBSITE_CONTENTSHARE – use exiting value from Function (before first deployment).

Once you’ve configured the Variables you should now be able to save the Release Management definition and create a Release to test out your deployment.

I’ve recorded a quick video (see below) that shows this end-to-end and also has an additional bonus step of hitting an HTTP Triggered endpoint on the Function as a post-deployment confirmation step (you will need to copy a Host key from the Function App and save it as ‘VersionApiKey’ in the Variable to use to call the API, then add the Smoke Web Test Task from the Marketplace).

So what should the demo Function do? It should trigger an email to a recipient when a record is added to Cosmos DB. The recipient is listed in the record that is inserted, samples of which are included in the Github project. If you can’t get it running make sure to leave a comment and I’ll help you out!

Tagged

Speaking: Azure Functions at MUG Strasbourg – 28 September

I’m really excited about this opportunity to share the power of Azure with the developer and IT Pro community in France that is soon to gain local Azure Regions in which to build their solutions.

If you live in the surrounding areas I’d love to see you there. More details available via Meetup.

Tagged , ,

A Year with Azure Functions

“The best journeys answer questions that in the beginning you didn’t even think to ask.” – Jeff Johnson – 180° South

I thought with the announcement at Build 2017 of the preview of the Functions tooling for Visual Studio 2017 that I would take a look back on the journey I’ve been on with Functions for the past 12 months. This post is a chance to cover what’s changed and capture some of the lessons I learned along the way, and that hopefully you can take something away from.

In the beginning

I joined the early stages of a project team at a customer and was provided with access to their Microsoft Azure cloud environment as a way to stand up rapid prototypes and deliver ongoing solutions.

I’ve been keen for a while to do away with as much traditional infrastructure as possible in my solutions, primarily as a way to make ongoing management a much easier proposition (plus it aligns with my developer sensibilities). Starting with this environment I was determined to apply this where I could.

In this respect the (at the time) newly announced Function Apps fit the bill!

The First Release

I worked on a couple of early prototype systems for the customer that leveraged Azure Functions in their pre-GA state, edited directly in the Browser. I’m a big C# fan, so all our Functions have been C# flavoured.

After hacking away in the browser I certainly found out how much I missed Visual Studio’s Intellisense, though my C# improved after a few years away from deep use of the language!

We ran some field tests with the system, and out of this took some lessons away.

I blogged about Functions with KeyVault and also on how to deliver emails using Functions with SendGrid.

At this point I also learnt a key lesson: Functions are awesome, but may not be best suited to time-critical operations (where the trigger speed is measured in milliseconds or low seconds). This was particularly the case with uncompiled Function Apps (which at the time was the only option).

I wrote a Function to push a sythentic transaction into our pipeline periodically to offset some of this behaviour. Even after multiple revisions we’re still doing this certain activities such as retrieving and caching Graph API access tokens.

Another pain point for us at this juncture was the lack of continuous deployment options. Our workaround was a basic copy / paste into a VSTS Web project held in Git repository in VSTS.

Around the end of our initial field trials Functions hit GA which was perfect for us!

The Second Release

I now had a production-ready system running Functions at its core. At this stage we had all our Functions in a single App because the load wasn’t sufficient to break them out.

Our service was yet to launch when the Functions team announced a CI / CD experience using the ‘Deployment Options’ setting in the Azure Portal.

We revved to a second release, using the new CI / CD deployment option to enforce deployments to a production App Service Plan. We were still living with on-demand compilation which we found impactful at deployment time as everything had to be restored and recompiled.

The Third Release (funproj all round!)

Visual Studio 2015 tooling was announced!

Yep, we used it – welcome to funproj Revision (3) of the codebase!

About now, one of my earlier blogs on sending email came back to haunt me as I began to (and still do periodically) receive emails from people clearly trying out my sample code!

Ah… the dangers of writing demonstrations that have your real email address in the To: field.

One item we’ve done with each major revision of the Functions code is to publish a new App Service Plan and use this is as our deployment target. We paramaterise as much as we can so we can pre-populate things like Service Bus connection strings. Our newly deployed Functions are disabled and then manually cut over one at a time until we’re satisfied that the new deployment is in a happy state. This n-1 Function App will live for a couple of weeks after initial deployment with its Functions disabled. This was all decided before deployment slot support which may change this design 🙂

We have a Function we can run that checks that all your environment variables are setup in a new App Plan – hopefully in future we’ll get this automated away as well!

Additionally, if we have a major piece of code changing, but not sufficient for a major revision, we’ll do an Azure-level backup of the App before we run the deployment so we have a solid recovery point to return to if need be.

Function runtime versioning

We had one particularly bad day where our Functions started playing up out-of-the-blue. It turned out that an unintended breaking change had been made by the Functions team when they did a new release.

The key learning from this experience for us was that our production Function Apps always run a known-good version of the runtime which is set as a specific version by setting the FUNCTIONS_EXTENSION_VERSION to something like ‘1.0.10576’ rather than ‘~1’ which means ‘latest non-breaking 1 release’.

We run our development App Services using ‘~1’ however, so we can determine periodically to update our production runtime to the latest known good runtime version so we don’t lag too far behind the good work the Functions team is doing.

Our other main issue during this revision was that our App Plan stopped being able to read the x509 cert store for some reason and our KeyVault client started failing. I never got to the bottom of it, but it was fixed through deploying to a new Plan.

Talking Functions

I was lucky enough to get to talk a bit about the solution we’d been building in early February.

The video quality varies, but here it is if you’re interested.

Hello Compiled Functions… oooooh Application Insights! (Rev 4!)

Compiled Functions? Sure, why not?!

The benefit here is the reduced startup times produced by the pre-compiled nature of the solution. The only downside we’ve found is deployments can cause transient thread-abort exceptions in some libraries, but these are not substantial and we deal with them gracefully in our solution.

As an example, we had previously seen Function cold start times some times up to 15 seconds. We now rarely see any impacts, some of which is the result of lessons we’ve learned along the way, coupled with the compiled codebase and the hard work the Functions team is doing to continuously improve their platform.

Early on in the life of our solution we had looked at Application Insights as a way to track our Function App performance and trace logging. We abandoned it because we ended up sometimes writing more App Insights code than app code! The newly introduced App Insights support, however, does away with all of this and it works a treat:

Functions App Insights

Unfortunately as part of the move here we also dropped Visual Studio 2015 support which meant we dropped the ‘funproj’ project layout. As Compiled Functions now give us local design time benefits we didn’t get before, it’s a good trade-off for my mind.

So… revision 5 anyone?!

We’re not here yet, though once the Visual Studio 2017 tooling hits GA we’ll probably look seriously at moving to it.

Ideally we’ll also move to a more fully testable solution and one that supports VSTS Release Management which we use for many other aspects of our overall solution.

Gee… sounds like a lot of work!

This might sound like a lot of busy work, but to be honest, the microservices we’re running using Functions are so small as to be easy to port from one revision to the next. We make conscious decisions around the right time to move, when we feel that the benefits to be realised are worth the small hit to our productivity.

Ultimately our goal is a shown below (minus the flames of course!)

That’s a wrap!

The Functions team is open in their work – ask them questions on Stack Overflow http://stackoverflow.com/questions/tagged/azure-functions

Raise your issues and submit your PRs here: https://github.com/Azure/Azure-Functions

Happy Days 🙂

Tagged , ,

Azure Functions: Build an ecommerce processor using Braintree’s API

In this blog I am continuing with my series covering useful scenarios for using Azure Functions – today I’m going to cover how you can process payments by using Functions with Braintree’s payment gateway services.

I’m not going to go into the details of setting up and configuring your Braintree account, but what I will say is the model you should be applying to make this scenario work is one where you Function will play the Server role as documented in Braintree’s configuration guide.

My sample below is pretty basic, and for the Function to be truly useful (and secure) to use you will need to consider a few things:

  1. Calculate total monetary amount to charge elsewhere and pass to the Function as an argument (my preference is via a message on a Service Bus Queue or Topic). Don’t do the calculation here – make the Function do precisely one thing – post a payment to Braintree and handle the response.
  2. The payment method token (or nonce) should also come from upstream from an end-user’s authorisation to pay. This is at the core of how Braintree securely processes payments and you should understand it by reading the Braintree documentation.
  3. Don’t, whatever you do, put your Braintree configuration details inline as plain-text. In all environments I always use Azure Key Vault for these sorts of details, coupled with my Key Vault client code*.

Note: I did get contacted by someone who advised that heavy workloads resulting in lots of calls to Key Vault will most likely result in port exhaustion on your Key Vault and your calling code receiving errors. You should consider this in your design – it’s not something I have had to work around just yet and I do have some ideas to solve in a relatively secure fashion which I hope to blog about in future.

Now we have the fundamentals let’s get into it!

Firstly we need to import the Braintree nuget package that brings all the goodness we’ll need to create and send requests and process responses from them. Add the following entry to your project.json file and when you save it the package will be restored.

Once we’ve done this we now have the power of the API at our fingertips and can craft requests as required.

In my simplified example below I am going to process a Transaction Sale for $10 (the currency will depend on your merchant account setup) and use a hardcoded Braintree Customer Identity that maps to an existing Customer entity that I’ve previously created in the Braintree Vault associated with my trial Merchant account.

This is a fairly convoluted example, as in reality you’d pass the paymentMethodToken to the Function as part of any data supplied and, as I noted above, you’d not leave your Merchant details laying around in plain-text (would you now?)

That folks, is pretty much all there is to it! Bake this into a set of Microservice Functions and pass in the total you wish to settle and off you go.

The Braintree SDK has a bunch of other service endpoints you can utilise to perform actions against other objects like Customers or creating recurring subscriptions so don’t limit your use solely to paying for things on checkout.

Happy days 🙂

Tagged , , ,

Azure Functions: Send email using SendGrid

Prior to Azure Functions announcing their General Availability (GA) I had previously used SendGrid as an output binding in order to send email messages.

Since GA, however, the ability to use SendGrid remains undocumented (I assume to give the Functions team time to test and document the binding properly) and the old approach I was using no longer seems valid.

As I needed to use this feature I spent some time digging into getting this working with the GA release of Azure Functions (version ~1). Thankfully as Functions is an abstraction over WebJobs I had plenty of information on how to do it right now thanks to the WebJobs documentation and extensibility :).

Here’s how you can get this working too:

1. Register your SendGrid API key in Application Settings: you must utilise the documented approach of setting your API key in an App Setting called “AzureWebJobsSendGridApiKey”. Without this your Function won’t be able to send mail successfully.

2. Import the SendGrid nuget package into your Function by creating a project.json file that contains this following:

3. Create an output binding on your function that will allow you send the message without needing to create client code in your Function:

4. Add a reference and using statement in your run.csx to ensure you have the right packages included. You can see this in the run.csx below that has all you need to create and send a simple email to a single recipient.

If you want to do a lot more customisation of the email that is sent you can simply refer to the SendGrid C# Library on Github which covers features such as sending using templates.

Once the Functions team publishes an updated approach to using SendGrid I’ll make sure to link to it from here. In the meantime… happy mailing!

Tagged , ,

Azure Functions: Access KeyVault Secrets with a Cert-secured Service Principal

Azure Functions is one of those services in Azure that is seeing a massive amount of uptake. People are using it for so many things, some of which require access to sensitive information at runtime.

At time of writing this post there is a pending Feature Request for Functions to support storing configuration items in Azure KeyVault. If you can’t wait for that Feature to drop here’s how you can achieve this today.

Step 1: Create a KeyVault and Register Secrets

I’m not going to step through doing this in detail as the documentation for KeyVault is pretty good, especially for adding Secrets or Keys. For our purposes we are going to store a password in a Secret in KeyVault and have the most recent version of it be available from this URI:

https://mytestvault.vault.azure.net/secrets/remotepassword

Step 2: Setup a Cert-secured Service Principal in Azure AD

a. Generate a self-signed certificate

This certificate will be used for our Service Principal to authorise itself when calling into KeyVault. You’ll notice that I’m putting a -1 day “start of” validity period into this certificate. This allows us to deal with the infrastructure running at UTC (which my location isn’t) and avoid not being able to access the certificate until UTC matches our local timezone.

b. Create Service Principal with Cert Authentication

This step requires you to log into an Azure Subscription that is tied to the target Azure AD instance in which you wish to register the Service Principal. Your user must also have sufficient privileges to create new users in Azure AD – if it doesn’t this step will fail.

At this point we now have a Vault, a Secret, and a Service Principal that has permissions to read Secrets from our Vault.

Step 3: Add Cert to App Service

In order for our Function App(s) to utilise this Service Principal and its certificate to access KeyVault we need to upload the PFX file we created in 2.a above into the App Service in which our Functions live. This is just as you would do if this App Service was running a Web App but without the need to bind it to anything. The official Azure documentation on uploading certs is good so I won’t duplicate the instructions here.

Watch out – Gotcha!

Once you’ve uploaded your certificate you do need to do one item to ensure that your Function code can read the certificate from store. You do this by adding an Application Setting “WEBSITE_LOAD_CERTIFICATES” and either specify just the thumbprint of your certificate or put “*” to specify any certificate held in the store.

Step 4: Function App KeyVault and Service Principal Setup

a. Nuget Packages

Accessing KeyVault with a Service Principal in Functions requires us to load some Nuget packages that contain the necessary logic to authenticate with Azure AD and to call KeyVault. We do this by adding the following to our Function App’s project.json.

b. KeyVault Client CSX
Now let’s go ahead and drop in our KeyVault “client” that wraps all code for accessing KeyVault in a single CSX (note that this is mostly inspired by other code that shows you how to do this for Web Apps).

Step 5: Use in a Function

As we’ve encapsulated everything to do with KeyVault into a CSX we can retrieve a secret from KeyVault in a Function using a single call once we’ve imported our client code.

Happy (Secure) Days!

Tagged , , ,