Bulk insert entities into Cosmos DB using Python

I’ve been working a lot with Python over the last few months and one piece of work I’ve spent quite a lot of time on is interfacing Python solutions with Cosmos DB.

Microsoft has shipped a great Cosmos DB Python SDK that provides almost all a developer could need to work with Cosmos DB. In one of my use cases I needed to batch insert multiple entities at the same time which is something the Cosmos DB SDK doesn’t support in any language. This meant I immediately looked at the Cosmos DB Bulk Executor library, only to find it only supports Java and .Net right now 🤨.

Luckily, having done a fair bit of Cosmos DB over the last few years I was aware this bulk insert pattern  had been solved previously using Cosmos DB Stored Procedures so I weant and dug up the sample bulk import stored procedure from GitHub:

https://github.com/Azure/azure-cosmosdb-js-server/blob/master/samples/stored-procedures/BulkImport.js

Job done.

Well… not quite!

Thankfully I have a fairly good head of hair, because I tore a fair bit of it out getting it all to work.

The first challenge I hit was a general lack of complex “Calling Cosmos DB Stored Procedures from Python” samples. Yes, there are simple examples floating around, but *all* of them consist of calling a stored procedure with no parameters, or calling a stored procedure with a single parameter. While I was calling and passing only one parameter I was passing an array and not a single value.

No matter what I did all I could generate from Cosmos DB was this response (pasted here in case others search for a fix online):


Exception: HTTPFailure: Status code: 400 Sub-status: 400

{ “code”:”BadRequest”, “message”:”Message: {\”Errors\”:[\”Encountered exception while executing Javascript. Exception = Error: The document body must be an object or a string representing a JSON-serialized object. Stack trace: Error: The document body must be an object or a string representing a JSON-serialized object. at createDocument (bulkImport.js:636:21) at tryCreate (bulkImport.js:32:9) at bulkImport (bulkImport.js:24:5) at __docDbMain (bulkImport.js:61:5) at Global code (bulkImport.js:1:2)”]} ActivityId: 95af001e-XXXX-XXXX-XXXX-6aca4d356c10, Request URI: /apps/0ee1f647-XXXX-XXXX-XXXX-04d92606fd10/services/53e603e8-XXXX-XXXX-XXXX-399602af32ec/partitions/09ebc044-XXXX-XXXX-XXXX-729efb2fd8ae/replicas/131869045473958276p/, RequestStats: \r\nRequestStartTime: 2018-11-18T22:26:58.2676364Z, Number of regions attempted: 1\r\n, SDK: Microsoft.Azure.Documents.Common/2.1.0.0″ }


Clear as mud … quelle horreur! I even found myself here:

Thankfully before I could post a question that would invariably be closed by some 30 million reputation-holding moderator and then down-voted into oblivion, I did a bit more digging and came across an answer to this question:

https://stackoverflow.com/questions/47748684/documentdb-bulkimport-stored-proc-getting-400-error-on-array-json-issue

This answer in particular:

https://stackoverflow.com/a/49450200/2899711

Thanks Aram…upvoted with a comment linking here 😊.

Which lead me to the solution shown below. The key element here is the line:

client.ExecuteStoredProcedure(sproc_link, [new_docs])

The ‘new_docs’ array is itself enclosed in an array!

Happy days! 😎

P.S. – You can learn about working with Cosmos DB completely for free on Microsoft Learn. (The added bonus is no Azure subscription or entry of credit card details is required to get going!)

Tagged

Azure Functions for Java developers

I had great fun yesterday presenting at the Sydney Serverless Meetup on the currently in-preview support for Java on Azure Functions. Rather than throw the slides up somewhere online I thought I’d write a blog that covers the content with the added bonus of giving you the option to dig into items in more detail.

First some background

I started with a bit of an overview of Azure Functions and why it is we have two runtime versions (because two is better than one, right?!) Actually, there are two versions because the new v2 runtime is designed from the ground up to be cross-platform as a runtime and developer experience, along with providing much better language (.Net, Java, Node, etc) and Binding extensibility. Azure Functions offers great productivity benefits, but you need to understand a bit about the basics to get going. Key items covered included:
  • Open Source – you can find everything the Functions team does on GitHub. Have a bug or feature request? File an issue (or even better… submit a PR 😍)
  • Triggers – control how Functions are invoked (on a schedule, on a HTTP request, on a new queue message).
  • Bindings – remove integration complexity by using Bindings to declaratively connect to input and output data sources.
  • Proxies – build clean REST APIs over Functions or other APIs without the need to setup and run a separate API gateway service.
  • Local Developer Experience – Functions development work (v2 runtime) can be done on Linux, Mac and Windows using whatever IDE you are used to. Java developers can build Java-based Functions locally using Eclipse, IntelliJ or Visual Studio Code
  • CI/CD – Functions support a rich set of CI and CD solutions
  • Monitoring – Application Insights is your go-to default experience but you have other options for your business logic use, particularly in v2 with the flexibility of the Logger design in .Net Core.

The Java story for Functions

At the time of the presentation (and this blog) the Functions team was actively working towards the General Availability (GA) of Java support for Functions. This work consists of two items: (1) the Java Worker that does the heavy lifting with the Functions Host of running the actual Java Functions you write; (2) the Maven Plugin that enable developers to quickly create, package, debug and deploy Java-based Functions. You can see from the Worker repository how the team is working at the very core of Java to enable the language as a first class citizen on the Function runtime. This is not simply invoking java.exe 😜. I demonstrated how you can create Functions using a Maven Archetype and then develop and test your business logic locally in IntelliJ with Maven. The main Docs site has the full details on how you can do this, so I won’t reproduce it here – go and try it out for yourself. If you need to import libraries as part of your solution you can add them to the pom.xml like you would in any standard modern Java project. We talked about scenarios where Java Functions might make sense in your world. Scenarios discussed included:
  • Modernise cron / batch / background jobs – you may have handed these to the ops folks in the past to write or run – now you can bring them into your development team’s SDLC.
  • Move @Scheduled Spring Boot or ExecutorService workloads out of other code into a stand-alone execution context.
  • Simplified event-based programming – remove the need to have polling logic in your code.
  • Develop “Microservices”.

But that’s not all…

If you order now you get a free set of steak knives! No… wait… wrong context! But… actually… *almost*. Microsoft has been working with Azul Systems to provide support for Java on Azure using Azul’s Enterprise builds of the OpenJDK. The benefit of this, as announced recently, is that Java workloads on Azure using Azul will benefit from free LTS support after January 2019.

Developing your own Bindings

I rounded out the evening by talking about how you can extend Functions capabilities by leveraging the new extensibility model for Bindings. Unsurprisingly this is very well documented on GitHub by the team, along with examples. We covered that these extensions are developed using .Net but can be consumed by any language bolted into the Functions runtime (which is awesome as a Binding provider – write once!) I even threw together a very simple example of writing a text file to a cloud storage container that used an output binding to upload a blob where the Binding did the heavy lifting of connecting to the provider and writing the file! No more plumbing code mixed in with my business logic 😎!

Useful links

Here are the raw set of useful links from my slides:

Audience questions

I managed to get through most of the questions on the evening – for those that I didn’t here are the answers.
  • Gradle support: unofficial right now – I’m waiting on official support advice.
  • Concurrency limits for Functions: official documentation on parallel execution.
  • Max execution time in containerised Functions: on a Consumption-style Plan in Azure: 10 minutes; otherwise indefinitely!
So, there we have it – what a great story for Java developers wanting to do more event-based programming! Happy days! 😎
Tagged , , , , , ,

Understanding Tenants, Subscriptions, Regions and Geographies in Azure

If you are getting started working with Azure you might come across a few key terms that it’s important to have a good understanding of. In this post I’m going to cover what I think are four of the key ones.

Tenant

A Tenant, as it relates to Azure, refers to a single instance of Azure Active Directory, or, as it is often called “Azure AD”. Azure AD is a key piece of Microsoft’s cloud platform as it provides a single place to manage users, groups and the permissions they hold in relation to applications published in Azure AD.

Key Microsoft applications that Azure AD provides access to include Office 365, Dynamics 365 and Azure. Yes, you read that right, Azure is treated as an ‘application’. You can also use Azure AD to control access to many other third-party applications such as Salesforce and even the AWS admin console. As an application developer you can register your own applications in Azure AD for the purpose of allowing users access.

Azure AD Tenants are globally unique and are scoped using a domain that ends with ‘onmicrosoft.com’ (i.e. myazuread.onmicrosoft.com) and each has a ‘Tenant ID’ in the form of an UUID/GUID. Some customers choose to connect their internal Active Directory environment to Azure AD to allow single or same sign-on for their staff and will also use a custom domain instead of the default ‘onmicrosoft.com’.

When you access the Azure Portal, or leverage one of the command-line tools to manage Azure resources in a Subscription, you will always be authenticated at some point via the Azure AD Tenant associated with the Subscription you want to access. The actions you can take will depend on the Role you have been assigned in the target Subscription.

Finally, Azure AD Tenants can be associated with multiple Subscriptions (typically in larger organisations), but a Subscription can only ever be associated with a single Azure AD Tenant at any time.

Dev Tip: if you want to develop an application that uses Azure AD but don’t have permissions to register applications in your company’s Azure AD Tenant (or you want a ‘developer’ Azure AD Tenant) you can choose to create a new Azure AD Tenant in the Azure Portal. Make sure in your application that you can easily change Azure AD Tenant details to allow you to redeploy as required. Azure AD has a free tier that should be suitable for most development purposes.

IT Pro Tip: you can change the display name for your Tenant – something I strongly recommend, particularly as Azure AD B2B will mean others will see your Directory name if they are invited and may be confused if the display name is something unclear. Note that you are *not* able to change the default onmicrosoft.com domain.

Subscription

A Subscription in Azure is a logical container into which any number of resources (Virtual Machines, Web Apps, Storage Accounts, etc) can be deployed. It can also be used for coarse-grained access control to these resources, though the correct approach these days is to leverage Role Based Access Control (RBAC) or Management Groups. All incurred costs of the resources contained in the Subscription will also roll-up at this level (see a sample below).

Subscription costs view

As noted above, a Subscription is only ever associated with a single Azure AD Tenant at any time, though it is possible to grant users outside of this Tenant access. You can also choose to change the Azure AD Tenant for a Subscription. This feature is useful if you wish to transfer, say, a Pay-As-You-Go (PAYG) Subscription into an existing Enterprise Enrolment. Subscriptions have both a display name (which you can change) and a Subscription ID (UUID/GUID) which you can’t change.

Subscriptions are not tied to an Azure Region and as a result can contain resources from any number of Regions. This doesn’t mean that you will have access to all Regions, as some Geographies and Regions are restricted from use – we’ll talk more about this next.

Resources contained in a Subscription, but deployed to different Regions will still incur cross-Region costs (where applicable) for the resource.

People sometimes use the word ‘Tenant’ instead of ‘Subscription’ or vice-versa. Hopefully you can now see what the difference is between the two.

Regions and Geographies

Azure differs from the other major cloud providers in its approach to providing services close to the customer. As a result, and at time of writing (August 2018), Azure offers 42 operational Regions with 12 more announced or under development.

A Region is a grouping of data centres that together form a deployment location for workloads. Apart from geo-deployed services like Azure AD or Azure Traffic Manager you will always be asked what Region you wish to deploy a workload to.

Regions are named based on a general geography rather than after exactly where the data centres are. So, for example, in Australia we have four Regions – Australia East, Australia Southeast, Australia Central 1 and Australia Central 2.

A Geography, as it relates to Azure, can be used to describe a specific market – typically a country (Australia), though sometimes a geographic region (Asia, Europe). Normally within a Geography you will find two Regions which will be paired to provide customers with high availability options. Can anyone spot the one Region that doesn’t have its pair in the same Geography?

There are a few special Regions that aren’t open to everyone – US Government Regions, the entire German Geography and China. In Australia, in order to access Australia Central 1 and 2 you must undergo a white listing process to gain access.

When you replicate data or services between Regions you will pay an increased charge for either data transfer between Regions and / or duplicated hosting costs in the secondary Region. Some services such as Azure Storage and Azure SQL Database provide geo-redundant options where you pay an incremental cost to have your data replicated to the secondary Region. In other cases you will need to design your own replication approach based on your application and its hosting infrastructure.

Once you have deployed a service to a Region you are unable to move it – you have to re-provision it if you need the primary location to be somewhere else.

As a final note, while there is a Regional availability model (replication of services between Regions), Microsoft has also introduced the concept of Availability Zones. Availability Zones are still being rolled out globally, and are more than just a logical overlay over Regions. Interesting times!

So there we have it, a quick overview of some of the key terms you should be familiar with when getting started with Azure. As always, if anything’s unclear or you have any questions feel free to drop a comment below.

😎

Tagged , , ,

DDD Sydney 2018 – Super sold-out Saturday!

Microsoft has always been a company by, of, and for developers, and the renewed Microsoft commitment to meet the Australian developer community wherever they are continued this past weekend when we were at DDD Sydney.

DDD Sydney, like Perth before it, didn’t disappoint in its slick organisation and community-selected content. The unique spice to the Sydney event though was the dedicated junior developer track – a concept to be loudly applauded and, I hope, replicated elsewhere over time.

DDD Sydney Badge

As this was my home city event I asked my oldest son if he’d like to be a paid attendee for the day – something he quickly said ‘yes’ to. As he’s only just started high school I didn’t expect him to understand all the content, but I thought it would be a good experience for him and hopefully he’d pick on some tips along the way. Well… I think he did pretty well out of the day based on the below Twitter post…

… and now I’ve had to install Python on his computer and order a Raspberry Pi for him! (A big thanks to Damian Brady from the Microsoft CDA team for spending the time with my son!)

Carrying on though…

At the Microsoft stand we had a great time discussing the Azure platform with developers and getting to understand how we can help them do more with it. Some of the discussions I had covered:

– What’s the difference between an Azure Tenant, Subscription and Region. This is a good question, and I’ll post a follow-up blog post to cover this as understanding these three items are fundamental to working with Azure, particularly if you are working with many customers who each own a Subscription.

– How about extending Visual Studio Team Services (VSTS) to do Domain Drive Design (DDD)? Good idea! VSTS has an extensibility model you can tap into if you want to add to the platform’s capabilities or customise it to suit your requirements.

– We had a bunch of people sign-up and play our Azure Cognitive Services-based game “Where’s Bit?”. We’ll eventually open this up for everyone to have access to how we built the game, but for now keep an eye out for it at the upcoming events we’ll be at.

Two lucky winners of “Where’s Bit?” will receive invites to an upcoming Xbox triple A title launch event in Sydney in September (I can’t tell you what it is.. but it’s pretty damn cool.. wish I was going!)

Overall I was really happy with how the day went, and really pleased that we were able to support the organisers in their first sell-out year! Sydney developers care deeply about their personal development and being engaged with others in their community and the buzz on the day was hard to come down from once the day was over!

I’m always happy to take feedback or questions around Azure for developers – feel free to leave a comment below or to hit me up on Twitter if you have any.

We’ll be back in Sydney for NDC in September and YOW! in November – hope to see you there then!

😎

Keynote with Damian Brady on AI and ML.

Keynote

Closing session – excess food was donated to OzHarvest.

Closing panorama

Tagged , , ,

DDD Perth 2018 – the value of a strong developer community

This past weekend I was in Perth to attend DDD Perth as part of Microsoft’s sponsorship and thought I’d write up my experience.

For those of you unfamiliar with DDD (Developer! Developer! Developer!), it is a community-organised event that started in the United Kingdom in 2005 and has run in various cities in Australia over the last few years.

If you take a look through the published agenda for Perth for 2018 you will see a fantastic variety of content and speakers – ranging from deep tech through to personal well being – all voted for by the community. This is a clear guide that developers are looking far beyond what could be considered traditional interest areas and making the most of personal growth opportunities available at events like DDD.

I didn’t get a chance to take any photographs, but I’m sure the DDD Perth blog will post a bunch when the committee has recovered from running the event for 400+ people! What I can talk a bit about here though is the types of discussions I had at the Microsoft booth.

  • Does Azure have a free tier? Yes. You can easily get started at https://azure.microsoft.com/en-us/free/.
     
  • How can I learn more about Azure? There are a bunch of different ways, but a good general platform overview is delivered at Azure Discovery Days. You can find the list of upcoming Australian events at https://www.microsoft.com/en-au/azurelearningpathways/upcoming-event.
     
  • Are Azure Functions production-ready? Yes, they are. The confusion here stemmed from the ‘v1’ (production-ready) and the ‘v2’ (in-preview) versions of the runtime. This is a reminder to me that people not close to a platform often don’t get the subtleties and simply hear “not production ready” as the message.
     
  • How do I migrate between Azure Regions? Well… I could write an entire (never ending) series on the number of ways to achieve this! I will leave it at “it depends”….
     
  • What do I need to do to get a t-shirt, book or sticker? (Hint: play one of our games, give us feedback, join our OSS4Good project)

There were a lot of other discussions on the day with the team on the booth and we had our two games up and running (I’ll keep those under wraps for now as a surprise for others) as well as launching our Open Source for Good project which you can head over to Github to learn more about (if you’d like to get involved feel free to leave a comment below and we’ll invite you).

I tweeted how amazing this day was, and the organisers need to be proud to have built such an amazing and accessible event! If you didn’t get a ticket for this year, then I’d certainly keep an eye out for next year’s event.

If you live in the following cities then you’re in luck because your 2018 DDD is yet to happen. Sydney is up next on 18 August, so make sure to get your tickets soon!

I’ll be at these events so please drop by and say hello!

😎

Tagged , ,

Empowering Australian Developers to do more with Azure

The first IT job I had was training mature age students at college how to use PCs. While I was doing this I also wrote and delivered a course on how to build sites for the (new at the time) World Wide Web.

My work on the WWW course got me noticed by a business in London that was building websites for their customers. After joining them, one of the largest projects I worked on was the first website for Marks & Spencer which we developed using Active Server Pages (ASP) and hosted on Windows NT 4. We even ran a celebrity text chat session for M&S using Exchange 5.5’s chat service!

Clearly there have been a few years (my son: “yeah, like 10 billion years”) between the above and today, and I’ve done a lot of things in the interim: development, operations, delivery management and community organisation. But at my core I’ve always been a developer.

If you’ve read this far I am sure you’ve figured out that today I’m not solving a technical problem for you. 🙂

So why am I reminiscing about times past? Well, really, I’m just calling back to where I started my career: helping people do more with technology, and particularly with new or unfamiliar technology.

Which leads me to the reason for this blog post.

I’m super excited to let everyone know that from August 2018 I’ll be joining Microsoft here in Australia as the Azure Pro Developer Lead, giving me the opportunity to return to my roots. I will be supporting developers in building their solutions on Azure using the platform’s wide range of innovative features, helping them move beyond Virtual Machines!

Azure enables rapid realisation of ideas you have and I want Australian developers to be empowered to deliver their ideas using Azure as their accelerator. I’m looking forward to starting and will see you at an event, at your office or over a coffee.

Happy Days 😎

Azure and Bit

Azure and Bit artwork from the talented Ashley McNamara. Thanks Ashley!

I also recently saw the below tweet which resonated with me around why I think Microsoft Azure is the best place for developers.

Tagged , ,

Fix Provider error in Cloud Shell when using AKS in a new Azure Region

Given the recent announcement of the GA of Azure Kubernetes Service I thought I would take it for a spin in one of the new Regions it is now available in. I have previously deploy AKS in East US using the Azure Cloud Shell so didn’t expect to run into any issues. However, I hit a minor snag, which I’m documenting here in case you come across it too.

az group create --name rg-aks-01 --location westus2

az aks create –resource-group rg-aks-01 –name testaks01 –node-count 1 –generate-ssh-keys

The subscription is not registered for the resource type ‘managedClusters’ in the location ‘westus2’. Please re-register for this provider in order to have access to this location.

And this is the fix.

az provider register --namespace Microsoft.ContainerService

Registering is still on-going. You can monitor using ‘az provider show -n Microsoft.ContainerService’

Then a short while later I ran the ‘show’ command and can now see this service is available in all the new Regions for GA (snippet shown below).

"locations": [
"UK West",
"East US",
"West Europe",
"Central US",
"Canada East",
"Canada Central",
"UK South",
"West US",
"West US 2",
"Australia East",
"North Europe"
]

Happy Days! 😎

Tagged , , ,

Read Tags from Azure Resource Groups and track using Table Storage

If you run in an environment where you need to track changes to Tags on Resource Groups in Azure then you may find this PowerShell snippet useful as code to drop into a Runbook.

The snippet will enumerate all Resource Groups in a Subscription (we assume you are already logged into the Subscription you want to use) and then extract all Tags from each Resource Group and write the details to Azure Table Storage.

Once you run this snippet you will be able to use the data for reporting purposes, where each Resource Group’s Resource ID will be used as the Partition Key, with the Tag Name (Key) and the current Date / Time used as a Row Key. You now have a reporting source you can use in the likes of Power BI.

Happy Days 😎

Tagged , ,

Are Free Tier Cloud Services Worth The Cost?

Over the years that I’ve been talking with public groups on cloud services, and Azure in particular, I will typically have at least one person in every group make a statement like this:

“Azure’s good, but the free tier isn’t as good as AWS.”

I’ve discussed this statement with groups enough times that I thought it would be good to capture my perspective on where Azure stands, provide some useful resources, and pose a question to those whose starting point is free services.

You want The Free? You can’t handle The Free!

You want the free?!

When people talk about how a free tier isn’t that useful, typically what they are saying translates into is one of two scenarios:

  1. The timeframe the free tier is offered for is not long enough for the person to achieve an acceptable learning outcome based on their time investment;
     
  2. More commonly, the service limits are too low meaning the person cannot achieve an acceptable learning outcome before their credit runs outs (regardless of time).

The reality is, beyond basic scenarios (run a Virtual Machine, create a database), and where someone doesn’t have sufficient continuous time to allocate to their cloud environment, the more likely it is they will receive minimal value from free tier services.

Effective use of free tiers

So how to minimise these outcomes?

  1. Be clear about what you want to achieve before you start a free tier subscription. If you don’t know *what* you want to do in advance you are likely to fritter away that free credit before you get to your eventual end goal. Additionally, if you know what you want to achieve then review the required cloud services you will use and determine if a free tier is going to provide you with sufficient resources to reach your goal.
     
  2. Start with pre-built environments or quickstarts – find labs or similar that give you access to existing environments. Attend events that include credits as part of attendance and use those to achieve a goal. Look at tutorials and samples to find automation scripts / templates that can get you up and running quickly (but remember the previous tip – if you try to provision a ten node Kubernetes cluster will that actually succeed in a free tier? Would a one node cluster suffice to allow you to learn?)

All the cloud platforms will provide you with time-limited free tiers, with some services being offered as “always free” at certain low usage levels.

Azure has had free services trials or tiers in one way or another for some time. Traditionally, however it hasn’t offered a 12 month period, though fairly recently that’s changed and there is now an extended 12 month Free tier offering for Azure.

One Azure cloud… many ways to get ongoing credits

Where Azure does differ substantially from AWS in particular is in the number of offerings Azure has that get you access to Azure credits on ongoing basis, lifting you out of having to use just free tier services:

  • Microsoft Developer Network (MSDN) Azure Benefits – available as an add-on to existing MSDN subscribers (note: your organisation might not have access to this benefit depending on your licensing). This is an ongoing benefit while you pay for an MSDN subscription.
  • Azure Starter for Students (formerly Dreamspark). This is an ongoing benefit while a student.
  • BizSpark Benefits – available to those who are leveraging the BizSpark programme for their business. Ongoing benefit while you are in the BizSpark program.
  • Azure for non-profits – go through the process to prove your status and gain access to Azure Credits.
  • Microsoft Azure Passes – typically when Microsoft runs training courses for Azure attendees will typically be provided with Azure credits in the form of an Azure Pass. We gave these away at the Global Azure Bootcamp this year. Time-limited offers (one to three months).

Investing in yourself or your idea

The reality of free tier services is they will only get you so far, whether the use you make of the cloud is to learn new concepts or to try an idea you have.

My take is this: if you aren’t prepared to invest your own money (i.e. I just want more free stuff) then you don’t put much value on your own education or idea.

If we had the cloud computing services we have now when the dotcom boom was happening we may well have seen a massively different outcome.

Startups wouldn’t have spent massive amounts of their funding on infrastructure and wasted months waiting for services to be provisioned before they even got to serving the first request.

Imagine if you had on-demand services when you were at school (maybe you still are) – the quality of your education would be improved by access to these sorts of services.

We are at a pivotal moment where we now have access to on-demand resources that a generation ago would have been unimaginable. If you are serious about an idea or personal development put your money where your brain is.

But I’m not an accountant!

Congratulations. Now you are! Didn’t hurt a bit either, did it?

It’s unavoidable for many of us that at some point it will come down to cost. I know there will be more than a few of you sitting there having previously paid a larger than expected cloud hosting bill. I bet you now manage those resources like a hawk. While this is a painful way to learn, you will have identified a key factor in how you design and run cloud native services.

Also, welcome to how businesses work – specifically how to control costs so they can remain viable. This is why your last request to the ops team for 10 servers was rejected, or why you had to finesse your design to fit into existing infrastructure constraints. 🙂

So, where to next?

I highly recommend spending time familiarising yourself with services in the cloud too – avoid anti-patterns that will likely be where you will unexpectedly spend more money than you thought.

You can find good examples of ways to configure services from the likes of Scott Hanselman and content like his “Penny Pinching in the Cloud” posts, or Troy Hunt’s posts on how “Have I been pwned” performs on Azure (pricing at the bottom of the post).

So, did I solve your problem? Make more of the Free? Unlikely I suspect.

Ultimately you need to consider that free tiers and services are designed as a taster, to get you thinking about how your could use those services for other things. While there are “always free” services, the reality is you will be unlikely to build the next Atlassian with it, but I’m pretty sure you can use them to pass exams or to get educated on cloud technology.

Happy Days 😎

Tagged , , , ,

Developer toolkit for working with Azure AD B2C JWT-protected APIs

I’ve blogged in the past about Azure Active Directory B2C and how you can use it as a secure turnkey consumer identity platform for your business.

In this post I’m going to walk through how you can debug JWT-protected APIs where those JWTs are being issued by AAD B2C. Note that a lot of what I write here will probably be applicable in any scenario where you are working with JWTs as AAD B2C is standards compliant so any advice here can be applied elsewhere.

We aren’t going to get into the Identity Experience Framework (IEF) here because he’s a whole universe of detail beyond the basic policy engine we’ll cover here 🙂

Your toolikit

Here’s the tools to get started with debugging.

Required tools:

  • A test AAD B2C tenant – a very strong recommendation *not* to use your production one!
  • An API testing tool like Postman. The B2C team has published how you can use Postman to test protected APIs.
  • Your API source code in a debug environment. Must be configured in the test AAD B2C tenant you are using!
  • A test client application – I’ve been using a customised version of the WPF sample client app from the B2C team.

Optional, but recommended:

  • jwt.ms (there is also jwt.io if you prefer)
  • Mailinator or any number of alternatives.
  • Create a B2C Profile Edit Policy even if you never roll it out to customers. This policy can be invoked via the Azure Portal to allow you to initialise new profile attributes.

Use standard OAuth libraries in your clients

Microsoft has great first-party support for B2C with the Microsoft Authentication Library (MSAL) across multiple platforms, but as B2C is designed to be an OAuth2 compliant service so any library that supports the specification should work with B2C. Microsoft provides samples that show how libraries like AppAuth can be used.

Rolling out custom attributes

There is currently a limitation with B2C around rolling out new custom attributes. Until an attribute is referenced in at least once policy in your tenant the attribute isn’t available to applications that utilise Graph API. This is why I always create a profile edit policy even that I can add new custom attributes to and then invoke the policy via the Azure Portal to initialise the attribute.

Testing APIs

Create test users

This is where a service like Mailinator comes in handy – you can create multiple test users and easily access the email notifications sent by B2C to perform actions like initial account validation or password reset.

Note: free services like Mailinator may be good for simple testing, but you may have security or compliance requirements that mean it can’t be used. In that case consider moving to a paid tier or other services that provide secured mailboxes (a service like outlook.com).

Request Tokens – Test Client Application

Once you have one or more test users you can then use one of the following approaches to obtain test tokens to use when calling APIs.

If you aren’t using Postman to retrieve tokens to supply in API calls then you can use the test client application above (assuming you are developing on Windows – or at some future point when we get WPF ported to .Net Core :)) to request tokens for users in your test tenant.

B2C Test Tool

Once you have the tokens you can copy them out and use them in Postman to make requests against your API by setting Authorization in Postman to use Bearer Tokens and then copying the value from the test tool into the ‘Token’ field.

Postman using a Token

If you’ve having issues with tokens being accepted by your API then you can leverage jwt.ms to review the contents of the token and see why it might be being rejected. A sample is shown below.

jwt.ms sample

If you have access to the target API source code make sure to debug that at the same time to see if you can identify why the token is being rejected.

As a guide the common failure reasons will include: token expired (or not yet valid); scopes are incorrect (if used); incorrect issuer (misconfiguration of client or API where they are not from the same B2C tenant); invalid client or audience ID.

So there we are! I hope you found this post useful in debugging B2C APIs – I certainly wish that I’d had something to reference when I started developing with B2C! Now I do! 😉

Tagged , , ,