Straight talking 64

by Tim Anderson

Hands on with a Microsoft Azure project – Tim Anderson finds out how easy it is to develop for Redmond’s cloud platform.

HardCopy Issue: 64 | Published: October 30, 2014

Microsoft’s Azure cloud platform got off to a poor start when it launched back in 2009, but since 2011, following some management changes (including the arrival of Vice President Scott Guthrie), its progress has been remarkable. The improvements began with an HTML5 management portal that was a model of clarity and usability, and continued with features including stateful virtual machines, and Azure Web Sites which scale from free to multiple load-balanced instances.

Each month has seen the announcement of new services. For example, in August 2014 Guthrie used his blog to announce a new NoSQL document database called DocumentDB, a new search service, Web Jobs for Azure Web Sites, and more. The recently announced Azure Machine Learning, coming out of the work of Microsoft Research, looks like a remarkable resource for data analytics.

Azure has also matured fast on the infrastructure side. There is now excellent global coverage in terms of datacentres, including two in Europe. Services like Azure Backup and Azure Data Recovery offer enterprise-quality resilience even to relatively small businesses.

Virtual Machines (VMs) on Azure have made developing on Microsoft’s platform radically easier and more efficient. You can go from a Visual Studio project to a prototype that the client can view on the internet in just a few minutes. Microsoft is also beginning to catch up with Amazon in terms of the range of VMs on offer. New D series VMs come with solid state disks from 50GB to 800GB, enabling applications that make use of SSD caching for performance. SQL Server 2014 has this built-in through a feature called Buffer Pool Extensions.

 

Azure in practice

portal

Azure’s preview portal, showing the global network of datacentres.

Azure’s rapid progress meant that, for a small company I work with which needed a web application to support its customers, including the upload and download of large files, Microsoft’s cloud looked like a natural fit. The company uses Office 365, which means all its internal users are on Azure Active Directory (AD). Visual Studio 2013 has a template for an ASP.NET MVC project with Azure AD authentication, so combining this with Azure Blob Storage (Microsoft’s equivalent of Amazon’s S3 internet storage) looked like a quick solution.

The project was successful, but turned out to be more challenging then it first appeared (a common experience for developers). That moment when you get past the glossy promises of the high-level features and into the nitty-gritty of implementing them is often painful, and so it proved.

The first issue was with Azure AD. With ASP.NET MVC, the URL in the browser maps to controller classes and methods. For example, a URL like http://someproject/admin/manage would typically execute a method called Manage in a class called AdminController. A great feature of the framework is that you can secure these classes and methods by decorating them with attributes. The attributes are placed on the code you want to protect, rather than elsewhere in configuration files, making this a robust technique for securing your code.

If you are managing users in your own database, it is easy to add roles to your application, and assign them to users, but what if you are using Azure AD? It turns out that the project template in Visual Studio only gets as far as allowing users to log in. There is no provision for role management, nor for the obvious step of querying Azure AD to discover the security groups to which the user belongs.

Further research revealed that a REST API called the Graph API lets you query Azure AD to get the information you need. An intricate post by Microsoft identity expert Vittorio Bertocci explains how to use this to integrate role-based authorisation in ASP.NET with Azure AD, by writing a custom authentication manager. Good news, except that the post is peppered with warnings like “this is pretty scrappy code” or “those claims are very likely to change, hence the above will no longer be valid” and “remember, we are still in developer preview”. That last is worrying, but the post was written in January 2013. Presumably all this is fully baked by now?

Unfortunately it turns out that the libraries for using Azure AD have been in a constant state of flux. Bertocci refers to the Azure Authentication Library (AAL), but this has been replaced by the Active Directory Authentication Library (ADAL), and the Graph Helper has given way to the GraphClient with an incompatible API. If you search for help on the subject you find sample projects which come either with warnings “do not use this, it is deprecated”, or else a ton of dependencies on preview code that is likely to change.

I patched together some code and it worked, but it was not straightforward. Time though for the next task: working with the Azure Blob Storage API.
The Azure Blob Storage REST API seems to me excellent and (so far) reliable, allowing the transfer of files block by block so you can code large file upload and download with retry logic in case of a failed block. Most of the sample code though, even in official Microsoft samples, is poor. There are samples that cannot work with large files – that is, files measured in GB rather than MB – because they take the simple approach of first posting a file to the web server, and then uploading it to Azure Blob Storage in a single shot with CloudBlob.UploadFile. There are several problems with this technique, namely that a large file will choke the web server and the upload will fail; that unless the browser provides a progress bar, the user will get no progress indication during the long upload operation; and that if the internet connection is at all unreliable, the user may get the frustration of a failed upload half way along.

The solution is to use repeated calls to CloudBlockBlob.PutBlock to upload the file piece by piece, with an AJAX call to update a progress bar after each successful block. This does pose a problem for versions of Internet Explorer before 10, since you need the JavaScript file API to grab specified portions of a file to upload, rather than just the whole thing. Otherwise, you will have to use Java, Silverlight or Adobe Flash, or write a client desktop application.

Another issue is that for optimum speed you should upload blocks in parallel, rather than waiting for each block to complete before sending the next one.

The final application actually worked nicely, but I was surprised not to find high quality samples for what seems to me the most common of requirements: transferring large files to and from Azure Blob Storage.

Another potentially painful area is Entity Framework (EF), Microsoft’s object-relational mapping library. You do not have to use EF, but many of the samples and libraries expect it. EF can save time, but equally it can add complexity, and it is not always obvious how to accomplish tasks that are easy with a lower-level SQL-based approach. Performance can also be an issue.

 

Ready for business?

If software development was straightforward then there would be no need for skilled developers, and the kinds of problems I encountered, with changing APIs, inadequate samples and frustrating documentation, are no different from those which coders regularly encounter on other platforms. Even so, the experience illustrates that, despite its impressive progress, the Azure development platform is not as mature as it first appears.

The link between Office 365 and Azure, via Azure AD, seems to me a strategic one. It makes sense for businesses to have a single directory with which to manage both Office 365 and custom applications, so basing an application on Azure AD adds real business value. If an employee leaves, disabling a single account blocks access to all those applications, protecting the organisation’s data.

Fundamentally, Azure does seem to be well engineered, and in the end that counts for more than any development pain. At the same time, I am waiting impatiently for the development libraries and documentation to mature. Developing for Azure makes sense, but allow plenty of contingency time for working through unexpected obstacles.

 Microsoft Azures ads
MindManager 2017 banner