Monday, 22 December 2014

Encrypting and decrypting data in an Azure service using a certificate

I recently had a requirement to encrypt some data stored in a Web.config file for an Azure hosted service that’s accessed over HTTPS.

To help secure information in configuration files, ASP.NET provides a feature called protected configuration, which enables the encryption of sensitive data in a configuration file. The recommended approach is to protect configuration using either the DpapiProtectedConfigurationProvider class or the RsaProtectedConfigurationProvider class that are both included in the .NET framework.

Unfortunately these protected configuration providers do not work with Azure. The DpapiProtectedConfigurationProvider class uses a machine-specific key that cannot be transferred to Azure. While the RsaProtectedConfigurationProvider enables transferring an RSA key pair in an XML file to different machines, and then importing the key to a key container, the XML file is meant to be removed from the machine after the key has been imported. On Azure, since the account running the Web role doesn’t have permissions to delete files in the web root, it is not possible to remove the XML file.

I needed a solution that would:

  1. Allow data to be decrypted when running the service locally, and when running the service in Azure.
  2. Allow data to be encrypted and decrypted on any machine in our organization.
  3. Not be so onerous to prevent the periodic changing of the data to be encrypted/decrypted.

The recommended approach for Azure is to use the Pkcs12 custom protected configuration provider and the Aspnet_regiss.exe tool to encrypt sections of the configuration file. The Pkcs12 format enables transfer of certificates and their corresponding private keys from one machine to another. This provider is similar to the RsaProtectedConfigurationProvider, with the difference being that instead of transferring the RSA key pair in an XML file, it relies on the transfer to occur using a certificate in .PFX format. This approach has been used by P&P in the Autoscaling Application Block. While this provider works with the built-in tooling in ASP.NET that can read configuration automatically, it is an onerous solution for configuration that may change periodically.

This blog post details my solution, which was to use the SSL cert for the service to encrypt data once on a local machine, and then use the SSL certificate to decrypt data both locally and in Azure. The advantage of this approach is that for a service delivered over HTTPS, Azure will already have the SSL certificate stored in its certificate store, and so no additional key transfer is required.

Implementation

To encrypt a piece of data you must first retrieve the SSL certificate from its location in the certificate store, and then encrypt the data using the certificate. The following code example shows this process.

1 private static string Encrypt(string plainText)
2 {
3 // Thumb value for SSL cert
4 var thumb = "<thumbprint goes here>";
5
6 var passwordBytes = UTF8Encoding.UTF8.GetBytes(plainText);
7 var contentInfo = new ContentInfo(passwordBytes);
8 var env = new EnvelopedCms(contentInfo);
9 X509Store store = null;
10 string cipherText = null;
11
12 try
13 {
14 store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
15 store.Open(OpenFlags.ReadOnly);
16 var cert = store.Certificates.Cast<X509Certificate2>().Where(xc => xc.Thumbprint == thumb).Single();
17 env.Encrypt(new CmsRecipient(cert));
18
19 cipherText = Convert.ToBase64String(env.Encode());
20 }
21 finally
22 {
23 if (store != null)
24 store.Close();
25 }
26 return cipherText;
27 }

 


 

 

 

The X509Store class is used to provide access to the X.509 store, which is the physical store where certificates are persisted and managed. Once the store has been opened in read only mode, the SSL certificate is retrieved by searching for its thumbprint value. The data to be encrypted is stored in a CMS/PKCS #7 enveloped data structure, and is encrypted using the EnvelopedCms.Encrypt method. It’s then base64 encoded before being returned from the method.

Decrypting the data simply reverses the process. The same SSL certificate is retrieved from the certificate store, and then is used to decrypt the data. The following code example shows this process.

1 private static string Decrypt(string cipherText)
2 {
3 // Thumb value for SSL cert
4 var thumb = "<thumbprint goes here>";
5 X509Store store = null;
6 string plainText = null;
7
8 try
9 {
10 store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
11 store.Open(OpenFlags.ReadOnly);
12 var cert = store.Certificates.Cast<X509Certificate2>().Where(xc => xc.Thumbprint == thumb).Single();
13 var bytes = Convert.FromBase64String(cipherText);
14 var env = new EnvelopedCms();
15 env.Decode(bytes);
16 env.Decrypt();
17 plainText = Encoding.UTF8.GetString(env.ContentInfo.Content);
18 }
19 finally
20 {
21 if (store != null)
22 store.Close();
23 }
24 return plainText;
25 }

 

 

 

 

The X509Store class is used to provide access to the X.509 store, with the constructor taking arguments that indicate which part of the certificate store should be opened. Once the store has been opened in read only mode, the SSL certificate is retrieved by searching for its thumbprint value. The data to be decrypted is converted to a byte representation, from its base64 representation, before being decoded and decrypted by the EnvelopedCMS class. The plain text is then returned from the method.

This approach to encryption and decryption enables you to store sensitive information in your configuration file in an encrypted form, which can then be decrypted both when running the service locally, and when running it in Azure. It offers the advantage that it’s not necessary to transfer additional keys to Azure in order to perform decryption, and it’s not too onerous a task to change encrypted data periodically.  It can be further strengthened through the use of additional security techniques, including using the SecureString class at appropriate places in the code.

Summary

This blog post has demonstrated how to encrypt and decrypt data using an SSL cert. My requirement was to encrypt/decrypt data stored in a configuration file, but it may equally be used with other data. The advantage of this approach is that it enables you to decrypt data both when running a service locally, and when running it in Azure, without the onerous task of copying additional encryption key data to Azure.

Monday, 15 December 2014

Using basic authentication in an Azure Cloud Service

I recently had a requirement to use transport security with basic authentication in a web service hosted in Azure. Basic authentication is a mechanism for a HTTP user agent to provide credentials when making a request to the server, and is supported by all major browsers and servers. It doesn’t require cookies, session identifiers, or login pages. Instead it uses a static, standard HTTP header which means that no handshakes need to be performed.

IIS web servers provide basic authentication against Windows accounts on the server or through active directory. This situation is further complicated in services hosted in Azure. The following code shows how transport security with basic authentication can be specified in a web.config file.

1 <bindings>
2 <basicHttpsBinding>
3 <binding name="TransportSecurity">
4 <security mode="Transport">
5 <transport clientCredentialType="Basic"/>
6 </security>
7 </binding>
8 </basicHttpsBinding>
9 </bindings>

However, when you run an Azure cloud service with this configuration you’ll receive the following error message:

The authentication schemes configured on the host ('Anonymous') do not allow those configured on the binding 'BasicHttpsBinding' ('Basic').  Please ensure that the SecurityMode is set to Transport or TransportCredentialOnly.  Additionally, this may be resolved by changing the authentication schemes for this application through the IIS management tool, through the ServiceHost.Authentication.AuthenticationSchemes property, in the application configuration file at the <serviceAuthenticationManager> element, by updating the ClientCredentialType property on the binding, or by adjusting the AuthenticationScheme property on the HttpTransportBindingElement.

The initial problem is that basic authentication is unavailable by default for Azure web roles. It can be enabled either by enabling RDP on the virtual machine that the service is running on, RDPing in and adding basic authentication to IIS and then enabling it. Alternatively you could write a Powershell script to install basic authentication, and configure it to run from your VS solution when the web role starts up. Then you need to create a Windows account in the virtual machine that will be used during basic authentication.

This solution was not ideal. Moving forwards the service could have many different users, and I didn’t like the thought of having to create Windows accounts for each user. Furthermore, I’m a firm believer in trying to keep web services as provider agnostic as possible, in order to reduce problems if the service needs to be moved to another provider.

An alternative solution would be to use a different basic authentication module, provided by a third party. This is also not ideal, as it involves additional effort in identifying a suitable third party module to use, and then much time thoroughly testing it.

In this blog post I’ll outline my solution to this problem, which is to implement your own basic authentication mechanism. Basic authentication uses a simple protocol:

  1. The “username:password” format is used to combine username and password into one string.

  2. The resulting string is then base64 encoded.

  3. The encoding string is sent to the server in an Authorization header sent with the web request:
Authorization: Basic <base64 encoded username:password goes here>

Implementation

My solution is in three parts:

  1. Configure the service to use transport security but no authentication.
  2. In the service, intercept all web requests and parse out the Authorization header that contains the basic authentication credentials. The extracted credentials can then be compared against the actual credentials.
  3. In the client, intercept all web requests and add an appropriate Authorization header using the convention specified for basic authentication.

The following code shows how to configure the service to use transport security but not authentication.

1 <bindings>
2 <basicHttpsBinding>
3 <binding name="TransportSecurity">
4 <security mode="Transport">
5 <transport clientCredentialType="None"/>
6 </security>
7 </binding>
8 </basicHttpsBinding>
9 </bindings>

To intercept web requests to the service I created a class called BasicAuthenticationManager that derives from the ServiceAuthorizationManager class, which provides authorization access checking for service operations. This class overrides the CheckAccessCore method which checks authorization for the given operation context. In this method you can obtain the headers for the web request, and you can then parse out the Authorization header. The following code example shows this.

1 protected override bool CheckAccessCore(OperationContext operationContext)
2 {
3 var authHeader = WebOperationContext.Current.IncomingRequest.Headers["Authorization"];
4
5 if (!string.IsNullOrWhiteSpace(authHeader))
6 {
7 if (authHeader.StartsWith("Basic"))
8 {
9 var credentials = ASCIIEncoding.ASCII.GetString(Convert.FromBase64String(authHeader.Substring(6))).Split(':');
10
11 // Compare credentials against stored encrypted credentials
12 // If equal return true, otherwise false
13 }
14 }
15 }

The logic is straight forward: The Authorization header is extracted from the web request, and then the credentials are extracted from the Authorization header. The credentials can then be validated using your chosen approach (such as comparing them against encrypted credentials stored in configuration). If the credentials are valid, then return true. Otherwise return false, or throw the exception of your choosing to prevent the service operation being executed.

The service must then be configured to use the BasicAuthenticationManager class. This can be accomplished by adding a serviceAuthorization element to your web.config. Note that the format for specifying the class is AssemblyNamespace.Classname, AssemblyNamespace.

1 <?xml version="1.0"?>
2 <configuration>
3
4 <system.serviceModel>
5 ...
6 <behaviors>
7 <serviceBehaviors>
8 <behavior>
9 ...
10 <serviceAuthorization serviceAuthorizationManagerType="FullyQualifiedTypeName, AssemblyName" />
11 ...
12 </behavior>
13 </serviceBehaviors>
14 </behaviors>
15 </system.serviceModel>
16
17
18 </configuration>

The client that invokes the web service must then be updated to create the Authorization header for every service operation. The following code example shows this.

1 using (var client = new Proxy.Client())
2 {
3 var userName = "<username goes here>";
4 var password = "<password goes here>";
5
6 // Create the authorization header
7 var httpRequestProperty = new HttpRequestMessageProperty();
8 httpRequestProperty.Headers[HttpRequestHeader.Authorization] = "Basic " +
9 Convert.ToBase64String(Encoding.ASCII.GetBytes(userName + ":" + password));
10
11 using (new OperationContextScope(client.InnerChannel))
12 {
13 // Add the authorization header to every outgoing message
14 OperationContext.Current.OutgoingMessageProperties[HttpRequestMessageProperty.Name] = httpRequestProperty;
15
16 // Make web requests
17 }
18 }


 

The HttpRequestMessageProperty class is used to provide access to the HTTP request, with the Headers property providing access to the HTTP headers from the request. It’s then easy to add an Authorization header that comprises “Basic “ and the base64 encoded “username:password” string. The OperationContextScope class is then used to add the authorization header to every outgoing message.

Summary

This blog post has demonstrated how to use basic authentication in an Azure cloud service, without having to expose the underlying virtual machine that the service runs on, and without then having to undertake messy configuration of the virtual machine. It offers more flexibility than using ISS basic authentication, as you can specify the credentials in your service, instead of having to rely upon basic authentication against a Windows account.