Putting custom data in Azure Table Storage from Episerver

It's completely normal to feel not so good when thinking about Entity Framework migrations or Episerver's Dynamic Data Store.

If your Episerver site is running on Azure and is using Blob Storage, either self-serviced or on the DXP, using Azure Table Storage instead can be a good alternative that's already available to use.

To illustrate; here is my simple string service example.

public interface ISimpleStorageService
{
  string Get(string partitionKey, string rowKey);

  void Insert(string partitionKey, string rowKey, string value);
}

When implementing it with Azure Table Storage all the packages are already installed in your solution, since they are required by the EPiServer.Azure package, and the connection string name is fixed so this class should work out of the box as a simple key-value store.

using System;
using System.Configuration;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Table;

public class AzureSimpleStorageService : ISimpleStorageService
{
  public string Get(string partitionKey, string rowKey)
  {
    var entity = GetEntity(partitionKey, rowKey);
    return entity?.Value;
  }

  public void Insert(string partitionKey, string rowKey, string value)
  {
    if (string.IsNullOrWhiteSpace(partitionKey))
    {
      throw new Exception("PartitionKey must have value");
    }

    if (string.IsNullOrWhiteSpace(rowKey))
    {
      throw new Exception("RowKey must have value");
    }

    var table = GetSimpleStorageTable();
    var updateEntity = GetEntity(partitionKey, rowKey);

    if (updateEntity != null)
    {
      updateEntity.Value = value;
      var updateOperation = TableOperation.Replace(updateEntity);
      table.Execute(updateOperation);
    }
    else
    {
      var storageItem = new SimpleStorageItem { PartitionKey = partitionKey, RowKey = rowKey, Value = value };
      table.Execute(TableOperation.Insert(storageItem));
    }
  }

  private static CloudStorageAccount GetStorageAccount()
  {
    return CloudStorageAccount.Parse(ConfigurationManager.ConnectionStrings["EPiServerAzureBlobs"].ConnectionString);
  }

  private static CloudTable GetSimpleStorageTable()
  {
    var storageAccount = GetStorageAccount();
    var tableClient = storageAccount.CreateCloudTableClient();

    // Create the table if it doesn't exist
    var table = tableClient.GetTableReference("SimpleStorageItems");
    table.CreateIfNotExists();

    return table;
  }

  private static SimpleStorageItem GetEntity(string partitionKey, string rowKey)
  {
    try
    {
      var table = GetSimpleStorageTable();
      var retrieveOperation = TableOperation.Retrieve(partitionKey, rowKey);
      var retrievedResult = table.Execute(retrieveOperation);
      var entity = (SimpleStorageItem)retrievedResult.Result;
      return entity;
    }
    catch (System.Data.Services.Client.DataServiceQueryException ex)
    {
      if (ex.Response.StatusCode == (int)System.Net.HttpStatusCode.NotFound)
      {
        return null;
      }

      throw;
    }
  }

  private class SimpleStorageItem : TableEntity
  {
    // The .NET Client supports more property types as well
    // but not very complex types...
    public string Value { get; set; }
  }
}

To see what's going on and which tables exist, download Azure Storage Explorer and then use the connection string to connect, in my experience that usually works better than listing through your Azure AD authentication. This accesss is only there for Integration environment, of course code works fine for Preproduction and Production but you need views or other own things to look at the data there.

Note that even with the new Deployment API available I think you still need to create a support ticket if you want to copy table data between environments.

This has been a slightly secretive option but it has been around since the start of the DXC Service and it has served us well for data we didn't want taking up space in our SQL database.

Published and tagged with these categories: Episerver, Development