Put Stuff in Your Windows Azure Junk Trunk – Windows Azure Worker Role and Storage Queue

Click on Part 1 and Part 2 of this series to review the previous examples and code.  First and foremost have the existing code base created in the other two examples opened and ready in Visual Studio 2010.  Next, I’ll just start rolling ASAP.

In the JunkTrunk.Storage Project add the following class file and code to the project. This will get us going for anything else we needed to do for the application from the queue perspective.

[sourcecode language=”csharp”]
public class Queue : JunkTrunkBase
{
public static void Add(CloudQueueMessage msg)
{
Queue.AddMessage(msg);
}

public static CloudQueueMessage GetNextMessage()
{
return Queue.PeekMessage() != null ? Queue.GetMessage() : null;
}

public static List<CloudQueueMessage> GetAllMessages()
{
var count = Queue.RetrieveApproximateMessageCount();
return Queue.GetMessages(count).ToList();
}

public static void DeleteMessage(CloudQueueMessage msg)
{
Queue.DeleteMessage(msg);
}
}
[/sourcecode]

Once that is done open up the FileBlobManager.cs file in the Models directory of the JunkTrunk ASP.NET MVC Web Application. In the PutFile() Method add this line of code toward the very end of that method. The method, with the added line of code should look like this.

[sourcecode language=”csharp”]
public void PutFile(BlobModel blobModel)
{
var blobFileName = string.Format("{0}-{1}", DateTime.Now.ToString("yyyyMMdd"), blobModel.ResourceLocation);
var blobUri = Blob.PutBlob(blobModel.BlobFile, blobFileName);

Table.Add(
new BlobMeta
{
Date = DateTime.Now,
ResourceUri = blobUri,
RowKey = Guid.NewGuid().ToString()
});

Queue.Add(new CloudQueueMessage(blobUri + "$" + blobFileName));
}
[/sourcecode]

Now that we have something adding to the queue, we want to process this queue message. Open up the JunkTrunk.WorkerRole and make sure you have the following references in the project.

Windows Azure References
Windows Azure References

Next create a new class file called PhotoProcessing.cs. First add a method to the class titled ThumbnailCallback with the following code.

[sourcecode language=”csharp”]
public static bool ThumbnailCallback()
{
return false;
}
[/sourcecode]

Next add another method with a blobUri string and filename string as parameters. Then add the following code block to it.

[sourcecode language=”csharp”]
private static void AddThumbnail(string blobUri, string fileName)
{
try
{
var stream = Repository.Blob.GetBlob(blobUri);

if (blobUri.EndsWith(".jpg"))
{
var image = Image.FromStream(stream);
var myCallback = new Image.GetThumbnailImageAbort(ThumbnailCallback);
var thumbnailImage = image.GetThumbnailImage(42, 32, myCallback, IntPtr.Zero);
thumbnailImage.Save(stream, ImageFormat.Jpeg);
Repository.Blob.PutBlob(stream, "thumbnail-" + fileName);
}
else
{
Repository.Blob.PutBlob(stream, fileName);
}
}
catch (Exception ex)
{
Trace.WriteLine("Error", ex.ToString());
}
}
[/sourcecode]

Last method to add to the class is the Run() method.

[sourcecode language=”csharp”]
public static void Run()
{
var queueMessage = Repository.Queue.GetNextMessage();

while (queueMessage != null)
{
var message = queueMessage.AsString.Split(‘$’);
if (message.Length == 2)
{
AddThumbnail(message[0], message[1]);
}

Repository.Queue.DeleteMessage(queueMessage);
queueMessage = Repository.Queue.GetNextMessage();
}
}
[/sourcecode]

Now open up the WorkerRole.cs File and add the following code to the existing methods and add the additional even method below.

[sourcecode language=”csharp”]
public override void Run()
{
Trace.WriteLine("Junk Trunk Worker entry point called", "Information");

while (true)
{
PhotoProcessing.Run();

Thread.Sleep(60000);
Trace.WriteLine("Working", "Junk Trunk Worker Role is active and running.");
}
}

public override bool OnStart()
{
ServicePointManager.DefaultConnectionLimit = 12;
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString");
RoleEnvironment.Changing += RoleEnvironmentChanging;

CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
RoleEnvironment.Changed += (sender, arg) =>
{
if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
.Any((change) => (change.ConfigurationSettingName == configName)))
{
if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
{
RoleEnvironment.RequestRecycle();
}
}
};
});

Storage.JunkTrunkSetup.CreateContainersQueuesTables();

return base.OnStart();
}

private static void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
{
if (!e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange)) return;

Trace.WriteLine("Working", "Environment Change: " + e.Changes.ToList());
e.Cancel = true;
}
[/sourcecode]

At this point everything needed to kick off photo processing using Windows Azure Storage Queue as the tracking mechanism is ready. I’ll be following up these blog entries with some additional entries regarding rafactoring and streamlining what we have going on. I might even go all out and add some more functionality or some such craziness! So hope that was helpful and keep reading. I’ll have more bits of rambling and other trouble coming down the blob pipeline soon! Cheers!

Put Stuff in Your Windows Azure Junk Trunk – ASP.NET MVC Application

If you haven’t read Part 1 of this series (part 3 click here), you’ll need to in order to follow along with the JunkTrunk Repository.  Open the solution up if you haven’t already and navigate to the Models Folder within the ASP.NET MVC JunkTrunk Project.  In the folder add another class titled FileItemModel.cs and BlobModel.cs. Add the following properties to the FileItemModel.

[sourcecode language=”csharp”]
public class FileItemModel
{
public Guid ResourceId { get; set; }
public string ResourceLocation { get; set; }
public DateTime UploadedOn { get; set; }
}
[/sourcecode]

Add the following property to the BlobModel and inherit from the FileItemModel Class.

[sourcecode language=”csharp”]
public class BlobModel : FileItemModel
{
public Stream BlobFile { get; set; }
}
[/sourcecode]

Next add a new class file titled FileBlobManager.cs and add the following code to the class.

[sourcecode language=”csharp”]
public class FileBlobManager
{
public void PutFile(BlobModel blobModel)
{
var blobFileName = string.Format("{0}-{1}", DateTime.Now.ToString("yyyyMMdd"), blobModel.ResourceLocation);
var blobUri = Blob.PutBlob(blobModel.BlobFile, blobFileName);

Table.Add(
new BlobMeta
{
Date = DateTime.Now,
ResourceUri = blobUri,
RowKey = Guid.NewGuid().ToString()
});
}

public BlobModel GetFile(Guid key)
{
var blobMetaData = Table.GetMetaData(key);
var blobFileModel =
new BlobModel
{
UploadedOn = blobMetaData.Date,
BlobFile = Blob.GetBlob(blobMetaData.ResourceUri),
ResourceLocation = blobMetaData.ResourceUri
};
return blobFileModel;
}

public List GetBlobFileList()
{
var blobList = Table.GetAll();

return blobList.Select(
metaData => new FileItemModel
{
ResourceId = Guid.Parse(metaData.RowKey),
ResourceLocation = metaData.ResourceUri,
UploadedOn = metaData.Date
}).ToList();
}

public void Delete(string identifier)
{
Table.DeleteMetaDataAndBlob(Guid.Parse(identifier));
}
}
[/sourcecode]

Now that the repository, management, and models are all complete the focus can turn to the controller and the views of the application. At this point the break down of each data element within the data transfer object and the movement of the data back and forth becomes very important to the overall architecture. One of the things to remember is that the application should not pass back and forth data such as URIs or other long easy to hack strings. This is a good place to include Guids or if necessary integer values that identify the data that is getting created, updated, or deleted. This helps to simplify the UI and help decrease the chance of various injection attacks. The next step is to open up the HomeController and add code to complete each of the functional steps for the site.

[sourcecode language=”csharp”]
[HandleError]
public class HomeController : Controller
{
public ActionResult Index()
{
ViewData["Message"] = "Welcome to the Windows Azure Blob Storing ASP.NET MVC Web Application!";
var fileBlobManager = new FileBlobManager();
var fileItemModels = fileBlobManager.GetBlobFileList();
return View(fileItemModels);
}

public ActionResult About()
{
return View();
}

public ActionResult Upload()
{
return View();
}

public ActionResult UploadFile()
{
foreach (string inputTagName in Request.Files)
{
var file = Request.Files[inputTagName];

if (file.ContentLength > 0)
{
var blobFileModel =
new BlobModel
{
BlobFile = file.InputStream,
UploadedOn = DateTime.Now,
ResourceLocation = Path.GetFileName(file.FileName)
};

var fileBlobManager = new FileBlobManager();
fileBlobManager.PutFile(blobFileModel);
}
}

return RedirectToAction("Index", "Home");
}

public ActionResult Delete(string identifier)
{
var fileBlobManager = new FileBlobManager();
fileBlobManager.Delete(identifier);
return RedirectToAction("Index", "Home");
}
}
[/sourcecode]

The view hasn’t been created for the Upload just yet, so the method will cause a build error at this point. But before I add a view for this action, I’ll cover what has been created for the controller.

The Index Action I’ve changed moderately to have a list of the Blobs that are stored in the Windows Azure Blob Storage. This will be pulled from the manager class that we created earlier and passed into the view for rendering. I also, just for cosmetic reasons, changed the default display message passed into the ViewData so that the application would have something displayed more relevant to the application.

The About message I just left as is. The Upload action simply returns what will be a view we create.

The UploadFile Action checks for files within the request, builds up the model and then puts the model into storage via the manager.

The last method is the Delete Action that instantiates the manager and then calls a delete against the storage. This action then in turn traces back through, finds the Table & Blob Entities that are related to the specific blob and deletes both from the respective Windows Azure Storage Table and Blob Mediums.

The next step is to get the various views updated or added to enable the upload and deletion of the blob items.

Add a view titled Upload.aspx to the Home Folder of the Views within the JunkTrunk Project.

Upload View
Upload View

First change the inherits value for the view from System.Web.Mvc.ViewPage to System.Web.Mvc.ViewPage. After that add the following HTML to the view.

[sourcecode language=”html”]
<asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server">
Upload an Image
</asp:Content>
<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">
<h2>
Upload</h2>
<% using (Html.BeginForm("UploadFile", "Home", FormMethod.Post,
new { enctype = "multipart/form-data" }))
{%>
<%: Html.ValidationSummary(true) %>
<fieldset>
<legend>Fields</legend>

<div class="editor-label">
Select file to upload to Windows Azure Blob Storage:
</div>
<div class="editor-field">
<input type="file" id="fileUpload" name="fileUpload" />
</div>
<p>
<input type="submit" value="Upload" />
</p>
</fieldset>
<% } %>
<div>
<%: Html.ActionLink("Back to List", "Index") %>
</div>
</asp:Content>
[/sourcecode]

After adding the HTML, then change the HTML in the Index.aspx View to have an action link for navigating to the upload page and for viewing the list of uploaded Blobs. Change the inherits first form System.Web.Mvc.ViewPage to System.Web.Mvc.ViewPage<IEnumerable>. The rest of the changes are listed below.

[sourcecode language=”html”]
<asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server">
Home Page
</asp:Content>
<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">
<h2>
<%: ViewData["Message"] %></h2>
<p>
<%: Html.ActionLink("Upload", "Upload", "Home") %>
a file to Windows Azure Blob Storage.
</p>
Existing Files:<br />
<table>
<tr>
<th>
</th>
<th>
FileName
</th>
<th>
DownloadedOn
</th>
</tr>
<% foreach (var item in Model)
{ %>
<tr>
<td>
<%: Html.ActionLink("Delete", "Delete",
new { identifier = item.ResourceId })%>
</td>
<td>
<%: item.ResourceLocation %>
</td>
<td>
<%: String.Format("{0:g}", item.UploadedOn) %>
</td>
</tr>
<% } %>
</table>
</asp:Content>
[/sourcecode]

Make sure the Windows Azure Project is set as the startup project and click on F5 to run the application. The following page should display first.

The Home Page o' Junk Trunk
The Home Page o' Junk Trunk

Click through on it to go to the upload page.

Selecting an Image to Put in The Junk Trunk
Selecting an Image to Put in The Junk Trunk

On the upload page select and image to upload and then click on upload. This will then upload the image and redirect appropriately to the home page.

The Image in the Junk Trunk
The Image in the Junk Trunk

On the home page the list should now have the uploaded blob image listed. Click delete to delete the image. When deleted the table and the blob itself will be removed from the Windows Azure Storage. To see that the data & image are being uploaded open up the Server Explorer within Visual Studio 2010.

Visual Studio 2010 Server Explorer
Visual Studio 2010 Server Explorer

View the data by opening up the Windows Azure Storage tree. Double click on either of the storage mediums to view table or blob data.

Windows Azure Storage
Windows Azure Storage

Put Stuff in Your Windows Azure Junk Trunk – Repository Base

Alright, so the title is rather stupid, but hey, it’s fun!  🙂

This project I setup to provide some basic functionality with Windows Azure Storage.  I wanted to use each of the three mediums;  Table, Blob, and Queue, and this example will cover each of these things.  The application will upload and store images, provide a listing, some worker processing, and deletion of the images & associated metadata.  This entry is part 1 of this series, with the following schedule for subsequent entries:

Title aside, schedule laid out, description of the project completed, I’ll dive right in!

Putting Stuff in Your Junk Trunk

Create a new Windows Azure Project called PutJunkInIt.  (Click any screenshot for the full size, and also note some of the text may be off – I had to recreate a number of these images)

Windows Azure PutJunkInIt
Windows Azure PutJunkInIt

Next select the ASP.NET MVC 2 Web Application and also a Worker Role and name the projects JunkTrunk and JunkTrunk.WorkerRole.

Choosing Windows Azure Projects
Choosing Windows Azure Projects

In the next dialog choose to create the unit test project and click OK.

Create Unit Test Project
Create Unit Test Project

After the project is created the following projects are setup within the PutJunkInIt Solution.  There should be a JunkTrunk, JunkTrunk.Worker, JunkTrunk Windows Azure Deployment Project, and a JunkTrunk.Tests Project.

Solution Explorer
Solution Explorer

Next add a Windows Class Library Project and title it JunkTrunk.Storage.

Windows Class Library
Windows Class Library

Add a reference to the Microsoft.WindowsAzure.ServiceRuntime and Microsoft.WindowsAzure.StorageClient assemblies to the JunkTrunk.Storage Project.  Rename the Class1.cs file and class to JunkTrunkBase.  Now open up the Class1.cs file in the JunkTrunk.Storage Project.  First add the following fields and constructor to the class.

[sourcecode language=”csharp”]
public const string QueueName = "metadataqueue";
public const string BlobContainerName = "photos";
public const string TableName = "MetaData";
static JunkTrunkBase()
{
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
RoleEnvironment.Changed
+= (sender, arg) =>
{
if (!arg.Changes.OfType()
.Any(change => (change.ConfigurationSettingName == configName)))
return;
if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
{
RoleEnvironment.RequestRecycle();
}
};
});
}
[/sourcecode]

After that add the following blob container and reference methods.

[sourcecode language=”csharp”]
protected static CloudBlobContainer Blob
{
get { return BlobClient.GetContainerReference(BlobContainerName); }
}
private static CloudBlobClient BlobClient
{
get
{
return Account.CreateCloudBlobClient();
}
}
[/sourcecode]

Now add code for the table & queue client and reference methods.

[sourcecode language=”csharp”]
protected static CloudQueue Queue
{
get { return QueueClient.GetQueueReference(QueueName); }
}
private static CloudQueueClient QueueClient
{
get { return Account.CreateCloudQueueClient(); }
}
protected static CloudTableClient Table
{
get { return Account.CreateCloudTableClient(); }
}
protected static CloudStorageAccount Account
{
get
{
return
CloudStorageAccount
.FromConfigurationSetting("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString");
}
}
[/sourcecode]

This class now provides the basic underpinnings needed to retrieve the appropriate information from the configuration.  This base class can then provide that connection information to connect to the table, queue, or blob mediums.

Next step is to create some initialization code to get the containers created if they don’t exist in Windows Azure.  Add a new class file to the PutJunkInIt Project.

JunkTrunkSetup
JunkTrunkSetup

[sourcecode language=”csharp”]
public class JunkTrunkSetup : JunkTrunkBase
{
public static void CreateContainersQueuesTables()
{
Blob.CreateIfNotExist();
Queue.CreateIfNotExist();
Table.CreateTableIfNotExist(TableName);
}
}
[/sourcecode]

Next add the System.Data.Services.Client Assembly to the project.  After adding the assembly add two new classes and name them BlobMeta.cs and Table.cs. Add the following code to the Table.cs Class.

[sourcecode language=”csharp”]
public class Table
{
public static string PartitionKey;
}
[/sourcecode]

Next add another class file and name it BlobMetaContext.cs and add the following code.

[sourcecode language=”csharp”]
public class BlobMetaContext : TableServiceContext
{
public BlobMetaContext(string baseAddress, StorageCredentials credentials)
: base(baseAddress, credentials)
{
IgnoreResourceNotFoundException = true;
}
public IQueryable Data
{
get { return CreateQuery(RepositoryBase.TableName); }
}
public void Add(BlobMeta data)
{
data.RowKey = data.RowKey.Replace("/", "_");
BlobMeta original = (from e in Data
where e.RowKey == data.RowKey
&& e.PartitionKey == Table.PartitionKey
select e).FirstOrDefault();
if (original != null)
{
Update(original, data);
}
else
{
AddObject(RepositoryBase.TableName, data);
}
SaveChanges();
}
public void Update(BlobMeta original, BlobMeta data)
{
original.Date = data.Date;
original.ResourceUri = data.ResourceUri;
UpdateObject(original);
SaveChanges();
}
}
[/sourcecode]

Now add the following code to the BlobMeta Class.

[sourcecode language=”csharp”]
public class BlobMeta : TableServiceEntity
{
public BlobMeta()
{
PartitionKey = Table.PartitionKey;
}
public DateTime Date { get; set; }
public string ResourceUri { get; set; }
}
[/sourcecode]

At this point, everything should build. Give it a go to be sure nothing got keyed in wrong (or copied in wrong). Once assured the build is still solid, add the Blob.cs Class to the project.

[sourcecode language=”csharp”]
public class Blob : JunkTrunkBase
{
public static string PutBlob(Stream stream, string fileName)
{
var blobRef = Blob.GetBlobReference(fileName);
blobRef.UploadFromStream(stream);
return blobRef.Uri.ToString();
}
public static Stream GetBlob(string blobAddress)
{
var stream = new MemoryStream();
Blob.GetBlobReference(blobAddress)
.DownloadToStream(stream);
return stream;
}
public static Dictionary<string, string> GetBlobList()
{
var blobs = Blob.ListBlobs();
var blobDictionary =
blobs.ToDictionary(
listBlobItem => listBlobItem.Uri.ToString(),
listBlobItem => listBlobItem.Uri.ToString());
return blobDictionary;
}
public static void DeleteBlob(string blobAddress)
{
Blob.GetBlobReference(blobAddress).DeleteIfExists();
}
}
[/sourcecode]

After that finalize the Table Class with the following changes and additions.

[sourcecode language=”csharp”]
public class Table : RepositoryBase
{
public const string PartitionKey = "BlobMeta";
public static void Add(BlobMeta data)
{
Context.Add(data);
}
public static BlobMeta GetMetaData(Guid key)
{
return (from e in Context.Data
where e.RowKey == key.ToString() &&
e.PartitionKey == PartitionKey
select e).SingleOrDefault();
}
public static void DeleteMetaDataAndBlob(Guid key)
{
var ctxt = new BlobMetaContext(
Account.TableEndpoint.AbsoluteUri,
Account.Credentials);
var entity = (from e in ctxt.Data
where e.RowKey == key.ToString() &&
e.PartitionKey == PartitionKey
select e).SingleOrDefault();
ctxt.DeleteObject(entity);
Repository.Blob.DeleteBlob(entity.ResourceUri);
ctxt.SaveChanges();
}
public static List<BlobMeta> GetAll()
{
return (from e in Context.Data
select e).ToList();
}
public static BlobMetaContext Context
{
get
{
return new BlobMetaContext(
Account.TableEndpoint.AbsoluteUri,
Account.Credentials);
}
}
}
[/sourcecode]

The final file to add is the Queue.cs Class File. Add that and then add the following code to the class.

[sourcecode language=”csharp”]
public class Queue : JunkTrunkBase
{
public static void Add(CloudQueueMessage msg)
{
Queue.AddMessage(msg);
}
public static CloudQueueMessage GetNextMessage()
{
return Queue.PeekMessage() != null ? Queue.GetMessage() : null;
}
public static List<CloudQueueMessage> GetAllMessages()
{
var count = Queue.RetrieveApproximateMessageCount();
return Queue.GetMessages(count).ToList();
}
public static void DeleteMessage(CloudQueueMessage msg)
{
Queue.DeleteMessage(msg);
}
}
[/sourcecode]

The now gives us a fully functional class that utilizes the Windows Azure SDK. In Part 2 I’ll start building on top of that using the ASP.NET MVC 2 Web Project. Part 2 will be published tomorrow, so stay tuned.

Windows Azure (w/ AWS) Presentation Coming Up

I have a presentation coming up next week on the 10th.  If you’re interested in cloud computing, specifically around storage then you should tune in.  I’ll be covering the basics and some of the architectural ideas, uses, and more around Windows Azure Storage, and the comparable Amazon Web Services storage services.  I’ll also be noting a few of my ongoing projects that you might, if you’re into cloud bits, get a kick out of or want to join.

To tune in to the presentation swing over to the https://www.clicktoattend.com/invitation.aspx?code=147809 link.  There is registration information on the page.  The presentation will technically start at 1 PM PST on the 10th of next week and run until about 1:45pm.  We’ll make the meeting live about 12:45 for early arrival and after about 1:45 there will be a question and answer session.  I hope to have a good bit of conversation afterwards discussing the uses, architectures, and patterns around storage use with cloud services.

I hope you’ll join me.  -Adron