Put Stuff in Your Windows Azure Junk Trunk – Windows Azure Worker Role and Storage Queue

Click on Part 1 and Part 2 of this series to review the previous examples and code.  First and foremost have the existing code base created in the other two examples opened and ready in Visual Studio 2010.  Next, I’ll just start rolling ASAP.

In the JunkTrunk.Storage Project add the following class file and code to the project. This will get us going for anything else we needed to do for the application from the queue perspective.

[sourcecode language=”csharp”]
public class Queue : JunkTrunkBase
{
public static void Add(CloudQueueMessage msg)
{
Queue.AddMessage(msg);
}

public static CloudQueueMessage GetNextMessage()
{
return Queue.PeekMessage() != null ? Queue.GetMessage() : null;
}

public static List<CloudQueueMessage> GetAllMessages()
{
var count = Queue.RetrieveApproximateMessageCount();
return Queue.GetMessages(count).ToList();
}

public static void DeleteMessage(CloudQueueMessage msg)
{
Queue.DeleteMessage(msg);
}
}
[/sourcecode]

Once that is done open up the FileBlobManager.cs file in the Models directory of the JunkTrunk ASP.NET MVC Web Application. In the PutFile() Method add this line of code toward the very end of that method. The method, with the added line of code should look like this.

[sourcecode language=”csharp”]
public void PutFile(BlobModel blobModel)
{
var blobFileName = string.Format("{0}-{1}", DateTime.Now.ToString("yyyyMMdd"), blobModel.ResourceLocation);
var blobUri = Blob.PutBlob(blobModel.BlobFile, blobFileName);

Table.Add(
new BlobMeta
{
Date = DateTime.Now,
ResourceUri = blobUri,
RowKey = Guid.NewGuid().ToString()
});

Queue.Add(new CloudQueueMessage(blobUri + "$" + blobFileName));
}
[/sourcecode]

Now that we have something adding to the queue, we want to process this queue message. Open up the JunkTrunk.WorkerRole and make sure you have the following references in the project.

Windows Azure References
Windows Azure References

Next create a new class file called PhotoProcessing.cs. First add a method to the class titled ThumbnailCallback with the following code.

[sourcecode language=”csharp”]
public static bool ThumbnailCallback()
{
return false;
}
[/sourcecode]

Next add another method with a blobUri string and filename string as parameters. Then add the following code block to it.

[sourcecode language=”csharp”]
private static void AddThumbnail(string blobUri, string fileName)
{
try
{
var stream = Repository.Blob.GetBlob(blobUri);

if (blobUri.EndsWith(".jpg"))
{
var image = Image.FromStream(stream);
var myCallback = new Image.GetThumbnailImageAbort(ThumbnailCallback);
var thumbnailImage = image.GetThumbnailImage(42, 32, myCallback, IntPtr.Zero);
thumbnailImage.Save(stream, ImageFormat.Jpeg);
Repository.Blob.PutBlob(stream, "thumbnail-" + fileName);
}
else
{
Repository.Blob.PutBlob(stream, fileName);
}
}
catch (Exception ex)
{
Trace.WriteLine("Error", ex.ToString());
}
}
[/sourcecode]

Last method to add to the class is the Run() method.

[sourcecode language=”csharp”]
public static void Run()
{
var queueMessage = Repository.Queue.GetNextMessage();

while (queueMessage != null)
{
var message = queueMessage.AsString.Split(‘$’);
if (message.Length == 2)
{
AddThumbnail(message[0], message[1]);
}

Repository.Queue.DeleteMessage(queueMessage);
queueMessage = Repository.Queue.GetNextMessage();
}
}
[/sourcecode]

Now open up the WorkerRole.cs File and add the following code to the existing methods and add the additional even method below.

[sourcecode language=”csharp”]
public override void Run()
{
Trace.WriteLine("Junk Trunk Worker entry point called", "Information");

while (true)
{
PhotoProcessing.Run();

Thread.Sleep(60000);
Trace.WriteLine("Working", "Junk Trunk Worker Role is active and running.");
}
}

public override bool OnStart()
{
ServicePointManager.DefaultConnectionLimit = 12;
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString");
RoleEnvironment.Changing += RoleEnvironmentChanging;

CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
RoleEnvironment.Changed += (sender, arg) =>
{
if (arg.Changes.OfType<RoleEnvironmentConfigurationSettingChange>()
.Any((change) => (change.ConfigurationSettingName == configName)))
{
if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
{
RoleEnvironment.RequestRecycle();
}
}
};
});

Storage.JunkTrunkSetup.CreateContainersQueuesTables();

return base.OnStart();
}

private static void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
{
if (!e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange)) return;

Trace.WriteLine("Working", "Environment Change: " + e.Changes.ToList());
e.Cancel = true;
}
[/sourcecode]

At this point everything needed to kick off photo processing using Windows Azure Storage Queue as the tracking mechanism is ready. I’ll be following up these blog entries with some additional entries regarding rafactoring and streamlining what we have going on. I might even go all out and add some more functionality or some such craziness! So hope that was helpful and keep reading. I’ll have more bits of rambling and other trouble coming down the blob pipeline soon! Cheers!

Put Stuff in Your Windows Azure Junk Trunk – ASP.NET MVC Application

If you haven’t read Part 1 of this series (part 3 click here), you’ll need to in order to follow along with the JunkTrunk Repository.  Open the solution up if you haven’t already and navigate to the Models Folder within the ASP.NET MVC JunkTrunk Project.  In the folder add another class titled FileItemModel.cs and BlobModel.cs. Add the following properties to the FileItemModel.

[sourcecode language=”csharp”]
public class FileItemModel
{
public Guid ResourceId { get; set; }
public string ResourceLocation { get; set; }
public DateTime UploadedOn { get; set; }
}
[/sourcecode]

Add the following property to the BlobModel and inherit from the FileItemModel Class.

[sourcecode language=”csharp”]
public class BlobModel : FileItemModel
{
public Stream BlobFile { get; set; }
}
[/sourcecode]

Next add a new class file titled FileBlobManager.cs and add the following code to the class.

[sourcecode language=”csharp”]
public class FileBlobManager
{
public void PutFile(BlobModel blobModel)
{
var blobFileName = string.Format("{0}-{1}", DateTime.Now.ToString("yyyyMMdd"), blobModel.ResourceLocation);
var blobUri = Blob.PutBlob(blobModel.BlobFile, blobFileName);

Table.Add(
new BlobMeta
{
Date = DateTime.Now,
ResourceUri = blobUri,
RowKey = Guid.NewGuid().ToString()
});
}

public BlobModel GetFile(Guid key)
{
var blobMetaData = Table.GetMetaData(key);
var blobFileModel =
new BlobModel
{
UploadedOn = blobMetaData.Date,
BlobFile = Blob.GetBlob(blobMetaData.ResourceUri),
ResourceLocation = blobMetaData.ResourceUri
};
return blobFileModel;
}

public List GetBlobFileList()
{
var blobList = Table.GetAll();

return blobList.Select(
metaData => new FileItemModel
{
ResourceId = Guid.Parse(metaData.RowKey),
ResourceLocation = metaData.ResourceUri,
UploadedOn = metaData.Date
}).ToList();
}

public void Delete(string identifier)
{
Table.DeleteMetaDataAndBlob(Guid.Parse(identifier));
}
}
[/sourcecode]

Now that the repository, management, and models are all complete the focus can turn to the controller and the views of the application. At this point the break down of each data element within the data transfer object and the movement of the data back and forth becomes very important to the overall architecture. One of the things to remember is that the application should not pass back and forth data such as URIs or other long easy to hack strings. This is a good place to include Guids or if necessary integer values that identify the data that is getting created, updated, or deleted. This helps to simplify the UI and help decrease the chance of various injection attacks. The next step is to open up the HomeController and add code to complete each of the functional steps for the site.

[sourcecode language=”csharp”]
[HandleError]
public class HomeController : Controller
{
public ActionResult Index()
{
ViewData["Message"] = "Welcome to the Windows Azure Blob Storing ASP.NET MVC Web Application!";
var fileBlobManager = new FileBlobManager();
var fileItemModels = fileBlobManager.GetBlobFileList();
return View(fileItemModels);
}

public ActionResult About()
{
return View();
}

public ActionResult Upload()
{
return View();
}

public ActionResult UploadFile()
{
foreach (string inputTagName in Request.Files)
{
var file = Request.Files[inputTagName];

if (file.ContentLength > 0)
{
var blobFileModel =
new BlobModel
{
BlobFile = file.InputStream,
UploadedOn = DateTime.Now,
ResourceLocation = Path.GetFileName(file.FileName)
};

var fileBlobManager = new FileBlobManager();
fileBlobManager.PutFile(blobFileModel);
}
}

return RedirectToAction("Index", "Home");
}

public ActionResult Delete(string identifier)
{
var fileBlobManager = new FileBlobManager();
fileBlobManager.Delete(identifier);
return RedirectToAction("Index", "Home");
}
}
[/sourcecode]

The view hasn’t been created for the Upload just yet, so the method will cause a build error at this point. But before I add a view for this action, I’ll cover what has been created for the controller.

The Index Action I’ve changed moderately to have a list of the Blobs that are stored in the Windows Azure Blob Storage. This will be pulled from the manager class that we created earlier and passed into the view for rendering. I also, just for cosmetic reasons, changed the default display message passed into the ViewData so that the application would have something displayed more relevant to the application.

The About message I just left as is. The Upload action simply returns what will be a view we create.

The UploadFile Action checks for files within the request, builds up the model and then puts the model into storage via the manager.

The last method is the Delete Action that instantiates the manager and then calls a delete against the storage. This action then in turn traces back through, finds the Table & Blob Entities that are related to the specific blob and deletes both from the respective Windows Azure Storage Table and Blob Mediums.

The next step is to get the various views updated or added to enable the upload and deletion of the blob items.

Add a view titled Upload.aspx to the Home Folder of the Views within the JunkTrunk Project.

Upload View
Upload View

First change the inherits value for the view from System.Web.Mvc.ViewPage to System.Web.Mvc.ViewPage. After that add the following HTML to the view.

[sourcecode language=”html”]
<asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server">
Upload an Image
</asp:Content>
<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">
<h2>
Upload</h2>
<% using (Html.BeginForm("UploadFile", "Home", FormMethod.Post,
new { enctype = "multipart/form-data" }))
{%>
<%: Html.ValidationSummary(true) %>
<fieldset>
<legend>Fields</legend>

<div class="editor-label">
Select file to upload to Windows Azure Blob Storage:
</div>
<div class="editor-field">
<input type="file" id="fileUpload" name="fileUpload" />
</div>
<p>
<input type="submit" value="Upload" />
</p>
</fieldset>
<% } %>
<div>
<%: Html.ActionLink("Back to List", "Index") %>
</div>
</asp:Content>
[/sourcecode]

After adding the HTML, then change the HTML in the Index.aspx View to have an action link for navigating to the upload page and for viewing the list of uploaded Blobs. Change the inherits first form System.Web.Mvc.ViewPage to System.Web.Mvc.ViewPage<IEnumerable>. The rest of the changes are listed below.

[sourcecode language=”html”]
<asp:Content ID="Content1" ContentPlaceHolderID="TitleContent" runat="server">
Home Page
</asp:Content>
<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server">
<h2>
<%: ViewData["Message"] %></h2>
<p>
<%: Html.ActionLink("Upload", "Upload", "Home") %>
a file to Windows Azure Blob Storage.
</p>
Existing Files:<br />
<table>
<tr>
<th>
</th>
<th>
FileName
</th>
<th>
DownloadedOn
</th>
</tr>
<% foreach (var item in Model)
{ %>
<tr>
<td>
<%: Html.ActionLink("Delete", "Delete",
new { identifier = item.ResourceId })%>
</td>
<td>
<%: item.ResourceLocation %>
</td>
<td>
<%: String.Format("{0:g}", item.UploadedOn) %>
</td>
</tr>
<% } %>
</table>
</asp:Content>
[/sourcecode]

Make sure the Windows Azure Project is set as the startup project and click on F5 to run the application. The following page should display first.

The Home Page o' Junk Trunk
The Home Page o' Junk Trunk

Click through on it to go to the upload page.

Selecting an Image to Put in The Junk Trunk
Selecting an Image to Put in The Junk Trunk

On the upload page select and image to upload and then click on upload. This will then upload the image and redirect appropriately to the home page.

The Image in the Junk Trunk
The Image in the Junk Trunk

On the home page the list should now have the uploaded blob image listed. Click delete to delete the image. When deleted the table and the blob itself will be removed from the Windows Azure Storage. To see that the data & image are being uploaded open up the Server Explorer within Visual Studio 2010.

Visual Studio 2010 Server Explorer
Visual Studio 2010 Server Explorer

View the data by opening up the Windows Azure Storage tree. Double click on either of the storage mediums to view table or blob data.

Windows Azure Storage
Windows Azure Storage

Put Stuff in Your Windows Azure Junk Trunk – Repository Base

Alright, so the title is rather stupid, but hey, it’s fun!  🙂

This project I setup to provide some basic functionality with Windows Azure Storage.  I wanted to use each of the three mediums;  Table, Blob, and Queue, and this example will cover each of these things.  The application will upload and store images, provide a listing, some worker processing, and deletion of the images & associated metadata.  This entry is part 1 of this series, with the following schedule for subsequent entries:

Title aside, schedule laid out, description of the project completed, I’ll dive right in!

Putting Stuff in Your Junk Trunk

Create a new Windows Azure Project called PutJunkInIt.  (Click any screenshot for the full size, and also note some of the text may be off – I had to recreate a number of these images)

Windows Azure PutJunkInIt
Windows Azure PutJunkInIt

Next select the ASP.NET MVC 2 Web Application and also a Worker Role and name the projects JunkTrunk and JunkTrunk.WorkerRole.

Choosing Windows Azure Projects
Choosing Windows Azure Projects

In the next dialog choose to create the unit test project and click OK.

Create Unit Test Project
Create Unit Test Project

After the project is created the following projects are setup within the PutJunkInIt Solution.  There should be a JunkTrunk, JunkTrunk.Worker, JunkTrunk Windows Azure Deployment Project, and a JunkTrunk.Tests Project.

Solution Explorer
Solution Explorer

Next add a Windows Class Library Project and title it JunkTrunk.Storage.

Windows Class Library
Windows Class Library

Add a reference to the Microsoft.WindowsAzure.ServiceRuntime and Microsoft.WindowsAzure.StorageClient assemblies to the JunkTrunk.Storage Project.  Rename the Class1.cs file and class to JunkTrunkBase.  Now open up the Class1.cs file in the JunkTrunk.Storage Project.  First add the following fields and constructor to the class.

[sourcecode language=”csharp”]
public const string QueueName = "metadataqueue";
public const string BlobContainerName = "photos";
public const string TableName = "MetaData";
static JunkTrunkBase()
{
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
RoleEnvironment.Changed
+= (sender, arg) =>
{
if (!arg.Changes.OfType()
.Any(change => (change.ConfigurationSettingName == configName)))
return;
if (!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
{
RoleEnvironment.RequestRecycle();
}
};
});
}
[/sourcecode]

After that add the following blob container and reference methods.

[sourcecode language=”csharp”]
protected static CloudBlobContainer Blob
{
get { return BlobClient.GetContainerReference(BlobContainerName); }
}
private static CloudBlobClient BlobClient
{
get
{
return Account.CreateCloudBlobClient();
}
}
[/sourcecode]

Now add code for the table & queue client and reference methods.

[sourcecode language=”csharp”]
protected static CloudQueue Queue
{
get { return QueueClient.GetQueueReference(QueueName); }
}
private static CloudQueueClient QueueClient
{
get { return Account.CreateCloudQueueClient(); }
}
protected static CloudTableClient Table
{
get { return Account.CreateCloudTableClient(); }
}
protected static CloudStorageAccount Account
{
get
{
return
CloudStorageAccount
.FromConfigurationSetting("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString");
}
}
[/sourcecode]

This class now provides the basic underpinnings needed to retrieve the appropriate information from the configuration.  This base class can then provide that connection information to connect to the table, queue, or blob mediums.

Next step is to create some initialization code to get the containers created if they don’t exist in Windows Azure.  Add a new class file to the PutJunkInIt Project.

JunkTrunkSetup
JunkTrunkSetup

[sourcecode language=”csharp”]
public class JunkTrunkSetup : JunkTrunkBase
{
public static void CreateContainersQueuesTables()
{
Blob.CreateIfNotExist();
Queue.CreateIfNotExist();
Table.CreateTableIfNotExist(TableName);
}
}
[/sourcecode]

Next add the System.Data.Services.Client Assembly to the project.  After adding the assembly add two new classes and name them BlobMeta.cs and Table.cs. Add the following code to the Table.cs Class.

[sourcecode language=”csharp”]
public class Table
{
public static string PartitionKey;
}
[/sourcecode]

Next add another class file and name it BlobMetaContext.cs and add the following code.

[sourcecode language=”csharp”]
public class BlobMetaContext : TableServiceContext
{
public BlobMetaContext(string baseAddress, StorageCredentials credentials)
: base(baseAddress, credentials)
{
IgnoreResourceNotFoundException = true;
}
public IQueryable Data
{
get { return CreateQuery(RepositoryBase.TableName); }
}
public void Add(BlobMeta data)
{
data.RowKey = data.RowKey.Replace("/", "_");
BlobMeta original = (from e in Data
where e.RowKey == data.RowKey
&& e.PartitionKey == Table.PartitionKey
select e).FirstOrDefault();
if (original != null)
{
Update(original, data);
}
else
{
AddObject(RepositoryBase.TableName, data);
}
SaveChanges();
}
public void Update(BlobMeta original, BlobMeta data)
{
original.Date = data.Date;
original.ResourceUri = data.ResourceUri;
UpdateObject(original);
SaveChanges();
}
}
[/sourcecode]

Now add the following code to the BlobMeta Class.

[sourcecode language=”csharp”]
public class BlobMeta : TableServiceEntity
{
public BlobMeta()
{
PartitionKey = Table.PartitionKey;
}
public DateTime Date { get; set; }
public string ResourceUri { get; set; }
}
[/sourcecode]

At this point, everything should build. Give it a go to be sure nothing got keyed in wrong (or copied in wrong). Once assured the build is still solid, add the Blob.cs Class to the project.

[sourcecode language=”csharp”]
public class Blob : JunkTrunkBase
{
public static string PutBlob(Stream stream, string fileName)
{
var blobRef = Blob.GetBlobReference(fileName);
blobRef.UploadFromStream(stream);
return blobRef.Uri.ToString();
}
public static Stream GetBlob(string blobAddress)
{
var stream = new MemoryStream();
Blob.GetBlobReference(blobAddress)
.DownloadToStream(stream);
return stream;
}
public static Dictionary<string, string> GetBlobList()
{
var blobs = Blob.ListBlobs();
var blobDictionary =
blobs.ToDictionary(
listBlobItem => listBlobItem.Uri.ToString(),
listBlobItem => listBlobItem.Uri.ToString());
return blobDictionary;
}
public static void DeleteBlob(string blobAddress)
{
Blob.GetBlobReference(blobAddress).DeleteIfExists();
}
}
[/sourcecode]

After that finalize the Table Class with the following changes and additions.

[sourcecode language=”csharp”]
public class Table : RepositoryBase
{
public const string PartitionKey = "BlobMeta";
public static void Add(BlobMeta data)
{
Context.Add(data);
}
public static BlobMeta GetMetaData(Guid key)
{
return (from e in Context.Data
where e.RowKey == key.ToString() &&
e.PartitionKey == PartitionKey
select e).SingleOrDefault();
}
public static void DeleteMetaDataAndBlob(Guid key)
{
var ctxt = new BlobMetaContext(
Account.TableEndpoint.AbsoluteUri,
Account.Credentials);
var entity = (from e in ctxt.Data
where e.RowKey == key.ToString() &&
e.PartitionKey == PartitionKey
select e).SingleOrDefault();
ctxt.DeleteObject(entity);
Repository.Blob.DeleteBlob(entity.ResourceUri);
ctxt.SaveChanges();
}
public static List<BlobMeta> GetAll()
{
return (from e in Context.Data
select e).ToList();
}
public static BlobMetaContext Context
{
get
{
return new BlobMetaContext(
Account.TableEndpoint.AbsoluteUri,
Account.Credentials);
}
}
}
[/sourcecode]

The final file to add is the Queue.cs Class File. Add that and then add the following code to the class.

[sourcecode language=”csharp”]
public class Queue : JunkTrunkBase
{
public static void Add(CloudQueueMessage msg)
{
Queue.AddMessage(msg);
}
public static CloudQueueMessage GetNextMessage()
{
return Queue.PeekMessage() != null ? Queue.GetMessage() : null;
}
public static List<CloudQueueMessage> GetAllMessages()
{
var count = Queue.RetrieveApproximateMessageCount();
return Queue.GetMessages(count).ToList();
}
public static void DeleteMessage(CloudQueueMessage msg)
{
Queue.DeleteMessage(msg);
}
}
[/sourcecode]

The now gives us a fully functional class that utilizes the Windows Azure SDK. In Part 2 I’ll start building on top of that using the ASP.NET MVC 2 Web Project. Part 2 will be published tomorrow, so stay tuned.

Gritty Technical Info on Windows Azure Worker Roles

In the last blog entry, “Gritty Technical Info on Windows Azure Web Roles“, I covered the creation and startup of a web role within the Windows Azure Development Fabric and observing the web role with the Windows Azure Compute Emulator.  In this blog entry I’ll cover the worker role.

Open the Windows Azure Web Role Sample Solution.  Right click on the Windows Azure and select New Worker Role Project.

New Worker Role Project...
New Worker Role Project...

Once the worker role project SampleWorkerRole is added the solution explorer will display the project just like the web role, albeit fewer files.

Solution Explorer
Solution Explorer

Next right click on the SampleWorkerRole instance in the Windows Azure Web Role Sample and select properties.  Now set the instance count to 2 and the VM size to extra large.

SampleWorkerRole Properties
SampleWorkerRole Properties

Click on F5 to run the application.  Now when the application executes the 6 web role instances will start and the 2 worker role instances will start.

Windows Azure Compute Emulator
Windows Azure Compute Emulator

Examine the first worker role instance.

SampleWorkerRole Instance Status
SampleWorkerRole Instance Status

The worker role instance displays a number of new diagnostic messages in a similar way to the web role.  The first half of the trace diagnostics are configuration and instance messages.  The second half of the trace diagnostics are status messages that are printed from the worker role running.

Open up the code in the WorkerRole.cs file in the SampleWorkerRole Project.  As a comparison open the WebRole.cs file in the SampleWebRole Project.

[sourcecode language=”csharp”]
using System.Diagnostics;
using System.Net;
using System.Threading;
using Microsoft.WindowsAzure.ServiceRuntime;

namespace SampleWorkerRole
{
public class WorkerRole : RoleEntryPoint
{
public override void Run()
{
Trace.WriteLine("SampleWorkerRole entry point called", "Information");

while (true)
{
Thread.Sleep(10000);
Trace.WriteLine("Working", "Information");
}
}

public override bool OnStart()
{
ServicePointManager.DefaultConnectionLimit = 12;
return base.OnStart();
}
}
}
[/sourcecode]

In the WorkerRole.cs file the code inherites from the RoleEntryPoint for the WorkerRole. In the WorkerRole Class the Run and OnStart Methods are overridden to provide some basic trace information and set the default connection limit.

The Run method has a basic while loop that updates every 10000 milliseconds, which displays on the Windows Azure Compute Emulator as “Information: Working”.

[sourcecode language=”csharp”]
using Microsoft.WindowsAzure.ServiceRuntime;

namespace SampleWebRole
{
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
return base.OnStart();
}
}
}
[/sourcecode]

In the code for the WebRole.cs file there is very little actually going on.  Take a closer look at the OnStart method override.  Technically this code doesn’t even need to be in the generated file and can be deleted, but provides a good starting point to add any other code needed in the start of the web role.

Next I’ll add some code in the worker role to provide a telnet prompt that responds with worker role information.  To work through this exercise completely download a telnet client like Putty (http://www.chiark.greenend.org.uk/~sgtatham/putty/).

If Visual Studio 2010 is no longer open, launch it and open the Windows Azure Web Role Sample Solution.  Right click on the SampleWorkRole Role in the Windows Azure Web Role Sample Project.  Click on the Endpoints tab of the properties window and click on Add Endpoint and call it TelnetServiceEndpoint.

Endpoint
Endpoint

Add a private member and create a run method with the following code.

[sourcecode language=”csharp”]
private readonly AutoResetEvent _connectionWait = new AutoResetEvent(false);

public override void Run()
{
Trace.WriteLine("Starting Telnet Service…", "Information");

TcpListener listener;
try
{
listener = new TcpListener(
RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["TelnetServiceEndpoint"].IPEndpoint) { ExclusiveAddressUse = false };
listener.Start();

Trace.WriteLine("Started Telnet Service.", "Information");
}
catch (SocketException se)
{
Trace.Write("Telnet Service could not start.", "Error");
return;
}

while (true)
{
listener.BeginAcceptTcpClient(HandleAsyncConnection, listener);
_connectionWait.WaitOne();
}
}
[/sourcecode]

After adding this code, add the following code for the role information to write to a stream.

[sourcecode language=”csharp”]
private static void WriteRoleInformation(Guid clientId, StreamWriter writer)
{
writer.WriteLine("— Current Client ID, Date & Time —-");
writer.WriteLine("Current date: " + DateTime.Now.ToLongDateString() + " " + DateTime.Now.ToLongTimeString());
writer.WriteLine("Connection ID: " + clientId);
writer.WriteLine();

writer.WriteLine("— Current Role Instance Information —-");
writer.WriteLine("Role ID: " + RoleEnvironment.CurrentRoleInstance.Id);
writer.WriteLine("Role Count: " + RoleEnvironment.Roles.Count);
writer.WriteLine("Deployment ID: " + RoleEnvironment.DeploymentId);
writer.WriteLine();

writer.WriteLine("— Instance Endpoints —-");

foreach (KeyValuePair<string, RoleInstanceEndpoint> instanceEndpoint in RoleEnvironment.CurrentRoleInstance.InstanceEndpoints)
{
writer.WriteLine("Instance Endpoint Key: " + instanceEndpoint.Key);

RoleInstanceEndpoint roleInstanceEndpoint = instanceEndpoint.Value;

writer.WriteLine("Instance Endpoint IP: " + roleInstanceEndpoint.IPEndpoint);
writer.WriteLine("Instance Endpoint Protocol: " + roleInstanceEndpoint.Protocol);
writer.WriteLine("Instance Endpoint Type: " + roleInstanceEndpoint);
writer.WriteLine();
}
}
[/sourcecode]

Now add a handle method for the asynchronous call.

[sourcecode language=”csharp”]
private void HandleAsyncConnection(IAsyncResult result)
{
var listener = (TcpListener)result.AsyncState;
var client = listener.EndAcceptTcpClient(result);
_connectionWait.Set();

var clientId = Guid.NewGuid();
Trace.WriteLine("Connection ID: " + clientId, "Information");

var netStream = client.GetStream();
var reader = new StreamReader(netStream);
var writer = new StreamWriter(netStream);
writer.AutoFlush = true;

var input = string.Empty;
while (input != "3")
{
writer.WriteLine(" 1) Display Worker Role Information");
writer.WriteLine(" 2) Recycle");
writer.WriteLine(" 3) Quit");
writer.Write("Enter your choice: ");

input = reader.ReadLine();
writer.WriteLine();

switch (input)
{
case "1":
WriteRoleInformation(clientId, writer);
break;
case "2":
RoleEnvironment.RequestRecycle();
break;
}

writer.WriteLine();
}

client.Close();
}
[/sourcecode]

Finally override the OnStart() method and setup the RoleEnvironmentChanging Event.

[sourcecode language=”csharp”]
public override bool OnStart()
{
ServicePointManager.DefaultConnectionLimit = 12;

DiagnosticMonitor.Start("DiagnosticsConnectionString");

RoleEnvironment.Changing += RoleEnvironmentChanging;

return base.OnStart();
}

private static void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
{
if (e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange))
{
e.Cancel = true;
}
}
[/sourcecode]

Now run the role by hitting F5. When the application runs open up the Windows Azure Compute Emulator to check for the end point and verify the instances of the role are running.

Endpoint displayed in the Windows Azure Compute Emulator
Endpoint displayed in the Windows Azure Compute Emulator

The Service Name should be Windows_Azure_Web_Role, with an Interface Type of SampleWorkerRole running on the tcp://*:1234 URL and the IP is 127.0.0.1P:1234.

SampleWorkerRole
SampleWorkerRole

Click on one of the instances, which should be green, and assure that each has started up appropriately.

PuTTY
PuTTY

Startup a telnet application, such as Putty and enter the information in as shown in screenshot above.

Telnet
Telnet

Start the telnet prompt connecting to the Windows Azure Worker Role. The prompt with the three choices will display. Click to recycle and then display worker role information a few times, just to make sure all the information is available and the worker role telnet application service is working. Select 3 to exit out and the prompt should close while the role continues to run on the development fabric.

Gritty Technical Info on Windows Azure Web Roles

This is a follow up to the previous blog entry I wrote pertaining to Windows Azure Roles.  I wanted to cover the bases on the various technical aspects of creating a Windows Azure Web Role & Worker Role in Visual Studio 2010.  Without interruption let’s just dive right in.  Start Visual Studio 2010 and initiate a new project. File, new, and then project will open the new project dialog.

Windows Azure Project
Windows Azure Project

Select a cloud template type and name your project.  Click OK and the New Windows Azure Project Dialog will appear to select the role types you can choose from.

Windows Azure Project Templates
Windows Azure Project Templates

Select an ASP.NET MVC Web Application, name it appropriately, and then click OK.  When prompted for a test project select yes and click OK.  When the solution is finished generating from the chosen templates there will be a SampleWebRole ASP.NET MVC Web Application, the test project titled SampleWebRole.Tests, and a Windows Azure Project titled Windows Azure Web Role Sample.

Solution
Solution

After that run the application to assure that the Development Fabric & other parts of the web application startup appropriately.

With the web application still running, click on the Development Fabric Icon in the status bar of Windows 7 and select the Show Computer Emulator UI.

Show Compute Emulator UI
Show Compute Emulator UI

The Windows Azure Compute Emulator will display. Click on the Service Deployments tree until you can see each individual instance (the little green lights should be showing). Figure 4.6 shows this tree opened with one of the instances selected to view the status trace.

Windows Azure Compute Emulator
Windows Azure Compute Emulator

Select Shift + F5 to stop the web application from running.  In the Solution Explorer right click on the SampleWebRole under the Windows Azure Web Role Sample Project and select Properties.

Properties for SampleWebRole
Properties for SampleWebRole

Under the configuration tab of the SampleWebRole Properties set the Instance Count to 6 and the VM Size to Extra Large.

Windows Azure Instance Properties
Windows Azure Instance Properties

Now select F5 to run the web application again in the Windows Azure Development Fabric.  The Windows Azure Compute Emulator (if it is closed right click back on the status icon to launch it again) will now display each of the 6 instances launching under the SampleWebRole.

Windows Azure Compute Emulator
Windows Azure Compute Emulator

Click on one of the green lights to show that specific instance status in the primary window area.

Windows Azure Compute Instance 2
Windows Azure Compute Instance 2

When you select the specific instance the status of that instance is displayed. The instance that is displayed in figure 4.10 has a number of events being recorded with the diagnostics, MonAgentHost, and the runtime. This particular instance had gone through a rough start. During the lifecycle of a Windows Azure Web, Worker, or CGI Role there are a number of events similar to these that can occur.

Read through the first few lines. These lines show that another agent was running, which could be a number of things that conflicted with this web role starting up cleanly. Eventually the web role was able to startup appropriately as shown in the runtime lines stating that the OnStart() is called and then complete, with the Run() executing next.

Reading further through the diagnostics the web role eventually requests a shutdown and then prepares for that shutdown pending the exit of the parent process 6924.

These types of events are common place when reviewing the actions a web role will go through; generally, don’t get too alarmed by any particular set of messages. As long as the role has green lights on the instances, things are going swimmingly. When the lights change to purple or red then it is important to really start paying attention to the diagnostics.

Windows Azure Worker Roles

In the next blog entry (Part II) I want to show is how to add a worker role and how to analyze the activities within the role. The worker role is somewhat different than a web role. The primary difference between a web role and a worker role is that one is built around providing compute work, while one is built around providing web compute. Think of the worker role as something similar to a Windows Service, which runs ongoing to execute jobs & other processes, often backend type processes. A web role is what is built to host Silverlight and web applications such as ASP.NET or ASP.NET MVC.

Part II published on Monday the 17th.

Shout it