A SQL Server .NET ASP.NET MVC RESTful Web Services Facade – Part I

Did I get enough of the acronyms and key words in the header?  It looks like soup!  :O

This is a somewhat messy project to build a prototype layer around SQL Server. The reason for this, shockingly, is to allow for a SQL Server to be used by frameworks and systems that normally don’t or can’t access the database directly. In my particular scenario we’re working on getting Ruby on Rails running with JRuby in a Windows Environment. Because we will need to utilize a lot of SQL Server Databases, it seemed like a great idea to build out a layer over the SQL Server (or Servers) so that a Ruby on Rails Web App, ASP.NET MVC, or even a PHP or pure Javascript Application could access the data in the database. What better way to do that then to create a RESTful Web Services Facade over the database.

Some of you might be thinking “Why not use RIA Services?!?!?! Are you mad!!” Well, there is a big problem, RIA Services doesn’t work against SQL 2000 or SQL 2005, which is the database technology that this particular requirement dictated. Well, now that you have context, I’ll dig straight in to what I did building this prototype out.

Kick Out a SQL Server Database Project

I need some data, and a database, with just some of the standard junk you’d expect in a production database. One of the best ways to throw together a database in a really short amount of time, with data, is to use a SQL Server Database Project.

New Database Project (Click for larger image)
New Database Project (Click for larger image)

You might see this and think, “But you said that the facade is against a SQL Server 2000 or 2005 database!” Well, it is, but to get a database running locally and have this project type work, I’m using my local SQL Server 2008 Express installation. However, I’m limiting myself to data types primarily available to SQL Server 2000 and 2005. So no worries, this works just fine against those archaic databases.  😛

First I ran the following script to create the database and some sample tables with various data types.

[sourcecode language=”sql”]
DROP DATABASE SomeExistingOrMigratedDatabase
GO
CREATE DATABASE SomeExistingOrMigratedDatabase
GO
USE SomeExistingOrMigratedDatabase
GO
IF EXISTS (SELECT * FROM sys.foreign_keys WHERE object_id = OBJECT_ID(N'[dbo].[FK_Person_Village]’) AND parent_object_id = OBJECT_ID(N'[dbo].[Person]’))
ALTER TABLE [dbo].[Person] DROP CONSTRAINT [FK_Person_Village]
GO
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Person]’) AND type in (N’U’))
DROP TABLE [dbo].[Person]
GO
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[SomeFlatDenormalizedDataTable]’) AND type in (N’U’))
DROP TABLE [dbo].[SomeFlatDenormalizedDataTable]
GO
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Village]’) AND type in (N’U’))
DROP TABLE [dbo].[Village]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Village]’) AND type in (N’U’))
BEGIN
CREATE TABLE [dbo].[Village](
[Id] [uniqueidentifier] NOT NULL,
[Village] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
CONSTRAINT [PK_Village] PRIMARY KEY CLUSTERED
(
[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
)
END
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[SomeFlatDenormalizedDataTable]’) AND type in (N’U’))
BEGIN
CREATE TABLE [dbo].[SomeFlatDenormalizedDataTable](
[Id] [uniqueidentifier] NOT NULL,
[StarzDate] [datetime] NOT NULL,
[Numerals] [int] NULL,
[Numberals] [int] NULL,
[Monies] [decimal](14, 4) NOT NULL,
[Day] [int] NOT NULL,
[Month] [int] NOT NULL,
[Year] [int] NOT NULL,
[BigNonsense] [ntext] COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[Flotsam] [float] NULL,
[Jetsam] [float] NULL,
[SmallishText] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[BiggishText] [nvarchar](2999) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
CONSTRAINT [PK_SomeFlatDenormalizedDataTable] PRIMARY KEY CLUSTERED
(
[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
)
END
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Person]’) AND type in (N’U’))
BEGIN
CREATE TABLE [dbo].[Person](
[Id] [uniqueidentifier] NOT NULL,
[Name] [nvarchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[DateOfBirth] [datetime] NOT NULL,
[VillageId] [uniqueidentifier] NULL,
CONSTRAINT [PK_Person] PRIMARY KEY CLUSTERED
(
[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
)
END
GO
IF NOT EXISTS (SELECT * FROM sys.foreign_keys WHERE object_id = OBJECT_ID(N'[dbo].[FK_Person_Village]’) AND parent_object_id = OBJECT_ID(N'[dbo].[Person]’))
ALTER TABLE [dbo].[Person] WITH CHECK ADD CONSTRAINT [FK_Person_Village] FOREIGN KEY([VillageId])
REFERENCES [dbo].[Village] ([Id])
GO
IF EXISTS (SELECT * FROM sys.foreign_keys WHERE object_id = OBJECT_ID(N'[dbo].[FK_Person_Village]’) AND parent_object_id = OBJECT_ID(N'[dbo].[Person]’))
ALTER TABLE [dbo].[Person] CHECK CONSTRAINT [FK_Person_Village]
[/sourcecode]

Once the database and tables are created, import the database into the database project. To do this select the “Import Database Objects and Settings…” by right clicking the context menu on the Database Project.

Import Database Objects and Settings...
Import Database Objects and Settings...

Select the database just created and click on start. Once the script generation is done, navigate into the project directories and you will see the following scripts have been created.

Generated Scripts (click for larger image)
Generated Scripts (click for larger image)

Next create a new data generation plan in the Data Generation Plans folder (notice I already cheated and have one in the above image).

Creating a Data Generation Plan
Creating a Data Generation Plan

Open up the file this creates (I called mine BuildSomeData.dgen). In the file, note I selected the relationship between the Village and People Tables, and set the ratio to 60:1. When you change the data in the Village table it then automatically updates how much data will be generated for the People Table.

Data Generation Plan
Data Generation Plan

When all that is done, hit F5, select the database and the data will be generated. That gets us a database with data to use as an existing source. From here I’ll jump into creating the actual Facade Layer.

NOTES: Once you generate data, depending on how much you decided to generate, you may want to see how big your database is by using the sp_dbhelp stored procedure. I am however, unsure which versions of SQL Server this stored procedure is available in.

Code for this project is available here: https://github.com/Adron/ExistingSqlServerProject

Windows Azure SDK Unit Testing Dilemma — F5DD Plz K Thx Bye

I’m a huge advocate for high quality code. I will admit I don’t always get to write, or am always able to write high quality code. But day in and out I make my best effort at figuring out the best way to write solid, high quality, easy to maintain, easy to read code.

Over the last year or so I’ve been working with Windows Azure (Amazon Web Services and other Cloud/Utility Platforms & Infrastructure also). One of the largest gaps that I’ve experienced when working with Windows Azure is the gross disregard for unit testing and especially unit testing in a Test Driven Development style way. The design of the SDK doesn’t make unit testing a high priority, and instead focuses mostly on what one might call F5 & Run Development.

I’ll be the first to stand up and point out why F5 Driven Development (for more on this, check out Jeff Schumacher‘s Blog Entry) is the slowest & distracting ways to build high quality code. I’d also be one to admit that F5 Development encourages poor design and development. A developer has to juggle far too many things to waste time hitting F5 every few seconds to assure that the build is running and code changes, additions, or deletions have been made correctly. If a developer disregards running the application when forced to do F5 Development the tendancy is to produce a lot of code, most likely not refactored or tested, during each run of the application. The list of reasons to not develop this way can get long pretty quick. A developer needs to be able to write a test, implement the code, and run the test without a framework launching the development fabric, or worse being forced to not write a test and running code that launches a whole development fabric framework.

Now don’t get me wrong, the development fabric is freaking AWESOME!! It is one of the things that really sets Windows Azure apart from other platforms and infrastructure models that one can develop to. But the level of work and effort makes effectively, cleanly, and intelligently unit testing code against Windows Azure with the development fabric almost impossible.

But with that context, I’m on a search to find some effective ways, with the current SDK limitations and frustrations, to write unit tests and encourage test driven design (TDD) or behaviour driven design (BDD) against Windows Azure, preferably using the SDK.

So far I’ve found the following methods of doing TDD against Windows Azure.

  • Don’t use the SDK. The easiest way to go TDD or BDD against Windows Azure and not being tightly bound to the SDK & Development Fabric is to ignore the SDK altogether and use regular service calls against the Windows Azure service end points. The problem with this however, is that it basically requires one rewrite all the things that the SDK wraps (albeit with better design principles). This is very time consuming but truly gives one absolute control over what they’re writing and also releases one from the issues/nuances that the Windows Azure SDK (1.3 comes to mind) has had.
  • Abstract, abstract, and abstract with a lock of stubbing, mocking, more stubbing, and some more abstractions underneath all of that to make sure the development fabric doesn’t kick off every time the tests are run.  I don’t want to abstract something just to fake, stub, or mock it.  The level of indirection needed gets a bit absurd because of the design issues with the SDK.  The big problem with this design process to move forward with TDD and BDD is that it requires the SDK to basically be rewritten as a whole virtual stubbed, faked, and mocked layer. Reminds me of many of the reasons the Entity Framework is so difficult to work with for testing (has the EF been cleaned up, opened up, and those nasty sealed classes removed yet??)

Now I’ll admit, sometimes I miss the obvious things and maybe there is a magic “build tests real easy right here” button for Windows Azure, but I haven’t found it.  I’d love to hear what else people are doing to enable good design principles around Windows Azure’s SDK. Any thoughts, ideas, or things I ought to try would be absolutely great – I’d love to read them. Please do comment!

Gritty Technical Info on Windows Azure Worker Roles

In the last blog entry, “Gritty Technical Info on Windows Azure Web Roles“, I covered the creation and startup of a web role within the Windows Azure Development Fabric and observing the web role with the Windows Azure Compute Emulator.  In this blog entry I’ll cover the worker role.

Open the Windows Azure Web Role Sample Solution.  Right click on the Windows Azure and select New Worker Role Project.

New Worker Role Project...
New Worker Role Project...

Once the worker role project SampleWorkerRole is added the solution explorer will display the project just like the web role, albeit fewer files.

Solution Explorer
Solution Explorer

Next right click on the SampleWorkerRole instance in the Windows Azure Web Role Sample and select properties.  Now set the instance count to 2 and the VM size to extra large.

SampleWorkerRole Properties
SampleWorkerRole Properties

Click on F5 to run the application.  Now when the application executes the 6 web role instances will start and the 2 worker role instances will start.

Windows Azure Compute Emulator
Windows Azure Compute Emulator

Examine the first worker role instance.

SampleWorkerRole Instance Status
SampleWorkerRole Instance Status

The worker role instance displays a number of new diagnostic messages in a similar way to the web role.  The first half of the trace diagnostics are configuration and instance messages.  The second half of the trace diagnostics are status messages that are printed from the worker role running.

Open up the code in the WorkerRole.cs file in the SampleWorkerRole Project.  As a comparison open the WebRole.cs file in the SampleWebRole Project.

[sourcecode language=”csharp”]
using System.Diagnostics;
using System.Net;
using System.Threading;
using Microsoft.WindowsAzure.ServiceRuntime;

namespace SampleWorkerRole
{
public class WorkerRole : RoleEntryPoint
{
public override void Run()
{
Trace.WriteLine("SampleWorkerRole entry point called", "Information");

while (true)
{
Thread.Sleep(10000);
Trace.WriteLine("Working", "Information");
}
}

public override bool OnStart()
{
ServicePointManager.DefaultConnectionLimit = 12;
return base.OnStart();
}
}
}
[/sourcecode]

In the WorkerRole.cs file the code inherites from the RoleEntryPoint for the WorkerRole. In the WorkerRole Class the Run and OnStart Methods are overridden to provide some basic trace information and set the default connection limit.

The Run method has a basic while loop that updates every 10000 milliseconds, which displays on the Windows Azure Compute Emulator as “Information: Working”.

[sourcecode language=”csharp”]
using Microsoft.WindowsAzure.ServiceRuntime;

namespace SampleWebRole
{
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
return base.OnStart();
}
}
}
[/sourcecode]

In the code for the WebRole.cs file there is very little actually going on.  Take a closer look at the OnStart method override.  Technically this code doesn’t even need to be in the generated file and can be deleted, but provides a good starting point to add any other code needed in the start of the web role.

Next I’ll add some code in the worker role to provide a telnet prompt that responds with worker role information.  To work through this exercise completely download a telnet client like Putty (http://www.chiark.greenend.org.uk/~sgtatham/putty/).

If Visual Studio 2010 is no longer open, launch it and open the Windows Azure Web Role Sample Solution.  Right click on the SampleWorkRole Role in the Windows Azure Web Role Sample Project.  Click on the Endpoints tab of the properties window and click on Add Endpoint and call it TelnetServiceEndpoint.

Endpoint
Endpoint

Add a private member and create a run method with the following code.

[sourcecode language=”csharp”]
private readonly AutoResetEvent _connectionWait = new AutoResetEvent(false);

public override void Run()
{
Trace.WriteLine("Starting Telnet Service…", "Information");

TcpListener listener;
try
{
listener = new TcpListener(
RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["TelnetServiceEndpoint"].IPEndpoint) { ExclusiveAddressUse = false };
listener.Start();

Trace.WriteLine("Started Telnet Service.", "Information");
}
catch (SocketException se)
{
Trace.Write("Telnet Service could not start.", "Error");
return;
}

while (true)
{
listener.BeginAcceptTcpClient(HandleAsyncConnection, listener);
_connectionWait.WaitOne();
}
}
[/sourcecode]

After adding this code, add the following code for the role information to write to a stream.

[sourcecode language=”csharp”]
private static void WriteRoleInformation(Guid clientId, StreamWriter writer)
{
writer.WriteLine("— Current Client ID, Date & Time —-");
writer.WriteLine("Current date: " + DateTime.Now.ToLongDateString() + " " + DateTime.Now.ToLongTimeString());
writer.WriteLine("Connection ID: " + clientId);
writer.WriteLine();

writer.WriteLine("— Current Role Instance Information —-");
writer.WriteLine("Role ID: " + RoleEnvironment.CurrentRoleInstance.Id);
writer.WriteLine("Role Count: " + RoleEnvironment.Roles.Count);
writer.WriteLine("Deployment ID: " + RoleEnvironment.DeploymentId);
writer.WriteLine();

writer.WriteLine("— Instance Endpoints —-");

foreach (KeyValuePair<string, RoleInstanceEndpoint> instanceEndpoint in RoleEnvironment.CurrentRoleInstance.InstanceEndpoints)
{
writer.WriteLine("Instance Endpoint Key: " + instanceEndpoint.Key);

RoleInstanceEndpoint roleInstanceEndpoint = instanceEndpoint.Value;

writer.WriteLine("Instance Endpoint IP: " + roleInstanceEndpoint.IPEndpoint);
writer.WriteLine("Instance Endpoint Protocol: " + roleInstanceEndpoint.Protocol);
writer.WriteLine("Instance Endpoint Type: " + roleInstanceEndpoint);
writer.WriteLine();
}
}
[/sourcecode]

Now add a handle method for the asynchronous call.

[sourcecode language=”csharp”]
private void HandleAsyncConnection(IAsyncResult result)
{
var listener = (TcpListener)result.AsyncState;
var client = listener.EndAcceptTcpClient(result);
_connectionWait.Set();

var clientId = Guid.NewGuid();
Trace.WriteLine("Connection ID: " + clientId, "Information");

var netStream = client.GetStream();
var reader = new StreamReader(netStream);
var writer = new StreamWriter(netStream);
writer.AutoFlush = true;

var input = string.Empty;
while (input != "3")
{
writer.WriteLine(" 1) Display Worker Role Information");
writer.WriteLine(" 2) Recycle");
writer.WriteLine(" 3) Quit");
writer.Write("Enter your choice: ");

input = reader.ReadLine();
writer.WriteLine();

switch (input)
{
case "1":
WriteRoleInformation(clientId, writer);
break;
case "2":
RoleEnvironment.RequestRecycle();
break;
}

writer.WriteLine();
}

client.Close();
}
[/sourcecode]

Finally override the OnStart() method and setup the RoleEnvironmentChanging Event.

[sourcecode language=”csharp”]
public override bool OnStart()
{
ServicePointManager.DefaultConnectionLimit = 12;

DiagnosticMonitor.Start("DiagnosticsConnectionString");

RoleEnvironment.Changing += RoleEnvironmentChanging;

return base.OnStart();
}

private static void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
{
if (e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange))
{
e.Cancel = true;
}
}
[/sourcecode]

Now run the role by hitting F5. When the application runs open up the Windows Azure Compute Emulator to check for the end point and verify the instances of the role are running.

Endpoint displayed in the Windows Azure Compute Emulator
Endpoint displayed in the Windows Azure Compute Emulator

The Service Name should be Windows_Azure_Web_Role, with an Interface Type of SampleWorkerRole running on the tcp://*:1234 URL and the IP is 127.0.0.1P:1234.

SampleWorkerRole
SampleWorkerRole

Click on one of the instances, which should be green, and assure that each has started up appropriately.

PuTTY
PuTTY

Startup a telnet application, such as Putty and enter the information in as shown in screenshot above.

Telnet
Telnet

Start the telnet prompt connecting to the Windows Azure Worker Role. The prompt with the three choices will display. Click to recycle and then display worker role information a few times, just to make sure all the information is available and the worker role telnet application service is working. Select 3 to exit out and the prompt should close while the role continues to run on the development fabric.

Gritty Technical Info on Windows Azure Web Roles

This is a follow up to the previous blog entry I wrote pertaining to Windows Azure Roles.  I wanted to cover the bases on the various technical aspects of creating a Windows Azure Web Role & Worker Role in Visual Studio 2010.  Without interruption let’s just dive right in.  Start Visual Studio 2010 and initiate a new project. File, new, and then project will open the new project dialog.

Windows Azure Project
Windows Azure Project

Select a cloud template type and name your project.  Click OK and the New Windows Azure Project Dialog will appear to select the role types you can choose from.

Windows Azure Project Templates
Windows Azure Project Templates

Select an ASP.NET MVC Web Application, name it appropriately, and then click OK.  When prompted for a test project select yes and click OK.  When the solution is finished generating from the chosen templates there will be a SampleWebRole ASP.NET MVC Web Application, the test project titled SampleWebRole.Tests, and a Windows Azure Project titled Windows Azure Web Role Sample.

Solution
Solution

After that run the application to assure that the Development Fabric & other parts of the web application startup appropriately.

With the web application still running, click on the Development Fabric Icon in the status bar of Windows 7 and select the Show Computer Emulator UI.

Show Compute Emulator UI
Show Compute Emulator UI

The Windows Azure Compute Emulator will display. Click on the Service Deployments tree until you can see each individual instance (the little green lights should be showing). Figure 4.6 shows this tree opened with one of the instances selected to view the status trace.

Windows Azure Compute Emulator
Windows Azure Compute Emulator

Select Shift + F5 to stop the web application from running.  In the Solution Explorer right click on the SampleWebRole under the Windows Azure Web Role Sample Project and select Properties.

Properties for SampleWebRole
Properties for SampleWebRole

Under the configuration tab of the SampleWebRole Properties set the Instance Count to 6 and the VM Size to Extra Large.

Windows Azure Instance Properties
Windows Azure Instance Properties

Now select F5 to run the web application again in the Windows Azure Development Fabric.  The Windows Azure Compute Emulator (if it is closed right click back on the status icon to launch it again) will now display each of the 6 instances launching under the SampleWebRole.

Windows Azure Compute Emulator
Windows Azure Compute Emulator

Click on one of the green lights to show that specific instance status in the primary window area.

Windows Azure Compute Instance 2
Windows Azure Compute Instance 2

When you select the specific instance the status of that instance is displayed. The instance that is displayed in figure 4.10 has a number of events being recorded with the diagnostics, MonAgentHost, and the runtime. This particular instance had gone through a rough start. During the lifecycle of a Windows Azure Web, Worker, or CGI Role there are a number of events similar to these that can occur.

Read through the first few lines. These lines show that another agent was running, which could be a number of things that conflicted with this web role starting up cleanly. Eventually the web role was able to startup appropriately as shown in the runtime lines stating that the OnStart() is called and then complete, with the Run() executing next.

Reading further through the diagnostics the web role eventually requests a shutdown and then prepares for that shutdown pending the exit of the parent process 6924.

These types of events are common place when reviewing the actions a web role will go through; generally, don’t get too alarmed by any particular set of messages. As long as the role has green lights on the instances, things are going swimmingly. When the lights change to purple or red then it is important to really start paying attention to the diagnostics.

Windows Azure Worker Roles

In the next blog entry (Part II) I want to show is how to add a worker role and how to analyze the activities within the role. The worker role is somewhat different than a web role. The primary difference between a web role and a worker role is that one is built around providing compute work, while one is built around providing web compute. Think of the worker role as something similar to a Windows Service, which runs ongoing to execute jobs & other processes, often backend type processes. A web role is what is built to host Silverlight and web applications such as ASP.NET or ASP.NET MVC.

Part II published on Monday the 17th.

Shout it

Windows Azure Web, Worker, and CGI Roles – How They Work

This is a write up I’ve put together of how the roles in Windows Azure work.  As far as I know, this is all correct – but if there are any Windows Azure Team Members out there that wouldn’t mind providing some feedback about specifics or adding to the details I have here – please do add comments!  🙂

Windows 2008 and Hyper-V

Windows Azure is built on top of Windows 2008 & Hyper-V. Hyper-V provides virtualization to the various instance types and allocation of resources to those instances. Windows 2008 provides the core operating system functionality for those systems and the Windows Azure Platform Roles and Storage.

The hypervisor that a Hyper-V installation implements does a few unique things compared to many of the other virtualization offerings in the industry. Xen (The Open Source Virtualization Software that Amazon Web Services use) & VMWare both use a shared resource model for utilization of physical resources within a system. This allows for more virtualized instances to be started per physical machine, but can sometimes allow hardware contention. On the other hand Hyper-V pins a particular amount of resources to a virtualized instance, which decreases the number of instances allowed on a physical machine. This enables Hyper-V to prevent hardware contention though. Both designs have their plusses and minuses and in cloud computing these design choices are rarely evident. The context however is important to know when working with high end computing within the cloud.

Windows Azure Fabric Controller

The Windows Azure Fabric Controller is kind of the magic glue that holds all the pieces of Windows Azure together. The Azure Fabric Controller automates all of the load balancing, switches, networking, and other networking configuration. Usually within an IaaS environment you’d have to setup the load balancer, static IP address, internal DNS that would allow for connection and routing by the external DNS, the switch configurations, configuring the DMZ, and a host of other configuration & ongoing maintenance is needed. With the Windows Azure Platform and the Fabric Controller, all of that is taken care of entirely. Maintenance for these things goes to zero.

The Windows Azure Fabric Controller has several primary tasks: networking, hardware, and operating system management, service modeling, and life cycle management of systems.

The low level hardware that the Windows Azure Fabric Controller manages includes switches, load balancers, nodes, load balancers, and other network elements. In addition it manipulates the appropriate internal DNS and other routing needed for communication within the cloud so that each URI is accessed seamlessly from the outside.

The service modeling that the fabric controller provides is a to map the topology of services, port usage, and as mentioned before the internal communication within the cloud. All of this is done by the Fabric Controller without any interaction other than creating an instance or storage service within Windows Azure.

The operating system management from the Fabric Controller involves patching the operating system to assure that security, memory and storage, and other integral operating system features are maintained and optimized. This allows the operating system to maintain uptime and application performance characteristics that are optimal.

Finally the Fabric Controller has the responsibility for service life cycle. This includes updates and configuration changes for domains and fault domains. The Fabric Controller does so in a way to maintain uptime for the services.

Each role has at least one instance running. A role however can have multiple instances, with a theoretically limitless number. In this way, the Fabric Controller, if an instance stops responding is recycled and a new instance takes over. This can sometimes take several minutes, and is a core reason behind the 99.99% uptime SLA requiring two instances within a role to be running. In addition to this the instance that is recycled is rebuilt from scratch, thus destroying any data that would be stored on the role instance itself. This is when Windows Azure Storage plays a pivotal role in maintaining Windows Azure Cloud Applications.

Web Role

The Windows Azure Web Role is designed as a simply to deploy IIS web site or services hosting platform feature. The Windows Azure Web Role can provide hosting for any .NET related web site such as; ASP.NET, ASP.NET MVC, MonoRails, and more.

The Windows Azure Web Role is provides this service hosting with a minimal amount of maintenance required. No routing or load balancing setup is needed; everything is handled by the Windows Azure Fabric Controller.

Uses: Hosting ASP.NET, ASP.NET MVC, MonoRails, or other .NET related web site in a managed, high uptime, highly resilient, controlled environment.

Worker Role

A worker role can be used to host any number of things that need to pull, push, or run continuously without any particular input. A service role can be used to setup a schedule or other type of service. This provides a role dedicated to what could closely be compared to a Windows Service. The options and capabilities of a Worker Role however vastly exceed a simple Windows Service.

CGI Role

This service role is designed to allow execution of technology stacks such as Ruby on Rails, PHP, Java, and other non-Microsoft options.

Windows Azure Storage

Windows Azure Storage is broken into three distinct features within the service. Windows Azure provides tables, blob, and queue for storage needs. Any of the Windows Azure Roles can also connect to the storage to maintain data across service lifecycle reboots, refreshes, and any temporary loss of a Windows Azure Role.

A note about Windows Azure Storage compared to most Cloud Storage Providers: None of the Azure Storage Services are “eventually consistent”. When a write is done, it is instantly visible to all subsequent readers. This simplifies coding but slows down the data storage mechanisms more than eventually consistent data architectures.

Shout it