# Wednesday, 30 December 2009

Occam's Razor is the principle that "entities must not be multiplied beyond necessity" and the conclusion thereof, that the simplest explanation or strategy tends to be the best one. (See Wikipedia)

The simplest constructs for cloud computing are storage and processing. These 2 basic services are consumed by any number of Internet-connected clients, often described as the '3 screens' of PC, Mobile, and TV.

In Azure, the storage construct is sub-categorized into it's simplest components of Table, Blob (file), and Queue storage. Processing is sub-categorized as either a 'Web Role' or a 'Worker Role'.

Compositions may utilize these native constructs in any combination. A web role may work independently or utilize table storage. A client may directly access cloud storage or a processing service.

These 5 basic constructs can be arranged in 120 different combinations (5! factorial). When designed to be stateless, basic compositions can be scaled by adding additional storage or processing resources on-demand.

Wednesday, 30 December 2009 12:28:55 (Pacific Standard Time, UTC-08:00)
# Tuesday, 29 December 2009

Every year during the holiday break I look forward to digging into a fun software project and learning something new. This year I developed a simple application called "Up or Down" using the Microsoft Windows Azure cloud platform. Windows Azure represents a huge bottom-up R&D effort by Microsoft to support highly scalable Internet-connected applications across a number of devices and platforms.

(Jump to the 5 minute video walkthrough)
Download the source code from Codeplex.

Up or down is a simple service monitoring application that periodically polls a number of websites and reports their status as "Up" or "Down". This application has 2 basic roles: a web role for serving web pages and a worker role that runs in the background polling services. Both the web and worker roles have access to the same Azure tables storage service.

Project Structure
The web and worker roles each have their own separate project spaces. Because the Azure table storage is shared by both roles, a 3rd library project is created for representing the data model and is referenced by both roles.

This project was developed using Visual Studio 2010 Beta 2 with the November 2009 Azure SDK.

Azure Table Storage
2 tables are defined. One for storing information about websites to be monitored (ServiceRequestDefinition) and a table for storing ping responses (ServiceResponses).

The service definition table is very simple and contains just a few rows in this example. The partition key is based on the HTTP request verb used to monitor a service. Up or Down currently only makes HEAD requests for content, but can be expanded to do full page GET requests and conduct string matching on the response.

The response table is designed to be more scalable. Monitoring 10 websites once per minute will create 10 x 60 x 24 = 14,400 response records per day and over 5 million records per year. The partition key is in the format YYYY-MM for the fast retrieval of response records by month; for example "2009-12". The response table also captures the response time in milliseconds for each request.

User Interface
Up or Down is a simple one page application that communicates with the server via JQuery AJAX calls.

  • JQuery UI is used for the tabs and style sheet.
  • Click events are handled using JQuery.
  • The "Add New Service" tab is a simple web form for defining a service name and URL.
  • Each monitored service has a toolbar of options to view a report, edit, or delete a service.
  • Anonymous users can view dashboard status and drill down to recent response history
  • Support for simple authentication that requires a password to update/delete data.
  • The current status dashboard can be refreshed.
  • Report sparklines generated using the Google charting API

Data Access : Server side AJAX Handler
An ASMX webservice named db.asmx is included in the web role project for handling browser requests, CRUD operations against the Azure table storage, and formatting JSON responses back to the browser. Each C# model class has a ToJSON() method for serializing responses.

  • The web service declares the [System.Web.Script.Services.ScriptService] attribute to support calls from javascript on the browser.
  • Web service methods are declared with the [ScriptMethod(ResponseFormat = ResponseFormat.Json)] attribute to allow JSON to be passed back to the browser.
  • The open source JSON2 parse and stringify library is used by the browser client to serialize javascript objects to JSON for passing to db.asmx.
  • The Json.NET library is used server-side to format JSON responses from Azure tables.

The jQuery AJAX method asynchronously handles both the success and error responses from the server and in turn updates the user interface.

Configuration and Initialization
The polling frequency is externalized to the Service definition file. When the worker role starts, it looks for this configuration setting or otherwise using a default polling frequency of 60 seconds.

The current state of this project is just functional enough to prove some basic concepts. Some likely subsequent features to be added:

  • Multi-tenant. Modify the partition keys to include an OrgId or ClientId to support multiple unique accounts
  • Profile and permissions model: Support multiple profiles with varying access permissions per account.
  • Subscription and Email notification. Notify profiles when sites go down.
  • Average response time: Use the history of response times to update performance metrics, such as average response time.
  • Additional reports: Add ability to query response history by date range.


5 Minute Video Walkthrough

This 5 minute video walks through some of the high level points of "Up or Down".

Tuesday, 29 December 2009 16:24:38 (Pacific Standard Time, UTC-08:00)
# Friday, 25 December 2009

Have a happy holiday!

From Mike, Susi, Kaci, and Cody (clockwise)
Friday, 25 December 2009 23:58:17 (Pacific Standard Time, UTC-08:00)
# Wednesday, 16 December 2009
The canonical Fibonacci series algorithm can now be represented in C# as simply:

Func<int, int> fib = null;
fib = n => n > 1 ? fib(n - 1) + fib(n - 2) : n;

If this algorithm were running on a local client application, the results would appear almost instanty.

However, in cloud computing, the slightest inefficiency in an algorithm becomes magnified once high concurrency is introduced. In this case, the fib(int) function will be repeatedly called with the same input value as it recurses through the series.

Memoization supports the retention of previously calculated values in a "memo", so that if fib(5) has already been calculated, it's cached value is returned instead of re-calculated.

C# supports closure through method extensions that allow us to wrap the fib() function in a container with a Dictionary memo.

public static Func<A, R> Memoize<A, R>(this Func<A, R> f)
  var map = new Dictionary<A, R>();
  return a =>
      R value;
      if (map.TryGetValue(a, out value))
        return value;
      value = f(a);
      map.Add(a, value);
      return value;

The fib function is then bound to the memo wrapper through a call to the method extension.
Func<int, int> fib = null;
fib = n => n > 1 ? fib(n - 1) + fib(n - 2) : n;
fib = fib.Memoize();

Please don't misconstrue this as a need to optimize for performance early in the development lifecycle. As always, agile development should optimize for performance later.

This memoization just demonstrates the elegant decoupling of functions using some new C# tricks historically only available to functional programming languages.

Wednesday, 16 December 2009 11:35:57 (Pacific Standard Time, UTC-08:00)
# Monday, 14 December 2009
I've been using the phrase "IF happens" as a personal reminder to myself to be more vigilant in how code gets written. It refers to the use of "IF" statements in source code and is based on a similar existential observation.

It is more of a "style" than a "science". "IF happens" somewhere between functional and object oriented programming.

I could use big words like "Cyclomatic Complexity" to describe why IF statements are inherently bad, but I'm pursuing the less dogmatic and more pragmatic path.

When IF statements happen, they should ideally refer to a natural language function or property.

For example, this is pretty bad:

if(product.PurchaseDate < (TODAY - 30)){
   //do something

This is better:

    // do something

Nested IF statements are definitely bad. Please don't invite me to a pair programming session if the code even remotely resembles the following:

   if(somethingElse || thatHappened){
      //do this
   else if(thisHappened){
         //do this
             //do this
         else if(thisVersion2FeatureStateExists && someVar != null){
            //do this (if an exception hasn't already been thrown)

Yes, we've all written code like the example above and it's probably buried in the core of every project. After all, IF happens.

When IF happens, consider moving the code into a factory container class if the purpose is to construct other objects, use a switch/case block, and write an apology in the comments with a future dated reminder to come back and refactor.

A New Years resolution I had in 2009 was to fully embrace the functional programming aspects of Javascript. I mostly accomplished that goal through the use of JQuery and anonymous functions, but did not fully utilize a FP library like underscore.js.

On the server side, C# now has Lambda expressions and LINQ to promote more functional programming. F# was another language on my 2009 resolution list (and in fact was also on my TODO list in 2008) that I unfortunately made 0 progress on. I hope that changes in 2010.

"IF" can happen a lot in unit tests. Assert is really just an abstraction to using IF. The long term pursuit should be more towards Design-by-Contract using something like Spec#.

Oh well... "IF happens". But there's always room for improvement :-)
Monday, 14 December 2009 12:32:16 (Pacific Standard Time, UTC-08:00)
# Sunday, 13 December 2009

Well, almost everything ;-)

A basic iteration with Azure consists of:

  • Develop
  • Debug and Test
  • Deploy to Staging
  • Deploy to Production
Here it is compressed into 5 minutes.

Sunday, 13 December 2009 18:25:13 (Pacific Standard Time, UTC-08:00)
Update Jan 2012: This article is way out of date. I've since moved to a MacPro for development, using VMWare Fusion to run some Windows apps.

What would be your ultimate machine for developing applications in the cloud?

I've been mulling over this question for a few weeks and finally got around to putting a solution together over a weekend. There are both hardware and software components:

  • Something in between a netbook and laptop, around 4 pounds.
  • 8+ hour battery life
  • 1" thin (Easy to tote)
  • SSD (Solid State Disk 256 GB+)
  • 4 GB RAM
  • 2 GHz CPU+
  • ~$700
I settled on a new Acer 4810 Timeline which met most of my requirements. The exceptions being the Acer has a 1.3 GHz CPU and doesn't have SSD. I wanted the SSD primarily for a speedy boot time, but some tuning of the Win7 sleep options along with the Acer's 8+ hour battery life means the laptop is rarely ever turned off and can quickly recover from sleep mode.


Next up was the software required to fully embrace the cloud. My list of essential cloud tools is no where near as prolific as Scott Hanselman's tool list. But hey, this is a nascent craft and we're just getting started. These tools are essential, in my opinion, for doing development on the leading cloud platforms.

Windows 7
Whoa! I can already hear my Mac toting friends clicking away from this blog post. "WTF? Why Windows 7 for a cloud development machine?" Well, there are several reasons:
  • First of all, even if you do develop on a Mac you likely have a Windows VM or dual boot configured for Windows anyway.
  • Windows has been running on x86 architecture for years. Mac only made the jump a few years ago and is still playing catch up on peripheral device and driver support.
  • Even Google, a huge cloud player, consistently will develop, test, and release all versions of Chrome and Google App Engine tools on Windows before any other platform. Windows developers typically get access to these tools months in advance of Mac users.
  • Eclipse is another tool that is well supported on Windows, above all other platforms.
  • Silverlight support. This is my RIA environment of choice going forward.

Visual Studio 2010 Beta 2
By the time you read this blog, perhaps this version of Visual Studio will be outdated, but it is the first release of VS designed with complete support for Azure.

Windows Azure
You'll need an Azure account to upload your application to the cloud for hosting.

Azure SDK
Great article here for getting started on installing/configuring your machine for the latest Azure bits

Azure Storage Explorer
Azure Storage Explorer is a useful GUI tool for inspecting and altering the data in your Azure cloud storage projects including the logs of your cloud-hosted applications.

Eclipse is a Java-based IDE that requires the Java runtime

Rather than downloading the latest and greatest version of Eclipse, I recommend downloading whatever version is currently supported by the next essential cloud development tool (below)...

Force IDE
Eclipse Plug-in that supports development and management of Apex/Visualforce code on Salesforce.com's platform (aka Force.com)

Google App Engine
Currently, there are both Python and Java development environments for GAE. Like Azure, GAE supports development and debugging on localhost before uploading to the cloud, so the installation package provides a nice local cloud emulation environment.

I have a need for both the Windows Explorer Tortoise shell plug-in and the Eclipse plug-in. You may need only one or the other. All the Force.com open source projects are accessible via SVN.

Git on Windows (msysgit)
It seems gitHub is becoming the defacto standard for managing the source code for many web frameworks and projects. Excellent article here on how to install and configure Git on Windows

Amazon Web Services (AWS) Elastic Fox
Nice Firefox plug-in for managing EC2 and S3 instances on AWS. I mostly just use it for setting up temporary RDP whitelist rules on EC2 instances as I connect remotely from untrusted IP addresses (like airports/hotels/conferences).

I highly recommend using the browser based AWS console for all other provisioning and instance management. There's also an AWS management app for Android users called decaf.

If you're doing real world web development, then you already know the drill. Download and install Internet Explorer 8, Firefox, and Chrome at minimum. In addition to the Elastic Fox plug-in (above) you'll want to install Firebug. IE 8 now has Firebug-like functionality built-in, called the Developer Toolbar. Just hit F12 to access it (see comparison with Firebug here).

I personally use JQuery for all web development that requires DOM manipulation to handle cross-browser incompatibilities and anomalies.

There are 6-10 other utilities and tools I installed on this machine that aren't specific to the cloud. Installing everything on this list took about 90 minutes (plus a couple hours to pull down all my SVN project folders for Passage and other related projects I manage).

Given that cloud development is all about distributing resources over servers and clients, I like to take a minimalist approach and reduce the surface area and drag of my local environment as much as possible. This improves OS and IDE boot time as well as eliminates a lot of common debugging issues as a result of version incompatibilities.

What about you? What hardware/tools/applications are essential to your cloud development projects?

Sunday, 13 December 2009 12:01:04 (Pacific Standard Time, UTC-08:00)
# Sunday, 06 December 2009

Just a reminder that effective January 2010, you will start receiving a bill for Azure hosting, although the amount due for January's bill will be $0. This will provide you with a line item breakdown of how your application consumes hosting and storage resources and will help in capacity planning and forecasting exercises.

Promoting your CTP apps to production will apparently require some manual intervention, so there should be no possibility of suddenly getting billed for an idle app you haven't been using.

Actual billing for Azure doesn't start until February, so before the December holiday break, take advantage of the free month of hosting and signup here for your Azure account.

You can still continue to develop/run your Azure apps on a localhost Windows machine using free development tool (check out the complete list of system requirements here).

Sunday, 06 December 2009 21:40:20 (Pacific Standard Time, UTC-08:00)
# Saturday, 05 December 2009

Can't get any simpler than this (from Steve Marx's blog)

Saturday, 05 December 2009 17:11:42 (Pacific Standard Time, UTC-08:00)

I'm playing with Cool Sites this weekend. Hope to be out of Beta very soon, but one idea begets another and pretty soon my ambition turns this into a full-fledged product. www.CubicCompass.com is currently running on Cool Sites Beta 1.

Here's a sneak peek of the latest console (below).
Note: Cool Sites is a web Content Management System (CMS) running on the Force.com platform.

Saturday, 05 December 2009 11:52:15 (Pacific Standard Time, UTC-08:00)