# Tuesday, 26 January 2010
(Note: This blog post is based on an idea I have for setting up a non-profit dedicated to promoting the Creative Commons licensing model for music using cloud infrastructure. I may revisit and modify this post periodically as the model gets fleshed out.)

As a musician growing up in the 80's, the "hip" thing to do was sampling other works and incorporating them into new arrangements. It was, and still is, a very creative and rewarding process that produces what are commonly referred to today as "mashups". I typically used Jazz samples. Others, like Vanilla Ice, used more recognizable samples from artists like Queen and David Bowie.

The artists of the sample source rarely received royalties for the redistribution of their work. Eventually, litigation started and a new precedence became established that prevented the creation of new works from existing works without the owners consent.

I had no problem with the concept of attributing other peoples work or sharing in any commercial success as a result of repurposing work into my own. I got in the habit of working with the Harry Fox Agency before pursuing such projects and budgeting appropriately for using samples. But still, artists (and most their publishers) had to be "pulled" into remixing opportunities.

Some visionary artists, like Peter Gabriel, were going out of their way to re-mix and "push" their samples to us and gave control of the faders to remix their songs into new and unimaginable soundscapes.

Lawrence Lessig's work "Remix" touched on the very heart of this issue and several questions are being raised about the future of music in light of the entertainment "cloud".

Is not the creative work of Danger Mouse's (probably not his real name) Grey Album a brilliant piece of creative work and masterpiece in it's own right, even though it is a mashup of existing digital media?

Should a parent be penalized for adding background music to the video of their childs 5th birthday and publishing it to YouTube?

Would a song like Chris Brown's 'Forever' had achieved such stellar commercial success had it not been viewed on YouTube over 40 million times in a non-commercial use?

There is a solution to this problem. The Creative Commons was established as a nonprofit corporation dedicated to making it easier for people to share and build upon the work of others, consistent with the rules of copyright.

They provide free licenses and other legal tools to mark creative work with the freedom the creator wants it to carry, so others can share, remix, use commercially, or any combination thereof.

To truly embrace the cloud and the Creative Commons involves:
  • Artists and labels rethinking the publication of their works and undergoing a 2.0 release of their works as reusable samples for use in other works.
  • Artists allowing non-commercial use of their works and defining the terms of commercial use.
  • Mashup artists attributing works to other artists and participating in the re-distribution of works through both commercial and non-commercial channels.
There are those that oppose this model.
  • Performance Rights Organizations (PROs) like BMI and ASCAP are entrenched in securing performance royalties on-behalf of commercial artists.
  • The ability for artists to allocate ownership percentages to studio musicians and other contributors threatens the traditional 50/50 Publisher/Artist model in common use today and would displace traditional royalty accounting systems.
  • Commercial Music labels are wedded to decades-old formulas that control the 360 degree image and distribution of an artists work.

To make this work:
  • New publishing and accounting systems are required that allow artists to offer the equivalent of fully vested 'stock options' on works.
  • New mashup publishing systems that allow original and mashup artists to negotiate the percentile distribution of commercial redistribution revenue (for example, an artist may use 5 samples in a song, giving each artist 10% and keeping 50% for the new work. The other 5 artists should have a voice in accepting proposed terms).
  • Artists must consider bypassing traditional "record deals" and work with progressive publishers.
  • Artists must write-off the non-commercial use of their music as a marketing expense.
  • Artists must package their work for convenient use by mashup artists and editors.
  • Consumers and Mashup Artists must attribute use of all works to the original artist.
  • It must be convenient for a Consumer or Mashup Artist to register their new works for commercial use and revenue must be quickly, easily, and fairly redistributed.

Tuesday, 26 January 2010 16:03:30 (Pacific Standard Time, UTC-08:00)
# Monday, 18 January 2010

Apex is a programming language supported by the Force.com platform.

A fluent interface is a way of implementing an object oriented API in a way that aims to provide for more readable code.

An example of a fluent interface looks like:

Contact c = new Contact();
ITransaction trx = new FluentTransaction()
		.Initialize(c, 5.0)

Transactions are a good use case for implementing a fluent interface because they often execute the repeated steps of initializing and validating input values, executing, and logging data. An API designer can expressly define these methods in an interface using Apex (very similar to Java and C# interfaces). When implementing a transaction object, the compiler will check to ensure the proper interfaces are implemented.

A fluent interface is normally implemented by using method chaining to relay the instruction context of a subsequent call. Generally, the context is defined through the return value of a called method self referential, where the new context is equivalent to the last context.

public class MethodChaining {
	public interface ITransaction{		
		ITransaction Initialize(Contact c, double d);
		ITransaction Validate();
		ITransaction Execute();		
		ITransaction Log();
		ITransaction Finish();
		boolean IsValid();
		boolean IsSuccessful();
	public class FluentTransaction implements ITransaction{
		private Contact m_contact;
		private double m_amount;
		private boolean m_isSuccessful = false;		
		private boolean m_isValid = false;
		public ITransaction Initialize(Contact c, double d){
			m_contact = c;
			m_amount = d;
			return this;
		public ITransaction Validate(){
			//Validate the m_contact and m_amount inputs
			//Setting this to false will instruct other execution chain methods to halt
			m_isValid = true; 
			return this;
		public ITransaction Execute(){
				// Execute the transaction here
				m_isSuccessful = true;
			return this;
		public ITransaction Log(){
				//Add any transaction audit logging here
			return this;
		public ITransaction Finish(){
				//Finalize method. Manage any required cleanup
			return this;
		public boolean IsValid(){
			return m_isValid;
		public boolean IsSuccessful(){
			return m_isSuccessful;
	public MethodChaining(){
		Contact c = new Contact();
		ITransaction trx = new FluentTransaction()
			.Initialize(c, 5.0)
			//Do something
			//Do something else

Monday, 18 January 2010 15:06:41 (Pacific Standard Time, UTC-08:00)
# Saturday, 09 January 2010
With the economy being what it is, I can think of no better resolution for a small business owner/entrepreneur than to focus on creating jobs.
Ideally these jobs will be created in the Portland, OR metro area, but we're accustomed to operating with distributed teams and are happy to continue doing so.

Web Application Design / Development
With our shift away from .NET and onto the Force.com platform, a person in this role will need to be competent with:
  • Salesforce API and custom/native objects
  • Apex Controllers and Classes / Object modeling
  • Visualforce Pages / Eclipse
  • Javascript / JQuery / AJAX
Possible job creation triggers:
We're actively involved in re-architecting the i-Dialogue platform to run on Force using the above technologies. This product will be launched on the AppExchange with a Freemium business model. As we start to see adoption, the long-term strategy of the product will be to develop and offer add-on premium application modules.
Demand for custom module development, such as custom shopping carts or product configurators, will also spark growth.

Professional Services Developer / Project Manager
The Pro Service Developer is competent with what's available "out of the box" from Salesforce and our core products, and offers last mile customization and development services in direct response to customer and project requirements.

Responsibilities include:
  • Custom Visualforce template design and branding
  • Custom layout
  • Configuration and implementation services
  • Weekly progress meetings. Customer facing support and requirements gathering.
Desired skills:
  • JQuery UI / Themeroller / Custom Themes
  • HTML / Javascript
  • Visualforce (UI emphasis - no custom controller development)
Possible job creation triggers:
  • Clients and customers requiring custom branding of Salesforce Sites and Portal templates.
  • Custom website, landing page, and portal development on Force.com
  • Multi-channel campaigns executed across email and landing page sites
  • Integration projects involving Sites, Customer/Partner Portal, and Ideas

Account / Sales Manager
The AppExchange coupled with our concept of a Site Gallery will automate much of the Sales and Marketing process, however an Account/Sales Manager will be required to assist current and prospective clients assemble the right mix of software and services for their solution and provide quotes.

Possible job creation triggers:
Once the free version is released, we'll shift to offering premium add-on modules within 2 months.

Risks in 2010
The freemium model is somewhat new to us. We're giving away significant value and functionality in the free version, so it's very likely that only 5-20% of all customers will require our premium services, which in turn will enable new job growth. However, this model leverages the scalability and distribution of the cloud and requires no additional effort on our part to provision new computing resources.

The Web Application Design and Development position requires a Software Engineering approach to using Javascript. This is a common skill found in Silicon Valley, but not so much in Portland, OR. It may be difficult to find just the right person(s) for this role.

Most support functions are handled by Pro Service and Account Managers today, but there may be a need for a specific support role in the future.


The actual number of jobs in each position may vary, but these are the 3 primary job functions we'll seek to create in 2010. The products and features we have planned in 2010 are embracing the cloud in ways unimaginable a couple years ago and I'm very excited to wake up each day in pursuit of these solutions. For me, software development has always been about the journey, and surrounding myself with creative, innovative, and passionate individuals on this journey is important.

If past success is any indicator of the future, then I think our new direction will be successful. Of the 60,000+ customers on Salesforce, many are always seeking to gain more value from their CRM investment by deploying Sites and Portals. By running natively and offering services directly on the Force.com platform, rather than at arms length in a hybrid configuration, we're now able to offer much richer applications and solutions.

If you know of anyone in the Portland, OR area that has these particular skill sets, please have them contact me at mike@cubiccompass.com.

Saturday, 09 January 2010 14:11:55 (Pacific Standard Time, UTC-08:00)
# Wednesday, 30 December 2009

Occam's Razor is the principle that "entities must not be multiplied beyond necessity" and the conclusion thereof, that the simplest explanation or strategy tends to be the best one. (See Wikipedia)

The simplest constructs for cloud computing are storage and processing. These 2 basic services are consumed by any number of Internet-connected clients, often described as the '3 screens' of PC, Mobile, and TV.

In Azure, the storage construct is sub-categorized into it's simplest components of Table, Blob (file), and Queue storage. Processing is sub-categorized as either a 'Web Role' or a 'Worker Role'.

Compositions may utilize these native constructs in any combination. A web role may work independently or utilize table storage. A client may directly access cloud storage or a processing service.

These 5 basic constructs can be arranged in 120 different combinations (5! factorial). When designed to be stateless, basic compositions can be scaled by adding additional storage or processing resources on-demand.

Wednesday, 30 December 2009 12:28:55 (Pacific Standard Time, UTC-08:00)
# Tuesday, 29 December 2009

Every year during the holiday break I look forward to digging into a fun software project and learning something new. This year I developed a simple application called "Up or Down" using the Microsoft Windows Azure cloud platform. Windows Azure represents a huge bottom-up R&D effort by Microsoft to support highly scalable Internet-connected applications across a number of devices and platforms.

(Jump to the 5 minute video walkthrough)
Download the source code from Codeplex.

Up or down is a simple service monitoring application that periodically polls a number of websites and reports their status as "Up" or "Down". This application has 2 basic roles: a web role for serving web pages and a worker role that runs in the background polling services. Both the web and worker roles have access to the same Azure tables storage service.

Project Structure
The web and worker roles each have their own separate project spaces. Because the Azure table storage is shared by both roles, a 3rd library project is created for representing the data model and is referenced by both roles.

This project was developed using Visual Studio 2010 Beta 2 with the November 2009 Azure SDK.

Azure Table Storage
2 tables are defined. One for storing information about websites to be monitored (ServiceRequestDefinition) and a table for storing ping responses (ServiceResponses).

The service definition table is very simple and contains just a few rows in this example. The partition key is based on the HTTP request verb used to monitor a service. Up or Down currently only makes HEAD requests for content, but can be expanded to do full page GET requests and conduct string matching on the response.

The response table is designed to be more scalable. Monitoring 10 websites once per minute will create 10 x 60 x 24 = 14,400 response records per day and over 5 million records per year. The partition key is in the format YYYY-MM for the fast retrieval of response records by month; for example "2009-12". The response table also captures the response time in milliseconds for each request.

User Interface
Up or Down is a simple one page application that communicates with the server via JQuery AJAX calls.

  • JQuery UI is used for the tabs and style sheet.
  • Click events are handled using JQuery.
  • The "Add New Service" tab is a simple web form for defining a service name and URL.
  • Each monitored service has a toolbar of options to view a report, edit, or delete a service.
  • Anonymous users can view dashboard status and drill down to recent response history
  • Support for simple authentication that requires a password to update/delete data.
  • The current status dashboard can be refreshed.
  • Report sparklines generated using the Google charting API

Data Access : Server side AJAX Handler
An ASMX webservice named db.asmx is included in the web role project for handling browser requests, CRUD operations against the Azure table storage, and formatting JSON responses back to the browser. Each C# model class has a ToJSON() method for serializing responses.

  • The web service declares the [System.Web.Script.Services.ScriptService] attribute to support calls from javascript on the browser.
  • Web service methods are declared with the [ScriptMethod(ResponseFormat = ResponseFormat.Json)] attribute to allow JSON to be passed back to the browser.
  • The open source JSON2 parse and stringify library is used by the browser client to serialize javascript objects to JSON for passing to db.asmx.
  • The Json.NET library is used server-side to format JSON responses from Azure tables.

The jQuery AJAX method asynchronously handles both the success and error responses from the server and in turn updates the user interface.

Configuration and Initialization
The polling frequency is externalized to the Service definition file. When the worker role starts, it looks for this configuration setting or otherwise using a default polling frequency of 60 seconds.

The current state of this project is just functional enough to prove some basic concepts. Some likely subsequent features to be added:

  • Multi-tenant. Modify the partition keys to include an OrgId or ClientId to support multiple unique accounts
  • Profile and permissions model: Support multiple profiles with varying access permissions per account.
  • Subscription and Email notification. Notify profiles when sites go down.
  • Average response time: Use the history of response times to update performance metrics, such as average response time.
  • Additional reports: Add ability to query response history by date range.


5 Minute Video Walkthrough

This 5 minute video walks through some of the high level points of "Up or Down".

Tuesday, 29 December 2009 16:24:38 (Pacific Standard Time, UTC-08:00)
# Friday, 25 December 2009

Have a happy holiday!

From Mike, Susi, Kaci, and Cody (clockwise)
Friday, 25 December 2009 23:58:17 (Pacific Standard Time, UTC-08:00)
# Wednesday, 16 December 2009
The canonical Fibonacci series algorithm can now be represented in C# as simply:

Func<int, int> fib = null;
fib = n => n > 1 ? fib(n - 1) + fib(n - 2) : n;

If this algorithm were running on a local client application, the results would appear almost instanty.

However, in cloud computing, the slightest inefficiency in an algorithm becomes magnified once high concurrency is introduced. In this case, the fib(int) function will be repeatedly called with the same input value as it recurses through the series.

Memoization supports the retention of previously calculated values in a "memo", so that if fib(5) has already been calculated, it's cached value is returned instead of re-calculated.

C# supports closure through method extensions that allow us to wrap the fib() function in a container with a Dictionary memo.

public static Func<A, R> Memoize<A, R>(this Func<A, R> f)
  var map = new Dictionary<A, R>();
  return a =>
      R value;
      if (map.TryGetValue(a, out value))
        return value;
      value = f(a);
      map.Add(a, value);
      return value;

The fib function is then bound to the memo wrapper through a call to the method extension.
Func<int, int> fib = null;
fib = n => n > 1 ? fib(n - 1) + fib(n - 2) : n;
fib = fib.Memoize();

Please don't misconstrue this as a need to optimize for performance early in the development lifecycle. As always, agile development should optimize for performance later.

This memoization just demonstrates the elegant decoupling of functions using some new C# tricks historically only available to functional programming languages.

Wednesday, 16 December 2009 11:35:57 (Pacific Standard Time, UTC-08:00)
# Monday, 14 December 2009
I've been using the phrase "IF happens" as a personal reminder to myself to be more vigilant in how code gets written. It refers to the use of "IF" statements in source code and is based on a similar existential observation.

It is more of a "style" than a "science". "IF happens" somewhere between functional and object oriented programming.

I could use big words like "Cyclomatic Complexity" to describe why IF statements are inherently bad, but I'm pursuing the less dogmatic and more pragmatic path.

When IF statements happen, they should ideally refer to a natural language function or property.

For example, this is pretty bad:

if(product.PurchaseDate < (TODAY - 30)){
   //do something

This is better:

    // do something

Nested IF statements are definitely bad. Please don't invite me to a pair programming session if the code even remotely resembles the following:

   if(somethingElse || thatHappened){
      //do this
   else if(thisHappened){
         //do this
             //do this
         else if(thisVersion2FeatureStateExists && someVar != null){
            //do this (if an exception hasn't already been thrown)

Yes, we've all written code like the example above and it's probably buried in the core of every project. After all, IF happens.

When IF happens, consider moving the code into a factory container class if the purpose is to construct other objects, use a switch/case block, and write an apology in the comments with a future dated reminder to come back and refactor.

A New Years resolution I had in 2009 was to fully embrace the functional programming aspects of Javascript. I mostly accomplished that goal through the use of JQuery and anonymous functions, but did not fully utilize a FP library like underscore.js.

On the server side, C# now has Lambda expressions and LINQ to promote more functional programming. F# was another language on my 2009 resolution list (and in fact was also on my TODO list in 2008) that I unfortunately made 0 progress on. I hope that changes in 2010.

"IF" can happen a lot in unit tests. Assert is really just an abstraction to using IF. The long term pursuit should be more towards Design-by-Contract using something like Spec#.

Oh well... "IF happens". But there's always room for improvement :-)
Monday, 14 December 2009 12:32:16 (Pacific Standard Time, UTC-08:00)
# Sunday, 13 December 2009

Well, almost everything ;-)

A basic iteration with Azure consists of:

  • Develop
  • Debug and Test
  • Deploy to Staging
  • Deploy to Production
Here it is compressed into 5 minutes.

Sunday, 13 December 2009 18:25:13 (Pacific Standard Time, UTC-08:00)
Update Jan 2012: This article is way out of date. I've since moved to a MacPro for development, using VMWare Fusion to run some Windows apps.

What would be your ultimate machine for developing applications in the cloud?

I've been mulling over this question for a few weeks and finally got around to putting a solution together over a weekend. There are both hardware and software components:

  • Something in between a netbook and laptop, around 4 pounds.
  • 8+ hour battery life
  • 1" thin (Easy to tote)
  • SSD (Solid State Disk 256 GB+)
  • 4 GB RAM
  • 2 GHz CPU+
  • ~$700
I settled on a new Acer 4810 Timeline which met most of my requirements. The exceptions being the Acer has a 1.3 GHz CPU and doesn't have SSD. I wanted the SSD primarily for a speedy boot time, but some tuning of the Win7 sleep options along with the Acer's 8+ hour battery life means the laptop is rarely ever turned off and can quickly recover from sleep mode.


Next up was the software required to fully embrace the cloud. My list of essential cloud tools is no where near as prolific as Scott Hanselman's tool list. But hey, this is a nascent craft and we're just getting started. These tools are essential, in my opinion, for doing development on the leading cloud platforms.

Windows 7
Whoa! I can already hear my Mac toting friends clicking away from this blog post. "WTF? Why Windows 7 for a cloud development machine?" Well, there are several reasons:
  • First of all, even if you do develop on a Mac you likely have a Windows VM or dual boot configured for Windows anyway.
  • Windows has been running on x86 architecture for years. Mac only made the jump a few years ago and is still playing catch up on peripheral device and driver support.
  • Even Google, a huge cloud player, consistently will develop, test, and release all versions of Chrome and Google App Engine tools on Windows before any other platform. Windows developers typically get access to these tools months in advance of Mac users.
  • Eclipse is another tool that is well supported on Windows, above all other platforms.
  • Silverlight support. This is my RIA environment of choice going forward.

Visual Studio 2010 Beta 2
By the time you read this blog, perhaps this version of Visual Studio will be outdated, but it is the first release of VS designed with complete support for Azure.

Windows Azure
You'll need an Azure account to upload your application to the cloud for hosting.

Azure SDK
Great article here for getting started on installing/configuring your machine for the latest Azure bits

Azure Storage Explorer
Azure Storage Explorer is a useful GUI tool for inspecting and altering the data in your Azure cloud storage projects including the logs of your cloud-hosted applications.

Eclipse is a Java-based IDE that requires the Java runtime

Rather than downloading the latest and greatest version of Eclipse, I recommend downloading whatever version is currently supported by the next essential cloud development tool (below)...

Force IDE
Eclipse Plug-in that supports development and management of Apex/Visualforce code on Salesforce.com's platform (aka Force.com)

Google App Engine
Currently, there are both Python and Java development environments for GAE. Like Azure, GAE supports development and debugging on localhost before uploading to the cloud, so the installation package provides a nice local cloud emulation environment.

I have a need for both the Windows Explorer Tortoise shell plug-in and the Eclipse plug-in. You may need only one or the other. All the Force.com open source projects are accessible via SVN.

Git on Windows (msysgit)
It seems gitHub is becoming the defacto standard for managing the source code for many web frameworks and projects. Excellent article here on how to install and configure Git on Windows

Amazon Web Services (AWS) Elastic Fox
Nice Firefox plug-in for managing EC2 and S3 instances on AWS. I mostly just use it for setting up temporary RDP whitelist rules on EC2 instances as I connect remotely from untrusted IP addresses (like airports/hotels/conferences).

I highly recommend using the browser based AWS console for all other provisioning and instance management. There's also an AWS management app for Android users called decaf.

If you're doing real world web development, then you already know the drill. Download and install Internet Explorer 8, Firefox, and Chrome at minimum. In addition to the Elastic Fox plug-in (above) you'll want to install Firebug. IE 8 now has Firebug-like functionality built-in, called the Developer Toolbar. Just hit F12 to access it (see comparison with Firebug here).

I personally use JQuery for all web development that requires DOM manipulation to handle cross-browser incompatibilities and anomalies.

There are 6-10 other utilities and tools I installed on this machine that aren't specific to the cloud. Installing everything on this list took about 90 minutes (plus a couple hours to pull down all my SVN project folders for Passage and other related projects I manage).

Given that cloud development is all about distributing resources over servers and clients, I like to take a minimalist approach and reduce the surface area and drag of my local environment as much as possible. This improves OS and IDE boot time as well as eliminates a lot of common debugging issues as a result of version incompatibilities.

What about you? What hardware/tools/applications are essential to your cloud development projects?

Sunday, 13 December 2009 12:01:04 (Pacific Standard Time, UTC-08:00)