# Monday, 26 July 2010
I missed the discussions on "open cloud" and "cloud standards" at OSCON 2010, but indirectly got the gist of it. Standards themselves aren't bad, but it's rarely a good idea to create a standard for the sake of just creating a standard.

I've observed 3 fundamental perspectives and their likely motivators on the issue of cloud standards:

1) Vendor: Invested $XXX million in software stack Y. Wants it proclaimed "standard" (for obvious reasons)
2) Developer: Wants "write once run everywhere" in the name of "portability" (doesn't want to learn 5 different APIs)
3) Consumer: Doesn't care. Wants it to "just work"

If you look across all cloud stacks today, what do you see that is common amongst them and could even remotely be called a "standard"? I see TCP/IP running on Intel x86 CPU architecture.

If someone told you 10 years ago "In order to democratize access to computing resources and enable ubiquitous access to information we all need to standardize on Intel x86 architecture and deliver services over IP", several people would have scoffed at that. It's incredible how far cloud computing has come as a "standard".

There is an art to minimizing standards in order to maximize adoption.

Look at electricity. For years there were debates over whether DC or AC power was better. The US finally standardized on a plug with 120V AC at 60Hz as a "standard" for electricity. 3 very simple data points that led to an explosion in adoption.

But look at the rest of the world. Different combinations of plugs, voltages, and frequencies abound! Fortunately these standards are all "open" and simple enough to allow for converters and transformers to fill the gaps. Fundamentally, the world really only agreed to standardize on alternating current (AC) at a range of voltages between 110-240 Volts. But for consumers, it "just works".

That's the way cloud computing is evolving. Building on the Internet (TCP/IP), running on servers (Intel x86), and bridging the gaps using a few basic tools (JSON, XML, SOAP, REST, CSV).

Check out Nicholas Carr's book "The Big Switch" for the back story on how industrial America moved towards standardized and centralized power utilities and how that parallels what's going on in cloud computing.

The Internet is called "the web" for a reason. It's in reference to its inherently chaotic data structure. Asserting standards to make sense of it won't help. It will only inhibit adoption. The web and cloud computing must be allowed to organically sprawl uninhibited.

Final thoughts:
1) Vendors: Explain to customers which specific primitive cloud services you offer (Windows/Linux, HTTP/SSL/SMTP/FTP) and preempt the discussion on integration by providing APIs to raw computing resources.
2) Developers: Sorry. The days of learning one language or standard are over. "Embrace the cloud!" The next killer app exists at the convergence of several cloud services (think mashups blending location, maps, social graph, consumer reviews, and merchant offers... all in one).
3) Customers: Sorry about the noise. We'll get this sorted out as soon as possible and ensure the cloud "just works". Thank you for your patience :-)

Monday, 26 July 2010 19:58:53 (Pacific Daylight Time, UTC-07:00)
Wednesday, 28 July 2010 02:10:57 (Pacific Daylight Time, UTC-07:00)
Some excellent food for thought Mike. It's difficult to actually identify where a standard might sit before it's called a paradigm standard. Does it sit within a language used in that paradigm, or is it the language itself. Could a group of languages or approaches be called standards or is it necessarily just one of these? I think we need a standard to govern questions before we can proceed ;)
Saturday, 07 August 2010 23:40:39 (Pacific Daylight Time, UTC-07:00)
A related topic is an overview of Cloud Computing Standards
Comments are closed.