# Monday, 22 February 2010

I'm presenting "Developing In The Cloud" at the Portland JavaScript Admirers Group this month. If you're in Portland this Wednesday, February 24th, 2010, come check it out 7pm-10pm!

The talking points are below. There'll be several visual examples and live coding demonstrations.

Monday, 22 February 2010 10:40:29 (Pacific Standard Time, UTC-08:00)
# Sunday, 21 February 2010

There are several bank specific rules for validating credit cards, but all card numbers can be validated using the Luhn algorithm. The Luhn algorithm, aka the Luhn formula or "modulus 10" algorithm, is a simple checksum formula used to validate a variety of identification numbers.

It is not intended to be a cryptographically secure hash function; it was designed to protect against accidental errors, not malicious attacks. Most credit cards and many government identification numbers use the algorithm as a simple method of distinguishing valid numbers from collections of random digits.

Programmers accustomed to using '%' as a modulus operator need only to shift gears slightly and use the Math.mod(int1, int2) library function when working with Apex.

public static boolean isValidCheckSum(List<integer> numbers){
	integer sum = 0;
	integer len = numbers.size();
	
	for (integer i = len - 1; i >= 0; i--)
	{
		if (math.mod(i , 2) == math.mod(len,  2) )
		{
			integer n = numbers[i] * 2;
			sum += (n / 10) + ( math.mod(n, 10));
		}
		else
			sum += numbers[i];
	}
	return ( math.mod( sum, 10) == 0 );
}
Sunday, 21 February 2010 12:15:00 (Pacific Standard Time, UTC-08:00)
# Tuesday, 16 February 2010

I spent a few hours tackling this problem, so hopefully this blog post will spare someone else the difficulty of using Salesforce Spring '10 features before it is broadly released.

Spring '10, aka version 18, is a big release for Salesforce Developers. The current version of Eclipse Force IDE defaults to using the version 16 API.

Once Spring '10 was installed on my org (NA1) I committed to using the v18 features on a couple new projects. Unfortunately the Spring '10 release hit a snag, so now the release is being staggered over 30-45 days and the new Force IDE release has been put on hold.

Fortunately, there are a couple workarounds to using the v18 API and features today.

Option 1)
a) Create an Apex class as you normally would in Eclipse and accept version 16 as the default.
b) Eclipse will create a second file next to the class with a 'meta.xml' extension.
c) Edit the -meta.xml file by changing the version to 18 then save.

Option 2)
a) Create an Apex class through the Salesforce browser interface but do not accept the default version 18. Instead, select version 16.
b) Open the Salesforce project in Eclipse and confirm the Apex class is available in the IDE (the current IDE apparently won't automatically sync with classes > v16. At least that was my experience).
c) Back in the Salesforce browser, change the version from 16 to 18.

Finally, depending on which approach used, you'll need to synchronize the Eclipse IDE with Salesforce servers using either the "Deploy to" or "Refresh from" server options by right clicking on the class and selecting from the Force.com option. (Option 1 requires "Deploy to". Option 2 "Refresh from").

Hope this helps!

Tuesday, 16 February 2010 18:28:19 (Pacific Standard Time, UTC-08:00)
# Tuesday, 02 February 2010

Every once in awhile I run into a requirement when using Apex where I think there must be a real obvious way to do something, only to hit a wall and decide to develop the functionality myself.

When this happens, the result either ends up being:
a) I later discover there actually is a faster/better way to do something and I feel stupid for taking the NIH path (Not Invented Here).
b) The functionality really is missing and the effort invested turns out to be well spent.

Let's just hope the time spent on this wrapper class is in the latter category :-)

In this use case, a custom database object is wrapped in an Apex class that abstracts the underlying object properties and provides some methods. One of the properties (a database custom field) is a multipicklist that is stored in semicolon delimited format, but is better represented in object-oriented terms as a List. Methods for Adding and Removing items from the list are also needed.

public class Foo{
	private Foo__c m_record = null;
	public Foo(Foo__c record){
		m_record = record;
	}

	private MultiSelectProperty m_categories = new MultiSelectProperty();
	public List<string> Categories{
		get{return m_categories.ToList(m_record.Category__c);}
	}
	public void AddCategory(string category){
		m_record.Category__c = m_categories.Add(m_record.Category__c, category);
	}
	public void RemoveCategory(string category){
		m_record.Category__c = m_categories.Remove(m_record.Category__c, category);
	}
}

The MultiSelectProperty class is defined here. The unit tests should help tell the story on how it can be used and the assumptions it makes.

//Manages semicolon-delimited multipicklist data. 
global class MultiSelectProperty {
	
	public List<string> ToList(string storedValue){
		List<string> values = new List<string>();
		if(storedValue == null || storedValue == '')
			return values;
		else{			
			string[] splitValues = storedValue.split(';');
			for(string v : splitValues){
				if(v != null && v != '')
					values.add(v);
			}
			return values;
		}
	}
	
	public boolean Contains(string field, string value){
		List<string> values = this.ToList(field);
		for(string v : values){
			if(v==value){
				return true;
			}
		}
		return false;
	}
	
	public string Add(string field, string value){
		if(field == null)
			field = '';
		
		if(value == null || value == '')
			return '';
		
		if(this.Contains(field, value))
			return field;
		
		if(field.endsWith(';')){
			return field + value;
		}
		else
			return field + ';' + value;
	}
	
	public string Remove(string field, string value){
		if(field == null || field == '')
			return '';
		
		if(value == null || value == '')
			return field;
		
		if(!this.Contains(field, value))
			return field;
		
		List<string> values = this.ToList(field);
		string formattedFields = '';
		for(string v : values){
			if(v == value)
				continue;
			formattedFields += v + ';'; 
		}
		return formattedFields;		
	}
	
	@IsTest public static void tests(){
		final string SINGLE_VALUE_NULL = null;
		final string SINGLE_VALUE_EMPTY = '';
		final string SINGLE_VALUE = 'first option;';
		final string SINGLE_VALUE_EMPTY_TRAIL = 'first option';
		final string DOUBLE_VALUE = 'first option;second option;';
		final string TRIPLE_VALUE = 'first option;second option;third option;';
		
		MultiSelectProperty categories = new MultiSelectProperty();
		System.assert(categories.ToList(SINGLE_VALUE_NULL).size() == 0);
		System.assert(categories.ToList(SINGLE_VALUE_EMPTY).size() == 0);
		System.assert(categories.ToList(SINGLE_VALUE).size() == 1);
		System.assert(categories.ToList(SINGLE_VALUE_EMPTY_TRAIL).size() == 1);
		System.assert(categories.ToList(DOUBLE_VALUE).size() == 2);
		System.assert(categories.ToList(TRIPLE_VALUE).size() == 3);
		
		System.assert(categories.Contains(SINGLE_VALUE, 'first option') == true);
		System.assert(categories.Contains(SINGLE_VALUE, 'second option') == false);
		System.assert(categories.Contains(DOUBLE_VALUE, 'second option') == true);
		System.assert(categories.Contains(DOUBLE_VALUE, 'third option') == false);
		System.assert(categories.Contains(TRIPLE_VALUE, 'first option') == true);
		System.assert(categories.Contains(TRIPLE_VALUE, 'second option') == true);
		System.assert(categories.Contains(TRIPLE_VALUE, 'third option') == true);
		
		string fields = categories.Add(SINGLE_VALUE, null);	
		System.assert(categories.ToList(fields).size() == 0);
		fields = categories.Add(null, null);
		System.assert(categories.ToList(fields).size() == 0);
		fields = categories.Add(null, '');
		System.assert(categories.ToList(fields).size() == 0);
		fields = categories.Add(SINGLE_VALUE, '');
		System.assert(categories.ToList(fields).size() == 0);
		
		fields = categories.Add(SINGLE_VALUE, 'second option');
		System.assert(categories.ToList(fields).size() == 2);
		System.assert(categories.Contains(fields, 'second option') == true);
		
		fields = categories.Add(SINGLE_VALUE_EMPTY_TRAIL, 'second option');
		System.assert(categories.ToList(fields).size() == 2);
		System.assert(categories.Contains(fields, 'second option') == true);
		
		fields = categories.Add(DOUBLE_VALUE, 'second option');
		System.assert(categories.ToList(fields).size() == 2);
		System.assert(categories.Contains(fields, 'second option') == true);
		
		fields = categories.Remove(null, '');
		System.assert(categories.ToList(fields).size() == 0);
		
		fields = categories.Remove(null, null);
		System.assert(categories.ToList(fields).size() == 0);
		
		fields = categories.Remove(SINGLE_VALUE, '');		
		System.assert(categories.ToList(fields).size() == 1);
		
		fields = categories.Remove(SINGLE_VALUE, null);		
		System.assert(categories.ToList(fields).size() == 1);
		
		fields = categories.Remove(SINGLE_VALUE, 'second option');
		System.assert(categories.ToList(fields).size() == 1);
		System.assert(categories.Contains(fields, 'second option') == false);
		
		fields = categories.Remove(SINGLE_VALUE, 'first option');
		System.assert(categories.ToList(fields).size() == 0);
		System.assert(categories.Contains(fields, 'first option') == false);
		
		fields = categories.Remove(TRIPLE_VALUE, 'first option');
		System.debug('results from remove ' + fields);
		System.assert(categories.ToList(fields).size() == 2);
		System.assert(categories.Contains(fields, 'first option') == false);
		System.assert(categories.Contains(fields, 'second option') == true);
		System.assert(categories.Contains(fields, 'third option') == true);
	}
}
Tuesday, 02 February 2010 12:23:21 (Pacific Standard Time, UTC-08:00)
# Saturday, 30 January 2010

Have you ever wondered about the "priority" field on support Cases? It makes sense for internal use, but I've never seen a priority other than "High" when the option is given on customer portal Web-to-Case forms. :-)

Saturday, 30 January 2010 16:27:09 (Pacific Standard Time, UTC-08:00)
# Tuesday, 26 January 2010
(Note: This blog post is based on an idea I have for setting up a non-profit dedicated to promoting the Creative Commons licensing model for music using cloud infrastructure. I may revisit and modify this post periodically as the model gets fleshed out.)

As a musician growing up in the 80's, the "hip" thing to do was sampling other works and incorporating them into new arrangements. It was, and still is, a very creative and rewarding process that produces what are commonly referred to today as "mashups". I typically used Jazz samples. Others, like Vanilla Ice, used more recognizable samples from artists like Queen and David Bowie.

The artists of the sample source rarely received royalties for the redistribution of their work. Eventually, litigation started and a new precedence became established that prevented the creation of new works from existing works without the owners consent.

I had no problem with the concept of attributing other peoples work or sharing in any commercial success as a result of repurposing work into my own. I got in the habit of working with the Harry Fox Agency before pursuing such projects and budgeting appropriately for using samples. But still, artists (and most their publishers) had to be "pulled" into remixing opportunities.

Some visionary artists, like Peter Gabriel, were going out of their way to re-mix and "push" their samples to us and gave control of the faders to remix their songs into new and unimaginable soundscapes.

Lawrence Lessig's work "Remix" touched on the very heart of this issue and several questions are being raised about the future of music in light of the entertainment "cloud".

Is not the creative work of Danger Mouse's (probably not his real name) Grey Album a brilliant piece of creative work and masterpiece in it's own right, even though it is a mashup of existing digital media?

Should a parent be penalized for adding background music to the video of their childs 5th birthday and publishing it to YouTube?

Would a song like Chris Brown's 'Forever' had achieved such stellar commercial success had it not been viewed on YouTube over 40 million times in a non-commercial use?

There is a solution to this problem. The Creative Commons was established as a nonprofit corporation dedicated to making it easier for people to share and build upon the work of others, consistent with the rules of copyright.

They provide free licenses and other legal tools to mark creative work with the freedom the creator wants it to carry, so others can share, remix, use commercially, or any combination thereof.

To truly embrace the cloud and the Creative Commons involves:
  • Artists and labels rethinking the publication of their works and undergoing a 2.0 release of their works as reusable samples for use in other works.
  • Artists allowing non-commercial use of their works and defining the terms of commercial use.
  • Mashup artists attributing works to other artists and participating in the re-distribution of works through both commercial and non-commercial channels.
There are those that oppose this model.
  • Performance Rights Organizations (PROs) like BMI and ASCAP are entrenched in securing performance royalties on-behalf of commercial artists.
  • The ability for artists to allocate ownership percentages to studio musicians and other contributors threatens the traditional 50/50 Publisher/Artist model in common use today and would displace traditional royalty accounting systems.
  • Commercial Music labels are wedded to decades-old formulas that control the 360 degree image and distribution of an artists work.

To make this work:
  • New publishing and accounting systems are required that allow artists to offer the equivalent of fully vested 'stock options' on works.
  • New mashup publishing systems that allow original and mashup artists to negotiate the percentile distribution of commercial redistribution revenue (for example, an artist may use 5 samples in a song, giving each artist 10% and keeping 50% for the new work. The other 5 artists should have a voice in accepting proposed terms).
  • Artists must consider bypassing traditional "record deals" and work with progressive publishers.
  • Artists must write-off the non-commercial use of their music as a marketing expense.
  • Artists must package their work for convenient use by mashup artists and editors.
  • Consumers and Mashup Artists must attribute use of all works to the original artist.
  • It must be convenient for a Consumer or Mashup Artist to register their new works for commercial use and revenue must be quickly, easily, and fairly redistributed.

Tuesday, 26 January 2010 16:03:30 (Pacific Standard Time, UTC-08:00)
# Monday, 18 January 2010

Apex is a programming language supported by the Force.com platform.

A fluent interface is a way of implementing an object oriented API in a way that aims to provide for more readable code.

An example of a fluent interface looks like:

Contact c = new Contact();
ITransaction trx = new FluentTransaction()
		.Initialize(c, 5.0)
		.Validate()
		.Execute()
		.Log()
		.Finish();

Transactions are a good use case for implementing a fluent interface because they often execute the repeated steps of initializing and validating input values, executing, and logging data. An API designer can expressly define these methods in an interface using Apex (very similar to Java and C# interfaces). When implementing a transaction object, the compiler will check to ensure the proper interfaces are implemented.

A fluent interface is normally implemented by using method chaining to relay the instruction context of a subsequent call. Generally, the context is defined through the return value of a called method self referential, where the new context is equivalent to the last context.

public class MethodChaining {
	
	public interface ITransaction{		
		ITransaction Initialize(Contact c, double d);
		ITransaction Validate();
		ITransaction Execute();		
		ITransaction Log();
		ITransaction Finish();
		boolean IsValid();
		boolean IsSuccessful();
	}
	
	public class FluentTransaction implements ITransaction{
		private Contact m_contact;
		private double m_amount;
		private boolean m_isSuccessful = false;		
		private boolean m_isValid = false;
		
		public ITransaction Initialize(Contact c, double d){
			m_contact = c;
			m_amount = d;
			return this;
		}
		
		public ITransaction Validate(){
			//Validate the m_contact and m_amount inputs
			//Setting this to false will instruct other execution chain methods to halt
			m_isValid = true; 
			return this;
		}
		
		public ITransaction Execute(){
			if(IsValid()){
				// Execute the transaction here
				m_isSuccessful = true;
			}
			return this;
		}
				
		public ITransaction Log(){
			if(IsValid()){
				//Add any transaction audit logging here
			}
			return this;
		}
		
		public ITransaction Finish(){
			if(IsValid()){
				//Finalize method. Manage any required cleanup
			}
			return this;
		}
		
		public boolean IsValid(){
			return m_isValid;
		}
				
		public boolean IsSuccessful(){
			return m_isSuccessful;
		}		
	}
	
	public MethodChaining(){
		Contact c = new Contact();
		ITransaction trx = new FluentTransaction()
			.Initialize(c, 5.0)
			.Validate()
			.Execute()
			.Log()
			.Finish();
		
		if(trx.IsSuccessful()){
			//Do something
		}
		else{
			//Do something else
		}
	}
}

Monday, 18 January 2010 15:06:41 (Pacific Standard Time, UTC-08:00)
# Saturday, 09 January 2010
With the economy being what it is, I can think of no better resolution for a small business owner/entrepreneur than to focus on creating jobs.
Ideally these jobs will be created in the Portland, OR metro area, but we're accustomed to operating with distributed teams and are happy to continue doing so.

Web Application Design / Development
With our shift away from .NET and onto the Force.com platform, a person in this role will need to be competent with:
  • Salesforce API and custom/native objects
  • Apex Controllers and Classes / Object modeling
  • Visualforce Pages / Eclipse
  • Javascript / JQuery / AJAX
Possible job creation triggers:
We're actively involved in re-architecting the i-Dialogue platform to run on Force using the above technologies. This product will be launched on the AppExchange with a Freemium business model. As we start to see adoption, the long-term strategy of the product will be to develop and offer add-on premium application modules.
Demand for custom module development, such as custom shopping carts or product configurators, will also spark growth.

Professional Services Developer / Project Manager
The Pro Service Developer is competent with what's available "out of the box" from Salesforce and our core products, and offers last mile customization and development services in direct response to customer and project requirements.

Responsibilities include:
  • Custom Visualforce template design and branding
  • Custom layout
  • Configuration and implementation services
  • Weekly progress meetings. Customer facing support and requirements gathering.
Desired skills:
  • JQuery UI / Themeroller / Custom Themes
  • HTML / Javascript
  • Visualforce (UI emphasis - no custom controller development)
Possible job creation triggers:
  • Clients and customers requiring custom branding of Salesforce Sites and Portal templates.
  • Custom website, landing page, and portal development on Force.com
  • Multi-channel campaigns executed across email and landing page sites
  • Integration projects involving Sites, Customer/Partner Portal, and Ideas

Account / Sales Manager
The AppExchange coupled with our concept of a Site Gallery will automate much of the Sales and Marketing process, however an Account/Sales Manager will be required to assist current and prospective clients assemble the right mix of software and services for their solution and provide quotes.

Possible job creation triggers:
Once the free version is released, we'll shift to offering premium add-on modules within 2 months.


Risks in 2010
The freemium model is somewhat new to us. We're giving away significant value and functionality in the free version, so it's very likely that only 5-20% of all customers will require our premium services, which in turn will enable new job growth. However, this model leverages the scalability and distribution of the cloud and requires no additional effort on our part to provision new computing resources.

The Web Application Design and Development position requires a Software Engineering approach to using Javascript. This is a common skill found in Silicon Valley, but not so much in Portland, OR. It may be difficult to find just the right person(s) for this role.

Most support functions are handled by Pro Service and Account Managers today, but there may be a need for a specific support role in the future.


Conclusion

The actual number of jobs in each position may vary, but these are the 3 primary job functions we'll seek to create in 2010. The products and features we have planned in 2010 are embracing the cloud in ways unimaginable a couple years ago and I'm very excited to wake up each day in pursuit of these solutions. For me, software development has always been about the journey, and surrounding myself with creative, innovative, and passionate individuals on this journey is important.

If past success is any indicator of the future, then I think our new direction will be successful. Of the 60,000+ customers on Salesforce, many are always seeking to gain more value from their CRM investment by deploying Sites and Portals. By running natively and offering services directly on the Force.com platform, rather than at arms length in a hybrid configuration, we're now able to offer much richer applications and solutions.

If you know of anyone in the Portland, OR area that has these particular skill sets, please have them contact me at mike@cubiccompass.com.

Saturday, 09 January 2010 14:11:55 (Pacific Standard Time, UTC-08:00)
# Wednesday, 30 December 2009

Occam's Razor is the principle that "entities must not be multiplied beyond necessity" and the conclusion thereof, that the simplest explanation or strategy tends to be the best one. (See Wikipedia)

The simplest constructs for cloud computing are storage and processing. These 2 basic services are consumed by any number of Internet-connected clients, often described as the '3 screens' of PC, Mobile, and TV.

In Azure, the storage construct is sub-categorized into it's simplest components of Table, Blob (file), and Queue storage. Processing is sub-categorized as either a 'Web Role' or a 'Worker Role'.

Compositions may utilize these native constructs in any combination. A web role may work independently or utilize table storage. A client may directly access cloud storage or a processing service.

These 5 basic constructs can be arranged in 120 different combinations (5! factorial). When designed to be stateless, basic compositions can be scaled by adding additional storage or processing resources on-demand.

Wednesday, 30 December 2009 12:28:55 (Pacific Standard Time, UTC-08:00)
# Tuesday, 29 December 2009

Every year during the holiday break I look forward to digging into a fun software project and learning something new. This year I developed a simple application called "Up or Down" using the Microsoft Windows Azure cloud platform. Windows Azure represents a huge bottom-up R&D effort by Microsoft to support highly scalable Internet-connected applications across a number of devices and platforms.

(Jump to the 5 minute video walkthrough)
Download the source code from Codeplex.

Up or down is a simple service monitoring application that periodically polls a number of websites and reports their status as "Up" or "Down". This application has 2 basic roles: a web role for serving web pages and a worker role that runs in the background polling services. Both the web and worker roles have access to the same Azure tables storage service.

Project Structure
The web and worker roles each have their own separate project spaces. Because the Azure table storage is shared by both roles, a 3rd library project is created for representing the data model and is referenced by both roles.

This project was developed using Visual Studio 2010 Beta 2 with the November 2009 Azure SDK.

Azure Table Storage
2 tables are defined. One for storing information about websites to be monitored (ServiceRequestDefinition) and a table for storing ping responses (ServiceResponses).

The service definition table is very simple and contains just a few rows in this example. The partition key is based on the HTTP request verb used to monitor a service. Up or Down currently only makes HEAD requests for content, but can be expanded to do full page GET requests and conduct string matching on the response.

The response table is designed to be more scalable. Monitoring 10 websites once per minute will create 10 x 60 x 24 = 14,400 response records per day and over 5 million records per year. The partition key is in the format YYYY-MM for the fast retrieval of response records by month; for example "2009-12". The response table also captures the response time in milliseconds for each request.

User Interface
Up or Down is a simple one page application that communicates with the server via JQuery AJAX calls.

  • JQuery UI is used for the tabs and style sheet.
  • Click events are handled using JQuery.
  • The "Add New Service" tab is a simple web form for defining a service name and URL.
  • Each monitored service has a toolbar of options to view a report, edit, or delete a service.
  • Anonymous users can view dashboard status and drill down to recent response history
  • Support for simple authentication that requires a password to update/delete data.
  • The current status dashboard can be refreshed.
  • Report sparklines generated using the Google charting API

Data Access : Server side AJAX Handler
An ASMX webservice named db.asmx is included in the web role project for handling browser requests, CRUD operations against the Azure table storage, and formatting JSON responses back to the browser. Each C# model class has a ToJSON() method for serializing responses.

  • The web service declares the [System.Web.Script.Services.ScriptService] attribute to support calls from javascript on the browser.
  • Web service methods are declared with the [ScriptMethod(ResponseFormat = ResponseFormat.Json)] attribute to allow JSON to be passed back to the browser.
  • The open source JSON2 parse and stringify library is used by the browser client to serialize javascript objects to JSON for passing to db.asmx.
  • The Json.NET library is used server-side to format JSON responses from Azure tables.

The jQuery AJAX method asynchronously handles both the success and error responses from the server and in turn updates the user interface.

Configuration and Initialization
The polling frequency is externalized to the Service definition file. When the worker role starts, it looks for this configuration setting or otherwise using a default polling frequency of 60 seconds.

Evolution
The current state of this project is just functional enough to prove some basic concepts. Some likely subsequent features to be added:

  • Multi-tenant. Modify the partition keys to include an OrgId or ClientId to support multiple unique accounts
  • Profile and permissions model: Support multiple profiles with varying access permissions per account.
  • Subscription and Email notification. Notify profiles when sites go down.
  • Average response time: Use the history of response times to update performance metrics, such as average response time.
  • Additional reports: Add ability to query response history by date range.

 

5 Minute Video Walkthrough

This 5 minute video walks through some of the high level points of "Up or Down".

<
Tuesday, 29 December 2009 16:24:38 (Pacific Standard Time, UTC-08:00)