This isn’t the forum to develop a complete explication of comparing the analytics from an A/B split, but I do want to use an example to point out some of the weird and interesting results that can be teased out in an A/B test.

I’m sometimes called upon to come up with subject lines and copy for campaigns, a task I enjoy and at which fancy myself pretty darn good.  But for one campaign we ran an A/B split test with the competing subject headings of, “Half-off for the Holidays” versus my “Everything half-off (even the partridge in the pear tree)”.  I was pretty confident I had the winner, but in comparing the results we noted that while the open rate was higher for mine, the more bland heading held a slight, but definitive advantage in clicks. Worse yet, the more bland heading had clearly resulted in more conversions to sales.

So what had happened?  There are many ways to interpret the data, but here are just a couple:

1.) People preferred the bland subject heading (I don’t believe it, but it is the simplest interpretation and we like to shave with Occam’s razor here at Sailthru.)

2.) People on that particular list are the type that like more simple subject lines (i.e. I didn’t know my audience.  It’s quite possible, but given the hip nature of the company sending the email I still don’t believe it.)

3.) Perhaps the clever subject heading had gotten some responders who wouldn’t have otherwise to open the email.  Ah!  Now we’re on to something.  (I’m not just saying this because it makes me sound better, I swear.)

Another way of phrasing this third interpretation is that those who clicked through and purchased were among the most engaged users of the site and were not heavily influenced by the cleverness of the subject line (or lack thereof).  If that’s so, then why had the simpler version resulted in more sales?  That’s not clear, though it is possible that given the list size results were skewed by one or two heavy buyers.

However, since our goal for that particular email was conversion to sales my rationalizations couldn’t hold sway and we went with the more simple heading.  But if our focus had been to build the size of our active list perhaps we would have run the other campaign; it had shown a small but decisive advantage in getting people to open the email.  Of course none of this takes into account the monumental importance that relevant and interesting content have on behavior once the email has been opened.  But that’s a topic for another posting.  Just remember, when running an A/B test take a close look at your analytics, they may reveal some interesting and unexpected behavior.  Even if it’s your own.