Thursday, December 20, 2007

Some ElfYourself statistics

Following on from the previous ElfYourself post, I found an article from Promo Magazine that talks about the 2006/2007 Christmas ElfYourself viral campaign. The article sings the praises of the little elfs - lots of mainstream press coverage, website traffic up, lots of buzz etc., etc. These are some of the specific highlights:

At its peak, this site generated an average of 41,000 elves per hour or 11 elves per second. Revelers created 11 million elves in all, and the site drew 36 million visitors in the five weeks it was up, from Nov. 27 through Jan. 1. Better yet, it helped boost traffic at OfficeMax.com by 20%.
So three is no doubt a lot of elves were created. Yet did any of this translate into real worth for OfficeMax? A 20% rise in traffic on the site seems good but if you look at Alexa (and yes, Alexa has issues with its sample), www.officemax.com had similar increases in the three years prior to running the campaign - it's called the 'holiday shopping season'.


If you look at Google Trends for 'Office Max' searches, you get similar results - the years without the campaign look pretty good in comparison - although there is a slight upward trend recently.


Besides, some increase is to be expected. If you have a viral campaign that generates tens of millions of page views, you would expect just by sheer chance x% of those people clicking through to OfficeMax just curious to see what else is happening. Are they motivated to purchase once they are there? Maybe. Given that it's a web campaign it shouldn't be hard to figure out yet I can only find site traffic statistics, not incremental sales, as a success metric for last year's effort.

When you do a brand campaign in a non-direct response medium such as TV or radio, you accept the limitations of measurement. Surveying and tracking awareness and changing impressions towards your brand is about the best you can do (not always, but usually). There is no such excuse on the Net. The medium is demonstrably measurable. If you set it up right, you should be able to measure the incremental sales effect of an effort such as this. As well as the brand effect.

Maybe OfficeMax did measure it this way? If they did, you would think they would be shouting sales metrics from the roof if it was successful.

I think marketing in general would be considerably better off if its practitioners were more transparent about success and failure. The worst is to tout victory for efforts like this with scant proof that they do indeed work.

Fred Wilson, a prominent NYC VC, posted about this campaign on his blog - 'expect to see more efforts like this' was his summation. He rightly points out that the CPM (cost per thousand impressions) is far superior to a TV effort. That's true, but it doesn't lessen the need for the return to still prove profitable. Making nothing off less is still making nothing. Kudos to Matthew Reinbold for pointing this out in the first comment.

It's not hard to REACH people with an idea. It's not even that hard to make the idea MEANINGFUL to them. It's dam hard to translate that meaning into VALUE for your brand or your bottom line.

Despite all this though, I still made an elf. An action I will caveat with the fact that I did click through to the OfficeMax site, was there for 10 seconds, bought absolutely nothing and probably won't go back until next year when the call of the elf dance echoes on the web once more.

Merry Christmas everyone!

PS: I don't think I make a good elf, lol.


Digg this
Sphere: Related Content

Monday, December 17, 2007

Elf Yourself - could have been so much more!

Many of you would have come across the ElfYourself site from OfficeMax - I think it is ranked in the top 20 for holiday internet traffic right now.

It's a cute site. You can upload a picture of yourself and up to three friends/family members, crop the pictures to just the heads and then put them on the top of dancing elf bodies. All very viral stuff.

I sent one to my wife who loved it. I asked her afterwards what else she remembers about watching it. Not much came to mind. She had no idea it was from Office Max. I asked her what she might do now that she knew it came from Office Max - she didn't really know. Yes, it's a sample of one, but based on my experience probably a very common reaction.

Office Max just created one of the best viral efforts of the season, had millions of people sending and receiving their effort, yet none (or very little) of what was sent or received concerned the brand. Yes they would have generated a bit of awareness for the holiday season - which is good as top-of-mind choice is important in retail - but it could have been so much more.

Where is the daily competition for the best elf that wins you a $20 Office Max voucher - redeemable at any store and so drives online or foot-traffic? Where is the picture of the elfs working away hard in their shop, with Office Max equipment strewn all around so people can click on products the elfs use? Where is the ability to print your dancing elf free with any purchase at Office Max before December the 25th?

The list can go on. Just because you have a great viral, doesn't mean there aren't ways to promote both your brand and drive people to action.

10/10 for the idea, 2/10 for the execution.

Digg this
Sphere: Related Content

Thursday, December 13, 2007

US Firm Size

I was just looking around the IBM Many Eyes site and found that you can now embed visualizations into blogs!

One that I have always been fascinated with is the distribution of revenue in US businesses.

This below is the number of non-employer and employer firms in the US - the vast majority are smaller, one person outfits. Try playing with the selector on the bottom right. It changes whether or not you are looking at the number of firms, employees or revenue.

It's amazing how much revenue is tied up in such a small number of entities. Food for thought.

Digg this
Sphere: Related Content

Sunday, December 9, 2007

UPDATE: The 'squeaky wheel' wireless card

Well, the Linksys wirelesss USB card came in the mail today. The one I bought despite the number of bad reviews it got from Amazon.

So far so good. It gets a little bit quirky if you pull it out while running, but other than one driver reinstall when this happened, no other problems. Connects fast and as yet hasn't dropped a line.

When I was looking back at the reviews for this product, one caught my eye. It was from a poster called mhk1999. It caught my eye because I remembered the handle from somewhere. Sure enough, after looking back at other sites selling the same product, this reviewer had pretty much spammed the same negative review everywhere:

Amazon
PC World
Epinions
Shopping.com
Pricegrabber

When this product works, it works. However it will occasionally shut down and then the only way to reconnect is to shut down the whole computer. If you are in the middle of several things like working on photos, graphics and the internet, you have to shut down all those programs, restart the computer to get the internet to work and then pull up all your work again. Simply unplugging the adapter will not do it and there is no way to "repair" connection except shutting down the computer. I've had it with this adapter.

Now whoever this person is, they are definitely entitled to do this. However, from a fellow consumer's point of view, this heavily biases the negative side of this product.

This is one person's view from one trial of one product, posted AT LEAST 5 times on separate sites they definitely didn't all buy from.

Just another example of how self selecting samples can be dangerous.

Digg this
Sphere: Related Content

Saturday, December 8, 2007

The WSJ "numbers" guy

The Wall Street Journal (WSJ) has a "numbers" guy that just posted an article on how numbers can be very misleading.

He makes a good point, despite coming close to arguing himself out of a job :).

The good point is that the term 'statistical significance' has taken on such a aura in popular culture and business that anything claiming 'significance' from data must be true and important.

It's generally not - on either account. Large sets of data can produce all sorts of strange results and small sets all sorts of wrong conclusions.

Just something to keep in mind.

Digg this
Sphere: Related Content

Wednesday, December 5, 2007

The "squeaky wheel" syndrome

With all this talk about Facebook and the value of networks and recommendations, it reminded me of an issue I have had for a long time with Amazon reviews.

The problem is the 'squeaky wheel' syndrome - the 'squeaky wheel' is the only one you hear. Or put another way, you tend not to care when things go right, only when they go wrong.

If you work with service companies, you see this phenomenon reflected in customer feedback. I've seen industries with anywhere from 10-to-1 complaints-to-compliments, to over 30-to-1! People are more likely to give you feedback (negative feedback) when something goes wrong. You can bet your house on it.

Amazon collects reviews from millions of people every day. If you take a look at the ratio of good reviews to bad reviews, the ratio is no where near 30 or even 10-to-1. But this isn't surprising. Amazon promotes reviewing as a way to help people make choices. People are glad to help, so they post positive as well as negative reviews.

However, I guarantee that if you look at the distribution of reviews (good vs bad ratio), it is still a bad indication of how likely you are to be happy with a product. Just as the 30-to-1 complaints-to-compliments ratio of a service company tells you nothing about how good the service is (that particular company had less 1% of all customers complaining).

Case in point, this is the review distribution on Amazon for a cheap USB wireless drive.


If you read the 8 reviews that gave it a "1" you would probably never buy this product. They are scathing. And those 8 reviews are 30% of all reviews given! If you take all 27 reviews as representative of the 'average' experience with this product, you've got a 30% chance to end up with something you will hate. Not great odds.

But it's all a big lie. There is no way Linksys is going to release a product to market that fails, completely, in 30% of cases. It's just not going to happen.

What has happened is Amazon has drastically improved the ratio of bad-to-good reviews, but not to the point that you can be confident that they are representative of the 'average' experience.

This may sound a little bit nit-picky, but it's important to know. It's the same phenomenon at work when CNN post the results of invitation polls - it's a self-selecting group giving their opinion. Something vastly different from actually polling a population.

It's kind of an obvious point. But it's amazing to see how persuasive Amazon reviews (indeed customer reviews in general) are on people's purchase decisions. When most of them tell you little about your chances of receiving a dud (they do have other uses of course).

I ended up buying that drive despite the bad reviews. I will be big and post my experience (good or bad) when it arrives and I get it working.

Digg this
Sphere: Related Content

Tuesday, December 4, 2007

We live most of our lives between 0 and 100

I just came across this nifty applet that lets you look at the popularity of every number between 0 and 100,000 - based on the results of a prominent search engine.


Not surprisingly, we live most of our lives happily ensconced between 0 and 100.

I don't think there is anything earth shattering about this - it's pretty much common sense. But it's nice to see it displayed - that 'aesthetic' value of information again.

It also reminded me of a use of this type of distribution that is not commonly known. Apparently you can tell if someone is cheating on their tax forms by looking at the distribution of integers they use. If it's more perfectly random than it should be (too many numbers with the same frequency of occurrence), they might be cheating. Normal tax returns should have skewed distributions - more 1s than 7s for instance. When we cheat, we tend to be a bit too random - some important advice there.

I can't find a link for this though, but I know this type of phenomena has a name... need to do some digging.


PS: I don't condone tax fraud. I love the IRS. No, really, I do.

PPS: Isabel came to the rescue and just commented that the phenomena is call Benford's Law - thanks Isabel, I needed that. Was driving me crazy.

Digg this
Sphere: Related Content

Saturday, December 1, 2007

Lies, dammed lies and Magazine Subscriptions

Chris Anderson over at the Long Tail just posted this on magazine subscription offers. I'd highly recommend reading it if you are in the business of trying to persuade a potential customer to subscribe to your service.

He talks about how magazine subscription offers are full of lies and half-truths to try and hook you in. We've all received many of these in the past. I especially like the ones that come three weeks after a 'free trial issue' and warn you about an impending lawsuit if you don't pay your overdue subscription. Great scare tactics.

In the past, a positive ROI from these tactics was an indication of them working. But not anymore. Where Chris may have complained to a few friends, his wife, his bowling team (ok, I really don't know if he bowls), etc. These days he blogs about it. To thousands of people.

Blogs, Web 2.0ish technologies and online social networks have introduced very real consequences for marketers who try these tactics.

Please stop.

Digg this
Sphere: Related Content