Friday, November 16, 2012

Eye Tracking Revisited

A few years ago, I looked at Tobii's cool eye tracking technology as a possible means of evaluating the effectiveness of paint color merchandisers. I ended up getting a new job before I could complete the project, which was a crying shame given the phenomenally stupid metrics the company was using at the time to determine effectiveness of the display, such as number of color chips pulled per year. Like discrete choice research or any of the other "real life" simulation tools gaining in popularity (has anyone seen the growth of Affinova lately?), eye tracking opened the potential for us to figure out what the consumer really wanted to see rather than what we thought we wanted the consumer to see.

So I was excited to see that one of the Next Gen Market Research 2012 award winners was a company I had never heard about called Eye Track Shop. They claim to have perfected the ability to perform eye tracking using a regular Webcam rather than using expensive equipment like Tobii requires. If market researchers on the client side got the tiniest bit creative with this technology and it really worked, this change in cost could offer a revolution in a huge number of businesses.

Even in our business making industrial hardware, the user interface is critical. We now have the potential to borrow a handful of users for short periods of time over the Web to get reactions to early prototypes before we spend millions on tooling for a product that wouldn't otherwise gain user acceptance. We could also easily test iterations of our asset management console to see what improvements made it more user-friendly. We could even present prospects with versions of our trade show displays to determine what grabbed the most attention.

Imagine the possibilities! What about A/B testing on physical packaging without ever having to ship the package? Store display pre-testing for seasonal merchandising? Improved impact testing of direct mail calls to action? All now possible with inexpensive eye tracking.

Makes me want to start a market research firm. Stay tuned.

Thursday, November 8, 2012

Simple Modeling

For all you people who thought I was going to talk about supermodels, you can stop reading now.

Today's post is about the kind of model you use to determine your forecasted sales or the effects of a future rebate or the effect of a new product introduction. I have been thinking a lot about this kind of modeling lately because of Nate Silver, the statistics genius who accurately predicted the election results two nights ago. Today, the Guardian had an awesome explanation of the likely content of Nate Silver's model which is worth reading in its entirety.

Although Silver apparently uses an advanced statistical technique called hierarchical modeling to perform his analysis, a manager needn't have a degree in statistics to use something more basic but still useful. I put together a similar but simpler model at Strategic Energy using Crystal Ball, an Excel spreadsheet plug-in now owned by Oracle. The software allowed me to build inputs that had an effect on energy prices and then run a series of simulations describing what would happen to electricity prices if my various inputs fluctuated. I chose how each input would fluctuate (for example, natural gas prices might fluctuate in a normal curve by plus or minus 10%) over a period of time, and the model told me the statistical likelihood that the electricity price would get into the range at which we could compete against the regulated utility price.

It's relatively easy to use this kind of modeling in all sorts of applications. I used it again at PPG to help forecast exterior paint sales, using simple inputs we knew to affect our sales such as temperature, rebates, competitor rebates, advertising, and price competition. This analysis helped to show how unprofitable our existing rebate program was and how dramatically temperature spikes increased our paint sales, both of which led to savings and greater on-shelf inventory at our retail customers.

Amidst all this usefulness, I'm constantly amazed when managers prefer to use experience and judgement rather than data to make decisions. Crystal Ball costs all of $995. Why leave your decisions up to chance when you can get fairly accurate help from a fairly simple model for a fairly cheap price (or free if you're willing to learn the R statistics package)? Alternatively, you could spend hundreds of millions of dollars and just ignore the models like this guy did. Good luck with that.