Saturday, August 15, 2009

How to create a great analytical report

"Joy in looking and comprehending is nature's most beautiful gift" -- Albert Einstein

How many times have you been thinking/told that "we need to understand A", and automatically assumed than this means "we need a report on A"? After this conclusion, things usually get into gear - people get together to come up with metrics, they task analytical (code word for "data puller") or IT people to come up with the number, and eventually put it in a nice regular email report. Sometimes things work out, but sometimes the report becomes just one of those efforts that are never used afterwards. In my particular place of work, not only the process spits out something ugly and often completely unuseful, it also takes a Herculean effort wasted on it. The end result is usually hailed a great success, and one of those days you get an email stating that the great new report you have been asking for (no , that was not me) has been created and now published, but we can't send it over email because it is more than 10Mb zipped, so go pull it yourself. Oh, by the way, this report does not include some of the data because we could not trace it properly in the set up we had created, so, it is pretty much not useful, anyway.

How do you avoid this situation from happening? Here are some of my thoughts.
  1. Find out what the true question is.The assumptions about what should be in the report most often come from two camps - the executives and the data pullers. Whenever executive asks for a report that is too specific, like "shows percentage of customers who's discount expires next month", beware. Chances are that what the executive think will answer the question is not what will answer it in the best way, and often, it will not answer the question at all. My personal take on that is to reframe the inquiry, and simply go back to the root - ask the executives about question they are trying to answer. Most times, you will find out that the answer lies in totally different cut of data that was initially assumed. The second camp of people that I have encountered, are data people, particularly, IT people in the organization I am currently working for (was not the case in my other jobs, so I have to make a caveat for them). For some reason, they tend to skip all of the initial exploration steps and jump to a conclusion about what kind of data will answer the question without much regard to the question itself. Sometimes it is not their fault, "they are just doing their jobs", but have yet to see a great report built upon their assumptions.
  2. Research. Can't stress it any more since this is the most common error made in the process, especially, if you are creating something new and not particularly well understood. Before you create a report, you must know what measures to put in it. Sometimes the measures are pretty simple, and sometimes they are not - there is a million ways to slice and dice the data, so if you determined to spend your company's time and money on creating the report, it will pay a hundred-fold to come up with the best slice/dicing technique. A good deep dive research project should have the following properties: 1) it should explain how things work - i.e. what impacts what in that particular area, and by how much; 2) it should compare several metrics for the same process and look at them from different angles - by store/franchise/DMA/division/region, by day/week/month/year, by product type, month of month/year over year, and so on; 3) ideally, it should have that best cut that you just lift from the presentation into the report and feel confident that tracking this measure(s) will tell you all you need to know about the issue going forward in the most efficient manner, meaning, with the least number of measures possible.
  3. Revisit your decision about regular report. Review your research project with peers and executives and decide if creating a structured, regularly updated report is truly needed. Even though it seems not plausible, but understanding of most business processes does not require a regular report. Remember, the conclusion that "we need a report" was just a conclusion, and may have been a wrong one. A deep dive research project may be able to answer most if not all questions, and thus the regular report is not needed. The decision may be to refresh certain parts of the deep dive at some later time, but not to turn it into a regular thing. There may be other reasons not to create a report, including: 1) the data quality is not good; 2) you have not been able to answer the question any better than existing reports; 3) the data is too stable or too volatile, and tracking it will not be insightful (will it do you any good to know that 20% of your orders come from type A customers every month, and the number never budges even a little?); 4) there is insight in the research, but not enough to warrant regular reporting. Remember, a weekly report is 52 times the work of an annual report, and a daily report is 5 to 7 times work of a weekly report. Too many regular reports are not going to promote learning but rather clutter the mailboxes, so be selective about reports that are produced on a regular basis.
  4. Listen to the data - it has the mind of its own. When you finally lift that chart that you want to recreate in a regular report, you have to evaluate it for appropriateness for regular reporting. First question is how hard it is to pull. Sometimes, the very best and meaningful cut of the data is 10 times harder to pull than a cut that is just a hair lower in quality. Multiply the difference in difficulty by the frequency of the report, and see if it makes sense to go for the harder to get number. Second consideration is the amount of maintenance the report will require. What can be pulled for a deep dive not always can be turned into a daily email without daily maintenance of the code. Any regular data pulls must be very stable, i.e. not impacted by customary changes in the information or operational systems. The other side of the maintenance consideration is how much you can automate the report, which should be tied to the frequency. Daily reports must be "set it and forget it" automated. Monthly report may have some pasting and copying in them.
  5. Put metrics in context. One of the biggest mistakes I see made over and over is forgetting about proper setting of the metric. Let's say your metric is total sales by region. Now, how do you know if those sales are good or bad - the regions may be different size and have different sales trends or inherent idiosyncrasies, so comparing them is not always meaningful. You may compare to the previous time period, but what if your business is seasonal? That's where we usually get to the year over year comparison. Now let's say your sales this month are 20% down compared to the sales a year ago (automakers in 2008-2009 are a good example), that must be bad - but not until you see that in the previous months they were 30% down. Proper trending is one of the most important settings in which you should show your metrics. The regular report should be able to tell you if the results are good or bad, otherwise, it is not very meaningful, and nothing can put the metrics in perspective like trending. This also impacts the execution of the report, because it is one thing to pull the data for one month, and quite another to pull it for 15 or 24 months. In an ideal situation, your timeframe and properties of the metric were well defined during the research stage.
  6. Now, finally, it is the time to build and execute the report. First, comes the format. I have always maintained that a pretty chart works magic - so, do use pretty charts. Second, think about the report delivery, distribution list, putting it out on the web-site and all that pleasant stuff. Your analytical part is now over, but polishing your splendid work and delivering it correctly makes a cherry on a sundae. You are working in marketing, after all. Good luck!

No comments: