The adage “Show, don’t tell” is probably familiar advice. It’s a trope of undergraduate writing instruction, capturing the intuition that it’s better to experience a demonstration of something for ourselves, rather than to just hear about it second-hand. I’ve written before about the importance of communication skills in the analyst’s toolkit, and today I want to discuss a particular challenge that I personally struggle with as a professional analyst: when do you show, and when do you tell?
I’ll tell you upfront: virtually all junior analysts start out by showing too much instead of telling. The junior instinct is to follow a logical, chronological narrative thread: we start by explaining the background and context, why we even began to work on this particular problem. Then we get into the data we looked at, explaining the different steps we took to get a clean-enough dataset to work with. We follow this with an explanation of the analysis and modeling, the actual technical nuts and bolts of what we had to do. Finally, we arrive at some results from the analysis, and perhaps some actionable takeaways or recommendations.
The junior analyst has taken their audience on a journey, woven a narrative that shows – not tells – us what they found and how they found it. It’s a carefully crafted, coherent explanation showing their work – very much like how you might show your mathematics teacher the work you did to arrive at your final answer. This is exactly how academia teaches the budding analyst to communicate, and for all practical intents and purposes, this approach is basically wrong.
The most common piece of advice I give analysts looking for feedback on communication – and ironically, one of the most common pieces of feedback I’ve personally received on my own professional presentations – is to show less and tell more. We data people have to work hard to overcome this instinct – nurtured by our schooling, and perhaps also partly inborn in our skeptical nature – to show the work we did, to prove the rigor of our analysis.
Bottom Line Up Front
The US military originated the slogan “Bottom line up front,” or “BLUF,” meant to reinforce the importance of starting any message with the intended conclusion and takeaway. Asserting your conclusion before explaining how you reached it runs completely counter to the analyst’s natural mode of communication. But most of the time in managing a business, even if executed imperfectly, putting the bottom line up front is more effective than the chronological academic style that we instinctively prefer.
When you’ve done your job well, when your audience trusts you, they don’t want to hear you prove yourself. Getting into all the reasons they should believe you is a literal waste of their time if they already trust you and your work.
In academia, who you are and what you’ve done before is, in principle, irrelevant: you need to establish the credibility of your research in of itself to get anyone to take you seriously. We need this rigor to search for truth in an academic setting – but it is not how most management teams or businesses operate.
When you’re trying to do things, rather than prove a theory, once you’ve established that someone is a reliable source of useful information, it’s pointless to force them to prove themselves over and over as if you don’t know whether to trust them.
This doesn’t mean that the business audience doesn’t care about how you reached your conclusions, or what any caveats might be. Putting the bottom line up front means there’s still a back to the message. Instead of “show, don’t tell,” my advice would be “tell, and then show.”
If you have some management consulting experience, or have worked with ex-consultants, this advice may sound familiar. It’s not much different than Barbara Minto’s famous Pyramid Principle, now often taught to junior consultants as a simple framework for outlining presentations and memos.
In my experience, the main reasons BLUF seems to work so well in management contexts are:
The most important people in the room are often the busiest and with the least context; thus, they have the least ability to appreciate fine details. Putting the conclusion first ensures they don’t have the chance to zone out and lose sight of your key findings.
The non-technical, non-data people are typically the ones who can actually do something to shift the business based on the analysis – but they don’t always (maybe even rarely) need to know what analysis was done in order to act; they just need to know the results.
Even technically-fluent people eager to dive into the details of the work benefit from knowing its outcome upfront – they can exercise their skepticism more effectively as they review how the work was done, if they know where it ultimately leads.
If you’ve been paying close attention, you might also have noticed that I both put the bottom line up front in this section, and first told you my conclusion before I showed you my reasoning. Would I have grabbed your attention quite as effectively if I’d taken the more traditional chronological narrative of showing you my thought process before stating the conclusion I believe you need to take away?
Know Your Audience and Cater to Them
You might have noticed a thread running through the three major reasons I listed for why I like BLUFfing so much: despite being a simplistic, one-size-fits-all recommendation, putting the bottom line up front is actually a great way to ensure your communications serve as versatile an audience as possible. “Know your audience” is another cliche of basic public speaking advice, but it’s not something the data person fresh out of school usually knows how to apply when presenting their analysis.
There are so many things one can do to tailor a message for different recipients, but one simple thing that ensures the same message is received well by diverse groups is putting the bottom line up front. The academic explanation of technical analysis followed by the results is well-suited to a crowd interested in how the work was done, but very ill-suited to non-technical groups who just want to know what this work means for them. Meanwhile, telling everyone about the results first, and then showing the work to anyone who wants to stick around for the details, ensures a much wider group of stakeholders will find the message relatable and actually remember it.
That said, “show don’t tell” is going to be more acceptable and occasionally even more effective for very particular audiences. I find that when sharing interim updates or looking to provoke open-ended discussion, especially within a community of technical peers, it’s totally fine and sometimes preferable to just show the chronological order of what you’ve been doing and where it’s led so far.
But when preparing something that will be seen by a broad audience outside one’s immediate team or those in a position to give detailed tactical feedback on the work, I find it’s best to tell them upfront what you’ve found, and then show how you got there. And when presenting the work live to a general audience – or especially a senior audience – you’ll want the bulk, if not all, of the presentation to be focused on your conclusions, not how you got there.
Sell the Audience on Your Results
When there’s an analysis worth sharing, that’s usually because there’s something the analyst thinks the business should do. Sometimes there’s a concrete recommendation; sometimes it’s just a vague strategic direction to consider; and sometimes it’s “there’s really nothing data can do for you here right now, so please stop asking your analysts to dig into this.”
Regardless of what the analyst hopes the business will do with their work, this makes the communication of their findings imperative. In the educational setting, the point of an analysis report is to show whoever’s reading it (usually a teacher of some kind) that we did the work, and we actually knew how to do it. The only action we hope for from our audience is to give us the grade that we want.
If we analysts were to stay in academia, the point of the research papers we’d write would be to enable other researchers to reproduce our work. We’d need to outline our research methods in enough detail that someone else could follow the same steps and achieve the same results: this is the action that we’re hoping for from our most skeptical, engaged readers.
Neither of these academic outcomes are what we’re after when we present findings to a business stakeholder. The chronological outline of research methods is generally not going to convince a non-technical executive to do what you want them to do. Ideally, they trust your word as the analytical expert.
The business neither wants you to make them question your methods (which boring them with your detailed analysis outline is likely to invite), nor waste their time re-establishing your credibility before getting to what the data actually means for them. If your boss hired you and allowed you to go in front of a business leader, it’s taken for granted that you have the credibility to tell them what the data says.
It’s often still helpful to sell the business on our analytical credibility before we tell them our results, especially earlier in our career. But the business is rarely in a position to assess the technical choices we made or the soundness of our approach. If you feel the need to reassure a business audience of your credibility, typically the most effective way to do this will be to borrow the credibility of other analysts that they already know and trust – your own management, your peers.
So, you might well wind up showing your work to others in the data community, and walking them through your methods much like academia trained you to do. Leading a successful review of your peers can sometimes be a prerequisite for successfully selling business leaders on your work. This kind of peer review can benefit from the traditional approach of showing your methods, rather than telling your conclusions.
But when the work has passed your peers’ review, there’s even less need to show non-technical stakeholders the work you did. If you can tell the business that the inner workings of your modeling and analysis have passed those technical folks’ review, this is typically more than enough to satisfy any technical skepticism from the business.
Surviving an academic review of your work is not the litmus test you need to pass for successfully shipping an analysis project – at least not in the businesses I’ve been in. Ultimately, the point of doing the analysis was to get the business to do something with it. And when sharing these findings with the business, I’ve found it’s virtually always best to start by telling everyone upfront what you’ve found.