An interesting paradox of the analyst profession is that in theory we are supposed to be objective and factual: our job is to help decision-makers understand what’s happening to the business through taking empirical measurements. This job is not about having and staking out opinions; the whole raison d’etre of the analyst is to be the dispassionate one who can cut through the haze of subjective opinions with the cold clarity of facts and numbers. 

For analysts early in their career, this is fine: the core foundation of the job is knowing how to take measurements and learning the technical tools of the trade. But the more one advances as an analyst, the more I find it helps for analysts to have an opinion, and to voice it – even though this often goes against the very instincts that got us this job in the first place. Today, I thought I’d share some reflections on how I think wrestling with opinions and human judgment can shape a data professional’s career.

Analysts Just Want to Call the Balls and Strikes

I once worked for a COO who loved to say that my job was to “call the balls and strikes” – basically, comparing the role of analyst to the umpire in baseball. Many analysts – as well as our stakeholders – see the job as simply about following where the numbers lead us. We are objective truth tellers for the business, when it comes to numbers.

I agree this is a big part of our role, but I also think this sells us short; it underplays the massive amount of judgment often required to perform the job of the analyst. As I’ve previously written, I think a large way in which analysts earn our keep is by being excellent communicators, because our job is to help the non-numbers people in the business understand what the numbers mean. Since humans understand our world through stories, the analyst’s job is to use numbers to tell a story – or to put it differently, to tell a story that reflects what the numbers seem to be saying.

This means analysts and data scientists are not mere dumb weighing scales; we are not just objective reflectors of reality. Especially in explaining complex matters, analysts inject our subjective perceptions and our own judgment (backed by our professional experience) into our work. 

This is not just my opinion; it’s backed by research. Famously, one study gave 29 teams of scientists the exact same dataset and research question: do soccer referees tend to give dark-skinned players more red cards than light-skinned players? Even despite being allowed to compare their analysis plans (but not interim findings) before finalizing their results, no two teams had the exact same numerical result, and they split 20-9 on the question of whether there’s a statistically significant bias amongst referees awarding more red cards to dark-skinned players.

This does not have much to do with personal beliefs infecting analysis: the study found no significant effect from the analyst teams’ prior beliefs about the research question on the ultimate findings that they reached. There is just a vast amount of subjectivity and professional judgment required to perform the task of analysis: the study found there were 21 unique combinations of variables used by the 29 teams in their analysis, and that a large amount of the variation in the teams’ results was driven by whether they had chosen to use a logistic instead of a linear model to answer the question.

You don’t need to know what a logistic or linear regression are to know that the question of which model to use is itself a subjective judgment call that data scientists have to make based on their professional experience; there is rarely a strictly right or wrong answer on which model to use, or which sets of variables you should use to build your model.

Data Analysis is Often As Much Art As it is Science

The artisanal craft of analysis is what makes me a little skeptical that AI or LLMs are coming for my job, at least in the very near future. The end to end process of talking to the business about their vague, loosely-specified questions, turning those questions into something more specific that can be tractably answered by analysis of data, and then actually performing that analysis are all tasks that require a lot of human judgment. 

Even if you ask an LLM to do this for you today, you will need a professional analyst to evaluate whether you can trust the LLM’s response; how would you know whether the AI made the right call to use a logistic instead of a linear regression model?

Despite all the intangible value that analysts create in the un-automatable, non-calling balls and strikes part of our job, it’s easy for both analysts and stakeholders to fall into the trap of selling this stuff short. The farther you are from the mess of the data, the easier it is to think that analysts are just taking straightforward measurements of objective reality rather than making judgment calls on what to measure and how it can even be measured

And the type of person drawn to the world of data is often one who finds the concept of “data-driven decisions” appealing. Analysts often forget that leaving a decision up to what the numbers say doesn’t mean we haven’t made some decisions ourselves. That’s because the analyst personality is often comforted by the aura of objectivity that a quantitative decision-making process emanates.

Think of the type of person drawn to the quantitative fields while we’re in school. We stereotype these people as robots (something my wife and friends only half-jokingly call me); stoic, emotionally unaffected. We are all human, and even the most analytical among us can feel deeply-held emotion – but the kind of person drawn to analytics and data is still likely to be someone who prefers to make decisions by means of cold calculation rather than gut feeling.

This makes it unnatural at best and deeply discomforting at worst for the analyst personality to grapple with the inherent amount of subjectivity and judgment in our work. All experienced analysts and data scientists are deeply familiar with having our work critiqued by fellow analysts, or reviewing the work of fellow data practitioners, and realizing that there are so many different, valid ways to skin a cat. But despite the practical knowledge that there are so many ways to do the same analysis, we are still reticent to suggest that our job sometimes relies more on gut feeling – or at least gut-informed judgment – than it does on there being one singular true, objectively correct way to do the numbers.

But because there are so many possible ways to do an analysis, and therefore so many judgment calls the data scientist must make, it behooves them to both be aware of what subjective calls they are making, and to make these calls ideally in line with how the business intends to operate. A big multinational bank is going to have a completely different risk tolerance than a startup that sells celebrity video messages, which means the risk tolerance of analysts at these companies in how they conduct their analyses has to be different too. The people in charge of analysis at these companies are not just “calling the balls and strikes” as impartial umpires; they are making subjective decisions based on what they think their business needs too.

The Pretence of Cold Quantitative Objectivity Weakens the Analyst

Because analysts are often seen as just the “numbers people,” businesses – even ones that embrace analysts as profit centers – typically don’t involve analysts much in the decision-making process once the numbers have been presented and reviewed. Everyone knows after all that more usually goes into a decision than just cold hard quantitative analysis. Analysts may know how many users have clicked a button, but they won’t and shouldn’t pretend to know why users are clicking (or not clicking) that button. That’s why other functions like Design and User Research are usually involved in the product development process at a technology company.

And even if you have some ideas as to why users are doing something, that doesn’t mean you know what to do about it. You often need an array of perspectives from product, marketing, operations, sales, and strategy (even if all these hats might be worn by only one or two people if you’re a small startup) in order to evaluate your options. Product might suggest moving the button into a different part of the user journey; marketing might suggest informing customers about it via email and push notifications; sales and operations might point out there’s an opportunity to talk about this new button in their agents’ scripts. Analysts rightfully won’t and shouldn’t be representing these perspectives when there are other people in the room who know much more about them.

But analysts who hide in this silo and refrain from participating in decision-making forever are going to have a cap on their career progression. If your brain is truly empty of possible ideas about why users are doing something unexpected, you’re never going to think of new analyses you could potentially pursue to confirm or rule out hypotheses where the data might have something to show. 

If you never have ideas about actions that the company could take – or should not take – to achieve its objectives, you’re never going to think of analyses that you could do to examine whether similar actions have been successful in the past, or might succeed in the future. Cutting analysts out of the conversation about what to do next leaves them poorly positioned to build on their work.

This is especially dangerous in a startup environment. In big, mature companies, every function I’ve named above – and more (Content, Compliance, Finance) – is going to be very well-represented at the table in every decision, down to the finest pixel. There’s less of a need for a vocal or opinionated analyst there; it’s still beneficial to have an analyst who’s thinking about what comes next, but not mission critical. In a startup big enough to collect meaningful data about its product, though, it’s essential for experienced analysts to speak up. 

The analyst’s opinions won’t always or even often be right – but every single function at a startup is under-resourced, and every person in the room needs to be comfortable with wearing multiple hats and thinking holistically about their business. By virtue of both their proximity to decision-makers and data on the health of the business, analysts are often going to have a generalist perspective on the business that is unrivaled. 

And once they’ve done multiple tours of duty across different businesses and different economic environments, analysts will have seen many ideas that have worked – and many that have not. At a startup, this experience is invaluable because every single function at a startup is comparatively immature, and so every function needs to be able to contribute ideas and think proactively about how they can contribute to the decision process beyond their role’s minimum viable scope.

Opinions: An Analyst’s Nightmare, but Also Potential Superpower

The idea of having opinions – or ideas that don’t directly come from the data – is typically scary for most analysts. We are at sea when we have something to say that we can’t conclusively prove with data. The exact same analyst who has a hunch about something, and can even explain to you why they think the data is pointing in that direction, can also explain to you all the reasons why their hunch could be wrong. And they aren’t likely to want to share that hunch with you until they’ve tortured the data enough to figure out whether their hunch is right or wrong.

This instinct is, I think, the analyst’s superpower. By being able to see the quantitative pros and cons of various options, and also instinctively being drawn to question all the ways in which each approach might be flawed, analysts are uniquely able to frame up the empirical basis for any decision – and also call out all the subjective, non-empirical factors they can see or have heard about which might sway the decision another way.

Analysts are often well-positioned to frame and drive a decision-making process for a team. Because of their role, analysts often have the best quantitative information and have heard a lot about the qualitative factors influencing a decision. And so as long as they have enough structured thinking and communication skills, they can play a critical role in marshalling a team towards a decision by summarizing what they’ve seen and heard. 

Even though it’s the product manager’s job, not the analyst’s, I’ve never met one who didn’t appreciate it when their analyst went out of their way to produce an executive summary; PMs are too busy and this task is often an easy one for an analyst to pick up

On most cross-functional teams, the product manager or business lead is the person who should be driving the decision process. But while they may own this process and the ultimate decision, realistically they are often not able to devote the time to do everything you should be doing if you want to make an informed decision – or explain the process of arriving at that decision to others.

An analyst who’s already done a lot of the thinking around all the pros and cons of a decision that can put all this thinking down into a clear document or slide deck saves their product manager precious time and capacity. On a ship, it’s the captain who owns the decision of where the ship should go – but the helmsperson and navigator still play critical roles in aiding the captain’s decision.

Still, it’s rare for anyone to ask, let alone expect, an analyst to directly aid in driving the decision process. Most analysts won’t step up to do this proactively or ask if they should do this. Most stakeholders won’t request it either. Every now and then you might get a manager from the data team who suggests this (I’ve often been that manager). But even then, it’s not common for the analyst or data scientist career ladder – the framework at a company which outlines what skills and competencies you need to get promoted – to call out structured problem-solving or building decision-making frameworks as core skills required for you to advance your career. Most teams I’ve been on nod to this in some way, but typically as a bonus or ancillary thing.

And I think that is probably wise. Because most analysts instinctively don’t want to own or drive decisions, it seems inefficient to try to train them out of this instinct if you don’t need to in order to get what you need from them. Most – arguably all – startups don’t plan to stay small forever, and big companies definitely don’t need analysts to behave this way, even though they’d still benefit from and ideally welcome it. So why go out of your way to make your analysts uncomfortable by setting an expectation they shouldn’t even need to meet in order to do a decent job?

As a people manager, while I’ve often encouraged my team members to take the proverbial wheel around framing a decision for their stakeholders, I rarely push them to do this as forcefully as I – or many other of the incredibly talented analysts and data scientists I’ve worked with – would. Analysts are worth more than the sum of our parts when we can blend our knowledge of data with qualitative knowledge of a business, and can thoughtfully share both the facts and our opinions in a mature way. But that’s a lot to ask of the typical analyst.

When the Analyst Has an Opinion, They Ought to Own It

For my money, what is non-negotiable as an analyst is owning your opinion. It’s ok to shy away from aggressively pushing your opinion. But it’s critical that you know what your opinion is – if you have one. I think even analysts who are content to wait for marching orders benefit from being self-aware enough to know that they often are human enough to have opinions, and from reflecting about why they came to hold those opinions. When you truly don’t have an opinion, being self-aware of that is useful too – this could indicate that you should learn more about this area, or alternatively, this could be a sign that this is an area where it’s actually not useful for data to be driving the decision.

The benefit of actively participating in the cut-and-thrust of a decision-making process is how it forces everyone’s opinions – and their reasoning for them – out into the open. Analysts, due to the very core of our role really being to “call the balls and strikes,” don’t get this practice of thinking about and sharing our opinions. Our personality type tends to avoid injecting our opinions into conversation, and other functions and leaders are rarely going to invite us to share much beyond the numbers. I think this tendency leaves us worse off as analysts – so for our careers’ sake, we are very lucky that it is far from fatal to having a successful career as an analyst or data scientist!

The tough news, though, is that the reactive analyst who shies away from sharing opinions or proactively shaping decisions is probably the type of analyst most immediately threatened by the rise of AI. I’ve used – and seen colleagues use – AI enough in the field to think the threat it poses to analyst jobs in the short term is overblown. But the type of analyst that current LLMs come closest to obviating is the analyst who waits to be told what to do, and who tries to avoid engaging with business decisions.

Whether or not the robots eventually come for our jobs, or AI just becomes another generalist business tool similar to spreadsheets and SQL, there will still be a spread among analysts – there will always be some who want to more actively shape business decisions than via just pulling the numbers. Many analysts with this inclination eventually leave the field for a different discipline – product, marketing, finance. Some, like myself, wind up building extensive data careers on the back of being the uncommon data scientist who proactively thinks about the business. 

Of course, both paths are valid for a data professional; one sometimes leads to the other. In future posts, I’ll have more to say about how analysts with this bent can pursue these paths – though I hope for the professional analyst community’s sake that taking these paths continues to be a choice, rather than a necessity forced upon us by AI!

Reply

or to participate

Keep Reading

No posts found