Scroll Top

Let’s open with a question.  Have you ever been to a data show?  I call it Data Theater.

This is an afternoon business meeting where the feature presentation is a highly competent team member or vendor driving Tableau or performing some similar data visualization exhibition.  Such meetings are often held in response to a broad strategic ask.  Then, after several weeks of analysis, key influencers are corralled back together to witness the results in real-time.

I admit, I too get sucked into demonstrations like this because they are fun.  Someone asks a relevant question and with just a few mouse-clicks the data projectionist changes the screen, and new answers appear as if by magic.  But then another, less specific question gets asked. And another.  Again, these tangential results are just as easily displayed and down the rabbit hole we go.

Here’s the telltale phrase to watch out for: “It would be interesting to see…”

It is with this opening that the trouble begins.  And it typically ends when the meeting time has run out, the projector has been shut down, no one is leaving with an action item, and some folks are wondering: “what just happened?”

Data analysis is both a science and an art.  It is also, very much, a profession requiring expertise.  So, where do we as leaders go wrong in influencing data analysis activities?  And what can we do about it?

First and foremost, we can work to understand our biases.  Whoa, what [record scratch sound effect]?!?

Surely you expected a different answer.  But a key part of our job as Chiefs, VPs, or any other significant leadership role is to ask the right questions.  Unfortunately, and all too often, what we ask and how we ask our questions has been influenced by one type of bias or another. While it is never our intention to act with bias, the fact is we often don’t even realize it is happening.

The following are a few common biases related to data analysis and decision making:

Availability Bias is the tendency for people to rely on information that comes readily to mind when making decisions.

Confirmation Bias happens as we are constantly on the lookout for evidence that supports our prior beliefs.

Distance Bias, which can be in terms of physical space, time, or other domains, reflects our instinct to prioritize what is closer at hand.

Experience Bias is when we assume our view of a given problem or situation constitutes the whole truth.

Outlier Bias hides the effect of outliers and anomalies.  Over reliance on averages, as an example, can skew our observations. Or, letting the exception “prove the rule.”

Similarity Bias occurs because humans are highly motivated to see themselves and those who are similar in a favorable light (also called Affinity Bias).  We gravitate toward people and answers that align with our own self views.

Just remember the next time a Chief or VP asks to “see the data themselves,” they are probably gearing up for an exercise in Confirmation Bias—looking through the data to find the answers they already know to be true.

As data leaders, we must be accountable for using our data to create actionable insights, not just data shows.  Data can tell stories, but it must also lead to clear recommended actions.  Too often, all we are doing is engaging in a series of interesting “did-you-knows.” But like every good story, our data stories need to end by resolving a conflict.  And, in the case of credit unions, they should end with a difficult business decision being solved.


Link to article: