1:05 Jackson: So, before we even crack open the software—whether that’s Tableau or Power BI—where do we actually start? Because my instinct is always to just start dragging fields into a workspace to see what happens.
1:18 Nia: That’s the classic trap, Jackson. It’s what I call "experimental clicking," and it’s how cluttered dashboards are born. The real pros tell us that great dashboards are won or lost before you ever touch a tool. You have to start with the "Why." Why does this specific dashboard exist? Is it to replace a manual monthly report, or is it a high-level health check for an executive?
1:40 Jackson: Right, because a CEO looking at a health check is going to want totally different things than a regional manager tracking daily inventory.
1:48 Nia: Precisely. You have to design for a specific audience, not "everyone." An executive wants those high-level KPIs and trends—the "big boxes with big bold numbers" as some users put it—whereas an operational user needs granular detail. If you try to serve everyone with one screen, you end up serving no one. One of the best frameworks for this is the "C-cubed" model—Clarity, Context, and Continuity.
2:12 Jackson: I like that. C-cubed. So, Clarity is about making the metrics obvious, but what about Context?
2:19 Nia: Context is huge. A number like "Revenue: eight hundred thousand dollars" means nothing by itself. Is that good? Is it bad? You need a benchmark, a target, or a trendline. Without that, you’re just giving them a raw number and asking them to do the mental math. And the final "C," Continuity, means keeping the visual language the same across the whole organization so users don't have to relearn how to read a chart every time they open a new report.
2:45 Jackson: So it’s basically about reducing "cognitive load." I’ve heard that term a lot—the mental effort it takes to understand what you’re looking at.
2:53 Nia: Exactly. Cognitive load is the silent killer of dashboards. Research shows that our working memory can only handle about seven items—plus or minus two—at once. So, if you’ve got fifteen different KPIs on one screen, the human brain just starts checking out. You want to aim for five to seven key metrics per view. Anything more than that should probably be on a different page or hidden behind a drill-down.
3:14 Jackson: That makes total sense. It's like a battle plan—you wouldn't put every single soldier’s name on a strategic map; you’d show the units and the objectives. So, if we’re limiting ourselves to seven things, how do we decide which seven make the cut?
3:29 Nia: You map every single element to a user decision. This is a rule I live by: if a metric doesn't trigger an action, it doesn't belong on the dashboard. Ask yourself, "What will the user do differently after seeing this number?" If the answer is "nothing," then it’s just decorative noise. You’re building a decision-support engine, not a piece of art.
3:49 Jackson: That’s a tough standard! But I can see how it forces you to be ruthless. You’re not just showing data because you have it; you’re showing it because it matters for the next ten minutes of that person’s workday.