From Data Overload to Data Storytelling
Districts share enormous volumes of data with boards and communities, but volume is not the same as clarity. When we present forty data points in twenty minutes, the likely outcome is confusion — not confidence. Data storytelling is not spin. It is the discipline of organizing information so a non-specialist audience can follow a coherent line from context to finding to implication.
There is a moment in almost every board meeting or cabinet discussion where a well-intentioned administrator shares a spreadsheet, a data table, or a set of charts — and the room goes quiet. Not because the data is alarming or revelatory, but because no one is entirely sure what they are supposed to take from it. The numbers are there. The formatting is clean. But the meaning is not self-evident, and the silence that follows is the sound of people trying to figure out what they are looking at.
I have been on both sides of that silence — as the person presenting the data and as the person trying to interpret it. And I have come to believe that the core problem in how most districts communicate data is not that we share too little. It is that we share too much, with too little attention to what the audience actually needs to understand.
The Overload Default
Districts generate enormous volumes of data. Assessment results, attendance patterns, enrollment figures, discipline records, intervention outcomes, demographic breakdowns, survey responses, budget allocations — the list is extensive, and every data point has value in the right context. The challenge is that value and volume are not the same thing, and our default tendency as administrators is to err on the side of more.
This tendency is understandable. When we present data to a board, a community group, or a state agency, we want to be thorough. We want to demonstrate that we are paying attention to all of it. We worry that leaving something out will invite the question we did not prepare for. And so we build slide decks with thirty pages and reports with twelve tables and dashboards with every metric we can surface — because comprehensive feels responsible.
But comprehensive is not the same as clear. And in the experience of most people who sit through data presentations in education, the effect of comprehensiveness is not confidence. It is confusion. When a board member is shown forty data points in twenty minutes, the likely outcome is not that they understand the district's performance. It is that they remember the one number that concerned them and forget the rest.
We have confused the act of sharing data with the act of communicating meaning. They are not the same thing.
What Storytelling Actually Means
I want to be careful with the word "storytelling," because in education it can sound like spin. It is not. Data storytelling is not about making numbers say what we want them to say. It is about organizing information so that a non-specialist audience can follow a coherent line of reasoning from context to finding to implication.
A data story answers three questions in sequence. First: what are we looking at, and why does it matter? This is the context — the goal, the problem, the question the district is trying to answer. Second: what does the data show? This is the finding, presented simply and without requiring the audience to do interpretive work. Third: what does this mean for what we do next? This is the implication — the decision, the adjustment, the direction.
Most district data presentations skip the first question and the third. They go straight to the data — here are the numbers — and leave the audience to construct the context and the meaning on their own. When we present assessment results without first establishing what we were trying to achieve and what success looks like, we are asking the audience to do cognitive work that we should be doing for them.
This is not a matter of dumbing anything down. It is a matter of respecting the audience's time and attention by doing the interpretive work in advance and presenting a coherent narrative rather than a collection of numbers.
The Difference in Practice
Consider two approaches to presenting literacy assessment data to a school board.
In the first approach, the administrator shares a table showing proficiency rates by grade level, subgroup, and assessment window. There are forty-eight cells in the table. Some are highlighted. The administrator walks through the rows, noting which subgroups improved and which declined. Board members nod. A few ask clarifying questions. The slide is archived.
In the second approach, the administrator opens with a single statement: last year, we set a goal to increase third-grade reading proficiency by five percentage points, with particular attention to students receiving Tier 2 interventions. Then the administrator shows one chart — third-grade proficiency over three years, with a line for Tier 2 students and a line for all students. The gap is narrowing. The administrator explains what changed: a new intervention protocol was adopted in September, fidelity was monitored monthly, and the results suggest it is working. Then the administrator names the next step: expanding the protocol to fourth grade in the fall.
The second approach uses less data. It communicates more. The board leaves with a clear understanding of what the district is working toward, whether it is making progress, and what comes next. The first approach shared more information but communicated less meaning.
Why This Shift Matters for Trust
The connection between data communication and community trust is not abstract. When stakeholders — board members, parents, community members — encounter district data that they can understand, the effect is not just informational. It is relational. It signals that the district respects them enough to communicate clearly. It signals that the administration knows what matters and can articulate it without hiding behind complexity.
Conversely, when data is presented in ways that are difficult to parse, the effect is not neutral. It creates distance. A parent who opens a district dashboard and sees twelve charts with no context does not think "this district is thorough." They think "this was not designed for me." A board member who receives a forty-page data report before a meeting does not feel informed. They feel overwhelmed, and they begin to wonder whether the volume is intentional — whether the data is being shared in a way that discourages scrutiny rather than inviting it.
I do not believe this is typically the intent. But the effect is real, and it is worth taking seriously.
Making the Shift
Moving from data overload to data storytelling does not require new technology or additional staff. It requires a change in orientation — from "what data do we have?" to "what do our stakeholders need to understand?"
That shift begins with a simple discipline: before preparing any data for an external audience, identify the one or two things that audience needs to walk away understanding. Not the twelve things we could show them. The one or two things that matter most. Everything else is supporting detail, available on request but not leading the conversation.
It also requires us to think differently about what a "complete" data presentation looks like. Complete does not mean exhaustive. Complete means the audience received enough context to understand the data, enough evidence to believe it, and enough direction to know what happens next. If those three elements are present, the presentation is complete — regardless of how many data points it contains.
This is a challenging shift for administrators who have been trained to value thoroughness. But I would argue that the most valuable data presentations I have witnessed in my career were not the most comprehensive. They were the most clear. And clarity, in the end, is what builds the trust that our communities are looking for.
CongratsGrad builds embeddable, public-facing dashboards for K-12 school districts.
Start a conversation