It’s never been more important for education systems to demonstrate impact from their professional learning programs. The policy shift from test-based outcomes to more flexible, comprehensive success standards has brought new possibilities — and challenges — to every level of operations, and that includes professional learning.
Join KickUp as we examine six of the most common roadblocks to thriving data cultures. With practical tips, guiding questions, and real-life examples of districts excelling through data, you can push your professional learning program to new levels of effectiveness.
Inertia is a fact of life. No matter how detailed or well-designed your data plan may be, it won’t flourish without consistent attention. But it’s not always feasible to add dedicated data meetings to your team’s already-cramped schedule. Instead, consider what opportunities for small but frequent reviews already exist in your system.
Today: creating consistent data checks that hone your skills and build your team’s momentum.
.
TACKLING COMMON DATA ISSUES | PROBLEM 6
Never underestimate the power of habit, particularly when it comes to good data practices. The number one danger in a fledgeling program is that initial burst of enthusiasm for collecting data which then goes unused, wasting your time and killing the project’s momentum.
This doesn’t mean you need to review every piece of information you’re tracking every day — quite the opposite. Aim for a single key metric or an important (but limited) area where predictive data can provide immediate and valuable benefits. Then use the results to build credibility, excitement and drive. Our top 6 tips for creating consistent data cultures:
With the busy schedule of an instructional coaching team, scheduling new data review meetings is sometimes just not feasible. Instead, ask where you might find opportunities for review in existing calendar slots. Weekly team check-ins, one-on-one sessions, or quarterly all-district meetings all present the opportunity to go over the numbers as a group — even if it’s just five minutes at the start.
Building a data habit from scratch is no small task. Rather than crafting a full dashboard, select a point or two to focus on at the start. But how do you decide what those points should be?
As a rule of thumb, the more frequent the review, the more granular that single data point can be. A small group of instructional coaches that meets once a week has the necessary familiarity with the data to draw conclusions from one metric. A biyearly all-district staff meeting, on the other hand, is better served by a walkthrough of high-level trends.
Let’s say you’re tracking how new teachers rate their proficiency with the district’s PBIS framework. By the end of the first semester, your goal (based on previous years’ experience) is that 85% of the teachers will rate themselves as “proficient” or “highly proficient” in the framework.
Someone whose work doesn’t involve PBIS may be concerned if they check in halfway through the semester and only 28% of the teachers have met the goal. But someone who works in PBIS full-time knows, based on past years’ experience, that there’s usually a sharp upward tick in November as the new teachers master the learning curve and gain confidence in their skills.
The first person in this example doesn’t work with the project, so they only need a high-level view, such as the overall numbers for the year. The second person, though, has a driving need to keep up with the data on a week-by-week basis. 28% might not be a cause for concern at the halfway point, but 8% might be — and catching the problem early means fixing it in time.
Above: Our own Carlye Norton speaks to this idea in our recent conversation on implementation fidelity with Youngstown City School District and Central Rivers Area Education Agency. Watch the video or get the recap here.
Speaking of group size, smaller and more frequent reviews also allow for a deeper level of discussion and disagreement on what the data implies. Celebrate critical thinking, curiosity, and the deeper desire to question the numbers — even if that means the conversation spans a few weeks.
Leadership plays an important role in setting the tone: by beginning the conversation with open-ended questions, you create space for practitioners to question and make their own meaning from the data.
SMART — Specific, Measurable, Attainable, Relevant, and Time-Based — philosophy is usually applied to the task of goal-setting, and the frequency of data reviews is as important a goal as any for your team.
If you’re just starting to focus on data as a program leader, you probably don’t have much in the way of historical context. Here’s where you look to your logic model: which short-term goals are the most critical to your larger mission? Of those, which have associated data points that are easiest to check and monitor at frequent intervals?
When reviewing data with your team, give them the opportunity to do their own exercises with the data as well as sit-and-get takeaways. This doesn’t need to be a lengthy production: it could be as simple as asking for their conclusions after a few minutes’ access to your data dashboard, or having them brainstorm ideas on future questions they’d like to see answered by data.
As a bonus, this kind of user testing can reveal weaknesses in your strategy — perhaps the wrong metric or a missed collection opportunity — early enough to course-correct before the bad data makes its way to the full set.
Consider creating a scorecard or data dashboard that handles the raw analysis, so the team can focus on interpretation. People are visual thinkers: it’s much easier to see trends and differences in a set of columns than it is to track a row of spreadsheet numbers. A little time spent setting it up at the beginning of the school year can turn into hours of time saved by the end. Plus, you’ll have a clear historical data set for more insights in the future.
Belton’s improvement plan is a living, breathing document. After defining the high-level goals, Assistant Superintendent of School Improvement Lorenzo Rizzi worked with school leaders to break out adult and student actions into quarterly outcomes.
During monthly one-on-one meetings, Dr. Rizzi and his principals spend time reviewing their action plan, progress monitoring student and teacher data, then addressing related questions and concerns. This structure keeps meetings focused on the most pressing work to be done. It also ensures everyone is swimming in the same direction.
1: Review School Improvement Plan (SIP) action steps
2: Progress monitor – Student data
3: Progress monitor – Teacher data
4: Attendance data
5: Questions, concerns, discussion
Schedule a demo with one of our friendly team members.