How to include data in your everyday decision-making

“Data-driven” is one the most overused phrases in education–and while many administrators know they want to promote a data-informed culture, it’s hard to know how to put this into practice beyond scheduled data meetings. This is especially true for school district staff trying to use data to personalize their professional development in ways that are both meaningful and sustainable.

Below, we’ve compiled a number of tips and tricks for incorporating professional development data into planning, decision-making, and implementation.

1. Capture and review data formatively, not just at the end of the year.

Waiting until the end of the year to measure the effectiveness of your professional learning strategies is like basing a class around a single final exam grade: it may tell you if your students mastered the objectives, but it doesn’t set you up with the info you need to support them. Improvement takes time, but build in opportunities to collect and review data during the year so that you can address areas of weakness when they start to appear.

Put it into practice: Capturing formative data doesn’t have to mean over-surveying. Informal measures like having coaches drop in for check-ins with their teacher cohorts or quick polls after a PL event can be powerful too.

Identify a clear cadence (such as quarterly or monthly) for your data review meetings so that your team can identify areas of weakness, then plan opportunities for focused support.

2. Zoom both in and out.

Looking at high-level data to find trends across schools, grades, and subject roles can point you towards areas of needs or factors impacting your teachers’ success. Don’t forget to sweat the small stuff too, though; consider which schools are seeing success and those who need extra attention, then tailor interventions so that they’re personalized and targeted.

Put it into practice: Filtering data is powerful. Using KickUp filters makes it easy, allowing you to sort data visualizations by attributes like grade, subject, years of experience, or schools to uncover trends.

Users like instructional coaches or principals can also filter their data to only show their teachers, then break it down by teacher interests, strengths, and learning styles. Practice using a single data visualization (like the heatmap) and sorting by different attributes, writing down any trends you see as you go. What takeaways can be drawn from this? Are there groups that could serve as teacher leads or others that may need subject-specific supports?

3. Build access so that data is shared up, down, and laterally.

Being transparent with the data you use to make decisions builds investment, so create opportunities to share with different stakeholders. It may already feel natural to use data to defend your resource allocations to a funder or school board, but it’s just as powerful to share data down the pipeline to those affected by the decisions. Allowing people to see the data and form their own insights is more impactful than simply providing high-level takeaways.

Put it into practice: When presenting professional learning plans, highlight the feedback/data you used to make your decisions. Be specific by including relevant percentages or visualizations. This not only encourages you to justify your own decisions but also makes PL feel purposeful and reflective to your audience.

4. Insist on using data to guide discussion and make decisions.

If you’ve invested in gathering good data, you should also invest in using that data. It requires discipline, but insist that all major decisions be mapped back to some type of data.

Put it into practice: If managing a team of principals or instructional coaches, push team members to justify their own PL decisions by defending their planned interventions or supports with at least one piece of data. Questions like “Who are you spending time with this week? Why?” or “What is our data telling you?” build decision-making muscle.
**Remember, “data” can come in a wide range of formats. Qualitative data can provide insight into specific areas of need, so it’s possible to think beyond test scores. By tagging in KickUp, you can also quantify your feedback to highlight trends.

5. Hold yourself accountable, but allow for context.

Real life doesn’t always translate into a perfectly sloped line graph. Progress isn’t always straightforward. For instance, many of our KickUp clients see a small drop in self-assessments around new initiatives between the initial needs assessment and a mid-year check-in. Often, this is because teachers assess themselves more honestly once they’re more familiar with these new instructional practices. They may also be feeling less optimistic in the middle of the year than they might have felt at a back-to-school PD. These reasons are valid and deserve attention.

Put it into practice: Qualitative data can help ensure you have the context necessary to understand progress. Use follow-up questions to determine the source of any stalled progress and identify roadblocks that require supports. Even if your planned data collection doesn’t provide enough insight into why you aren’t seeing growth, informal conversations with teachers who are clearly demonstrating progress–and those who are not–can offer invaluable insight into planning next steps.

Need more support? Your client success manager can help you build on these best practices, establish a calendar of data reviews, and brainstorm ways to share your data with more members of your team.

Not a client yet? Schedule a demo to see KickUp in action. KickUp’s client success managers work with you to build custom data collection instruments that reflect your strategic plan and track progress formatively. The KickUp dashboard makes finding insights and trends easy and the client success team brings expertise to data review meetings so that you can be sure data translates into action.

Let's get started

Schedule a demo with one of our friendly team members.

Schedule a Demo