Running a Data Analytics Program: 9 Red Flags to Watch For

Posted by Erin Surprise on Feb 19, 2019 7:12:12 AM

Red warning flags

 

Launching a data analytics program is only half the battle. Watch out for these red flags to ensure yours delivers as anticipated.


When leveraged to its greatest potential, data analytics has the power to increase productivity, reduce costs, and improve results for organizations in every industry. In this day and age, CEOs and superintendents alike are actively investing in analytics resources with the belief that they can help transform their organization.


Unfortunately, many organizations fail to effectively scale their efforts to become data-driven and, as a result, they don’t see a notable ROI. A recent McKinsey report explores this challenge, identifying common failure patterns across organizations of all types, industries, and sizes, and pinpointing nine red flags that indicate failure may be on the horizon.


While these insights weren’t gleaned from school districts, their application within an education context has the potential to inform the way that district leaders facilitate and enforce data infrastructure, as many of the pitfalls that prevent businesses from becoming data-driven are equally challenging for educators. Here, we’ll explore what educators can learn other’s failure to properly leverage data analytics, and uncover what district leaders can do to keep these red flags from impeding their district’s data analytics program before it even gets off the ground.


Red flag #1: Leadership doesn’t have a clear vision for the new data program.


According to McKinsey, when executives lack a clear understanding of how data analytics can help their companies, the programs often fall flat. Companies may invest significant resources into pilot data programs, but if management doesn’t have a grasp on the operational context in which data analytics might be applied — or worse, specific performance indicators that are both realistic and productive — any application will likely be half-baked.


This happens in schools, too. When district and school-level leadership don’t have a clear vision of how a data-driven culture can tangibly improve student outcomes, data initiatives are less likely to be effective. This can be addressed by having district leadership undergo training and coaching to become data literate prior to the implementation of a new program. This coaching may take the form of workshops, recurring weekly sessions, or through online resources, but all training should be a team effort — collaborative learning will ensure that the entire leadership team is on the same page.


Additionally, providing tools that help individuals identify goals, develop action plans, track data, and examine results can help make the often overwhelming amount of information tangible and actionable. By walking through job-based scenarios in this manner, educators can grasp the value and retain the information more readily.


Luckily, when a data-driven culture starts at the top, it’s likely to trickle down; these training and development opportunities can serve as a template to bring remaining district staff up to speed after the rollout. Trained administrators can work with staff, ensuring that the vision of a data-literate staff is clear to all stakeholders.


Red flag #2: Initial use cases weren’t chosen thoughtfully and strategically.


Once a new tool or solution has been selected to assist in the rollout of an organization’s data analytics strategy, it may be tempting to implement it across departments, teams, and business divisions. But in order for data analytics initiatives to be both effective and measurable, leadership must think through specific, feasible use cases that have the potential to create value. By starting off small, it’s easier to understand what worked and what didn’t, and be better equipped to solicit buy-in for the strategy’s continued rollout.


Likewise, it’s not practical to roll out a new data program across an entire district at the same time. Not only is it costly, but this kind of blanket approach can slow down the process and  hinder how quickly you start to see results. Districts tend to have far greater success with a gradual rollout that launches in relation to a clearly defined initiative, as this makes it easier to identify successes, challenges, and best practices for future initiatives. However, having a longer term strategy beyond one use case is also key (see red flag #3).


Red flag #3: The data strategy only extends to a few use cases.


While the initial rollout should only focus on a few use cases, the overarching vision for the new data program shouldn’t end there. Building a truly data-driven culture means, over time, creating a new digital ecosystem and fundamentally changing the way everyone in the organization approaches problem-solving.


District leadership should plan its data program implementation strategically and in advance, just as it would with any major initiative. Just because data can improve some situations more clearly than others doesn’t mean that it should be applied, like a Band-aid, only where it’s most desperately needed.


Having a more universal and long-term strategy helps with may of the other red flags mentioned here as well. If leadership and staff know that what they learn for one initiative could have a broader reaching impact on their role, then training and development become more important. Additionally, processes and resources can efficiently be put in place to support a more systemic initiative.


Red flag #4: Analytics roles are poorly defined.


It takes a village to successfully launch and sustain a district-wide data analytics initiative — and before implementing a new program, districts should have resources allocated to hire and train analytics talent.


Part of this process involves defining dedicated roles in detail but leaving room for evolution and growth as new needs become apparent, as well as considering whether the roles are best filled by external hires, through partnerships, or whether data-savvy teachers may be able to perform some of the duties concurrent with their existing roles. Failing to plan ahead for these roles can result in wasted funds and a stalled data initiative. If you are rolling out the initiative over time, but have a long term strategy, you will be positioned to gradually make these decisions as you gain a better understanding of the role requirements.


Red flag #5: The district lacks data-savvy liaisons.


According to McKinsey, one of the most important roles organizations must assign is that of “analytics translator,” or someone on the business side of an organization who can help identify data use cases and then “translate” those business needs to data scientists, data engineers, and other tech experts so they can create a solution.


In districts this is often the initial project champion: someone who understands the how data can be leveraged to impact student outcomes. But it shouldn’t end there. Through training and incentives, district administrators, school administrators, and teachers can be developed to fill this role, acting as liaisons between educators, tech teams, and strategic leadership to ensure that all parties are heard and understood. The more data-savvy educators and staff a district can develop, the better data can be used every day and in every role to drive improved student outcomes.


Red flag #6: The tech team and curriculum team can’t bridge the gap.


With full plates and differing priorities, district leadership and the information technology team can easily fall out of sync. But often, schools’ success depends on districts’ ability to bridge the gap between these two vital groups — districts that can break down silos separating these departments will be better equipped to leverage educational data for positive change.

 

The districts that are most successful in bridging this gap often facilitate collaboration by including tech team members in strategy planning sessions and administrators in technology meetings. If everyone can understand each other’s point of view, the district’s data analytics initiatives will be a much greater success.


Read our longer blog post on this specific issue, Connecting District Leadership and the Tech Team: How to Overcome Challenges.


Red flag #7: Too much time and money is wasted on data cleansing.


According to the McKinsey report, some data initiatives can fail because too much emphasis is put on scrubbing all existing data clean. While we agree that bottlenecking a project before it really begins can be a red flag to watch out for, we know that having clean data is critical. And we have seen this issue hold up school districts’ and education agencies’ projects. But it goes back to the cliche: junk in, junk out.


For this reason, we recommend that a data cleansing action plan is put in place that takes into consideration the time and processes that will be necessary to address data cleanliness. It should not be an afterthought once a project is already in process.


Here at Hoonuit, we have a team of seasoned data scientists, engineers, and solution architects whose job it is to help us understand exactly what data quality checks to run to clean-up data at the beginning of a new implementation, as well as what processes and business rules to put in place to monitor data cleanliness going forward.


Red flag #8: A district isn’t using a tool designed especially for educators.


While there are countless data analytics solutions on the market, a majority of them aren’t designed with needs specific to the education community in mind. And because they aren’t built for educators, they often don’t integrate with the most popular educational tools or provide out-of-the box analysis that meets educators needs, making it harder for school districts that wish to offer a streamlined user experience to support an analytics program’s continued use.


Fortunately, tools like Hoonuit are built with educators in mind. That’s why we’ve built partnerships with many leading educational technology vendors, and why we make ease of use a top priority. Our intuitive dashboards offer unparalleled data visualization functionality, making it simple to turn information into actionable insights to improve student outcomes.


Red flag #9: It’s hard to measure ROI for data analytics.


While there’s no question that data analytics is valuable, many organizations struggle to pinpoint precisely how and where it’s helping. That’s because the ROI of data analytics often isn’t a one-to-one process: it completely transforms that a way a school district operates, while also providing greater value and more useful insights.


What’s more, in educational settings, ROI isn’t simply a money in, money out equation; not all student outcomes can be measured purely quantitatively. Educational data can support the work of teachers and administrators as they improve efficiency, meet reporting requirements, and most importantly, enable student performance. To that end, district administrators should look to get more value out of the data their schools produce, and leverage that data as a vehicle for positive change in their communities.


At Hoonuit, we understand that  every new school year brings on new challenges, opportunities, and goals. We can help you use data analytics to support it all.

 

Learn more about our end-to-end data management and analytics solution here.

Subscribe

Suscribe to our Blog

Stay up to date on the lastest stories from hoonuit


Topics : Data & Analytics

UPDATES FROM THE HOONUIT BLOG