Attention CIOs: Many will fail in the data science game
If you talk to company CIOs today about their internal data science programs, you’ll likely get a response of enthusiasm for the potential business gains… or a worried look from a CIO. who just made a major investment and really hope he’ll have something to show the board this quarter. Or a combination of both.
When I speak regularly with CIOs, I see this push / pull dynamic unfolding business after business. Domino Data Lab recently conducted a survey with Wakefield Research on Executives’ Opinions on Data Science Initiatives, and 97% of Executives expect data science program revenue to increase. I also read an Accenture survey which showed that 75% of executives think their company going bankrupt if they can’t successfully evolve data science over the next five years.
So you have a wave of excitement for the potential gains of artificial intelligence and an extinction-level fear of being wrong in data science. It is a difficult place. Especially when 82% of data managers say they’re under pressure to make dramatic short-term investments so they can show a quick win.
How can an CIO, board and management team avoid falling into the trap of explosive investments that don’t last? You need to look at your metrics and make sure you’re measuring the right KPIs. To quote Peter Drucker, “if you can’t measure it, you can’t improve it.”
The KPIs you choose should point to building a sustainable machine capable of producing a constant stream of highly profitable models. You want to avoid short-sighted measures that won’t be sustainable over time, such as forecasting consistent quarter-over-quarter growth. And you want to avoid the gold rush mistake of diving into the mine before you build a framework that will support your program over time.
I’m going to share a set of basic principles that I’ve found that the most successful data science programs have in common. These should guide your choice of metrics. Then I’ll share specific metric ideas that apply to both your data science program and actual data science outcomes. Combined, these measures can help generate long-term sustainable gains.
Fundamentals of Data Science
For companies that are starting to develop their data science programs, here are four principles to keep in mind when thinking about how to measure the long-term impact of your data science program:
- Iteration speed. How fast does your team go through ideas and models? Speed is more important than big breakthroughs. You want to prepare your team for long term success. It means building a product-making machine that will justify your initial investment over time and deliver consistent results.
- Reusable knowledge. Relying on the intelligence and experience of your team is more important than producing an immediate response. You need to create reusable assets. This means you need to prioritize creating a searchable and shareable knowledge base that can be a catalyst for future product research and development.
- Agility of the tool. With the pace of innovation in analytics, you see new tools all the time. Success will require agility and flexibility in the way you use your tools and how quickly the team can speed up new software implementation. Don’t put all your tech eggs in one basket. This applies to infrastructure, frameworks, programming languages, and tool solutions.
- Process and culture. Building a successful analysis wheel takes more than technology: you need a team that can help each other in the work and a culture of growth and learning. Senior leaders know that building the right team is their most important goal and the biggest factor in success or failure. Give your team the infrastructure they need to discover miracles.
If these strategic goals are at the fore, you’ll be on the right path to long-term success. The next step is to consider how to build and assess a successful team.
Read also : Steps to Improve Your Data Architecture
Building the Foundation
When starting out with a new data science program or expanding an existing one, senior leaders need to look at current expenses, the amount of knowledge the team creates over time, and how quickly new members of the team team add value. There are three key areas to consider here:
- Program operating costs. There will be recurring costs for essential tools and data management to consider, but don’t forget to measure your IT team’s support expenses. If you configure your infrastructure so that data scientists can take a self-service approach, the number of support tickets should decrease over time.
- Contribute to the knowledge base. You want to reward data scientists for sharing their ideas and contributing to the company’s knowledge base, as well as producing valuable new models. To quantify and recognize collaboration, you can track the number of contributions per person and engage with the leading knowledge management platform to measure the success of individuals, and track the overall contribution rate over time to assess success. overall team contributions.
- Integration. Find out if there are ways to configure your data science programs to speed up the onboarding process. You want to make sure that new hires can add value quickly and not slow down more experienced people on the team by asking them where everything is. The easiest way to make people work is to make it easier for employees to find information on their own.
Read also: 6 Ways Your Business Can Benefit From DataOps
Once a data science program is up and running, leaders need to determine how to speed up the process to achieve better results over time. Data science moves quickly, but you shouldn’t have to reinvent the wheel with every new project. Leaders need to consider how they can reuse and repeat the previous work of the larger team to kick off the next project and get results quickly.
At Domino Data Lab, we call this “model speed”. This describes the time it takes to create a new model, deploy it to production, update it, and recycle it on a regular basis. Model Velocity measures the speed of your data science flywheel and the speed at which you deliver model-based products.
- Model creation. Start tracking the raw time it takes to build a new model, from initial planning to deployment to production. If you are building a knowledge base and creating collaborative workflows, you may be able to reduce the time for creating a new model from 180 days to 14 days. Use your experience with each project to help build the next model.
- Deployment in production. Once you’ve developed a model, see how long it takes for a validated model to go into production. For too many businesses, each deployment requires unique infrastructure changes and adjustments to handle incoming data. If you create a documented and repeatable process for your IT team, you can streamline the process and get a model into production in just one day, instead of taking months.
- Regular updates. Once a model is in production, you need to give it some care and “power” it to maintain its viability. When a problem arises or the data changes, examine your approach and procedures to determine the root cause. Make sure you have a defined procedure for correcting or updating models, and a regular cadence of model updates to proactively address discrepancies in the data flow.
Apply the key principles
I’ve shared different ways to measure the success of your data science team and make sure you can show constant improvement and business impact over time. I’ve seen companies discover unique customer insights through data science. But I’ve also seen programs at top companies fail when everything is treated as a prototype and the companies don’t build a long-term program.
By building a data science machine based on iteration and continuous improvement, your teams and employees can achieve better results and reduce the time it takes to get new information. The key questions and principles above will give you a starting point to start the discussion and determine specific actions for your business so that you can track your long-term success.
Read more : Best data science tools 2021