What is a data strategy?
A data strategy is a plan that describes what data your organisation needs to run and grow the business, how that data will be collected and maintained reliably, and how it will be used to support decisions at every level from operational management to the board.
It answers three questions that most mid-market organisations have not answered clearly:
- What data do we actually need? Not what data we could collect, but what data the business requires to make the decisions that matter.
- How do we know we can trust it? Where does the data come from, who is responsible for its accuracy, and what happens when it is wrong?
- What are we going to do with it? What decisions will this data improve, what operational processes will it change, and what outcomes will be different as a result?
A data strategy that does not answer all three questions is not a strategy - it is a data architecture plan or a reporting roadmap dressed up as something more strategic than it is.
"Our CFO had been producing the same board pack for four years. Every month there was a different number for the same metric depending on which system you pulled from. The data strategy didn't start with technology - it started with agreeing what the number should be, who owned it, and where it lived. We were six months in before we touched a dashboard."
- CEO, £50m services business
What a data strategy is not
The term is used loosely enough that it is worth being precise about what a data strategy is not:
| What people call a data strategy | What it actually is |
|---|---|
| A plan to implement a data warehouse or data lake | A technology architecture project |
| A roadmap for building dashboards and reports | A reporting improvement plan |
| A plan to hire data scientists or analysts | A capability and resourcing plan |
| A GDPR compliance review | A data governance / legal compliance project |
| An AI or machine learning initiative | A use case development project |
None of these are wrong in themselves. But treating any of them as a data strategy produces a plan that solves an operational or technical problem without answering the underlying question of what data the business needs to perform better and how it will be governed reliably.
Signs you do not have a working data strategy
For most mid-market businesses, the evidence that a data strategy is needed - or that the existing one is not working - is visible in day-to-day leadership behaviour:
- Board meetings spend significant time debating which numbers are correct rather than what to do about them
- Different departments produce different versions of the same metric using different data sources
- Senior leaders have lost confidence in the management accounts or the board pack and make decisions based on gut feel instead
- The finance team spends significant time each month reconciling data from multiple systems to produce a single view
- A system implementation was completed but MI from the new system is not trusted or is not being used
- Data requests to the technology or data team take weeks because there is no self-service capability
- The organisation has invested in BI tooling (Power BI, Tableau, etc.) but the dashboards are not used because people don't trust what they show
- Leadership is making significant commercial decisions - pricing, capacity, investment - without data-backed analysis
What a data strategy actually contains
A working data strategy for a mid-market organisation typically contains six components:
1. Decision inventory
A structured view of the decisions the organisation makes - at operational, management, and strategic level - and what data each decision requires. This sounds elementary but most organisations have never done it. The result is that data investment accumulates around what is easy to collect rather than what the business actually needs to decide.
2. Data asset map
A clear picture of what data the organisation currently holds, where it lives, who owns it, and how reliable it is. This is not a full data catalogue - it is a high-level map of the assets that matter for the decisions identified in step one, and an honest assessment of their current quality.
3. Data quality and ownership framework
For each data asset that matters, a clear definition of what "good" looks like, who is responsible for maintaining it, and what the remediation process is when quality falls short. Without this, data quality improvement is a project with no owner that regresses the moment the project closes.
4. MI and reporting architecture
A design for how data flows from source systems to the people who need it - what the single version of each key metric is, where it is calculated, and how it reaches the board pack, the management reports, and the operational dashboards. This is where tooling decisions (data warehouse, BI platform, etc.) are made - but only after the requirements are clear.
5. Data governance model
The operating model for how data is managed on an ongoing basis - data ownership, change control for definitions and calculations, the process for resolving disputes about numbers, and how data quality is monitored. Governance is what prevents the strategy from decaying.
6. Prioritised roadmap
A sequenced plan of what to do first, second, and third - based on commercial value and practical dependency. Not a three-year technology roadmap; a 12-18 month operational plan with clear milestones and defined outcomes at each stage.
Why most data strategies fail
The most common reasons data strategy projects do not deliver their intended outcomes:
They start with technology, not decisions
Selecting a data platform before defining what the business needs to know produces an expensive infrastructure in search of a use case. The right sequencing is: define what decisions need better data, then design the data assets required, then select the technology that can support them.
They have no clear business owner
Data strategies owned by the technology function tend to produce technically sophisticated solutions that nobody uses. Data strategy needs a business owner at director level - typically the CFO, COO, or a Chief Data Officer - who is accountable for the commercial outcomes, not just the technical delivery.
They underestimate the people and process change required
Improving data quality requires people to change how they enter, manage, and use data in day-to-day operations. This is a change management challenge, not a technology challenge. Data strategies that do not include a credible plan for changing operational behaviours will produce better data infrastructure running on the same unreliable processes.
They confuse outputs with outcomes
Dashboards built, data models deployed, and analysts hired are outputs. The outcomes that matter are decisions made with greater confidence, time saved on reconciliation, and commercial performance that improved because of better information. Strategies that measure their own progress in outputs rather than outcomes lose sight of why they exist.
The difference between MI and data strategy
Management information (MI) is what leadership teams receive - reports, dashboards, and analysis - to enable them to manage the business. A data strategy is the plan that determines what MI the business should have, what data it should be built on, and how that data is governed.
The practical distinction matters because many mid-market organisations have an MI problem when what they actually need is a data strategy. The MI is unreliable not because of how the reports are built but because the underlying data is inconsistent, unowned, and held in systems that were not designed to work together. Rebuilding dashboards on unreliable data produces better-looking reports of the same bad numbers.
The diagnostic question is: if we gave the same underlying data to three different analysts and asked them to build the monthly board pack, would they produce the same numbers? If the answer is no, the problem is data, not MI.
Where to start if your data is unreliable
For most mid-market organisations, the pragmatic starting point is not a full data strategy - it is a focused diagnostic on the two or three metrics that matter most to the leadership team and that are currently unreliable.
The questions to answer for each metric:
- What is the agreed definition of this metric? Is it written down, and do all parts of the business use the same definition?
- Where does the data that feeds this metric originate? Which system(s), which processes, which people are responsible for its accuracy?
- What are the known reliability problems? Where does data get lost, duplicated, or manually adjusted in ways that introduce error?
- Who is responsible for the accuracy of this metric? If it is wrong, whose problem is it to fix?
Answering these questions for the three or four metrics that matter most to the board will typically surface the root causes of data unreliability more quickly than a full data strategy exercise - and will provide the evidence base for prioritising what to fix first.
Build, buy, or outsource?
Mid-market organisations face a real choice about whether to build internal data capability, buy tooling and use it with internal resource, or engage an external partner to provide data strategy and MI on an ongoing basis.
The honest answer depends on the nature of the organisation's data challenges and its realistic capacity to hire and retain data talent:
- Build is right if the organisation has sufficient scale to justify a dedicated data function and the commercial ambition to differentiate on data over time. For most mid-market organisations below £100m revenue, building a full internal data function is premature.
- Buy tooling with internal resource works well if the primary challenge is MI rather than data architecture - where the data is largely reliable but the reporting layer is inadequate. Modern BI tools (Power BI, Tableau, Looker) can be adopted relatively quickly by a commercially-minded finance or operations team.
- External data strategy support is appropriate when the root cause of unreliable MI is data quality, governance, or architecture that the internal team does not have the capacity or expertise to address. The engagement should have a defined exit - leaving the organisation with improved data infrastructure and the internal capability to maintain it.
How long does a data strategy take to produce?
A data strategy for a mid-market organisation can be produced in 4-6 weeks if it is scoped correctly and the right people are available. The output should be a decision-ready document that the leadership team can use to prioritise investment and assign accountability - not a 200-page technical specification.
The key inputs required are: access to the leadership team to understand the decisions that matter, access to the data owners in finance, operations, and technology, and an honest assessment of the current state of data quality and governance. With those inputs, a competent external data strategy practitioner can produce a board-ready recommendation within a month.
Where data strategy projects take 6-12 months, the usual reason is that the scope expanded from strategy to implementation, or that the organisation did not have the internal clarity to define what decisions the strategy needed to support. Both of these problems are avoidable with the right scoping upfront.
"We had been talking about our data strategy for two years. A consultant had produced a 180-page report that nobody had read past page 20. What we actually needed was someone to tell us which three numbers were wrong, why they were wrong, and what to do about it. We had that in six weeks."
- CFO, professional services business
MI you can trust, in weeks not months
A Business Review can identify the root cause of data and MI unreliability in 14 days - and produce a prioritised improvement plan your board can act on.