Data as a strategic asset
By the end of this module you will be able to:
- Apply the DCAM maturity framework to assess an organisation's data capability
- Identify data monetisation patterns and explain how to estimate data's financial value
- Design a data literacy programme and define metrics for measuring data strategy success

Real-world transformation · 2019 onwards
Starbucks Deep Brew: using data to predict which stores to open, three years before construction
Starbucks' Atlas system combines hundreds of data signals: population density and growth, traffic patterns, competitor proximity, neighbourhood demographic trends, income data, proximity to transit, and the historical performance of nearby Starbucks locations. Machine learning models trained on this data predict the likely revenue of a proposed new store location before a single brick is laid.
The business impact is substantial. Opening a store costs $300,000 to $600,000 in fit-out alone, and a poorly chosen location may take years to become profitable or may never recover its investment. Atlas improves selection accuracy by reducing the rate of underperforming new stores. The data asset - accumulated transaction history, customer behaviour, and location performance across 35,000 stores over 30 years - is what makes this possible.
Starbucks' experience illustrates the core argument for treating data as a strategic asset: a competitor who opens a store next to a Starbucks tomorrow will not have access to Starbucks' 30-year geospatial performance dataset. That asymmetry is not replicable with capital investment alone. The data, accumulated systematically over time, is a durable competitive advantage.
Starbucks has 35,000 stores globally. Rather than using intuition or simple demographic data to choose new locations, they built a system called Atlas that ingests geospatial, demographic, competitive, and historical performance data. How does treating data as a strategic asset change capital allocation decisions?
With the learning outcomes established, this module begins by examining the dcam maturity framework in depth.
26.1 The DCAM maturity framework
The Data Management Capability Assessment Model (DCAM), developed by the EDM Council, is the leading framework for assessing and developing organisational data management capability. DCAM defines eight capability domains: Data Management Programme, Data Governance, Data Architecture, Data Quality, Data Lineage, Data Operations, Data Technology, and Data Culture.
Each domain is assessed at one of five maturity levels: Level 1 (Initial): no defined processes, ad hoc activity; Level 2 (Managed): repeatable processes defined for specific areas; Level 3 (Defined): enterprise-wide standards and processes documented and trained; Level 4 (Quantitatively Managed): performance measured and tracked with KPIs; Level 5 (Optimising): continuous improvement processes in place, benchmarked against industry peers.
Most organisations assessed under DCAM land at Level 2 to 3. Reaching Level 4 requires not just process definition but measurement infrastructure: data quality dashboards, pipeline SLAs tracked in monitoring systems, and data product health metrics. Reaching Level 5 requires external benchmarking and a culture of data-driven improvement of the data management function itself.
The DCAM assessment process identifies capability gaps and prioritises investment. A financial services firm with strong data governance (Level 4) but weak data culture (Level 2) has a different remediation roadmap than a retail firm with strong operational data (Level 3) but weak data architecture (Level 1). The framework makes these trade-offs visible.
With an understanding of the dcam maturity framework in place, the discussion can now turn to data monetisation patterns and valuation, which builds directly on these foundations.
“Data is the new oil. Like oil, data is valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals to create a valuable entity that drives profitable activity.”
Clive Humby, mathematician and data scientist, originator of the Tesco Clubcard scheme (2006)
Humby's analogy is often quoted but frequently misunderstood. The point is not that data is infinitely valuable but that raw data, like crude oil, requires significant processing (cleaning, integration, modelling, governance) before it delivers value. Organisations that invest heavily in data collection but not in processing, governance, and analysis infrastructure are sitting on crude oil without a refinery.
26.2 Data monetisation patterns and valuation
Data monetisation takes three forms. Direct monetisation sells data or data products to external parties: weather companies selling historical data to insurers, financial data providers selling market feeds to hedge funds, credit bureaus selling risk scores to lenders. This requires careful legal due diligence on consent, data sharing agreements, and regulatory permissions.
Indirect monetisation improves internal decisions and products using data without directly selling it. Amazon's recommendation engine (driven by purchase and browsing data) generates an estimated $35 billion per year in additional revenue - not from selling data, but from using it to improve the customer experience. Netflix's content commissioning decisions (House of Cards was commissioned based on viewership pattern analysis) are indirect monetisation of viewing data.
Risk reduction monetisation uses data to avoid losses: insurance fraud detection, credit default prediction, cybersecurity anomaly detection, and predictive maintenance. The value is the cost of losses avoided. Rolls-Royce's TotalCare programme (where airlines pay per engine flying hour rather than per engine) is underpinned by sensor data from every engine, allowing Rolls-Royce to predict maintenance needs and reduce unplanned downtime. The data is the basis for the entire business model.
Valuing data as an asset is methodologically challenging because data has no standard accounting treatment (it does not appear on balance sheets in most jurisdictions). Three approaches exist: cost approach (what did it cost to collect and maintain?), market approach (what would comparable data sell for?), and income approach (what revenue or cost saving does it generate?). The income approach is most useful for strategic decision-making but requires attributing revenue or savings to specific data assets, which requires instrumentation.
With an understanding of data monetisation patterns and valuation in place, the discussion can now turn to data literacy and measuring data strategy success, which builds directly on these foundations.
Common misconception
“Data strategy is primarily a technology problem: buy the right platform and data management will follow.”
Technology is necessary but insufficient. Gartner's research consistently shows that 80% of data and analytics initiatives fail to meet their business objectives, with the primary causes being cultural resistance, lack of clear ownership, inadequate data literacy, and misaligned incentives - not technology limitations. A first-class lakehouse platform populated by data that no one trusts, documented by no one, and owned by no one produces no value. The human and governance dimensions of data strategy consistently determine success more than the technology stack.
Common misconception
“If we collect enough data, insights will emerge naturally.”
Data volume without curation produces noise, not insight. The UK Government Data Quality Hub found that poor-quality data costs the UK public sector an estimated fifteen billion pounds annually. Strategic value comes from curated, well-governed datasets with clear ownership, not from raw accumulation.
26.3 Data literacy and measuring data strategy success
Data literacy - the ability to read, work with, analyse, and argue with data - is the organisational capability that unlocks the value of data investments. Without it, dashboards go unread, models produce outputs that decision-makers distrust, and data products are built for analysts rather than for the people who need to act on insights.
A data literacy programme has three tiers. For all employees: reading charts, understanding basic statistical concepts (average, variance, the difference between correlation and causation), and knowing when to ask for help. For data users (managers, analysts): querying data, interpreting model outputs, assessing data quality, and designing simple analyses. For data professionals: the full technical and governance curriculum covered in this course.
Measuring data strategy success requires a hierarchy of metrics. Input metrics measure programme activity: number of data products in the catalogue, percentage of datasets with documented ownership, number of staff completing data literacy training. Process metrics measure governance effectiveness: data quality scores by domain, percentage of pipelines with SLAs, mean time to resolve data quality incidents. Outcome metrics measure business impact: revenue attributable to data-driven decisions, cost reduction from predictive maintenance, reduction in compliance incidents.
The Chief Data Officer (CDO) role, established to lead enterprise data strategy, typically reports to the CFO or CEO. Successful CDOs focus on proving value early (quick wins that build credibility), creating visible data products (dashboards used by leadership), and linking data investment to business outcomes (the income approach to data valuation). CDOs who focus primarily on governance and compliance without demonstrating business value are frequently replaced within two years.
A DCAM assessment finds that an organisation's Data Governance domain scores Level 3 (Defined: enterprise-wide standards documented and trained) but its Data Culture domain scores Level 1 (Initial: ad hoc, no defined processes). What does this gap most likely indicate, and what should the CDO prioritise?
A logistics company processes 2 million parcel tracking events per day and has 15 years of historical delivery data. An insurance company approaches them to purchase anonymised on-time delivery performance data by postcode for pricing commercial delivery insurance. Which data monetisation pattern does this represent, and what must the logistics company verify before proceeding?
A CDO presents a data strategy review to the board. They show: 200 datasets onboarded to the catalogue (input metric), 78% of datasets with quality scores above threshold (process metric), but cannot quantify the business impact. The board asks: 'How much value has this programme generated?' What is the CDO's fundamental measurement gap, and how should it be addressed?
Key takeaways
- DCAM measures data management capability across eight domains at five maturity levels. Most organisations sit at Level 2-3. Reaching Level 4 requires instrumented measurement of data quality, pipeline performance, and governance effectiveness - not just documented processes.
- Data monetisation takes three forms: direct (selling data or data products), indirect (improving products and decisions using data internally), and risk reduction (using data to avoid losses). The income approach - quantifying business value attributable to specific data assets - is the most useful valuation method for strategic decision-making.
- Data strategy failures are primarily cultural and governance failures, not technology failures. 80% of initiatives fail due to cultural resistance, lack of ownership, and inadequate data literacy - not platform limitations.
- A complete data strategy measurement framework includes input metrics (programme activity), process metrics (governance effectiveness), and outcome metrics (business impact). Without outcome metrics, data investment cannot be justified to executives or boards.
Standards and sources cited in this module
EDM Council, Data Management Capability Assessment Model (DCAM)
The authoritative source for the DCAM maturity framework and its eight capability domains.
Data monetisation patterns and competitive advantage from data assets with sector-specific examples.
Qlik, 'Data Literacy Index' (2022)
Research on organisational data literacy gaps and the correlation between data literacy and business performance.
Thomas Redman, 'Data Quality: The Field Guide' (2001)
Foundation text for data quality measurement and the cost of poor data quality. Source for the principle that data quality is a management problem, not a technology problem.
Starbucks Technology Blog: Atlas - Geospatial Analytics for Site Selection
Context for the Starbucks Atlas system and the application of geospatial ML to capital allocation decisions.
Module 26 of 26 · Practice & Strategy · Course complete
