The Digital Future: Seven Transformation Programmes
By the end of this module you will be able to:
- Describe DSI, FMAR, SDR, CCS, MHHS, and VWAN programmes with their timelines
- Explain how AI and digital twins are transforming grid planning
- Assess the 2030 vision for GB energy data infrastructure

Think about it
Seven programmes, one deadline: can GB digitalise its energy system by 2030?
GB's energy data landscape is being reshaped by seven major transformation programmes running in parallel. Each addresses a specific capability gap: data discovery, flexibility visibility, consumer data access, consent management, half-hourly metering, and communications infrastructure. Together, they represent the most ambitious digitalisation effort in the history of GB energy markets.
The challenge is not any individual programme. It is coordination. These programmes are governed by different organisations, funded through different mechanisms, and operating on different timelines. This module maps each programme, examines the emerging role of AI and digital twins, and assesses whether the 2030 vision is achievable.
Each transformation programme addresses a different piece of the puzzle, but they must all interoperate. What happens when MHHS generates 500 billion readings per year but the DSI is still in MVP? When FMAR registers 12 GW of flexibility but the consent framework is not yet ready?
With the learning outcomes established, this module begins by examining the seven programmes mapped in depth.
12.1 The seven programmes mapped
DSI: Data Sharing Infrastructure
The Data Sharing Infrastructure is led by NESO and represents the most fundamental change to how energy data is discovered and accessed. The DSI is not a data warehouse — it does not store energy data. Instead, it functions as a discovery and trust layer: a catalogue of what data exists, who holds it, under what terms it can be accessed, and what consent is required. Think of it as a search engine for energy data, combined with an identity and consent management system.
The programme timeline runs from MVP in 2028 to full operation by 2030. MVP will deliver core discovery capabilities: a searchable catalogue of energy datasets, standardised metadata, and basic consent management. Full operation adds advanced features including automated data access agreements, real-time consent verification, and integration with all major energy data platforms.
The DSI matters because it addresses the fundamental problem identified in Module 11: GB's energy data is fragmented across multiple platforms with no unified way to discover what exists or request access. Without the DSI, every new data- driven innovation requires bespoke bilateral agreements with each data holder. With the DSI, innovators can discover available data, understand access terms, and manage consent through a single interface.
FMAR: Flexibility Market Asset Register
FMAR launches in 2027 with an initial target of registering 2.5 GW of flexibility assets, scaling to 12 GW by 2030. The register will provide NESO and DNOs with visibility of distributed flexibility resources — batteries, demand response aggregations, vehicle-to-grid installations, and other assets that can adjust their consumption or injection in response to price signals or dispatch instructions.
Currently, NESO has reasonable visibility of large-scale generation and transmission- connected assets, but limited awareness of the rapidly growing fleet of distributed resources connected to the distribution network. A 50 MW battery farm connected at 132 kV is well understood. Ten thousand domestic batteries aggregated by a virtual power plant operator are largely invisible to system operation. FMAR addresses this gap by creating a standardised register of flexibility assets, their technical capabilities, and their availability for dispatch.
SDR: Smart Data Repository
The SDR is expected to become operational by the end of 2026 and is modelled explicitly on the Open Banking framework. It provides the technical infrastructure for the consumer data access rights established by the Data (Use and Access) Act 2025. The SDR defines standardised APIs through which energy companies must share customer data with authorised third parties, subject to explicit consumer consent.
The Open Banking parallel is instructive. When Open Banking launched in the UK in 2018, it faced scepticism about whether consumers would actually use it. Eight years later, millions of consumers use Open Banking-powered services for budgeting, switching, and credit assessment. The SDR aims to replicate this trajectory for energy data, enabling a new generation of consumer-facing energy services built on standardised data access rather than proprietary integrations.
CCS: Centralised Consent Service
RECCo is developing the Centralised Consent Service as the consent management framework that underpins the SDR and other data sharing initiatives. The CCS provides a standardised mechanism for consumers to grant, revoke, and manage consent for their energy data to be shared with third parties. It addresses the current problem where consent is managed inconsistently across different platforms, making it difficult for consumers to understand who has access to their data and how to withdraw that access.
The consent framework is not a trivial problem. Energy data consent must handle granular permissions (which data types, for what purpose, for how long), delegated consent (a landlord consenting on behalf of tenants), and revocation that propagates across all downstream systems. The CCS is designed to handle all of these scenarios through a centralised service that market participants integrate with, rather than implementing their own consent mechanisms.
MHHS: Market-wide Half-Hourly Settlement
MHHS is the most operationally complex of the seven programmes. The cutover is scheduled for July 2027, at which point all electricity meter points in GB will be settled on a half-hourly basis. This is a step change from the current system where most domestic meters are settled using static profiles that estimate consumption patterns rather than using actual half-hourly readings.
The data scale is staggering: MHHS will generate approximately 500 billion meter readings per year. Every meter point contributes 48 readings per day, 365 days per year. For 57 million meter points, that approaches a trillion individual data points annually. The infrastructure required to collect, validate, aggregate, and settle this volume is fundamentally different from the current batch-processing approach that handles a fraction of this volume.
MHHS matters for settlement accuracy and for creating the price signals that drive demand-side flexibility. Under profile settlement, a domestic consumer who shifts their washing machine and EV charging to off-peak hours receives no financial benefit because the profile assumes average behaviour. Half-hourly settlement rewards actual demand shifting, creating the economic signals needed for a flexible, decarbonised system. The transition from estimated to actual settlement is one of the most consequential data infrastructure changes in GB energy history.
VWAN: Virtual Wide Area Network
VWAN is the communications infrastructure programme, transitioning from the DCC's dedicated smart metering network to broadband-based communication by 2026. This represents a shift from a purpose-built, dedicated network to one that leverages existing public broadband infrastructure. The advantages include higher bandwidth, lower marginal costs per meter point, and the ability to support richer data flows including near-real-time readings and over-the-air firmware updates.
The transition is not without risk. The DCC's dedicated network was designed specifically for smart metering traffic with guaranteed quality of service. Moving to shared broadband infrastructure introduces dependencies on commercial broadband providers and the public internet. However, the cost savings and bandwidth improvements are considered to outweigh the reliability trade-offs, particularly as broadband coverage in GB continues to improve under the Project Gigabit programme.
What is the primary function of the DSI (Data Sharing Infrastructure)?
“MHHS will generate approximately 500 billion meter readings per year. The data infrastructure required for this volume is fundamentally different from batch processing.”
Elexon, MHHS Programme Plan (2024)
This scale — half a trillion readings per year — transforms settlement from a batch process into a continuous data management challenge. The shift from estimated profiles to actual readings requires infrastructure capable of near-real-time validation and aggregation at a scale unprecedented in GB energy markets.
The seven transformation programmes create the data infrastructure. Section 12.2 examines the AI and digital twin applications that will run on top of that infrastructure once it is in place.
12.2 AI and digital twins in grid planning
Beyond the seven formal programmes, two technology trends are transforming how the energy system uses data: artificial intelligence and digital twins. Neither is a formal programme with a governance board and a delivery timeline. Both are being adopted organically by system operators, network companies, and market participants as the data foundations improve.
Demand forecasting and carbon intensity
AI-powered demand forecasting is already operational in GB. NESO uses machine learning models to forecast national demand, wind output, and solar generation. The National Grid ESO Carbon Intensity API provides 30-minute granularity carbon intensity data across 14 GB regions, with 96-hour forward forecasts. These forecasts use neural networks trained on historical generation mix, weather data, and interconnector flows to predict the carbon intensity of electricity at each half-hour interval.
The practical impact is already significant. A data centre operator using the Carbon Intensity API can shift computational workloads to periods of low carbon intensity, reducing both emissions and costs. An EV charging network can schedule overnight charging to coincide with high wind output. A smart heating system can pre-heat a building when carbon intensity is low and coast through high-carbon periods. These applications already exist and are growing rapidly as the API becomes more widely adopted across industries.
Predictive maintenance
Network companies are deploying machine learning for predictive maintenance of distribution assets. By analysing patterns in transformer loading, cable temperature, partial discharge measurements, and historical fault data, these models can predict equipment failures before they occur. The value is not just in avoided outages and improved customer service but in optimised capital expenditure: replacing assets based on condition rather than age reduces the total cost of maintaining the distribution network.
The data dependency is crucial. Predictive maintenance models are only as good as the sensor data and operational history they are trained on. As CIM standardisation improves the quality and consistency of network model data across all 14 DNOs, and as smart meter data provides higher-resolution demand information at each connection point, these models become progressively more accurate. The virtuous cycle between better data and better AI is already visible in early deployments.
Digital twins
A digital twin of the electricity network is a virtual model that mirrors the physical network in near-real-time. It ingests live data from SCADA systems, smart meters, weather stations, and market systems to create a dynamic representation of the network state. Planners can then simulate scenarios — connecting a new wind farm, closing a substation for maintenance, managing a cluster of heat pumps in a residential area — without affecting the real network.
NESO and several DNOs are developing digital twin capabilities. The LTDS CIM models (described in Module 14) provide the structural foundation: a standardised representation of every network element, its connectivity, and its electrical parameters. The digital twin adds live operational data on top of this structural model, creating a dynamic representation that updates continuously as conditions change.
The 2030 vision is a comprehensive digital twin of the entire GB electricity system from 400 kV transmission to low-voltage distribution. This would enable integrated planning across voltage levels, real-time congestion management, and automated response to system events. Whether this vision is achievable by 2030 depends entirely on the quality and availability of the underlying data — which is precisely what the seven transformation programmes are designed to deliver.
AI and digital twins are already emerging where data quality is sufficient. Section 12.3 assesses whether the 2030 vision — all seven programmes operational, all dependencies resolved — is achievable on the current timelines.
12.3 The 2030 vision assessed
The converging timelines of all seven programmes create a vision of what GB energy data infrastructure could look like by 2030. In this vision, all meter points are settled half-hourly (MHHS, operational from July 2027). All flexibility assets above a minimum threshold are registered and visible to system operators (FMAR, scaled to 12 GW). Consumers can access and share their energy data through standardised APIs (SDR, modelled on Open Banking). Consent is managed centrally and transparently (CCS). Data discovery and access are unified through a trust layer (DSI, fully operational). Communications infrastructure supports near-real-time data flows (VWAN, broadband-based).
The critical question is interdependency. The SDR needs the CCS for consent management. FMAR needs the DSI for asset discovery across system boundaries. MHHS needs the communications infrastructure that VWAN is modernising. Digital twins need the CIM-standardised network models that the LTDS programme is delivering. AI applications need consistent, high-quality data from all of these sources. If any single programme fails or is significantly delayed, the others are affected through these dependency chains.
The Clean Power 2030 target — decarbonising GB's electricity system by 2030 — depends on data infrastructure that does not yet fully exist. It requires knowing where flexibility assets are (FMAR), settling them accurately (MHHS), managing consent for data sharing (CCS), enabling consumer participation through data access (SDR), and coordinating across a complex multi-actor system (DSI). The data infrastructure is not a nice-to-have for decarbonisation. It is a prerequisite.
The risk is not that any individual programme fails outright. The risk is that programmes deliver on slightly different timescales, with slightly different interface specifications, and that the integration gaps between them persist longer than planned. In the meantime, the market participants who need to use these systems will continue to build workarounds, creating technical debt that becomes progressively harder to unwind. The programme coordination function — ensuring that DSI interfaces match CCS APIs, that FMAR data flows into digital twins, that MHHS data is discoverable through the DSI — is where the real risk lies.
“Digitalisation of the energy system is not a goal in itself — it is the infrastructure that makes decarbonisation possible at the speed and scale required.”
Energy Data Taskforce, A Strategy for a Modern Digitalised Energy System (2019)
This framing from the Energy Data Taskforce establishes data infrastructure as a prerequisite for clean power, not an optional enhancement. The seven transformation programmes — MHHS, DSI, FMAR, CCS, SDR, VWAN — are not digitalisation for its own sake; they are the operational foundation that enables a flexible, decarbonised electricity system.
Common misconception
“AI will solve all energy system data challenges by 2030.”
AI is a powerful tool but it depends entirely on the quality and availability of underlying data. A machine learning model trained on inconsistent, fragmented, or incomplete data will produce inconsistent, fragmented, or incomplete predictions. The seven transformation programmes are necessary precisely because they create the data foundations that AI requires. Without MHHS, CIM standardisation, and the DSI, AI applications will remain limited to the domains where clean data already exists.
Approximately how many meter readings per year will MHHS generate when fully operational?
Key takeaways
- Seven transformation programmes are running in parallel: DSI (discovery, 2028-2030), FMAR (flexibility register, 2027), SDR (consumer data, end 2026), CCS (consent, RECCo), MHHS (half-hourly settlement, July 2027), and VWAN (broadband comms, 2026).
- MHHS will generate ~500 billion meter readings per year, requiring fundamentally different data infrastructure from current batch processing.
- AI applications including demand forecasting, carbon intensity prediction (14 regions, 96-hour forecast), and predictive maintenance are already operational and growing.
- Digital twins of the GB electricity network are emerging incrementally, built on CIM-standardised network models with live operational data overlays.
- The Clean Power 2030 target depends on data infrastructure: the energy transition is as much a data challenge as an engineering one. Programme interdependency is the primary risk.
Standards and sources cited in this module
NESO. Data Sharing Infrastructure Programme Overview, 2024
Programme scope, timeline, and MVP specification
Primary source for the DSI programme description, the discovery and trust layer architecture, and the 2028-2030 delivery timeline.
Elexon. Market-wide Half-Hourly Settlement: Programme Plan, 2024
Cutover timeline, data volumes, and settlement impact
Source for the July 2027 cutover date, the 500 billion readings per year estimate, and the transition from profile to actual settlement.
National Grid ESO. Carbon Intensity API Documentation
API specification, regional granularity, and forecast methodology
Source for the 14-region, 30-minute granularity carbon intensity data and 96-hour forecast capability that demonstrates operational AI in GB energy.
Module 12 of 15 in Energy System Data