
Improving Scope 3 data quality
We explore the chicken-and-egg problem: one company's operational emissions are another's Scope 3. Firms rely on each other for high-quality data to drive decarbonisation - without it, everyone’s stuck. Can the GHG Protocol improve trust in Scope 3 data?
Low quality data undermines Scope 3 actionability
A key challenge for the GHG Protocol is to help make Scope 3 data more transparent, useful and actionable.
According to a recent SBTi survey, lack of access to reliable Scope 3 data is preventing companies from setting credible baselines, tracking the impact of their decarbonisation initiatives, and achieving targets.
Last month, we discussed the SBTi’s proposals to enhance Scope 3 target setting by encouraging firms to focus their efforts on carbon-intensive activities in their footprint and on the emission sources where they have the most influence.

According to the SBTi, only 6% of companies use supplier-specific emission factors. Moreover, these emission factors are themselves often derived from a combination of the suppliers’ operational emissions and activity or spend-based estimates of the suppliers’ upstream value chain emissions.

In its stakeholder consultation on the Scope 3 Standard and Technical Guidance, the GHG Protocol received feedback casting doubt on the reliability and comparability of Scope 3 inventories, as well as the degree of certainty, accuracy and relevance of results. Alas, many companies are stuck spending more resources on collecting data than taking action.
“In God We Trust, All Others Must Bring Data”
- W. Edwards Deming
The heart of the problem is that emissions data is not created in isolation. Companies are both preparers and users of this data - forming a carbon data value chain. For this value chain to function effectively, companies need to:
- Collect raw data, which involves sourcing and integrating company-specific and third-party data (whether reported emissions, primary or secondary activity data or emission factors)
- Transform this raw data into a company-specific emissions footprint using appropriate methodologies
- Extract insights to drive decarbonisation, and
- Disclose their data as an input for others to do the same thing.

Each step in the process can face practical constraints, such as:
- Vocabulary constraints: using different words or metrics to describe the same things, or the same terms having different meanings. For example, different units of activity data, mixing of estimates and measurement approaches, different naming conventions and vocabulary for emission sources and categories.
- Validity constraints: unverified or unverifiable sourcing and manipulation of data due to a lack of resources or clarity on how to collect data. This can lead to concerns about accuracy of primary data, use of black box emissions engines, or lack of metadata on assumptions.
- Methodological constraints: compiling data based on the most readily applicable approach rather than methods that produce the most accurate data. This could entail overuse of spend estimates, unstandardised emission factors, or proxies to cover missing data.
- Awareness constraints: when information isn’t made public or when reports containing it are hard to find. For example, hidden data controlled by landlords, or patchy or incomplete secondary data.

A way forward - breaking Scope 3 down by data quality
To address some of these challenges, the GHG Protocol is looking at ways for companies to more effectively present and communicate the quality of their Scope 3 inventories.
Based on our reading of ongoing technical working group discussions, companies may be required to disaggregate their Scope 3 emissions according to different tiers of data quality.
For example, total Scope 3 emissions in each category would need to be reported by different tiers based on the quality of the underlying data and calculation process. Emissions could also be summed across tiers to provide a breakdown of the total Scope 3 footprint by data quality.

The GHG Protocol believes this approach would help provide increased transparency in distinguishing between levels of data accuracy, making Scope 3 data more actionable. It also hopes to encourage use of more accurate data over time by incentivising companies to move from lower to higher tiers of data quality.
Defining a data quality hierarchy
The challenge for the GHG Protocol is to create a simple data quality hierarchy that is easy for preparers to implement and for users to interpret. Over the years there have been various attempts to produce such hierarchies by different frameworks and organisations - one of the more successful is the PCAF data quality score used for financed emissions.
It’s too early to draw firm conclusions on where this will land. But our understanding is that the GHG Protocol is leaning towards requiring companies to distinguish between Scope 3 inventory quality based on whether the data is (i) specific, (ii) average, or (iii) spend-based.
There are still details that need to be worked out - for example, how to classify hybrid approaches and certain downstream Scope 3 categories. It also seems that “top tier” specific data will need to use both source-specific activity data (i.e. data that’s been metered, counted or physically modelled), as well as fuel or source-specific emissions factors.
In addition, the GHG Protocol is also considering whether to require companies to disaggregate data depending on whether it’s been verified. They are also debating whether to include a requirement to measure uncertainty (as opposed to a recommendation) - though this itself remains “uncertain” at the time of writing...
The Minimum Line
There’s always a risk of reading too much into ongoing discussions. Nothing is agreed until it’s agreed. This is also just one aspect of a wider review of the Scope 3 standard. And there is still a way to go before the GHG Protocol’s revisions see the light of day in 2027.
Even so, we can see a direction of travel emerging towards requiring companies to give more detailed information on the quality of the data in their Scope 3 footprint.
It’s too soon to know whether this will also lead to more prescriptive requirements for companies to apply certain levels of minimum data quality. Throughout this revision process, the GHG Protocol is managing a tension between balancing demands for increased prescriptiveness to improve comparability, while retaining sufficient flexibility to ensure completeness.
What is becoming clear is that it is increasingly imperative for companies to have a detailed carbon inventory management plan, which documents the methodologies, assumptions and estimates that have been used in putting together their carbon footprint.
Interested in building future-proof carbon inventories with Minimum?
Subscribe to our monthly newsletter or email info@minimum.com.