Getting a Grip on Scope 3 Data Quality

September 30, 2025
Written by  
Nick Greenwood
Climate Innovation Lead
Share this post

We dive into measuring data quality. The GHG Protocol is exploring ways to boost scope 3 data quality by requiring companies to disclose how specific their data is. But there are real questions about feasibility.

Defining data quality

Making scope 3 data more actionable is a key focus of the GHG Protocol’s revision to the Scope 3 Standard. Data quality is often insufficient for companies to move beyond reporting to action.

A key challenge is a lack of specificity. According to SBTi, only 6% of companies use supplier-specific emission factors. This means most firms are relying on average-data or spend-based methods, which makes it difficult to distinguish between suppliers and act accordingly.

Figure 1. Survey on differences in GHG accounting approaches used for baseline emissions (SBTi)
Figure 1. Survey on differences in GHG accounting approaches used for baseline emissions (SBTi)

But data quality isn’t only about specificity. Data can be specific but lack accuracy - as the GHG Protocol explains (see Figure 2). For example, metered fuel data recorded with an incorrectly calibrated device is highly specific but inaccurate.

Specificity vs. Accuracy

Even though the supplier-specific and hybrid methods are more specific to the individual supplier than the average-data and spend-based methods, they may not produce results that are a more accurate reflection of the product’s contribution to the reporting company’s scope 3 emissions.

In fact, data collected from a supplier may actually be less accurate than industry-average data for a particular product. Accuracy derives from the granularity of the emissions data, the reliability of the supplier’s data sources, and which, if any, allocation techniques were used. The need to allocate the supplier’s emissions to the specific products it sells to the company can add a considerable degree of uncertainty, depending on the allocation methods used.

Figure 2. GHG Protocol Technical Guidance for Calculating Scope 3 Emissions

In fact, data quality encompasses several aspects: the extent to which data is complete, reliable, and representative (geographically, temporally, technologically). Different dimensions can impact overall data quality, such as the data source, methods used to calculate emissions, degree of uncertainty, as well as specificity.

Which dimension matters often depends on the use case - measuring overall size of emissions, reporting on achieved reductions, or driving action.

Communicating data quality

As we discussed in our April newsletter, a key challenge with scope 3 is communication. Companies are both preparers and users of emissions data. One company’s emissions are another’s scope 3 emissions, which in turn form part of someone else’s scope 3 emissions, and so on.

For emissions data to flow across the value chain, companies need to:

  1. Collect raw data, which involves sourcing and integrating company-specific and third-party data (whether reported emissions, primary or secondary activity data or emission factors)
  2. Transform this raw data into an organisational or product emissions footprint using appropriate methodologies
  3. Extract insights to drive decarbonisation, and
  4. Disclose their data as an input for others to do the same thing.
Figure 3. Carbon value chain data flow (author)
Figure 3. Carbon value chain data flow (author)

A key challenge for the GHG Protocol is devising a way to communicate data quality that enables this information to flow across the value chain, so that it is:

  • Easy to interpret for data users: so firms can easily ascertain whether the data they are using for their scope 3 footprint is of sufficient quality to measure progress and inform action
  • Easy to implement for data preparers: so firms can straightforwardly assess and communicate the quality of their scope 3 footprint
  • Not subjective: avoids excessive interpretation and judgement.

Given the multiple dimensions of data quality, meeting these criteria is no easy task.

Webinar Recording

Want to learn more about the
GHG Protocol updates?

Watch Webinar Recording
icon

A data quality hierarchy based on specificity

After considering different options, the GHG Protocol looks to have converged on the proposal of disaggregating scope 3 according to a data quality hierarchy - based on specificity.

Companies would be required to disaggregate each scope 3 category by three tiers: (i) specific; (ii) non-specific (or average); and (iii) EEIO (environmentally-extended input-output) or spend. With a possible “unknown” category when firms lack information. The precise category names are still under discussion.

While disaggregating by specificity supports actionability, as noted above, it doesn’t necessarily convey information about accuracy and precision.

The GHG Protocol is therefore also proposing that companies indicate whether the data in their scope 3 footprint has been verified. It may also require large companies to conduct an uncertainty assessment (pending development of a standardised method).

As a result, in the future, companies might be required to report their scope 3 emissions in a table that looks something like the following:

Figure 4. Illustrative format under discussion in GHG Protocol technical working group

Getting specific

While the proposals should help users of scope 3 data form a better understanding of its quality, it’s less clear how feasible this will be for preparers to implement.

The proposals will require companies to determine the specificity of emissions data (whether it’s specific, average, or EEIO) based on how specific both: (i) the underlying activity data and (ii) applied emission factors are to the value chain activity being calculated.

Activity data would be considered specific if it meets all of the following conditions:

  • Time period: the measurement period corresponds to the reporting period
  • Location: the data is collected from a company’s own premises or provided by value chain partners for the specific site and technology/process/product/waste fraction
  • Allocation: if allocation is used, it is applied consistently across all outputs to avoid under- or over-reporting of emissions
  • Activity-specific rules: e.g. fuel, energy and material consumption must be measured in physical units.

Emission factors would be classified as specific if they comply with GHG Protocol corporate standards, use the latest IPCC Assessment Report, and meet type-specific criteria (see below).

Figure 5. Possible definition of specific emission factors (GHG Protocol technical working group)

By contrast, non-specific emission factors would be those modelled with secondary datasets.

The final classification of scope 3 data into tiers would depend on the combined specificity of the activity data and emission factors used in the calculation.

Figure 6. Illustrative classification by data specificity tier (GHG Protocol technical working group)

Scope 3 emissions would be classified into tiers according to whether they use:

  • Specific (blue): specific activity data and specific emission factors
  • Non-specific/average (yellow): any average, secondary, or proxy emission factor, regardless of activity data specificity
  • EEIO/spend (magenta): any EEIO or spend-based emission factor, regardless of activity data specificity.

A similar, though not identical, approach is being explored for some downstream categories (9-12), with a stronger emphasis on certainty levels for activity data.

The Minimum Line

Given the multi-faceted nature of data quality, it’s no easy task to summarise it in a single straightforward metric. Over the years a number of approaches to measuring data quality have been proposed by different schemes and standard-setters.

The focus of the current proposals on specificity suggests the GHG Protocol is prioritising making scope 3 data more actionable. The proposed disaggregation is potentially helpful for users of scope 3 data. In fact, Corporate Standard working groups are now considering whether elements of this approach could extend to scope 1.

But companies that use scope 3 data are also preparers themselves. And this raises concerns about feasibility. These proposals could materially increase reporting costs and exceed the capabilities of some existing data management systems.

This is a live topic of debate in the working group, with options on the table to extend implementation timelines or soften requirements into recommendations.

Written by  
Nick Greenwood
Climate Innovation Lead
Share this post
stay up to date

Stay ahead on sustainability — get insights, updates, and best practices from the team behind Minimum.

Subscribe to our newsletter