Meta has developed an AI-based approach to improve the quality of Scope 3 emissions estimates across its IT hardware supply chain. The method combines machine learning and generative models to classify hardware components and infer missing product carbon footprint (PCF) data.
Presented at the 2025 Open Compute Project summit in May, the work contributes to broader efforts to standardize emissions reporting and improve the quality of data used in procurement and decarbonization planning. It also reflects a growing focus on applying AI techniques not only to measure environmental impact, but to help manage and reduce it across digital infrastructure.
At the heart of Meta’s approach is a hybrid AI pipeline designed to improve data coverage and consistency. The system uses machine learning to estimate PCFs (Product Carbon Footprint) by identifying components with similar specifications. In parallel, a generative AI model classifies hardware into a shared taxonomy for Scope 3 reporting.
The taxonomy model addresses a key friction in Scope 3 reporting: inconsistent naming and categorization of components across suppliers. By using generative AI to interpret part descriptions and unify them under a shared schema, Meta enables more consistent PCF assignment and reduces redundant disclosures for similar hardware.
The result is a more complete and standardized dataset across Meta’s fleet. The methodology supports open standards efforts through the iMasons Climate Accord and the Open Compute Project, and forms part of Meta’s roadmap toward net-zero emissions by 2030. Meta has also open-sourced the taxonomy model used in this process, aiming to encourage adoption across the supply chain and reduce duplication in emissions disclosures.
While Meta’s work focuses on upstream emissions data, AI is certainly not new inside infrastructure performance optimization. Google’s early work with DeepMind showed how reinforcement learning could reduce data center cooling demand. More recently, the company reported a 12% drop in data center emissions in 2024, citing ongoing AI-driven optimizations in cooling, workload distribution, and hardware utilization.
Microsoft has also positioned AI as a core part of its sustainability strategy, applying machine learning to areas like power forecasting, grid-aware workload scheduling, and emissions monitoring across Azure data centers.
In contrast to these large-scale, AI-driven systems, earlier efforts have often relied on open source tools to surface emissions data in a more static or advisory capacity. Projects like the Carbon Aware SDK and Cloud Carbon Footprint enable developers to estimate energy use and schedule workloads based on grid carbon intensity. However, the tools depend on predefined rules rather than adaptive learning. The shift underway is from passive visibility to active discovery and optimization, where AI systems make continuous decisions in response to real-world environmental signals.
Whether applied to hardware or software, these approaches depend on access to high-quality, machine-readable emissions data. The Open Compute Project’s new schema for reporting product carbon footprints is one example, aimed at standardizing how vendors disclose hardware emissions across the supply chain. Meta’s work feeds directly into this framework, providing a mechanism to classify and infer missing data points at scale.
