Automated market makers incentivize users to become liquidity providers in exchange for a share of transaction fees and free tokens....Read More
by Roel Dobbe (Postdoctoral Researcher) and Meredith Whittaker (Co-Founder), AI Now Institute
OnSeptember 20th, workers from 12 tech companies joined the global climate strike, highlighting tech’s role in climate change and demanding “zero carbon emissions by 2030, zero contracts with fossil fuel companies, zero funding of climate denial lobbying or other efforts, and zero harm to climate refugees and frontline communities”.
This might have surprised some people, as tech’s contribution to the climate crisis is rarely acknowledged. Indeed, industry marketing often highlights green policies, sustainability initiatives, and futures in which artificial intelligence (AI) and other advanced technologies provide solutions to climate problems.
But there is much more to the story. In this post we outline what we do and don’t know about the role of AI and the tech industry in climate change, and discuss ways that policy makers can address this outsized impact.
The Climate Cost of Global Computation
The tech industry faces criticism for the significant energy used to power its computing infrastructure. In response, the major tech companies have made data centers more efficient, and worked to ensure they’re powered at least in part by renewable energy. Changes they’re not shy about, announcing them with marketing, and public fanfare.
These changes are a step in the right direction, but don’t come close to tackling the problem. Most large tech companies continue to rely heavily on fossil fuels, and when they do commit to efficiency goals, these are not open to public scrutiny and validation.
Researchers Lotfi Belkhir and Ahmed Elmeligi estimate that the tech sector will contribute 3.0–3.6% of global greenhouse emissions by 2020, more than double what the sector produced in 2007 (Belkhir and Elmeligi, 2018). The estimated 2020 global footprint is comparable to that of the aviation industry, and larger than that of Japan, which is the fifth biggest polluter in the world. Data centers will make up 45% of this footprint (up from 33% in 2010) and network infrastructure 24%.
To take a couple of concrete examples: Greenpeace’s 2017 Clicking Clean report showed that the major streaming companies Amazon Prime, HBO, and Netflix all use less than 22% of renewables. And Northern Virginia, which is home to the greatest concentration of data centers in the world, is powered by a utility company with only 1% of its electricity sourced from renewables (Cook et al., 2017).
And these numbers are likely to get worse. Assuming a continuation of exponential growth in data traffic and storage, which we have seen for the last three decades, Belkhir and Elmeligi estimate that the tech industry’s carbon footprint could increase to 14% by 2040, “accounting for more than half of the current relative contribution of the whole transportation sector” and more than the current relative contribution of the US.
With the emergence of wasteful cryptocurrency mining and 5G networks aiming to realize the “internet of things”, the increased acceleration of data collection and traffic is already underway (Hazas et al., 2016). In addition to 5G antennas consuming far more energy than their 4G predecessors, the introduction of 5G is poised to fuel a proliferation of carbon-intensive technologies, including autonomous driving and telerobotic surgery.
AI Makes Tech Dirtier
In the AI field there is a dominant belief that “bigger is better.” In other words, AI models that leverage mass computation are assumed to be “better” and more accurate. Rich Sutton, Distinguished Research Scientist at Alphabet’s Deepmind puts it this way: “methods that leverage computation are ultimately the most effective, and by a large margin.”
While this narrative is inherently flawed, its assumptions drive the use of increased computation in the development of AI models across the industry. Last year, OpenAI reported that “[s]ince 2012, the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.5 month doubling time (by comparison, Moore’s Law had an 18 month doubling period).” Their observations show developers “repeatedly finding ways to use more chips in parallel, and being willing to pay the economic cost of doing so”.
And as AI relies on more compute, its carbon footprint increases, with significant consequences. A recent study out of the University of Massachusetts, Amherst estimated the carbon footprint of training a large natural language processing model. Emma Strubell and her co-authors reported that training this one AI model produced 300,000 kilograms of carbon dioxide emissions (Strubell et al., 2019). That’s roughly the equivalent of 125 round trip flights from NYC to Beijing.
AI and The Fossil Fuel Industry
Adding to their already sizeable climate impact, big AI companies are aggressively marketing their (carbon intensive) AI services to oil & gas companies, offering to help optimize and accelerate oil production and resource extraction. Amazon is luring potential customers in the oil and gas industry with programs like “Predicting the Next Oil Field in Seconds with Machine Learning.” Microsoft held an event called “Empowering Oil & Gas with AI”, and Google Cloud has its own Energy vertical dedicated to working with fossil fuel companies. And C3 IoT, an AI company originally started to help the transition to a society fueled by renewable energy, now helps large oil & gas companies, including Royal Dutch Shell, Baker Hughes and Engie, to expedite their extraction of fossil fuel.
And recently, the Guardian examined the role of big tech in sustaining the market for fossil fuel, illuminating the massive amounts of money tech companies invest in organizations that actively campaign against climate legislation, and promote climate change denial.
Opacity and Obfuscation
When researchers and policymakers attempt to account for tech’s climate footprint, it is immediately clear how little information is available. They are left to rely on voluntary company disclosures, without access to the information they would need to make a thorough accounting of tech’s true energy use.
Belkhir and Elmeligi, in the study cited above, explore this frustrating lack of access and information, and its consequences for research (Belkhir and Elmeligi, 2018). There is simply very little public data available, nor are there incentives for tech companies to release it. Without the information necessary to reach robust conclusions, Belkhir and Elmeligit had to estimate 2018 data center energy consumption using data from 2008. It was all they had to work with, even though, in the past ten years, both the scale of computation and the technologies powering it have changed radically.
The authors of Greenpeace’s report make similar observations, stating that while efficiency metrics have been eagerly adopted by the industry, “very few companies report under newer metrics [..] that could shed any light on the basic question: how much dirty energy is being used, and which companies are choosing clean energy to power the cloud?”
In its report, Greenpeace asked tech companies to release straightforward information about data center energy impact, including the size of data center facilities and the percentage of renewables they used. They noted that the “continued lack of transparency by many companies […] remains a significant threat to the sector’s long-term sustainability.” The report also shows that many shortcuts are still employed to make corporate sustainability numbers look better than they should (Cook et al., 2017; see page 39).
Amazon Web Services (AWS) serves close to half of the global market for cloud services or “infrastructure as a service”, meaning millions of businesses and tech companies rely on the platform for their online operations (including data storage and computation). Despite their obvious responsibility, the Greenpeace report states that the company has remained “almost completely non-transparent about the energy footprint of its massive operations.” This, in turn, makes it impossible for the millions of organizations relying on AWS infrastructure to assess and report their own energy and carbon footprint. Such opacity doesn’t just prevent us from holding large tech companies accountable. It forms a critical barrier to meaningful energy accounting across all sectors and organizations that rely on digital technology.
Given the tech industry’s significant contribution to climate change, it is clear that policymakers would do well to pay more attention to tech’s climate impact. Which brings us to a key question: how can climate policy better take tech into account?
We have drafted seven recommendations that we believe provide an initial path toward tech-aware climate policy, and climate-aware tech policy.
1. Mandate transparency
Rather than relying on Greenpeace and other advocates, regulators should require companies to provide full energy and carbon transparency.
Information about a tech company’s total climate impact should be available publicly, and should be calculated in ways that inform customers and platform users about the impact of their own use of APIs, compute cycles, and other infrastructural resources.
We agree with AI researchers who propose “making [energy] efficiency an evaluation criterion for research alongside accuracy and related measures.” These researchers argue that floating point operations (the actual number of computations needed) be brought back as a central metric to compare models and (training) methods (Schwartz et al., 2019). This would be a welcome measure that complements the typically reported metrics of hardware use and time needed to run an algorithm.
Computing actual energy use is easy, as is accounting for carbon emissions across various units of computation (per operation, per training cycle, per cloud customer or user, per facility, etc.). Providing such information to regulators, the public, and as part of interfaces for AI programming languages and cloud services is a first step towards building broader awareness, informing regulation, and incentivizing researchers and developers make more conscious and accountable decisions.
2. Account for the “full-stack supply chain”
Understanding the energy use required to create AI and produce large scale technical systems is only a first step. To account for tech’s true environmental impact, we need a much broader view of the entire tech ecosystem — one that includes what economists often refer to as “externalities.”
In “Anatomy of an AI System,” a 2018 essay and large-scale map, AI Now cofounder Kate Crawford and Professor Vladan Joler examined a single Amazon Echo, illuminating the environmental and labor resources required to develop, produce, maintain, and finally dispose of this sleek and seemingly simple object (Crawford and Joler, 2018). Their map shows the global supply chains that are required to produce the Echo, including all of its extractive effects on the environment. This extends well beyond fossil fuel extraction, and include mineral mining for chips, exploitative human labor for labeling training datasets, and the significant waste produced by consumer gadgets designed for planned obsolescence. .
Without an understanding of this “full stack supply chain,” we will not be able to fully account for tech’s global climate impact.
3. Don’t settle for efficiency, and be wary of rebound effects
It has long been known that efficiency gains in a given process or service can lead to growth in our reliance on this same process or service — to such an extent that the growth cancels out the gains. This phenomenon is called the rebound effect.
For example, researchers in the UK showed substantial rebound effects in app-based ride-sharing, indicating that more efficient ridesharing leads to more use of cars, “cancelling out 68% to 77% of CO2 emission reductions and 52% to 73% of aggregated social benefits (including congestion, air quality, CO2 emissions, noise) expected from ridesharing” (Coulombel et al., 2019).
We are right to be wary that efficiency efforts in computing could lead not to climate gains, but to increased reliance on computation. In fact, we have seen evidence of this for decades. From 1997 to 2017, global internet traffic increased by a factor 1.7 million, equivalent to more than doubling each year, while microchips have only become twice as efficient every two years (Moore’s Law).
We know that carbon emissions must be curbed to prevent major climate collapse. While efficiency percentages are essential, real energy accounting requires absolute numbers. This means understanding the rebound effects of more efficient technical infrastructures, and taking steps to ensure this doesn’t lead to an increase in fossil fuel consumption.
4. Make “non-energy policy” analysis standard practice
Currently, AI is being introduced across nearly every policy area as a strategic priority, without accounting for its climate impact. Here we see the importance of so-called non-energy policy — a term describing policies that on their surface don’t engage climate or energy use, but nonetheless have profound climate implications.
In the UK, a recent study across thirteen non-energy sectors looked to understand the extent to which these policies account for energy impact. The authors found that policy proposals in non-energy sectors, such as agriculture or transportation, often don’t calculate climate impact. Across all policy documents for “Industry, business and innovation”, they found that “only one conducts dedicated analysis of the impact of non-energy policies on energy systems” (Cox et al., 2016).
AI and large scale technical infrastructure has a significant environmental costs. And as such, when AI is integrated in a non-energy policy domain, its energy and climate impacts should be calculated as a standard part of policy practice.
5. Integrate tech regulation and green new deal policy making
As we write this, the new European Commission and Parliament are considering regulation of tech companies, and looking at how to implement ambitious climate agreements. To our knowledge, these two policy areas are discussed in siloed policy teams and committees. Given tech’s climate impact, an integration of tech and climate policy is urgent and overdue. The European Commissioners Timmermans and Vestager have an opportunity to make this important connection when they begin drafting their 100 day plans for both climate and tech regulation.
6. Curb the use of AI to accelerate fossil fuel extraction
AI provided by major tech companies is being used to accelerate oil & gas discovery and excavation, effectively speeding up climate change.
As researchers McGlade and Ekins state in their groundbreaking 2015 Nature article, “a third of oil reserves, half of gas reserves and over 80 percent of current coal reserves should remain unused from 2010 to 2050 in order to meet the target of 2 °C.” In other words, to ensure we meet the minimum target set by the Paris Climate Agreements, fossil fuel reserves must remain in the ground.
Given the urgency of this target, it is clear that policy makers have a role to play in stopping tech companies from accelerating oil exploration and excavation. Regulation curbing the use of AI for fossil fuel extraction is necessary.
7. Address the use of AI to harm and exclude climate refugees
The best climate science, as agreed upon by the International Panel for Climate Change, estimates that we have only 8 and a half years before we burn through the remaining carbon budget sitting between us and irreversible ecological collapse.
Meanwhile, climate disaster is already unfolding for millions of people globally. In the first half of 2019 alone, a record 7 million people were displaced due to extreme weather. People of color, poor people, and those living in developing countries are on the front lines of climate change. These are the communities that will need shelter and assistance from rich countries and individuals who bear the most responsibility for carbon emissions.
Instead, we see a pattern of exclusion emerging, aided by the tech industry. Amazon, Palantir, and other companies are selling technology that is being used to police, surveil, and track vulnerable communities in the US, and people seeking shelter and asylum at the US border. These technologies are a core part of an infrastructure designed to exclude those in need, and shield those most responsible from accountability.
An honest and inclusive approach to keeping the tech industry accountable for issues of climate change would include, and perhaps start with, putting a halt to development and application of tracking and surveillance technologies that harm these vulnerable communities.
A call to action
The science is clear: if current emissions levels remain where they are, we will burn through the carbon budget set by the International Panel on Climate Change in 8 and a half years. Once we do, it will be nearly impossible to ensure a liveable planet.
Sadly, we see little action to curb emissions, with the tech industry playing a significant role in the problem.
Making the transition to a zero carbon society will require significant structural change, but there is still a window to act. Policymakers, tech workers, and academics have an opportunity to lead in holding tech accountable, and we can think of few tasks more urgent.
The authors would like to thank their AI Now colleagues for providing input in the research and writing of this post. We also welcome people working in the directions laid out in this post to share their work, reports, papers, ideas on Twitter (please include #AIandClimate, @roeldobbe, @mer__edith, @AINowInstitute).
Please cite as: Dobbe, R. and Whittaker, M. (2019). AI and Climate Change: How they’re connected, and what we can do about it. AI Now Institute. Retrieved from https://medium.com/@ainowinstitute.
- L. Belkhir and A. Elmeligi, “Assessing ICT global emissions footprint: Trends to 2040 & recommendations,” Journal of Cleaner Production, vol. 177, pp. 448–463, Mar. 2018.
- G. Cook et al., “Clicking Clean: Who is winning the race to build a green internet?,” Greenpeace, Washington, DC, Jan. 2017.
- N. Coulombel, V. Boutueil, L. Liu, V. Viguié, and B. Yin, “Substantial rebound effects in urban ridesharing: Simulating travel decisions in Paris, France,” Transportation Research Part D: Transport and Environment, vol. 71, pp. 110–126, Jun. 2019
- Cox et al., “Impact of Non-energy Policies on Energy Systems: a scoping paper”, UK Energy Research Centre, Nov. 2016.
- Kate Crawford and Vladan Joler, “Anatomy of an AI System: The Amazon Echo As An Anatomical Map of Human Labor, Data and Planetary Resources,” AI Now Institute and Share Lab, Sept. 2018.
- M. Hazas, J. Morley, O. Bates, and A. Friday, “Are There Limits to Growth in Data Traffic?: On Time Use, Data Generation and Speed,” in Proceedings of the Second Workshop on Computing Within Limits, New York, NY, USA, 2016, pp. 14:1–14:5.
- R. Schwartz, J. Dodge, N. A. Smith, and O. Etzioni, “Green AI,” arXiv:1907.10597 [cs, stat], Jul. 2019.
- E. Strubell, A. Ganesh, and A. McCallum, “Energy and Policy Considerations for Deep Learning in NLP,” arXiv:1906.02243 [cs], Jun. 2019.