Google and company face CO2 problem

The cooling system in the billing center consumes a lot of electricity. The picture shows the cooling system of the Ashburn data center in Virginia, USA.Image: Washington Post/Getty

Big tech companies are also climate-friendly, but their energy consumption is increasing — also because of ChatGPT and other AI tools.

Daniel Zurauf/ch media

Artificial intelligence is still primarily a matter for professionals. But so-called “deep learning” systems, such as voice robots that gain the ability to independently write reports on football matches or imitate voices, are now available to the masses. ChatGPT and similar conversational systems allow every average mobile phone user to initiate the learning process underlying the system.

But those things that are convenient, sometimes fun, and often useful also use a lot of power. The exact amount is difficult to quantify, but experts believe that commands issued to “intelligent” chatbots release about three times more greenhouse gases on average than a normal Google search query. This is noteworthy, especially since the latter cannot exactly be described as climate-friendly.

CO2 emissions are rising sharply

This can be confirmed, among other things, in tech conglomerate Google’s current sustainability report. The company is also very active on environmental issues and has committed to meeting all of its electricity needs with renewable energy as early as 2030. It emitted 14.3 million tons of greenhouse gases last year. Of course, this figure is also reassuring when you compare it to the almost ten times higher CO2 emissions of Holcim, the world's largest cement company.

However, there's no denying that Google's carbon footprint is deteriorating at an alarming rate. Since 2019, the group has caused or induced a 48% increase in CO2 emissions. This number will increase by 13.5% in 2023 alone.

FILE - A sign is seen on the Google building on the campus in Mountain View, California, on September 24, 2019. (AP Photo/Jeff Chiu, File)

Google noted that achieving its “extremely ambitious” net-zero target by 2030 as planned “won't be easy.”Image: trapezoid

Google noted that achieving its “extremely ambitious” net-zero target by 2030 as planned “won't be easy” because uncertainty about the future development of artificial intelligence will complicate forecasts. Two years ago, in an interview with CH Media, Urs Hölzle, a long-time Google executive in Switzerland, responded with a refreshing openness to the question of how realistic the company’s climate goals were: “Let’s be honest: We It’s not clear yet. As long as energy suppliers provide green electricity, we will buy it. If this doesn’t work 100%, we will take action ourselves.”

Clean and affordable nuclear power?

In apparent anticipation that the growing popularity of mass-market AI applications will continue to weigh heavily on Google's carbon footprint, the company is now becoming “proactive.” On October 14, he announced the signing of a cooperation agreement with the US company Kairos Power to build small nuclear power plants, called small modular reactors (SMRs), with a power generation capacity of 500 megawatts by 2030. Google said that through this agreement, American consumers will benefit from more, clean and affordable nuclear power.

The benefits of such microreactors presented by the technical group are evidence of the euphemism. Apart from the fact that the spread of nuclear technology associated with Kairos' business model poses significant security risks, its cleanliness can be seen as a pious aspiration at best.

Just last week, the British parliamentary committee responsible for monitoring the Sellafield nuclear facility found that despite all previous efforts, 2,100 liters of radioactively contaminated water were being discharged into the water at Europe's largest nuclear facility every day. Environmental flow built in the 1960s.

It would take decades to completely seal a warehouse of contaminated materials. Against this background, the argument that SRM produces less radioactive waste and only needs to be safely stored for 500 years instead of 10,000 is not particularly convincing.

On top of that, just a year ago, the US company Nuscale Power's state-funded mini-reactor project failed miserably. Facts have proved that the technical difficulty of the project is far greater than expected and it cannot provide evidence of economic operation. CEO John Hopkins said on the occasion of the capitulation in November 2023: “If you've reached a point where you're stuck on the edge of a tiger, it's best to get off now.”

Limitations of deep learning

Given the rapid proliferation of “deep learning” systems on the phones in everyone’s pockets and on the computers of every small and medium-sized business, such observations are more meaningful than Google’s carbon footprint. Roland Siegwart, professor of robotics at ETH Zurich and a pioneer in the field of machine learning, said in an interview with “Blick” in 2023 about the significance of chatbots to everyone: “I am skeptical about it. Like ChatGPT Such a system would need to analyze an extremely large amount of data in order to produce good results. But if you show a child a drawing of a giraffe, he or she can immediately recognize it in a zoo, even if the animal looks different in the zoo. It’s different in the book. Systems that learn from large amounts of data can’t do that yet.”

This statement is interesting because it means that many of the current and, most importantly, potential large-scale applications of AI could actually be easily abandoned. But the energy-savers lobby is small, and big industry has long since discovered that climate protection is a business and a good argument for its own marketing. For example, electrical engineering group ABB sells its expertise in electrification, bringing huge benefits to society in the fight against greenhouse gases and global warming. There is no doubt that ABB’s services help improve the energy efficiency of billing centers processing AI data. But data centers are also lucrative customers.

Billing centers run rows of servers and reach high operating temperatures, which must use large amounts of energy for cooling. The International Energy Agency (IEA) predicts that global energy consumption in billing centers will double to 1,000 terawatt hours within two years. This is roughly equivalent to the amount of energy currently produced annually by large industrialized countries like Japan. ABB and many other companies naturally view this first as a major growth market rather than as a problem of power shortages.

epa06616470 An undated handout image posted by Facebook shows server racks at a data center in Prineville, Oregon, United States, on March 20, 2018. Britain and the United States are increasingly contentious...

Large server farms require few people but a lot of energy.Image source: EPA/EPA

Of course, the same applies to Google, which sees its services primarily as a valuable contribution to climate protection and wants their energy needs to be seen as a necessary evil. Urs Hölzle, a long-time Google manager, admitted the second fact in an interview with CH Media: “I'm not one of those people who believes that technology itself is the solution. It obviously helps. But take energy-saving route planning as an example: of course, giving up completely The journey will be better.” Given the current power of opinion leaders and the lack of consumer awareness, it seems unlikely that the electricity saver community will one day be able to have its voice again in discussions about climate protection measures.

Recipes to fight waste

  • How much power does a specific piece of software or a specific AI application require? Stefan Naumann, professor at the Institute of Sustainable Informatics at the University of Applied Sciences Trier, says determining resource needs depends on many factors. This makes self-critical examination of projects even more important.
  • Is an AI solution necessary for the given requirements?
  • Are there existing models that just need a slight modification?
  • Is there already reliable data that can be used to develop AI models?
  • Are there any environmental awards (such as Blue Angels) for the data centers where AI models are trained?
  • Given the expected number of requests, is it worth optimizing the AI ​​model significantly so that requests require as few resources as possible.

You may also be interested in:

Amid the North's deployment of soldiers to Russia, North Korean Foreign Minister Choe Son Hui assured Moscow that Pyongyang will provide help until it wins the war against Ukraine.