AI: A sustainability friend or an environmental foe?

Posted: July 02, 2024

Green ammonia is decarbonizing food production

AI has been helping industries decarbonize for years by making their operations more energy efficient. But, manufacturing, training and running the computers that power AI also produces a lot of carbon emissions. So, which is it? Is AI increasing or decreasing our industrial carbon footprint?



Listen and subscribe


REBECCA AHRENS

Imagine it’s 1700. You’re a farmer, living in the countryside, and you need to go somewhere—or maybe move a bunch of things from one place to another that’s, say, 15 miles away.

Back then, before the first Industrial Revolution, you would have had to rely either on your own physical power or on what animals could do for you. 

There were things like windmills and tools that used gravity to do things. After all, Don Quixote was battling windmills way back in the 1600s. But in the 1700s and even early 1800s, a lot of labor was still accomplished using plain old muscle power. 

But then, during the first Industrial Revolution, we figured out how to create a new kind of power—a power that totally changed the world and made many of the things that you and I enjoy about life in the 21st century possible.

ELIZABETH

What do you mean by “new kind of power?”

REBECCA

Artificial power.

ELIZABETH

Artificial power?! I’ve never heard of such a thing.

REBECCA

I guarantee you have heard of it. You just might not have heard it called that before. But to explain what artificial power is, I want to move forward in time to the summer of 2016. 

The TED conference was taking place in Banff, Canada that year. And during one of the presentations, this unassuming, grey-haired man with a very, very subtle lisp and an iconic beard got on stage and delivered this short speech on how the proliferation of artificial intelligence was going to bring about a new industrial revolution as—if not more—dramatic than the one that gave us artificial power. 

KEVIN KELLY [recording from TED talk]

So everything that we had electrified, we’re now going to cognify.

REBECCA

That man was Kevin Kelly, the founding Executive Editor of WIRED magazine and author of a bunch of books, including The Inevitable and What Technology Wants.

When we discovered that we could burn things like wood, or coal, or gas, to create motive force and, eventually, electricity, it totally upended society.

So, after the Industrial Revolution, instead of walking or having some kind of animal pull you around, you could get in a car and burn gasoline to get yourself or stuff from one place to another. Instead of having to drill holes and cut or saw things by hand, you could create machines and tools that would drill holes and cut things using the power of electricity.

For a little more context: the first battery wasn’t invented until 1800. Electricity generators and motors weren’t invented until around the 1820s. But once we had these things, it just completely revolutionized the way we did practically everything. 

ELIZABETH

I just can’t imagine life without electricity. I mean, think of all of our household appliances, modern medical devices, transit systems, movies, recorded music.

REBECCA

I know, yeah. So, anything that requires electricity to run—or any kind of, like, burned fuel to run—would not have existed without the discovery/invention of this artificial power. And these things have obviously radically improved the standard of living for most people around the world.

But—and this is a very big “but” here—many of the environmental issues we face today can also be tracked back to the evolution of artificial power. So even though artificial power has given us all this great stuff, it’s also gotten us into a lot of trouble.

So, for instance, burning coal, oil and natural gas to create electricity and heat alone accounts for about a quarter of all emissions in the U.S. If you bring transportation into the mix—so, like, burning stuff to move cars and planes and boats around—and you bring other industrial tasks into the mix, like industrial chemical reactions and whatnot, suddenly you’re at three quarters of all greenhouse gas emissions.

ELIZABETH

Wow.

REBECCA

And—I know—and these are roughly the same numbers globally. There’s an interesting chart from Our World in Data that we will link to in the show notes on the episode page that shows this global breakdown of emissions sources.

ELIZABETH

So, what’s really next?

REBECCA

So, this next industrial revolution that Kevin Kelly was talking about is poised to once again radically reshape our lives and our world. But, the impact that AI has on the environment and sustainability goals we’ve set for ourselves will largely depend on the relationship between AI and the industrial sector. 

Today, we’re going to explore both the negative and positive influences AI models could have on our environment and global sustainability goals. And we’ll talk about what role critical industries, like water, energy, manufacturing, materials science, can play in shaping how AI changes life on planet earth moving forward.    

Ok. On with the show.

REBECCA

I’m Rebecca Ahrens

ELIZABETH

And I’m Elizabeth Dean 

REBECCA

And you’re listening to stories from Our Industrial Life. 

ELIZABETH

Each episode, we bring you tales of how industrial data and technology impact our everyday lives, keep our world running, and shape the future of society and the planet. 

REBECCA

This week on the show—AI: A sustainability friend or an environmental foe?

We’re going to explore the two possible paths in this particular “choose your own adventure” we all find ourselves in: how this AI-driven industrial revolution could help us mitigate the downsides of the first Industrial Revolution—in other words, be a sustainability friend—and/or how it might make things worse. 

And to help us think through these two paths and what they might offer us, I want to bring in another guest. 

REBECCA

Can you hear me ok?

SIMON

Oh, super clear.

REBECCA

This is Simon Bennett. He’s the Director of Innovation and previously the Global Head of Research at AVEVA.

SIMON

There's obviously a lot of different angles for AI and sustainability. There are so many different angles towards managing the reduction of carbon in the environment.

REBECCA

I want to start by looking at four areas in particular where we might have concerns: electricity usage, water, raw materials and carbon emissions. Obviously, these are all interconnected, but maybe let’s start with electricity and carbon emissions. 

SIMON

This is a very topical subject. I see a lot of papers on this and a lot of press coverage on this. And of course, my answer is gonna come out of the mouth of somebody who works for a tech company. So my perspective is only my perspective, having spent my career in software.

What I've learned is that at worst-case estimates our current incremental usage of AI and all of the technologies that will support that is that the total created carbon for the planet is going to be anywhere between one and three percent.

REBECCA

Yeah, so, I’ve heard that the global contribution of the tech sector—and AI models in particular—to greenhouse gas emissions is pretty difficult to calculate for several reasons. And we’ll get into that. But, most estimates out there put the current carbon emissions from the information communication sector somewhere in that one to four percent ballpark.

SIMON

And although they are small numbers, they are huge amounts of tonnage of carbon that will be created by compute.

REBECCA

Yeah.

SIMON

And that’s not specifically AI, by the way. When we say compute, I mean the whole world: the photographs that me and you download from our cloud service providers, etcetera.

Let's think about people playing video games: that's compute. We're all consuming compute in some form or another. AI surely does consume more because there's some very compute-intensive activity to create these models. It's compute that's very, very intensive on the graphic processing unit.

ELIZABETH

What does that mean—“intensive on the graphical processing unit?” I’m realizing I don’t really know what people mean when they talk about “compute.”

REBECCA

So the graphical processing units, or GPUs, are a type of integrated computer circuit that’s really good at doing parallel processing—in other words, processing different tasks or calculations at the same time rather than sequentially. And the piece of hardware in a processing unit that does that actual processing and computation is called a core. And GPUs have hundreds to thousands of cores. 

GPUs were actually originally designed for video and imaging processing. So, if you have any avid gamers in your family, for instance, you might have heard them gush at some point about a new graphics card they just got.

ELIZABETH

That would be my nephew. Not too long ago he ordered and installed a new graphics card. Said it made all the difference.

REBECCA

Yes, exactly. So, those are the GPUs. So eventually, people realized that GPUs’ ability to do fast parallel processing made them great candidates for machine learning: training neural networks like large language models and other forms of generative AI.

ELIZABETH

Like ChatGPT? 

REBECCA

Yeah, let’s take a chatbot like ChatGPT for example. OpenAI built ChatGPT based on a particular kind of neural network called a large language model or, LLM. The LLM that powers chatGPT was trained on a private Microsoft Azure-based supercomputer. That computer is basically just a bunch of those GPUs we talked about (plus some other hardware) sitting in a massive data center—or maybe several data centers, somewhere like Iowa or Virginia—operating with hundreds of thousands to millions of those cores.

ELIZABETH

So when people talk about the amount of electricity it takes to train and run an AI model, what they really mean is the amount of electricity it takes to power these data centers? 

REBECCA

Yeah, exactly. So, the U.S. has the largest single concentration of data centers: roughly a third or so of the 8,000 that there are in the world. And according to a report from an organization called the International Energy Agency (IEA), power consumption for the data centers in the U.S. is expected to increase from around 200 terawatt-hours in 2022—which is roughly 4% of the U.S. electricity demand—to almost 260 TWh in 2026—which is roughly 6%.

SIMON

So it's definitely consuming a lot of electricity, which is created by all kinds of different methodologies.

REBECCA

Meaning, of course, until we can completely decarbonize electricity production, with electricity comes carbon emissions.

Even though we can say for sure that large AI models are extremely electricity hungry, it’s hard to draw any general conclusions about how much carbon dioxide is associated with building, training and running them because, for one, as Simon just alluded to, different data centers have different amounts of renewable energy in the mix—meaning, you know, some data centers might be exclusively run on fossil fuel-based electricity, some might be run on a mix of wind and solar and fossil fuels: it just depends on, kind of, where the data center is.

And secondly, there are also carbon emissions associated with mining the metals needed for those computers—and then you gotta ship those raw materials to, you know, Taiwan, or wherever the computer components will be fabricated, and then you have to ship those fabricated parts to wherever the data center is going to be, and so on. And on top of that, if a new data center is built with the intention of running a large AI model, you could also take into account all the emissions associated with building that data center.

ELIZABETH

Because no model, no need for a new data center.

REBECCA

Exactly. No model, no data center. 

So, obviously, people are really interested in trying to quantify and figure out exactly how many carbon emissions are associated with these AI models, which, as I said, is tricky to do. But there are research scientists at this AI app developer called Hugging Face that have been attempting to map the lifetime carbon footprint of an AI model by taking into consideration these sorts of emissions that are produced during the manufacturing of the computer equipment. 

They estimated, for example, that training one large AI model, called BLOOM, resulted in 25 metric tons of carbon dioxide emissions. But, that figure doubled when they took into account the emissions produced when you manufacture the computer equipment, the emissions from training and running, the broader computing infrastructure, and the electricity required to actually run the model once it was trained, which is a process called inference—like asking ChatGPT a question, for example.

ELIZABETH

Wow. So if you double that, that’s really roughly 50 metric tons.

REBECCA

Exactly. And 50 metric tons—just for some image of what this means— is equivalent to about 60 flights between London and New York.

And, one more caveat here: BLOOM was sort of structured to run in a really efficient way. And, by contrast, training ChatGPT-3 produced an estimated 552 tonnes of carbon dioxide--which is like 20 times what BLOOM took to run. And that is just for operating the computers—that’s not for manufacturing them.

SIMON

Hyperscalers—the Googles and Amazons and Microsofts of this world—to their great credit, are setting a very impressive goal for themselves to make all of their data centers net zero. Meaning that they'll be powered by renewable energies—not just offset by carbon credits or something, let's say, less worthy.

And they’ve made very public commitments. There's lots of websites where you can see all of that data that they've made public.

REBECCA

It’s true that the hyperscalers Simon mentioned have made some really big public commitments. But, since he and I last spoke, the picture has gotten a little more complicated. 

So, for instance, Microsoft reported a 6.3% decrease in Scope 1 and Scope 2 emissions in 2023.

ELIZABETH

So, the emissions from keeping the lights on, running the compute in the data centers— that went down?

REBECCA

Yes. But, its overall emissions went up almost 30%.

ELIZABETH

How did that happen?

REBECCA

Well, for one, it built new data centers, and the emissions from making those buildings, manufacturing new computers—all those Scope 3 emissions went up over 30%.

ELIZABETH

Oof.

REBECCA

I know. But, all is not lost. Microsoft is going to start requiring some of its highest-volume suppliers to use 100% carbon-free electricity by 2030.

ELIZABETH

Ok, that’s good, right? So, we’re still working on decarbonizing the building of actual, physical data centers. But, once they’re up and running, what are the emissions from actually running the computers?

REBECCA

Well, that’s also complicated to explain. Sorry: nothing has an easy answer here. So, Microsoft AI, for example, is 100% powered by renewables. But, of course, that doesn’t mean that it’s hooked up directly into a windmill or a solar farm, for instance. It needs to take power off the grid just like anyone else. And when power producers—whether they’re renewable or traditional fossil fuel-based power—when they put power onto the grid, it all just gets mixed together. You know, you can’t differentiate the power once it’s in those power lines.

ELIZABETH

So, how does that work?

REBECCA

So, there are a couple things companies can do to offset any non-renewable sources they might end up using from the grid. The first is long-term power purchase agreements, or PPAs. These are essentially investments in renewable energy startups. 

ELIZABETH

So, they’re creating demand for renewable energy and helping new renewable startups get up and running.

REBECCA

Right. Then there are also these things called renewable energy credits, or RECs. But these just don’t seem to spur demand for renewables.

To turn this knot into a true Gordian situation: some people argue that while it’s all fine and good to invest in building more solar panels, and wind turbines, and batteries, and so on, those renewable energy sources might be better used powering things like homes, schools, office buildings and hospitals.  

ELIZABETH

Yep, makes sense. This is getting very complicated.

REBECCA

And we haven’t even talked about water yet. Because another big problem with running all that compute is…

SIMON

It creates heat. You know: temperatures.

REBECCA

A lot of heat. And computer equipment can be damaged if things get too hot, so it’s really important that those data centers have ways to stay cool. And, most often, this is done using fans or pumping cold water, which, again, requires electricity.

Google got into a little spot of trouble recently over plans to build a new data center in Chile. Depending on their size, data centers can use up to hundreds of thousands or even millions of gallons of water a day, which is a very concerning statistic, especially in an area that is already struggling with drought.

ELIZABETH

So, um, how do we mitigate that?

REBECCA

Well, some people suggest that one way to mitigate the environmental impacts of AI is to limit its use: to not integrate AI into things it doesn’t need to be integrated into. But I just really don’t think that that idea is realistic.

ELIZABETH

Why couldn’t we create regulations around this?

REBECCA

I mean, we could. But first of all, how do you define “need?” Like, who gets to decide what needs AI and what doesn’t need AI?  And second of all, if we’ve learned anything about technology and innovation from history, it’s that once that genie is out of a bottle, it’s incredibly difficult to put it back in again.

SIMON

I heard a professor at Cambridge University recently say—and I'm, by the way, I'm not, and he was not, dismissing this—but he was saying, “if it's one to three percent, is that what we should be focusing on for the sake of planet Earth?”

REBECCA

Yeah

SIMON

There are many, many industries that have been really dragging their heels in order to reduce their carbon emissions, and compute consumption—energy consumption through compute—is probably not going to be the thing that if we tip that over, we will say we will get to, you know, Paris-agreement levels of temperature reduction.

REBECCA

Yes, AI models can have a significant impact on carbon emissions and natural resources. But if we were able to snap our fingers and make all these models disappear suddenly, we would still be facing the same climate and environmental issues we are now. Plus if we got rid of all these models, we would lose out on the ways that these models can actually help us address many of the climate and environmental concerns.

Which brings us to the second road in our “choose your own adventure.”

SIMON

An Australian energy company called AGL Energy, who are largely a thermal power, natural gas—they use wind and hydroelectric—so they're a very green energy creator in Australia.

And in recent years, they've seen a massive growth of consumption from supplying 300 megawatts to 5,500 megawatts in only seven years, I think it was.

SIMON

This demand growth that they've had to cater for has seen a massive growth in the data that they're producing inside their business. And we've then layered on top of that AI tools for them to optimize the orientation of their wind turbines, which has resulted in over 200 machines being able to create 30% more power.

And and those are—well, I can't say how many dollars that is or how much carbon that's saved from the atmosphere—but you can see there are some key indicators from just one or two examples of AI where I think we could dwarf the actual cost of compute for AI itself.

Though the public media will put out a story which indicates that we should be frightened about the use of AI for some of these things, I'm not a fan of that. I think it's clickbait. And I think it also attracts readership to that kind of subject.

REBECCA

OK, so you were you were talking about AGL and I actually—I'm glad you brought them up—because I had David Bartolo from AGL on the podcast a couple years ago to talk about grid decarbonization and just what is that going to look like, what is it going to take?

And we touched a little bit on AI and predictive analytics in that conversation. It wasn't the focus of the conversation, but he brought it up, you know, in a similar way that that you just did, that for green energy producers in particular, AI can be really useful in helping them get the most out of their assets, right? So with the example you brought up—the wind turbine—is there an orientation that would be more effective or is there something going wrong with the wind turbine that we need to intervene and fix it before it stops working all together? 

Or—you’ve probably heard this as well—there's a lot of talk about the complexity, the increasing complexity, of balancing the power grid because it has to stay at this very precise level. And when you add renewable resources onto the power grid, the issue is that because they fluctuate so much, it makes it a lot harder to keep it at that stable level. And you can see how AI can absolutely play a role in helping to manage all of that information and making sure things stay stable.

SIMON

My sense of what you said earlier—that, you know, when we get more optimization of a wind farm or a PV, solar PV, farm—that that can only help to accelerate the transition. Because the more reusable energy competes with the fossil-burning tradition, the more it's going to win. Because the reality is they’re easier, cheaper to maintain and set up than a traditional coal-burning plant or something. They are easier to maintain and keep modern. They come with ready-made sensors, so they're full of data for you to manage in a more effective way.

And back to your point about other uses of AI: we've been experimenting with using AI for helping to predict the next few days’ worth of power generated from wind or solar PV.

Because the met data—the weather information, meteorological data—is almost public domain. You know, in many cases around the world that data is available to us to draw into an AI training model and for us to say to the energy provider: “next week you'll get this, this, this: our energy predictions.” And you can run this on almost a real-time basis to get projections for next week's worth of megawatt-hours that’s going to be produced.

Any edge I think you can give to renewable energies is a way to accelerate the energy transition. And AI is posed to do one of those things.

SIMON

One of the stories I've been very fascinated with is how to apply AI specifically to the material science of the electrolyte inside a battery form.

REBECCA

Ok. Interesting.

SIMON

So the electrolyte is the material inside a battery which conducts the electricity itself, and it's the piece of the material which is put under stress when it's going through its recharge cycles. So we punch a lot of energy into it and that creates a lot of heat and then it expands and shrinks and that's why you get diminished lifetimes for our batteries.

So there's a lot of battery science going on at the moment and there's a move from lithium ion to pure lithium which is a move towards a different state of battery, which is more high performance. But of course everything is then lithium oriented, which is you know a very finite planet resource.

REBECCA

So, scientists are looking for electrolytes that perform better and use less lithium.

ELIZABETH

And they’re using AI to do this?

REBECCA

Yeah: sounds like it.

SIMON

It’s one of those lifelong research tasks that many teams around the world have been kind of wrestling with: how best to create a chemical make-up for a battery which is affordable, performant, has good lifetime, etc.

So they set AI against that and it broke down—when it was given its targets, which was, you know, performance and duration and whatnot…

REBECCA

 …and this AI that Microsoft and the Pacific Northwest National Laboratory created looks at all the viable chemical elements and suggests ways to combine them.

And it came up with 32 million possible materials. But then, it filtered them down based on their stability, ability to conduct electricity, and very quickly got down to 800, you know, pretty good candidates.

Then the AI simulated the movements of atoms and molecules in each material to cut down the list to 150 candidates. And once the AI had the proposed 150 candidates, Microsoft used high-performance computing to evaluate which of those would be practical and cost effective.

And that left them with roughly 18 plausible candidates.

ELIZABETH

How long did this whole process take?

REBECCA

Just 80 hours. So, 80 hours to come up with 32 million possibilities and then narrow it down to the 18 most promising.

SIMON

And and then the laboratory that was actually running that AI tool then were asked to commission a number of tests. So they did laboratory synthesizing of some of these materials. And I think they did about six or seven of them.

What it indicated to me was the solving of a large number-crunching problem is very natural for AI. It’s very much a landing place now. And the world of research, specifically in the sciences, has just exploded with this because the science community has always struggled with the amount of time it takes to crunch numbers and the amount of time it takes and people with skill to dive into all of these data sources.

But with an AI, it can just simply be left alone for a few days and run its cycles and come up with different projections.

It showed an alternative path for battery development, which was very exciting of course as part of that story.

REBECCA

So just to be sure I’m understanding the story you laid out. Basically you’re saying that their scientists were doing research into alternative electrolytes that could be used in batteries. And the AI was able to come in and sort of comb through all that data and say, no, like, it's not even worth looking into this batch, because I can already tell—I'm obviously anthropomorphizing the AI right now—but it narrowed the field for them so that they could target the most likely solutions that would, that could, be successful. And sodium was one of those—it looks like they ran a test. And it was…showed promise in some ways. And maybe it was limited in other ways. But it's just—the more salient point is that they don't now have to go through all 37, 000 options.

SIMON

Yeah. And the thing is as well is when you think of small-scale laboratory function, they can be so much more targeted with their funding. So, you know, if they have a small amount of resources to spend on testing and you've got 37,000 options, you're just going to have to do a lot of desktop study in order to come up with primary candidates. Whereas the system can come up with primary candidates.

It's a bit about failing fast. It's that whole idea that, you know, the quicker you can find out that something doesn't work, that you've actually saved money. It’s kind of an invisible process. But yeah, I thought it was an interesting other application of AI that I've seen.

REBECCA

So it sounds like the answer to the question—if the question were something like, “Is AI ultimately going to help with sustainability or be sort of a neutral contributor or maybe even be counterproductive?”—the answer is really, kind of, it depends. It depends on what type of AI we're talking about. It depends on the context in which it's being used.

So with the AGL example, if it's being used for increasing and optimizing the operation of renewable electricity sources, it's being used to help balance grids that are taking on more renewable sources, that is clearly going to have a net positive benefit, because the ultimate goal is decarbonizing our electricity production. Because this, this meta-question of, or this meta-concern of, how much energy and electricity AI requires isn't as pressing if we can decarbonize the power to begin with. So in that kind of context, clearly, clearly a benefit.

There's like a lower sort of level. I don't know if I want to qualify it that way, but there's a, say, a smaller scale kind of benefit that is more specific to a company where they might get a benefit to their bottom line in terms of increasing the efficiency of their process. But it’s not necessarily, like, moving the needle for sustainability globally, say.

I think—what I'm wondering and maybe this is sort of touchy or a controversial way to phrase this—but I wonder if in other applications, like say it's a manufacturing plant, for example. They're making widgets and they're using AI to try and save some amount of energy in that process. If there's not a way to look at that, that's kind of a Band-Aid over the problem, like the larger problem of decarbonization. Because for as long as the majority of our electricity comes from carbon sources, you know, saving a margin of electricity in some manufacturing plant, while great for that company, right—like if that's going to be a cost savings for them and it's going to reduce their resources and help them meet sustainability goals—when you talk about it on the global level, like, that's not really the problem. Right? It's decarbonizing the energy sources.

SIMON 

I feel the same way as you. I see lots of organizations that—they're looking to run their businesses more efficiently—but that was the same 50 years ago. That was the same 100 years ago. That's the ethos of a lot of company success is that they are just more efficient than the next company.

And it's, it's almost refusing the opportunity to think holistically about the challenge.

REBECCA

Right.

SIMON

It's like: I've got my machine, it's my current operating model. I've got my staff, I've got my skills, I've got my, you know, my process of making money as a business. And I'm being challenged now to reduce my carbon footprint. I will make that bit a bit more efficient and that bit a bit more efficient and I'll be able to reduce the workforce there, which will make me more efficient, and all the wrong approaches for challenging the planet's biggest challenge ever.

REBECCA

So this issue of what to do about AI, how should it be used? How should it not be used? Should it be regulated? Should it be banned? What about the fact that critical infrastructure like power grids and energy producers and water utilities and transportation providers and hospitals, and so on, already rely on AI and have been relying on it for decades at this point? All of these questions are complicated. They're multifaceted. And there aren't easy answers. I think we've demonstrated—if anything, we've demonstrated that in this episode.

And we haven't even touched on the AI alignment problem. But I guess when I think about this, where I find footing, say, is in something that Kevin Kelly said back in his talk, his TED talk, in 2016. And I'll just read you a couple of things he said.

So, one thing he said is:

“It's only by embracing it”—it being AI—”that we can actually steer it.” And I really agree with that, because I don't think AI is going anywhere.

And then I'll read you this one other quote that I that I thought was interesting as well, which is:

“What we're trying to do is make as many different types of thinking as we can. We’re going to populate the space of all the different types or species of thinking. And there are actually maybe some problems that are so difficult in business and science”—and I, as Rebecca, would add, the industrial space—”that our own type of human thinking may not be able to solve them alone. We may need a two-step program, which is to invent new kinds of thinking that we can work alongside of to solve these really large problems, like dark energy or quantum gravity.”

So. We will leave you today on that note.

Make sure you follow us on your preferred podcast platform, or subscribe to us on YouTube. And we'll see you next time with more stories from Our Industrial Life.


 

Resources

Watch Kevin Kelly’s TED talk from 2016 about AI spurring the next industrial revolution.

See the chart from Our World in Data of global greenhouse gas emissions by sector that Rebecca mentions in the episode.

Read the BBC report on the best estimate so far of emissions from just AI alone. Check out the Digiconomist blog by the study’s author, Alex De Vries as well.

Learn how Microsoft is following through on its sustainability commitments with a 10.5GW renewable power purchase agreement with Brookfield Asset Management, reportedly the largest of its kind to be signed between two corporate entities.

Listen to the Our Industrial Life episode on how AGL Energy uses AI to balance sustainable energy sources on its grid.

Learn more about how Microsoft and Pacific Northwest National Laboratory used AI and high-performance computing (HPC) to discover promising new battery electrolytes.

See the transcript above for more links to facts, figures and stories mentioned in the episode.

 

If you enjoyed this episode...

Listen and subscribe to Our Industrial Life at Apple Podcasts, Spotify, YouTube, Google Podcasts, and RSS.


Rate and review the show—If you like what you’re hearing, be sure to head over to Apple Podcasts and click the 5-star button to rate the show. And, if you have a few extra seconds, write a couple of sentences and submit a review to help others find the show.

Contact AVEVA
Live Chat
Schedule Demo