An AI Executive on Why the Technology Can Save the Environment, Not Destroy It

November 3, 2025

An AI Executive on Why He Thinks the Technology Can Save the Environment, Not Destroy It

Not AI generated. Photo: NOAA//Unsplash


The Inertia

Artificial intelligence is largely painted as an environmental burden. The AI race has fueled a boom in resource-intensive data centers, raising alarms about rising emissions, water use, and energy demand. Many fear AI will accelerate climate change rather than help solve it.

Alistar Campbell sees things differently. Campbell is vice president at Dotmatics, a scientific software company recently acquired by the tech giant Siemens. Dotmatics built Luma, an AI-powered platform designed to enhance research and innovation. The company claims its technology can help make the world a “healthier, cleaner, and safer place to live.”

“We are working especially with companies in the environmental sector to help them design new products and use AI as a tool in their armory,” said Campbell. “We take generic AI and then supercharge it with scientific smarts.”

Luma helps researchers interpret their data, streamlining workflows and analysis – a one-of-a-kind tool, according to Campbell. Unlike general-purpose AI models that scrape the internet for data, Luma is trained on clients’ own proprietary datasets, used across industries such as pharmaceuticals, biotech, life sciences, and biorenewables.

Campbell highlights two of Luma’s current applications that focus directly on sustainability: helping scientists develop sodium-ion batteries that reduce reliance on lithium mining, and researching enzymes and catalysts that break down plastics. Without AI, Campbell says, such innovation would take far longer.

Still, he acknowledges the AI criticism. “Having a healthy skepticism of new things is a good thing for all of us,” he said.

The doubts aren’t unfounded. AI is driving demand for data centers that consume massive amounts of power to run servers and water to keep them cool. MIT highlights that AI was a key driver in doubling North American data center power requirements between 2022 and 2023, and that a single ChatGPT query requires five times more energy than a typical web search. By 2028, data centers are projected to account for 12 percent of U.S. electricity consumption – many of them located in regions already strained for water.

Campbell believes the balance will shift as the technology matures. Efficiency gains and smarter energy management, he argues, can tip AI’s net impact toward the positive. He points to Dotmatics’ use of the on-demand computing model, where servers shut down when idle – as opposed to buying allocated server space that runs 24/7 – as one example of how the industry can reduce its footprint.

History offers some precedent. A 2024 New York Times analysis compared today’s AI boom to the cloud-computing surge of the early 2000s. Despite data center computing output increasing sixfold between 2010 and 2018, energy use rose only six percent thanks to advances in efficiency. Campbell thinks AI will follow a similar trajectory.

“Some parts of (AI) are overhyped – there will be areas that will probably not be successful,” said Campbell. “But if we use it right, if we put it in the hands of the scientists, it becomes a scientist co-pilot to augment their own scientific knowledge and research. I think it will be super successful in those spaces.”

For Campbell, the key is helping the public better understand how AI is used and what it can (and can’t) do. He doesn’t believe it will ever make scientists obsolete.

“You need human oversight to make sure the output isn’t a hallucination,” he said. “I don’t think it’s ever going to be fully autonomous AI. We have to understand the areas where it’s actually having a positive impact.”