The environmental impact of artificial intelligence: what we're not seeing
THE'environmental impact of artificial intelligence It's often told simply as a story of efficiency and sustainability. In reality, the environmental impact of artificial intelligence relies on a physical infrastructure that consumes enormous amounts of energy, water, and raw materials, with costs that are increasingly difficult to ignore.
Large generative AI models, data centers, and specialized hardware require industrial-scale resources. Every calculation, every prompt, every new model version has a specific environmental cost, which must be considered throughout the entire lifecycle: hardware production, training, inference, and disposal.
Understanding these costs is also essential for those who work in digital marketing and marketing automation. Companies that adopt AI in their processes – including communication on channels like WhatsApp Business – must start integrating criteria computational responsibility in their strategies, combining innovation, customer experience and sustainability.
The environmental impact of artificial intelligence: the two faces of AI
To understand the’environmental impact of artificial intelligence Let's imagine two opposing but closely related scenarios. In the first, an AI system manages data from thousands of sensors on a power grid in real time, optimizing the distribution of renewable energy and reducing greenhouse gas emissions with greater precision than human experience.
In the second, the data center powering that same AI consumes millions of liters of water a day to cool thousands of processors, drawing water resources from already stressed areas. These two images demonstrate the dual nature of AI: on the one hand, a technology enabling sustainability, on the other, an energy- and water-guzzling infrastructure with an exponentially growing environmental footprint.
AI is often perceived as "immaterial" because it lives in the cloud. But the cloud is made of wires, servers, silicon, and rare metals, distributed across vast computing campuses. The narrative of digital efficiency risks obscuring a simple fact: the increase in computational demand far outstrips improvements in hardware efficiency.
For this reason, several researchers are proposing a paradigm shift: moving from the myth of efficiency alone to the logic of "computational responsibility." This means evaluating AI systems not only for accuracy and speed, but also for their physical and climate costs throughout their entire life cycle.
AI between sustainability and hidden costs: energy, emissions, hardware
Before analyzing costs, it's essential to recognize that AI can be a powerful enabler of sustainability. Google, for example, has shown that adopting DeepMind algorithms to optimize its data center cooling systems can reduce cooling energy consumption by up to 401 t, thanks to dynamic adjustments based on real-time data.
In global supply chains, companies like Unilever are using AI to monitor the origin of critical raw materials, such as palm oil, integrating satellite imagery, IoT data, and blockchain to reduce waste, deforestation, and violations of environmental standards. Similarly, in the energy sector, companies like AES are using predictive models for wind turbine maintenance, achieving 90% accuracy in predicting failures and significant savings: up to $100,000 per incident and 3,000 maintenance trips avoided.
The flip side of this coin is the race to ever-larger models. Training a single Large Language Model can require as much electricity as 100 US homes in a year.’environmental impact of artificial intelligence It increases with the complexity of the model: for GPT-4, for example, over 20,000 tons of CO₂eq are estimated to be emitted for one training cycle.
According to the International Energy Agency (IEA), data centers currently account for between 11 and 21 watts of global electricity demand, with a potential increase of 801 watts between 2022 and 2026, driven largely by generative AI. GPU power has increased from around 400 watts per chip in 2022 to a projected figure of around 1,200 watts for the 2025 generations, multiplying the power requirements of computing clusters.
The energy mix is also crucial. A model that consumes 27,500 MWh for training can generate "only" 240 tCO₂eq if powered by renewable sources, while models trained on fossil fuels can have climate footprints orders of magnitude higher. Relative efficiency (FLOPS per watt) is not enough: the key metric becomes absolute energy consumption and total emissions generated.
The hardware life cycle must also be considered. Studies such as Life Cycle Assessment by BLOOM They show that embodied emissions in server and GPU manufacturing can account for more than 201 t of the model's overall emissions. Rapid component obsolescence leads to large amounts of electronic waste (e-waste) and increased demand for rare earth metals, with social and environmental impacts in mining regions.
A Digital Desert: Water, Data Centers, and Environmental Justice
THE'environmental impact of artificial intelligence It's not just about energy, but also about water. Data centers use huge quantities of water, often potable, for evaporative cooling: the water absorbs the heat from the servers and is then dispersed into the atmosphere as vapor. Consumption is both direct (cooling systems) and indirect (water used to generate electricity, especially from fossil or nuclear sources).
With the rise of AI, Google has seen a 2013T increase in water consumption, Microsoft a 34% increase. The problem is highly local: many data centers are being built in regions already exposed to water stress. In Arizona's Sonoran Desert, a Microsoft facility in Goodyear is designed to consume up to 52 million gallons of drinking water per year, enough to meet the needs of 670 families. In Mesa, a Google data center can use up to 17 million cubic feet of water per year, while the state restricts new residential construction due to water scarcity.
In Aragon, Spain, a region where 75% of the surface area is at risk of desertification, new Amazon data centers are authorized to use 755,720 cubic meters of water per year, equivalent to irrigating 233 hectares of corn. Amazon has also requested a 48% increase in water consumption for existing facilities, citing rising global temperatures. Local groups, such as the collective "Tu Nube Seca Mi Río," denounce the direct conflict between the cloud and agriculture.
Projections for 2028 indicate that water consumption by AI data centers could exceed 1 trillion liters per year, an increase of more than 1000% compared to 2024. This scenario raises questions of environmental justice: the global benefits of AI – including business and marketing innovation – often come at a cost, borne by vulnerable communities.
The impact of a single query shouldn't be underestimated either. Data updated to 2025 shows that a request to heavy-duty models like GPT-4 or Claude 3 Opus can generate up to 0.5 grams of CO₂eq, consuming over 100 mL of water per query in some energy-intensive systems, while lighter models like Gemini Pro or GPT-3.5 are an order of magnitude lower.

Greenwashing, Transparency, and Computational Accountability
The growing attention on the’environmental impact of artificial intelligence It has also led to AI greenwashing practices. Some companies emphasize "green" AI use cases—such as reforestation projects or electricity grid optimization—to divert attention from the emissions, water consumption, and e-waste generated by their computing infrastructure.
The parallel with "Dieselgate" is clear: just as "clean diesel" cars used software to cheat on emissions tests, today aggressive marketing campaigns can mask the true impact of data centers. The root of the problem is a lack of transparency: many big tech companies fail to publish granular and verifiable data on energy, water, and the energy mix of individual plants, citing the proprietary nature of the information.
Ethical responsibility starts with measurement. Shared standards for reporting the life cycle of models and infrastructures, as well as independent audits, are needed. In this context, AI itself can help unmask greenwashing: automated analysis systems can cross-reference energy reports, satellite data, news, and ESG documentation to highlight inconsistencies between claims and actual performance.
At the same time, the Green AI movement is emerging, proposing concrete tools for integrating sustainability into system design: pruning, quantization, and knowledge distillation techniques to reduce model weight and power consumption; data-centric design to use smaller, high-quality datasets; and specialized hardware and data centers with closed-loop liquid cooling and waste heat recovery.
Sustainable Human-Computer Interaction also suggests "energy-aware" design principles: interfaces that make the environmental cost of operations visible, promoting more energy-efficient choices whenever possible. In business, this can translate into dashboards that show the footprint of AI-based campaigns, segmentations, or automations, encouraging marketing teams to use computational resources more consciously.
The Environmental Impact of Artificial Intelligence: Impact on Marketing and Business
THE'environmental impact of artificial intelligence It's not just a technical issue, but a strategic lever for marketing and business. Brands that adopt generative AI for campaigns, customer service, or marketing automation are increasingly under scrutiny from customers, investors, and regulators sensitive to ESG issues.
Today, many digital marketing activities—from content personalization to multichannel campaigns, to chatbots on WhatsApp Business—rely on cloud-based AI models. Every advanced segmentation, every predictive customer analysis, every algorithm-driven mass mailing involves energy and water consumption. Failing to measure these impacts exposes you to reputational risks and accusations of inconsistency with sustainability commitments.
For companies, integrating AI environmental metrics into their strategies offers several advantages:
- ESG brand positioning: demonstrating attention to the lifecycle of AI systems (models, data centers, workflows) strengthens the credibility of green initiatives.
- Cost efficiency: Lightweight models and optimized automations consume fewer cloud resources, reducing operational expenses.
- Responsible customer experienceDesigning intelligent yet “sober” interactions – such as chatbots that optimize the number of messages – improves the experience and reduces computational waste.
In customer service chat, for example, a company can choose to use small or mid-range models for the majority of frequent requests, reserving the more demanding models for complex cases. Similarly, in a marketing automation strategy on WhatsApp Business, it's possible to limit redundant processing, unnecessary sending, and hyper-fragmented segmentation that increases computations without any real return.
Finally, adopting transparency standards on AI use in campaigns—including estimated energy impact—can become a differentiating factor, especially in highly environmentally sensitive B2C sectors such as food, fashion, retail, and tourism.
How SendApp Can Help With Artificial Intelligence's Environmental Impact
In this scenario, platforms like SendApp can play a concrete role in managing the’environmental impact of artificial intelligence connected to WhatsApp Business communication. The goal isn't to abandon AI, but to use it more efficiently and measurably within messaging, support, and sales flows.
With SendApp Official (WhatsApp Official API), businesses can centralize and optimize all WhatsApp interactions in a single infrastructure, reducing redundancies and wasteful processes. Intelligent conversation routing, controlled use of templates, and centralized campaign management help limit unnecessary queries to AI models, focusing computing power where it generates the most value.
SendApp Agent It allows you to organize the work of support and sales teams by combining human agents and automation. By precisely defining which requests are handled by bots and which require human intervention, the company can prioritize lighter models for FAQs and activate advanced models only when necessary, reducing the overall load on AI systems.
With SendApp Cloud, it is possible to design more sustainable marketing automation flows on WhatsApp, thanks to:
- Targeted segmentations which reduce unnecessary message sending and redundant processing.
- Smart Triggers based on truly relevant events, avoiding hyper-complex workflows that consume resources without increasing conversion.
- Advanced reporting to measure campaign performance, so as to optimize and streamline less effective flows.
These choices—apparently purely operational—have a direct impact on the number of computational operations required by the underlying AI systems. Fewer unnecessary calculations mean less energy, less water, and less hardware needed to handle peak loads.
For companies looking to integrate AI, automation, and sustainability into their communications strategies, SendApp can be the ideal partner: a platform for efficiently orchestrating WhatsApp Business, reducing the waste of digital resources and maximizing the return on every interaction.
If you want to design communication flows on WhatsApp that combine performance, customer experience and attention to’environmental impact of artificial intelligence, You can request a personalized consultation or start a free trial of SendApp. It's the first step toward a truly intelligent messaging strategy: effective for your business, easier on the planet.







