9 mins
Big tech isn't acknowledging AI's environmental impact. Probably on purpose.
Following on from Earth Week, we’ve been reflecting at JECO on the importance of being transparent about the environmental impact of our industry. And let’s be honest: tech has an emissions problem, which is becoming more salient with the growth of AI. It’s a big topic, so grab a cuppa and settle in.
AI has undoubtedly transformed the global tech industry in recent years. Perhaps the biggest evidence of the AI shift is the rapid global investment in AI infrastructure - the UK is investing heavily in AI growth zones to boost domestic datacentre construction, the US is estimated to have tripled datacentre construction spending in the three years preceding 2025, and the EU is targeting a tripled datacentre capacity within the next five to seven years. Major global economies are investing huge amounts of money and resources into AI infrastructure and show no signs of slowing down.
But we think it’s irresponsible to talk about AI without acknowledging its very real environmental impact. Goldman Sachs forecasts global datacentre power demand to rise 165% by 2030 from 2023 levels, significantly increasing overall global electricity demand - with AI one of the major drivers of that growth. Yes, we’re seeing growth of cleaner power sources, but the exponential rise of the AI industry is outpacing this, meaning fossil fuels will end up filling the demand gap. The IEA estimates that fossil fuels will power over half of datacentre energy demands through to 2030, generating an estimated 300 million tonnes CO₂ annually as a result. And that’s before we even factor in water consumption from datacentre cooling, which is another poorly disclosed cost. Recent estimates suggest AI systems’ annual water usage could already rival annual global bottled water consumption (though the lack of operator-level disclosure makes precise water consumption estimations difficult).
It’s important to highlight who’s actually footing the bill for this skyrocketing energy demand too. In many countries, the financial cost of powering datacentres is being partially subsidised through consumer electricity prices. Ordinary households like yours are, in effect, helping to bankroll the infrastructure behind the expansion of the AI industry. The global share can make the issue look deceptively small, but the sharper problem is local: AI datacentres concentrate enormous loads in specific grid regions, creating bottlenecks in transmission, transformers, generation capacity and grid connection queues. In those markets, new AI demand can directly affect energy prices, grid reliability and the marginal generation source used to meet peak load. This hidden cost transfer rarely features in discussions around AI’s economic model, and it makes the incredibly high emissions associated with AI power demands even more concerning.
Unfortunately, this lack of transparency extends to datacentre emissions themselves, which we’re seeing particularly due to AI use threatening previous emissions commitments from companies and governments alike. It was recently exposed that a lobby group comprising members across big tech companies like Amazon, Google and Meta secured a provision in EU law preventing public access to individual EU datacentre emissions figures under the guise of this being “commercially sensitive” information - a move expected to violate EU transparency rules. Net zero commitments are quietly disappearing from company platforms as AI-driven emissions surge, with actual emissions likely way beyond the figures that are being officially reported.
Concealment of AI’s emissions is only part of the problem - the way that emissions are reported is also flawed. Many emissions disclosures rely on reporting methods which don’t reflect the hourly, location-based electricity actually consumed by AI workloads, and hardware supply chain emissions and AI-specific workload splits are often missing entirely. Even where cloud providers are improving their sustainability tooling, customers are often still unable to attribute their emissions at the level of granularity that matters - for example, per product, workflow or customer feature - which makes AI emissions extremely hard to govern. Closer to home, core UK government departments were very recently found to hold vastly divergent forecast figures for domestic datacentre power consumption by 2030, and even revised their published figures by huge amounts after being approached for comment by journalists.
The pattern is consistent and undeniable - the harder it becomes to see the true environmental impact of AI, the easier it is for stakeholders to avoid accountability.
Of course AI does have its benefits. In our own space, AI-generated code is accelerating software creation: increasing developer productivity, reducing costs and making development more accessible for marginalised groups who haven’t historically been granted the same access to the tech industry.
But for every silver lining, there’s a cloud (can we play this off as a compute pun?!) - and AI's benefits bring their own hidden costs too. Let’s consider the software development space. AI coding agents are now so accessible that thousands of developers and teams worldwide are independently building near-identical pieces of software: slightly different implementations of the same functionality, while often lower quality than established open source alternatives. When AI coding agents make it cheaper and easier to regenerate software than to understand, reuse or improve existing software, we risk eroding the collaborative, cumulative culture of open source sharing that has historically driven so much of our industry’s best work. Plus, this practice creates a new form of technical waste: duplicated codebases, duplicated dependencies, duplicated CI runs, duplicated hosting and duplicated maintenance. Not great from an emissions perspective.
With governments and big tech firms continuing to spearhead AI expansion, opting out of AI use isn’t a realistic option for most organisations who want to remain competitive. And from a wider perspective, AI can help optimise grids, logistics, buildings, code, hardware utilisation and scientific discovery too - all things which can bring real environmental benefits. The point is that those benefits are not automatic. If AI is deployed without measurement, workload discipline and efficiency targets, the infrastructure cost arrives immediately, while the environmental benefits remain hypothetical. So, we need to do everything within our own power to reduce AI’s environmental impact - and there are a few primary approaches to this.
Software efficiency. Software runs on datacentre servers which consume electricity to process instructions. AI is making software faster to generate, but not necessarily faster, leaner or cheaper to run. Inefficient code means more CPU time, more memory pressure, more storage and network I/O, more idle-but-provisioned capacity - all of which translates to higher hardware usage and electricity consumption. Efficiency at the code level is one of the most direct levers available to reduce the energy footprint of the software we're all building and using every day.
Smarter AI use. Not every problem needs AI to solve it, not every workflow benefits from AI involvement, and not every AI model is appropriate for the task at hand. One of the most impactful things organisations can do is develop better frameworks for identifying where AI actually adds value, and where it doesn't. Technical discipline matters here too - for example, simple tasks can often be routed to smaller models, and repeated outputs can be cached rather than regenerated. We should be using AI more selectively and intentionally, and measuring whether AI actually improves a workflow enough to justify its cost.
Cutting digital waste. Consider how many times a single piece of data (your emails, your documents, etc.) is being ingested, processed and re-processed by different AI services across different platforms. Unnecessary ingests, embedding passes, summarisation runs, duplicate models, you name it - all of these cycles consume energy and have a real compute cost. Organisations that understand how their data flows through AI systems are better placed to reduce unnecessary duplication and the energy waste that comes with it.
Visibility. Compute demand is increasing sharply as AI-powered workflows become the new normal, but most organisations are being left in the dark regarding how much energy their IT operations actually consume - making it almost impossible to meaningfully reduce their carbon footprint. High-level cloud usage figures are a start, but workload-level accountability is where real change happens: understanding which product, feature, build or workflow is consuming energy, where and when it runs, and whether that work is actually necessary.
We’re developers ourselves, so we get it - AI use is somewhat inevitable in the current tech landscape. We’re not here to tell you not to engage with AI, but our team is committed to using our skills to make sure that using AI comes at a lower cost to the planet.
JECO Optimiser automatically analyses, fixes and validates performance issues across your codebase. By profiling real workloads, identifying bottlenecks, proposing targeted fixes and validating results against repeatable scenarios, we make software optimisation measurable rather than speculative. Better-performing software doesn’t just feel faster, but it can reduce server time, hardware demand, bandwidth, battery drain and cloud spend - allowing you to cut your costs, boost your productivity and lower your power consumption.
JECO Scope gives organisations granular visibility into the energy and carbon signals behind their IT operations: device fleets, workloads, usage patterns, locations and operating behaviour. Through this, we enable your organisation to make smarter, cheaper and more sustainable operating decisions - giving you a stronger, data-led basis for emission reduction approaches, cloud and vendor conversations and carbon reporting processes.
AI is here to stay, but so is the responsibility to reduce its impact at the software level. It’s more important than ever before to make the software around AI, and the software produced by it, more efficient, measurable and accountable.
We’re building automated tools to reduce the energy use of software worldwide: if you’re ready to be part of this shift too, sign up to our waitlist.