Dubai, UAE: Arthur D. Little’s Blue Shift institute has published its latest report, AI’s Hidden Dependencies. This in-depth report, involving more than 50 experts, explores AI’s resource dependencies and the consequent direct systemic vulnerabilities for businesses and lays out strategic actions in response.

With AI’s adoption and usage expected to remain on its growth trajectory, so is its strain on resources. The report notably identifies three main areas of dependency:

  • Environmental impacts, including emissions due to AI’s heavy energy usage and the manufacture of related hardware
  • Energy supply, including increased electricity demand and strain on the grid
  • Compute infrastructure, including supply chain choke points and dependencies on dominant providers

As AI is now becoming a critical infrastructure, the report anticipates that those “hidden dependencies” will increasingly expose businesses to three systemic vulnerabilities:

  • Economic instability as the real costs of AI become apparent
  • Sustainability risk as companies lose control of their carbon footprint
  • Strategic lock-in as supplier dependency constrains competitiveness

In response, the report recommends that businesses prioritize a set of “no-regret” actions, namely:

  • Anticipate the real cost of AI by aligning AI costs with real business value
  • Restore environmental credibility by gaining control over the real footprint of AI use
  • Build strategic resilience by maintaining the freedom to move between providers and jurisdictions

Dr. Albert Meige, Global Director of Arthur D. Little’s Blue Shift institute, comments: “AI feels cheap today because its real economic and environmental costs are essentially hidden. Once dependence sets in, those costs will surface. And companies should be strategically prepared.”

Key highlights from the report:

  • AI energy demand could increase fivefold by 2030, pushing global data-center electricity consumption close to 1,000 TWh, or roughly 3% of total global power demand.
  • In major AI hubs, data centers could consume up to 40% of local electricity within the next decade, already triggering grid connection delays and moratoria of up to seven years.
  • AI inference already consumes up to 2,700 GWh per year, overtaking training as the main source of AI-related emissions as usage becomes always-on and continuous.
  • A single hyperscale AI data center can use as much water per day as a medium-sized city, making water availability a binding constraint on where AI infrastructure can expand.
  •  Environmental transparency is collapsing, with fewer than 3% of new AI models disclosing energy or emissions data, down from around 10% just one year ago.