The Layer Most Leaders Never See
We talk about AI like it lives in the cloud.
I spent two years living in the Sierra Nevada region. Lake Tahoe was close enough to feel like home — that water, that stillness, that particular quality of light. It's a place I know physically and feel connected to in a way that's harder to name.
So when I read that nearly 50,000 people in that region are being told to find alternative power sources because AI data centers are consuming larger portions of Nevada's electricity grid, something in me shifted. Not as a policy concern. As something more personal. A place I love being quietly reallocated to serve a demand most people never see.
There is no cloud. There are data centers, power grids, cooling systems, water infrastructure, and regional tradeoffs sitting underneath every interaction we treat as instant and weightless.
What looks like a software shift is, underneath, an infrastructure shift.
I use AI in my work. I recommend it to leaders and teams. This isn't a case against it — it's a case for understanding what we're actually working with.
The numbers are worth sitting with.
U.S. data centers consumed an estimated 211 billion gallons of water indirectly in 2023 alone — and that was before the current AI buildout hit full scale. Training a single large language model can directly evaporate 700,000 liters of freshwater. By 2028, Morgan Stanley projects AI data centers could drive an 11-fold increase in annual water consumption for cooling and electricity generation.
Energy demand tells the same story. A January 2026 report projects U.S. data center electricity demand will nearly double between 2025 and 2028 — from 80 to 150 gigawatts. That's the equivalent of adding a country the size of Spain to the grid in three years.
These aren't abstract projections. They're allocation decisions. And they're already landing somewhere.
We've seen this pattern before.
Electricity once felt like a magical abstraction — clean, infinite, always available. Until cities had to redesign around peak load, generation limits, and regional allocation.
The internet felt the same way in its early phase. Then streaming hit scale and bandwidth stopped being an IT detail. It became a national infrastructure constraint that reshaped pricing, regulation, and investment.
AI is following the same trajectory. But with a different input: intelligence itself.
Not storage. Not bandwidth. Cognition.
Cognition at scale doesn't behave like a software feature. It behaves like a utility demand system — continuous, unpredictable, compounding across every industry that adopts it. And like every utility system that scaled into infrastructure, the costs stop being abstract. They show up in electricity curves, water usage, grid stress, and the quiet reshaping of where compute can physically exist.
Here's what I think about as someone who develops leaders inside organizations navigating this moment.
MIT Sloan research on AI adoption across 50 organizations found that leaders who set clear guardrails — cross-functional, deliberate, informed about actual constraints — consistently outperformed those who simply enabled broad access and hoped for the best.
The difference wasn't technical literacy.
It was systems thinking.
The leaders who make the best AI decisions aren't the ones most excited about the technology. They're the ones who can see both layers simultaneously — the frictionless experience and the infrastructure underneath it.
That gap between what AI feels like and what it actually requires? That's a leadership problem. It shows up in adoption strategies that skip tradeoffs, in AI commitments made without understanding what they're actually resource-committing to, in teams who optimize for speed without accounting for the system they're accelerating inside.
Research on AI-driven leadership increasingly points to the same finding: the most critical capability isn't knowing how to use the tools. It's maintaining the judgment to know what the tools can't see — the second-order effects, the resource implications, the human costs of moving fast inside a constrained system.
Systems thinking has always been a leadership differentiator. AI just raised the stakes for not having it.
Three things leaders can do right now:
1. Map your AI commitments to actual resource costs. Before expanding AI tooling across your org, ask: what does this require in compute, data infrastructure, and human oversight? Build that into the business case, not as a footnote.
2. Create a cross-functional AI tradeoff conversation. Not just IT and legal. Finance, operations, people. What are we actually choosing when we choose this? That question belongs in the room before the contract is signed.
3. Build infrastructure literacy into leadership development. The next generation of leaders will make decisions at the intersection of AI capability and physical constraint. Organizations that develop that literacy now will move more strategically than those who learn it from the news.
The most important shift isn't that AI is powerful.
It's that intelligence is becoming infrastructural — which means every leader, every team, every organization is now operating inside an allocation system whether they understand it or not.
The ones who do will make better decisions. For their organizations, and for the places — and people — that sit underneath the abstraction.
Because every abstraction eventually meets physics.
And physics always has a place.



