AI has transformed data centers into front-line strategic assets. GPU-intensive workloads, hyperscale cloud expansion, and edge computing are driving unprecedented power demand—pushing grids and generation capacity to their limits. U.S. data center consumption surged from 76 TWh (2018) to 176 TWh (2023), representing 4.4% of total U.S. electricity use, with projections approaching 945 TWh by 2030.
Developers must deliver capacity faster, cleaner, and at gigawatt scale—despite long equipment lead times and highly variable load profiles.
In this webinar, Worley and invited partners will unpack what it takes to deliver next-generation AI data centers. We’ll examine the challenge from both sides of the fence:
- “Outside-in”: Utilities, power generation, transmission, and microgrids
- “Inside-out”: Racks, cooling systems, and automation at scale
Drawing on real-world examples—including GW-scale developments and off-grid AI training campuses—we’ll share practical strategies to navigate today’s biggest infrastructure challenges.
Key Learnings:
- How AI and hyperscale growth are reshaping power demand, siting, and grid requirements
- Integrating “inside the fence” solutions with “outside the fence” infrastructure
- Navigating equipment lead times, regulatory pressures, and fluctuating demand
- Practical strategies from GW-scale project examples to accelerate your development timeline