AI inference can be relocated across geographies to access lower-cost or lower-carbon electricity when latency budgets are relaxed, with the energy-latency frontier quantifying marginal benefits and new metrics tracking returns on latency tolerance.
Sustainable carbon-aware and water-efficient LLM scheduling in geo-distributed cloud datacenters
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.DC 1years
2026 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
AI Inference as Relocatable Electricity Demand: A Latency-Constrained Energy-Geography Framework
AI inference can be relocated across geographies to access lower-cost or lower-carbon electricity when latency budgets are relaxed, with the energy-latency frontier quantifying marginal benefits and new metrics tracking returns on latency tolerance.