DCW Conference Programme 2024
Does Generative AI run on thin Air?
The presentation explores the detailed energy costs associated with the training and usage of generative AI (ChatGPT alike) that uses data centres and edge nodes. The computational requirements are pushing for new efficient and more complex GPU/accelerator hardware with ever increasing power requirements. The thermal management challenges the implementation of thin air as the main coolant and is enforcing data centres to bring liquids as the coolant close to the microelectronics. Future generations of microelectronics are expected to have higher heat fluxes and lower temperatures of operation. The question arises as to whether immersed cooling, direct-to-chip or any derivatives of liquid cooling can meet the heat removal requirements. Such developments are becoming a strong technical driver for different data centre designs.