Perseus → SOSP '24
Perseus, an energy optimization system for large model training, was accepted to appear at SOSP '24!
Paper BlogOur goal is to measure, understand, optimize, and expose the energy consumption of modern machine learning.
The ML.ENERGY Initiative is a joint effort of computer scientists across multiple academic institutions, including SymbioticLab where it originally started.
Observing and understanding energy consumption of modern ML
Optimizing ML energy consumption based on understandings
Exposing energy to ML developers to encourage optimization
University of Michigan
MIT CSAIL
Columbia University
CMU
University of Washington
University of Washington
University of Michigan
University of Michigan
Zeus receives $50,000 for development support from the 2024 Mozilla Technology Fund.
AnnouncementTwo LLMs will battle on your command in terms of both response quality and energy. Your judgement tips the scale of victory.
ColosseumA rich benchmark and comparison of performing inference on modern LLMs with metrics including energy, latency, and model quality!
Leaderboard RepositoryBased on Carbon-Aware Zeus, Chase was accepted to appear at the ICLR workshop on Tackling Climate Change with Machine Learning!
Repository PaperCarbon-Aware Zeus wins the Second best overall solution prize in CarbonHack22 organized by the Green Software Foundation!
Project page YouTubeOur first project "Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training" was accepted to appear at NSDI '23!
Zeus website PaperAny opinions, findings, and conclusions of our works are those of the author(s) and do not necessarily represent the official policy of any of these organizations.