Recently, two computer scientists had an idea: if computers use energy to perform calculations, could stored data be a form of stored energy? Why not use computing as a way to store energy?
What if information could be a battery, man?
As it turns out, the idea isn’t as far-fetched as it may sound. The “information battery” concept, fleshed out in a recent paper, would perform certain computations in advance when power is cheap—like when the sun is shining or the wind is blowing—and cache the results for later. The process could help data centers replace up to 30 percent of their energy use with surplus renewable power.
The beauty of the system is that it requires no specialized hardware and imposes very little overhead.
“Information Batteries are designed to work with existing data centers,” write authors Jennifer Switzer, a doctoral student at UC San Diego, and Barath Raghavan, an assistant professor at the University of Southern California. “Some very limited processing power is reserved for the IB [information battery] manager, which manages the scheduling of both real-time computational tasks and precomputation. A cluster of machines or VMs is designated for precomputation. The IB cache, which stores the results of these precomputations, is kept local for quick retrieval. No additional infrastructure is needed.”
Forecasting is key
Not every task is suited to the information-battery approach, but for many data centers, certain loads can be predicted with reasonable accuracy and cached for later retrieval. Companies like Netflix, for example, may ingest video in one format and then transcode it to optimize it for various devices, a process that isn’t always time-sensitive. The same is true when training machine-learning algorithms—computer scientists can queue up the training data and let the information-battery manager decide when to run the training. Google has been using a system like this for a few years in a quest to trim its carbon emissions, though as you might expect, details are sparse.
The information-battery manager in some ways mimics schedulers found within PC or smartphone operating systems. There, the schedulers optimize the flow of data through the CPU and other chips to keep things moving smoothly. Depending on the task and the demands on the system, the scheduler may keep the interface responsive to user inputs, or it might prioritize a compute-intensive job so it finishes more quickly.
In the case of information batteries, the manager optimizes the workload based on the price of electricity and the availability of tasks that can be performed ahead of time. The manager has three main parts—a price-prediction engine, a pre-computation engine, and a scheduler. To determine which tasks to run, the scheduler weighs information from the price-prediction and pre-computation engines. The price-prediction engine uses a neural network to forecast future electricity prices, while the pre-computation engine uses a different neural network to predict future computational demands.
In the model Switzer and Raghavan created to test the concept, the IB manager queried grid operators every five minutes—the smallest time interval the operators offered—to check the price of power to inform its predictions. When prices dipped below a set threshold, the manager green-lit a batch of computations and cached them for later.
Supplanting grid-scale batteries
The system was pretty effective at reducing the need for expensive “grid power,” as the authors call it, even when the pre-computation engine did a relatively poor job of predicting which tasks would be needed in the near future. At just 30 percent accuracy, the manager could begin to make the most of the so-called “opportunity power” that is created when there is excess wind or solar power.
In a typical large data center, workloads can be predicted around 90 minutes in advance with about 90 percent accuracy, the authors write. With a more conservative prediction window of 60 minutes, “such a data center could store 150 MWh, significantly more than most grid-scale battery-based storage projects,” they say. An equivalent grid-scale battery would cost around $50 million, they note.
While the authors don’t provide an estimate of how much an information battery would cost to run, it’s likely to be significantly cheaper since it relies on existing infrastructure and is implemented in software, which can be optimized based on the price of electricity and the computational demands of the data center.
Today, there aren’t many times when there’s excess wind or solar power—mostly during the summer in California when it’s especially sunny and in Texas when it’s particularly windy. But in the near future, when there’s more wind and solar on the grid, negative power prices may become more common, and information batteries could become both viable and widespread.
“Key to the IB approach is that it is not a general-purpose solution but is likely to be effective for many common workloads,” the authors write. Given that data centers consume around two percent of all electricity used in the US—a number that’s all but certain to grow—information batteries could become a cost-effective alternative to massive and costly batteries.