Researchers claim a 30-line code change in the Linux kernel could significantly reduce data center energy consumption. How does this work, who benefits most, and are there any downsides to this optimization? Seems like a game-changer, but I’m skeptical.