As mentioned in previous blogs [Ref. 1] the power consumption to manufacture semiconductors has raised concern. The International Roadmap for Devices and Systems (IRDS) [Ref. 2] has been evaluating that issue. There is no short-term solution as the complexity of the circuitry continues to increase. Changes that are being evaluated, e.g., chiplets, do not reduce the demand for semiconductor devices, but are options to produce and yield devices more efficiently.
Reference 3 provides an interesting view into alternate power sources for high performance devices. The article indicates that high-end CPUs can require as much as 700W per chip. As shown in the chart in June 2023 blog, the power consumption is trending to be beyond the requirements to obtain enough power to satisfy the demand. As pointed out, architectures at 7nm and below require lower operating voltages, which creates a need for higher currents. The problem with the power delivery networks PDNs) is that they must be able to maintain a constant power supply under conditions that can be very rapidly changing. One solution is to change the IT infrastructure from 12 volts to 48 volts.
What difference does this make? For the same power, the current at 48 volts is ¼ the current required at 12 volts. The power loss is defined by I2R. So, reducing the current by a factor of 4 reduces the power loss by a factor of 16. Reference 3, provides an example of a 10MW data center that would save 1.4 million kWh per year. Not considering the cost saving, the reduction in the power consumption can help lower the projected power consumption in the country. To make this happen requires a change in most of the large computing centers equipment.
This in not addressing the power required to run the Artificial Intelligence processors. Reference 4 states: “It’s unclear exactly how much energy is needed to run the world’s AI models. One data scientist trying to tackle the question ended up with an estimate that ChatGPT’s electricity consumption was between 1.1M and 23M KWh in January 2023 alone. There’s a lot of room between 1.1 million and 23 million, but it’s safe to say AI requires an obtuse and immense amount of energy. Incorporating large language models into search engines could mean up to a fivefold increase in computing power. Some even warn machine learning is on track to consume all energy being supplied.”
There was an interesting comparison at the end of the reference 4 article: “A computer can play Go and even beat humans,” said McClelland. “But it will take a computer something like100 kilowatts to do so while our brains do it for just 20watts.”
“The brain is an obvious place to look for a better way to compute AI.”
So will we continue to make larger and larger models or will we find better solutions? Is it time to rethink the standard semiconductor architecture? Part of this is being pursued through the development of chiplets, which can permit placing memory closer to the processor that needs the memory. That would cut down on the circuitry wiring the signal needs to traverse. In turn, that should reduce the power consumption and speed up the computing process. With new materials being developed, new potential applications should be possible. The future always has a way of surprising us. We shall see what happens next.
- April, May, and June 2023 blogs