Home / Blog / Measuring power consumption: establishing the baseline on which to evaluate code optimization

Measuring power consumption: establishing the baseline on which to evaluate code optimization

In May, Embedded.com ran a series of articles by Rob Oshana of Freescale Semiconductor and Mark Kraeling of GE on managing embedded software design’s power requirements. Taken as a whole, the articles are a mini-course in themselves, but over the next month or so, I’ll be providing a short synopsis of each of them.

I have to say, in terms of piquing my interest, the authors had me at hello. (Or, at any rate, at the intro.)

One of the most important considerations in the product lifecycle of an embedded project is to understand and optimize the power consumption of the device. Power consumption is highly visible for hand-held devices which require battery power to be able to guarantee certain minimum usage/idle times between recharging. Other embedded applications, such as medical equipment, test, measurement, media, and wireless base stations, are very sensitive to power as well — due to the need to manage the heat dissipation of increasingly powerful processors, power supply cost, and energy consumption cost — so the fact is that power consumption cannot be overlooked. (Source:  Embedded.com)

The point of the series is then introduced, which is that, although we generally think of power optimization as being the responsibility of hardware engineering, there’s plenty that the software side of the house can do, too.

The focus of the first article is on the importance of measuring power. It starts out with an explanation of the basics of power consumption, and how you need to factor in the application, the frequency, the power consumption, and the process technology. It points out that there’s nothing that software can do about static power consumption, but that it can have an impact on dynamic power consumption, by controlling clocks (which are responsible for consuming the majority of dynamic device power). In order to establish a foundation using software to optimize power efficiency, you first need to go about measuring power.

There are a number of ways to measure power, starting with the old fashioned way: using an ammeter (kind of the slide rule of power measurement) on the core power supply. There are other methods available, as well:

 …some embedded processors provide internal measurement capabilities; processor manufacturers may also provide “power calculators” which give some power information; there are a number of power supply controller ICs which provide different forms of power measurement capabilities; some power supply controllers called VRMs (voltage regulator modules) have these capabilities internal to them to be read over peripheral interfaces.

Once you’ve profiled your app’s power consumption for each section of code, you’ll have the baseline on which to determine how effective your code optimization is.

I’m not going to get into the details here, but the articles are definitely worth reading if you’re on the embedded software side of the house.


Just in case you’re looking for some light reading for an early summer’s day, here are the links to all four articles in the series:

Optimizing embedded software for power efficiency: Part 1 – Measuring power

Optimizing embedded software for power efficiency: Part 2 – Minimizing hardware power

Optimizing embedded software for power efficiency: Part 3 – Optimizing data flow and memory

Optimizing embedded software for power efficiency: Part 4 – Peripheral and algorithmic optimization

They are all excerpted from Oshana’s and Kaeling’s book, Software Engineering for Embedded Systems.

Again, I will be doing short posts on the other three articles over the next month or so. (Part 2 will be posted next week.)