Energy efficiency vs. data rate
In an all-digital transmitter, energy consumption (eg. pJ/bit) increases with the data rate. It is evident that, power consumption doesn't increase linearly with the data rate ( in that case, energy consumption would be constant).
Can anyone provide any analytical formulae (or related papers) which explain such phenomenon?
Rgds,
Hafiz
In this paper, the expression of dynamic power is provided:
http://www.ewdtest.com/conf/proc08/ewdts08-91.pdf
You note that this power is linearly proportional to the frequency which explain why energy consumption increases linearly with frequency rise.
Howerver it is possible to have a dynamic power consumption "independant" of the frequency. This can be done by voltage scaling or data encoding for lowering the switching activity.
@ AdvaRes
Thanks for your response.
But my query is little bit different. Let me state it with an example:
say, for a digital transmitter (ideally no static power loss):
for a data rate of 100 Mb/s, power consumption is 2 mW => energy consumption(2 mW÷100 Mb/s)=> 20 pJ/bit;
for a data rate of 500 Mb/s, for the same energy consumption, we should have a power consumption of (500 Mb/s χ 20 pJ/bit)=10 mW.
But in practical case, this is not so, power consumption becomes say 12 or 13 mW.
As a result, energy consumption also increases, which is supposed to remain constant for an all-digital transmitter.
I need explanation of such phenomenon
