Originally Posted by TaylorFade
That is very interesting indeed.
Let's see... 300A draw.
14.4v - 3,370w
16v - 3,700w
18v - 3,835w
Yeah... that doesn't look right. Lol.
The higher the voltage, the more current it draws!
I always thought the current draw difference would be minimal as it almost non measurable but i was wrong.
Based on this one particular amp, current draw increases from 14.4v to 18v by 22%.
Originally Posted by brandes.cm
The higher input voltage also means most circuits also have to dissipate more power. Hence the 15V zener used to clamp the gate drive voltage. At 14.4 it wouldn't even conduct, anything higher and wasted power. Also the +-12V and 15V regulators are now forced to dissipate more power. So I can see efficiency dropping off.
So then this isn't uncommon.
I had someone else give me a different point of view on this too-
If efficiency is the primary objective, then as an engineer standpoint, I'd be focusing on making the amp the most efficient at the typical voltage range it is to be played at.
If you notice, 14.4v and 16v are near the same.
Running 19v charging system, typical voltage drops from a large system and not enough alt will drop from 17 to ~16v and 12v, if strong, can reside between 13.5-14.5v so it makes sense.