Amplifiers have their own rated output powers. Typically consumer products only show one set of numbers, for example, 30W @ 8 ohm.
This often leads to confusion when speakers of impedances other than 8 ohm are used. Statements like: "4 (or 8) ohms gives more power", and "2 (or 16) ohms will spoil the amplifier".
The correct answer would be: it depends.
Lets look at what an amplifier does. Assume an ideal amplifier. No limits of any sort. Output voltage is input voltage multiplied by gain. Connect a speaker, one end to the output and the other end to ground. Voltage between output and ground forces current to flow across the speaker, delivering power to the speaker.
But real devices have limits. A real amplifier has voltage and current limits.
Voltage: the maximum output voltage of an amplifier is equal to the power supply voltage minus a few volts, depending on the amplifier
If an amplifier attempts to output a voltage ( = input voltage + gain) that is bigger than the limit, it simply cannot.
Current: semiconductors can pass a particular amount of current until it reaches a point at which amount of current does not increase further.
The amplifier is capable of outputting any combination of voltage and current (as long as they are within limits).
Power equals to voltage times current, so to get maximum power from the amplifier, operate the amplifier at maximum current and maximum voltage.
However, current depends on voltage and impedance. (In speaker amplifier, voltage and load impedance are independent variables, their values set by the user.)
If the load impedance is low, current draw at a specific output voltage will be higher compared to with a higher impedance. Current limit will be reached earlier than voltage limit as output voltage/volume is increased.
If load impedance is high, then voltage limit is reached before the load draws a large current.
Too low or too high load impedance both result in max output power allowed with that particular load lower than the absolute max capable by that amplifier.
To get the most power out of the amplifier, choose a load such that current limit meets voltage limit. Or (impedance) = (voltage) / (current).
So in a sense, an amplifier is optimized for maximum output with a certain load. Note that this is often mistakenly called "impedance matching" which means another thing.
Usually the speaker amp manufacturer would optimize it for somewhere between 4 and 8 ohms. Because the output power drops proportionally with the change in load impedance outside of the optimal value, e.g. an amp optimized for 10W @ 8 ohms only pushes 5W @ 4 ohms, being fully limited by current, while an amp optimized for 10W @ 4 ohms only pushes 5W into 8 ohm, limited by voltage. If the optimal value is somewhere in between, then both 4 ohm and 8 ohm loads are off from the optimal value by not too much, and powers into 8 and 4 ohms will not be too much different. (An amp optimized for 10W @ 5.567 ohm would deliver 7W into both 4 ohm and 8 ohm)
And when power into 4 and 8 ohm are different, manufacturers show the higher one.
W A R N I N G !
W A R N I N G !
This page is full of non-facts and bullsh!t, (just like the internet and especially forums and other blogs), please do not believe entirely without exercising your intellect. Any resemblance to real things in reality is purely coincidental. You are free to interpret/misinterpret the content however you like, most likely for entertainment, but in no case is the text written on this blog the absolute truth. The blog owner and Blogger are not responsible for any misunderstanding of ASCII characters as facts. *cough* As I was saying, you are free to interpret however you like. *cough*