Well I admit real musical waveforms are quite complex, rich in harmonics & broad in bandwidth, and completely rife with transients. The brute-force amplification of sine waves into a dummy (resistive) load, while quite objective and reproducible does not necessarily give a good representation of the real-world performance into a real world load since speaker with frequency-dependent overall impedance, capacitive and inductive reluctance.
I believe plenty of equipment will easily meet it's advertised power rating, then trip out on its thermal capability of heat dissipation after a few minute's. Guess there's a reason for the buying big buck product's.
Unfortunately, real-world performance may be very difficult to measure in an objective and "portable" way say, can you and I be on different continents, using different equipment, but measuring the same thing?. I don't think so.
Given all of this - it seems power ratings are not terribly useful at all if you ask me and all in general; one is better off to trust one's ears.
I have local made amps deliver approximately 25 watts per channel at something like 5% THD. They sound very good with my speakers in my room listening to the kinds of music I like.:lol:
In common use, the terms "RMS power" or "watts RMS" are erroneously used to describe average power. A 100 "watt RMS" amplifier can produce a sine-wave of 100 watt average into its load. With music, the total actual power would be less. With a square-wave, it would be more.
In the US acording to the Amplifier Rule CFR 16 Part 432 (39 FR 15387) was instated by the Federal Trade Commission (FTC) requiring audio power and distortion ratings for home entertainment equipment to be measured in a defined manner with power stated in RMS terms. The erroneous term "watts RMS" is actually used in CE regulations.
In any alternating current waveform (which an audio signal is), the voltage measured in RMS times the current measured in RMS is the AVERAGE power.
The use of root-mean-square to measure voltage and current was developed decades before the phrase "high fidelity" was coined, specifically so that when the two are multiplied together, the result is average power.
Here's the details of how RMS voltage and current are used for average electrical power:
Root mean square - Wikipedia, the free encyclopedia
This was apparently misunderstood by those making the 1960 to 1970's standard. With the wrong thinking that "RMS times RMS must equal RMS," and since this is a legal ruling about marketing audio amplifiers in the US, they have been rated in "RMS watts" ever since.
Actually thinking about the RMS rating is seems pretty good as basically a rounded AC equivalent of constant DC. Like anything, rating's are easily fudged here, too.
Responses cut short of 1kHz rating's at most cases. At most cases an amp rated at 200 watts could produce a 1 kHz test signal without running out of headroom, change the scenario to full 20Hz~20kHz range and let's see what happens. You will be terrified to see the results, it may even come down to 120 watts.
Most better amp's have lot's of short term power abilities, sometimes two times. Then there's availible output heat sink cooling area.
Ultimately it all boils down to what i said before, it is the sound that ultimately matters.