Tuesday, March 12, 2013

Watts Revisited


Somebody asked me in another forum why I use the term “watts at clipping” to describe the power an amp makes.  I thought I’d devote a new blog post to this topic.

It may surprise you to know there is no hard and fast formula that amp makes use to calculate exactly how many watts their harp amps make.  In 2009 I researched an article I was writing on this topic by contacting all the big harp amp makers at the time to find out how they arrived at their wattage claims.  Some of these amp makers were quite annoyed that I even asked the question.

I believe we need a standardized method for calculating amp power so the numbers are meaningful and consumers can make rational, informed decisions.  Wattage numbers in amps are like horsepower figures in cars.  There is a powerful incentive to fudge the numbers upwards – higher than your competitors – because many buyers are greatly influenced by it.

You've probably seen wattage claims for boom boxes or car audio or computer speakers that seem, um, unlikely.  They sometimes claim hundreds of watts when the truth is a small fraction of that.  Harp amp makers have been more restrained in their numbers, but still there is no real standardization.

Those of you who like vintage hifi gear will recognize this phrase:  110 watts RMS per channel @ 8 ohms from 20 to 20K Hz with .1% THD.  I found that harp amp makers borrow parts of that formula to arrive at their wattage spec.

This is what they do:  They drive the amp with a test tone – a sine wave – and measure the AC output at the speaker tap by watching the waveform on a scope.  You may have opinions about what the proper method should be, but that is what they actually do.

One problem is that the frequency and amplitude of the test tone is not uniform (and certainly not announced by the amp maker) and it can make a big difference.  Another problem is the amount of deformation they tolerate in the wave form at the point they claim as their wattage number.  In other words (as in the hifi formula), how much distortion is included in the number?

This is where the question arose in the other forum:  What the heck is wrong with distortion?  We love distortion in our tone, right?

Indeed we do, but for the sake of arriving at a meaningful wattage number that allows us to make real comparisons we have to stipulate the level of distortion in the test.  The easiest and best way to do that is to measure the amp’s power at the point that the sine wave begins to clip.  That is a very good indication of the amp’s true strength.

Can the amp make more power beyond the point of clipping?  Sure.  But depending on the design of the amp the distortion (the deformation of the sine wave) can start to sound unappealing pretty quickly.  Amp makers could just crank everything to the max and report that number but it would be unrealistic and meaningless for musicians, and there are lots of ways to juice the max number.  We want to know:  How much clean power can the amp make?  How does it compare with other amps?

Here is what I am suggesting as a standard for harp amp makers:  Drive the amp with 150mvac@130Hz and measure power as peak clean voltage into the appropriate true non-reactive load.

Is it a perfect formula?  Probably not, but it is not meant to be.   We need to insist that amp makers use a standard, uniform and verifiable method of calculating their wattage claims, and this method is a good place to start that conversation.

No comments: