A recent trip to the attic reminded me of the long-forgotten ritual of an annual proof-of-performance, required of all stations by the FCC. Sitting in a dusty corner was a B&W 410 distortion and voltmeter. Time to bring it down for a general cleanup and photoshoot.
The theory of total harmonic distortion measurement is simple. Feed a low-distortion sine wave into the device under test. Connect the output to the distortion analyzer. Filter out the fundamental frequency. What's left is second and higher order harmonics and noise. Of course, when theory is put into practice, things can get complicated.
Many types of filters and amplifiers have been used in distortion analyzer design. There are strengths and weaknesses to each one. The B&W 410 uses a tunable Wein bridge bandpass filter, one of the simplest and easiest to use circuits. It has been used in numerous analyzers ranging from HP to Heathkit.
The Wien Bridge uses a series and parallel R-C circuit to create a filter with a very broad peak. Usually, the two resistors have equal values, as do the two capacitors, which simplifies the circuit. The theory is simple. The phase of the filtered fundamental signal is zero when compared to the input signal, but only at the center frequency of the filter. A Wein bridge circuit has an insertion loss of about 10 dB. Therefore, the input signal needs to be attenuated by this amount before it is combined with the filtered signal.
Specifications of the 410 are typical for 1960s gear. Distortion could be measured from 20 to 20,000 Hz with an accuracy of +/- 5% full scale. The voltmeter response is 20 Hz to 200 Khz, with an accuracy of +/- 5%. Note that the bandwidth of the instrument needs to be significantly higher than the frequencies being measured. Most harmonic distortion falls in the 2nd through 5th harmonics. Above that, it's mostly noise. A rule of thumb is that the circuit bandwidth needs to be about 10 times the fundamental frequency.
Maintenance of the 410 was typical for 1960s test equipment. Occasionally check the six vacuum tubes, log socket voltages and investigate errant readings. Calibration was also simple, there are only three tweaks. Set voltmeter calibration to full-scale with a 100-millivolt signal at 1 Khz. Set voltmeter frequency response by checking the100 Hz 1-volt reading, feeding in 200 Khz 1 volt, and adjust a trimmer cap until they're the same. Set the null circuit trimmer by matching the 100 Hz and 60 Khz full-scale readings. The 410 was a real workhorse, and survived well with this minimal attention.
Until the advent of auto-nulling with the HP 334 series, making distortion measurements was a time-consuming process, and this was the case with the 410. First, the function switch was set for the proper frequency range. Next, the calibrate control was set for a full-scale reading. Then, the range switch was set to the 100% distortion position, and the coarse frequency control was adjusted for a null. Then the coarse amplitude was adjusted. Are you getting tired yet? Keep stepping down the range switch while alternately adjusting coarse amplitude and frequency for a null. When that gets too bumpy, do the same thing with the fine amplitude and frequency controls. When you can't get it any lower, read the percentage distortion off the meter.
The B&W 410 was part of the Gates Radio “proof kit”, which also included the B&W 210 audio oscillator and the Gates M3625 gain set, along with rudimentary instructions on how to do an audio proof. The gain set consisted of a VU meter, 2 dB/step Daven attenuator and a set of plug in attenuators.
The 410 was the most common instruments used for audio proofs in the 1960s. B&W never developed solid state test equipment, and the standard bearers that followed were the HP 333A and 334A series auto-nulling distortion meters. Then came the age of Potomac Instruments, and the AA-51 Audio Analyzer. Today, software-based analyzers are popular, although getting sufficient bandwidth through sound cards can be an issue. There are some good software analyzers available as free downloads.
By the 1970s, the proof-of-performance had become little more than a token gesture. The Commission's rules regarding performance standards for the Proof had been drawn up in the 1930s, and never upgraded. It was probably more demanding to make vacuum tube equipment of that vintage meet the FCC's specs. By the 1970s, most stations had at least partially transitioned to solid state equipment, especially in the audio chain. Even 1960s vacuum tube audio gear was better than what was available in the 1930s. The result was stations were capable of sounding significantly better than they had in the 1930s. Pleas to the Commission to upgrade proof standards to reflect the improved technology went unanswered.
Finally, in the mid 1980s, Broadcast Engineeringmagazine took matters into its own hands. They published a series of articles and forms for both AM and FM proofs that outlined both good and excellent performance standards for state of the art broadcast gear. Of course, participation was purely voluntary, but it gave the industry some common standards to raise the bar on technical performance.
Our B&W 410 came from a college surplus grab in the late 1970s. The date code on some of the parts suggest it was manufactured around 1967. It was kept in top condition and used for contract work till the early 90s when it was retired.