An Audio T.H.D. Analyser

E-mail the author: glenk AT
Home Page

This is a complete, self contained T.H.D. analyser for measuring the distortion of audio circuits to levels below 0.001%. The design goal is have an analyzer with a residual noise and distortion "measurement floor" for the most part at or below 0.0001% (-120dB) Internally, in the signal source section, the unit consists of an ultra low distortion audio oscillator and a variable gain differential-output line driver with a selectable output impedance of either 50 ohms, 150 ohms or 600 ohms . The measurement section consists of a differential-input signal attenuator, preamplifier and buffer, an auto-tuning notch filter, a distortion residual amplifier with automatic level setting, measurement bandwidth filters and a true R.M.S. detector and analogue meter movement for measuring/indicating the amplitude of the distortion residual as a percentage of the level of the fundamental signal applied to the input. The measurement bandwidth is ten times the fundamental frequency

This analyser project is in fact a "baby step" to a much more complicated analyser project I have been dabbling with for some time, which is essentially a homebrew work-alike to the Audio Precision System One, with my own operating software for the computer control. That project just kept getting stalled for one, simply because I lacked the measurement capability with which to adequately and straight forwardly evaluate the distortion performance of the various modules and circuits under development of which the low-distortion audio frequency sections of that project are to be comprised. The development of a suitable alternative to Audio Precision's patented hybrid "MDAC" low-distortion multiplying digital to analogue converter, which is used as the resistive element (for high resolution, digitally-controlled frequency tuning) in the System One's state-variable filters is a case in point.

While the project described here will eventually be made relatively superfluous to my needs in the distant future if my above mentioned plans for the much more complicated and capable unit pan out, I still didn't want to invest a great deal of time an energy into hobbling together something of limited capability and usefulness. For the instrument detailed here I've have taken design cues from various sources including the (very comprehensive) A.P. System One technical manual and the T.H.D. analyser design by Cordell as published in Audio magazine, starting July 1981. Like the latter, the design presented here operates at fixed, switch-selectable spot frequencies covering the audio frequency band (and beyond), rather than a continuously variable operating frequency, however with some necessary rationalizations.

The design by Cordell, while relatively simple and straight forward electrically, was complicated a great deal, implementation wise, principally by the rotary-switch switching, with multiple gangs and wafers, of the passive components responsible for setting the loop gain and operating frequency of the state variable filters as used in the oscillator and notch filter sections. This isn't a constructional and parasitic-mitigating layout design issue I felt a desire to contend with, and especially so when a design goal was to make use of, in this design, of the superior distortion and noise performances provided by current generation op-amps from National's "LME" range, which have over five times the gain bandwidth product of the old and venerable NE-5534.

All of the switching in and out of passive components in this design is achieved intermediately via small-signal relays in the signal handling circuit paths that are distortion sensitive, and via analogue multiplexers or JFET switches in almost all of the rest. These small signal relays and analogue switches, along with the decoding circuitry required to drive them, add considerably to the electrical complexity of the design, as well as to the P.C.B. real estate required and overall cost. It is for this reason the frequency coverage of the instrument, from 10 Hz to 50 kHz, is covered by only 12 spot frequencies, in four decade ranges, in 1, 2, 5 steps. The Cordell design, in comparison, provided 11 spot frequencies in each decade range, but the signal relay complement required to switch so many steps just becomes prohibitive. However in light of the improved overall performance of this contemporary design, the limitation of only 12 spot frequencies isn't really much of practical hindrance to the instruments usefulness.

For the signal-input handling section of this design, I decided that a balanced input was mandatory, along with a wide switch-selectable input voltage range. The design here can handle input voltages from 100 mV to 100V R.M.S., switch selectable in ten 1, 2, 5 "nominal amplitude" steps. A variation of a good +/-10 dB to each nominal input voltage level is catered for by the automatic level-setting circuitry in the distortion residual amplification path - the only exception to this being on the 100 V range, due to high voltage limitations. For testing the distortion of audio power amplifiers an external high power dummy load is a requirement, but for the testing of line level circuits the signal amplifier board incorporates low power dummy load of both 600 and 150 ohms, which can be switched in when required. These load resistances are electrically switched into circuit via small signal relays, and a current monitoring circuit will trip and latch them out of circuit in the case of a high voltage input signal being applied with them accidentally switched in.

On the distortion residual metering side, full scale sensitivity of the R.M.S. reading meter is switch selectable in fifteen 1, 2, 5 steps, over five decade ranges, from 0.001% to 50% T.H.D. The amplified distortion residual signal is made available at a front panel BNC connector, for examination with an oscilloscope and/or a spectrum analyser.

Design Files


Signal Source Section

Generator Board - Schematic Diagram -- (Updated 26 July 2012)
Generator Board - PCB Gerber Files -- (Updated 26 July 2012)

Line Driver Module - Schematic Diagram -- (Updated 26 July 2012)
Line Driver Module - PCB Gerber Files -- (Updated 26 July 2012)

Signal Source Section Power Supply Module - Schematic Diagram -- (Updated 26 July 2012)
Signal Source Section Power Supply Module - PCB Gerber Files -- (Updated 26 July 2012)

Generator Section Wiring Diagram
Generator Section Front Panel Label -- (Updated 22 July 2012)


Analyser Section

Differential-Input Signal Amplifier Board - Schematic Diagram -- (Updated 4 November 2012)

Notch Filter and Distortion Residual Amplifier Board - Schematic Diagram -- (Updated 19 March 2012)

Auto-Tune, Auto Level-Set and Status Indicator Board - Schematic Diagram -- (Updated 25 June 2012)

R.M.S. Distortion Metering and Measurement-Bandwidth Filter Board - Schematic Diagram -- (Updated 15 March 2012)

Analyser Section Power Supply Module - Schematic Diagram

Analyser Section Wiring Diagram
Analyser Section Front Panel Label


I have made some reasonable progress the last few days on the circuit board revisions, after a fair bit more experimentation with the operating prototype. I have also starting tidying up this web page. The revised schematics and pcb layouts currently completed/verified can be downloaded above in the "Design Files" section. I will be updating the Design Files section as I get each board built up and tested. After a lot of playing around at the bench, one thing in particular has become clear - namely how relatively trivial it is to build a distortion analyser with a better than 0.001% THD+N measurement floor, and how exponentially harder it is to build one with a better than 0.0001% measurement THD+N floor. At <120 dB every ground loop matters and mains hum is a perpetual pain. Due to the latter issue I have decided to abandon my aluminium chassis work and being on a new chassis fabricated from sheet steel. A sheet metal aluminium chassis simply provides zippo shielding of low frequency electromagnetic radiation. I will have to go out next weekend a pick up a sheet of Galvabond. Suffice to say all this re-design stuff means there is now even more work to do.

The revised PCB layout for the analyser section of the design now has the analyser section (minus the power supply and differential signal input/amplifier PCB) accommodated on three completely separate PCB's. The first PCB (schematic here) contains the tuned state variable filter and differential amplifier that comprises the fundamental notch filter as well as the high gain, range-switched amplification path for the distortion signal. This PCB therefore contains in isolation all of the analogue circuitry which has proven particularly sensitive to noise pickup. Those familiar with the Cordell circuit will notice that I have arranged the state variable filter stages a bit differently. This permits the loop inverter stage to which the signal to be measured is injected to operate with shunt feedback only, eliminating it as a source of common-mode distortion. Also with the aim of eliminating common-mode distortion the differential "nulling" amplifier is presented with equal source impedances to each input. The component values of the analogue multiplier stages have also been modified to run a significantly lower ac voltage at the drains of the control jfets, taking advantage of the lower voltage noise on the LME op-amps in this low impedance environment. This significantly lowers multiplier distortion caused by the non-linear drain-source resistance of the jfets.

The second PCB contains the auto-level set circuitry A.G.C. control and V.C.A's, the auto-tune circuitry and the status indicator circuitry. In order to mitigate ground loop issues between these two PCB, all ac sensing and dc control signals are interfaced with individual differential-input instrumentation amplifiers - this permits each signal to be sensed with respect to its reference ground at the source, without the requirement for a signal ground connection between the separate PCB's.

The third PCB (schematic here) contains the true-R.M.S. detector / distortion metering circuitry and the measurement bandwidth filters. For the bandwidth filters I decided on a design having little compromise with individual and well defined responses even on the 20 kHz and 50 kHz ranges. For each of the 12 spot measurement frequencies there is a separate band pass filter automatically switched into the amplified residual path that gives a measurement bandwidth equal to ten times the spot measurement (fundamental) frequency. This permits accurate measurement up to the tenth harmonic of any measurement frequency, including 20 kHz and 50 kHz (no, you will not get that with a PC sound card!) with a minimum of noise.

Each band pass filter is made up from a 2-pole high pass section having a -3 dB frequency equal to the fundamental frequency, cascaded with the 2-pole low pass section having a -3 dB frequency equal to ten times the fundamental frequency. The high pass section has a Q of 0.866, which returns a small gain peak at the 2nd harmonic frequency of 0.5dB. This compensates for a 0.5dB loss at the second harmonic frequency caused by the slope of the fundamental notch filter.

The low pass section has a Q of 0.707 for a Butterworth response with a maximally flat pass band. The design by Cordell used a Bessel low pass (Q of 0.5) for minimal phase shift as "The excellent phase response characteristic of this design minimizes waveform distortion of the distortion products, preserving the accuracy of the visual display." This sounded reasonable, but after investigating the Bessel Vs the Butterworth response I concluded that the differences in waveform distortion for typical, real world distortion signals is very slight indeed; the quote a little exaggerated. There isn't an appreciable phase difference between the two filter alignments until beyond the 5th harmonic, and even at the tenth it is only some 20 degrees.

I conducted the comparison in LT spice, generating typical "residual" waveforms and passing them through each filter simultaneously. I will post the sim files up in the near future, after they are tidied up for presentation. Due it its maximally flat pass band response and sharper HF cut-off/attenuation, I decided that the Butterworth alignment was the better choice, although there really isn't that much in it, overall, to be honest. Since some will likely disagree anyway, just for the sake of it, I'll compute the alternative component values for a Bessel low pass alignment when I get the time.

Right now I'm working on getting the notch filter PCB and the filter/RMS detector PCB's up and running on the bench.


Annoyingly, I've had to delete and upload revised design files for the metering board, that I posted up two nights ago. Giving the design one last look over before etching the PCB, a design issue became evident. I have now modified the manner in which the filter outputs are multiplexed, such that the input capacitances of the DG403 analogue switches are not effectively connected to a common output line, putting a pole in the frequency response that was otherwise causing an undue HF response error with the uppermost (50 kHz-to-500 kHz) band pass filter switched in.


Some more schematic and PCB design files uploaded..................


Just uploaded the design files for the notch filter / distortion amplifier PCB.


Well this evening I finally got the last component soldered on to the Notch Filter & Distortion Amplifier PCB and completed the preliminary testing. It worked beautifully first go, almost beyond my expectations. Here is a picture of it operating on my workbench:

The notch filter & distortion amplifier PCB is the board in the middle. The board to the left is the state variable oscillator board and the "dead bug" rats nest to the right is the prototype auto-tune board. This prototype board contains the two synchronous detectors which servo control the state variable filter of the notch filter board to perfectly null out the fundamental. The auto-tune board contains switching circuitry which radiates a lot of noise. Set to the 0.001% Full Scale distortion range, the broadband distortion amplifier operates at a gain of 5000 Av (74 dB) and is therefore very susceptible to noise pick up. The photo shows how I had to orientate the auto-tune rats nest to minimize it coupling noise into the distortion amplifier. In fact it would have helped if I had cut the interconnecting cables somewhat longer, because a bit more distance between the boards would have helped. I also had to locate the oscillator board at distance otherwise the distortion amplifier would begin picking up the fundamental. Suffice to say this PCB will have its own separate compartment in the new (steel) chassis to be fabricated.

Anyway, here is the result, operating at 1 kHz on the most sensitive (0.001% full scale) distortion range:

The vertical sensitivity for the fundamental is 1 V per division and the vertical sensitivity for the residual signal is 20uV per division. The measurement bandwidth for this test was (almost) DC to 10 kHz; the upper cut off simply provided by a single pole filter (a 1k6 resistor and a 10 nF capacitor) soldered to the PCB.

The fundamental measured 1.84 V rms and the residual measured 1.6 uV rms. This equates to -121 dB - 0.000087% !

The amazing thing is it could actually have been lower had I mounted the noisy auto-tune rats nest further away. Besides that the residual signal is in large part noise and hum. The noise component will be reduced further still in the completed analyser, which will roll off the bandwidth at a two pole rate and the steel enclosure should eliminate the hum. One funny observation was that the distortion residual would occasionally jump up to around about 2 uV rms, at seemingly random intervals, but reliably settle back to 1.6 uV rms. I had a suspicion which quickly proved correct. Turning on a radio of mine and tuning to the local AM broadcast station (ABC Radio National at 891 kHz), I could sit back and watch the needle of my true rms meter, monitoring the distortion residual, move in sympathy with the loudest audio passages of the broadcast radio program.

Not to display too big a head, but much to my satisfaction this project is nicely progressing to what would surely have to be the most advanced, stand-alone analogue THD analyser described for home construction so far by anyone, anywhere.

Here is how far I got with the assembly of the Metering Board. Tomorrow evening I will de-solder all of the LM833's and then solder them back on the correct way round - Doh!

I still have some more experimentation to do before the auto-tune circuitry is finalized. Once the metering board is completed it will be much easiser for me to complete the required tests on all operating frequencies and distortion ranges. What I basically need to test/experiment with is the synchronous detector / servo amplifier time constants / frequency compensation and range-switched loop gain, making sure that the fundamental rejection is consistent from 10 Hz to 50 kHz with no stability issues. One thing I am contemplating at the moment is having switched time constants for each decade range, so that the settling/auto-tune time is conveniently quicker when measuring at higher frequencies. However this will add some further complexity to the design.

At the moment the time constants are fixed and by necessity slow and it takes quite a few seconds for the servo loops to tune out the fundamental when switching ranges. As a matter of fact it is quite satisfying to watch happen on an oscilloscope, reliably and consistently to sub ppm levels, manually turning up the vertical sensitivity switch for the channel monitoring the distortion residual to track it as the servo loops slowly tune it out into oblivion!


Not a very exciting update, but having now made a start on the preliminary PCB layout for the Differential Line Driver and have noticed a couple of errors on the posted schematic. I simply don't have the time to amend the schematic drawing now, but the errors are: REG1 and REG2 should be labelled LM78L15 and LM79L15 respectively (instead of LM78L05 & LM7905) and R8 should be labelled 2k, not 1k. The PCB layout should be ready for etching later in the week. The metering board component stuffing is almost complete now...................


I've been doing some serious head scratching the last 24 hours over the Line Driver design. Not that there was anything wrong with the pervious design, but I have now come up with one that I think is significantly better. Here is the revised schematic diagram. This revised design is the result of a comprehensive noise analysis and features significantly improved noise performance. I achieved this with the use of parallel gain stages, borrowing ideas from Self's Small Signal Audio Design. Another major modification is the addition of a balanced input stage.

The balanced input stage eliminates a major implementation headache - namely how to couple the ground-referenced signal output from the Generator board to the signal input of the Line Driver board while avoiding ground loop issues. The power supply ground return wire of the Line Driver board carries muck such as the LME49600 half-wave class AB currents (especially when driving low impedance loads at maximum output). We do not want the voltages induced across the power supply ground return wire by these currents to become part of the input signal. My original intent was to solve this problem by constructing completely separate (galvanically isolated) power supply modules for the Generator board and the Line Driver board, thus making the only ground connection between the two boards the shield of the coax signal cable connecting the Generator board signal output to the Line Driver board signal input, that essentially carries only the signal current.

With this revised design of the Line Driver board, however, I won't have to resort to this measure and expense - both boards can now be safely powered from a common, regulated power supply. Any additional stage, such as a balanced input stage, however, must be added with caution, because of the potential to worsen or perhaps even ruin the overall designs distortion and noise performance. To this end a relatively complex circuit is required; the balanced input stage is comprised of U1 through U4 and the associated 0.1% tolerance resistors. I don't have the time to post the noise circuit analysis / calculations here now, but four parallel stages were all that was required to make the noise contribution of the balanced input stage to the complete circuit noise completely negligible. The 2V rms signal output from the Generator feeds input terminals J1 and J2, such that the "signal" ground is referenced/sensed back to the Generator board. The LM49720 is a 55 MHz GBWP op-amp, operating here (U1B through U4B) at a noise gain of 2.27 Av, thus returning a closed loop bandwidth of 24 MHz. Such a high bandwidth, combined with the specified 0.1% tolerance feedback resistances ensure that the balanced input stage retains an excellent signal input-CMRR performance from DC to HF.

A further measure I have taken is to actually limit the HF content of the power supply ground wire return currents with LC supply rail filters L1/C5, L2/C6. The DCR of the inductors along with the ESR of the capacitors provide greater than critical damping.

All up I think this is now a rather refined line driver design that fulfills my original design specification and should perform exceptionally well; that can't really be improved upon to any significant degree without a drastic increase in complexity. With any luck I'll have the PCB layout knocked off before the Easter long weekend.


It seems that I just can't help myself; I have revised the Line Driver schematic yet again. As an economy measure, in the last revision, I did not buffer the ground sense input (pin J2) of the balanced input stage. Doing such saved on op-amp packages, but very slightly compromised the ability of the balanced input stage to reject ground loop currents. Given the overall cost and complexity of the complete analyser project, I decided in the end to do away with this compromise. The IC-package count has been increased by four. The balanced input stage is now comprised of IC's U1 through U8. The total number of parallel stages has been increased from four to five. This was done to compensate for the noise contribution of the additional buffers.


It's been a month since the last update. I simply have not had the time due to too many other things on my plate. I finally got stuck into the project again this evening. Firstly, I completed loading and testing the metering/filtering board. Here it is, wired up and operating on my workbench:

With this board wired into the analyser I proceeded with the necessary bench testing as detailed in the previous updates; namely testing the residual distortion + noise performance at all operating frequencies (not just 1 kHz) and working out the optimal/final range-switched time constants for the auto tune servo loops.

However I ran into a snag along the way. A minor issue with the Generator/Oscillator board became evident. It turns out that the multiplier stage did not have enough control range at the upper end of the frequency range (due to an apparent decreasing amplitude control loop gain with increasing frequency) for the Generator to operate reliably at the highest selectable frequency of 50 kHz. Oscillation at 50 kHz would intermittently die out. Investigations revealed that at 50 kHz the amplitude control loop servo amplifier/integrator was driving the gate of the multiplier control jfet at a DC potential of ~200 mV. The DC potential at the gate should in fact always be a few volts negative (which it indeed was at the lower frequencies).

The fix, however, was a simple one. I just increased the feedback resistor of the multiplier stage from 200 R to 1 k, to increase the loop gain five times. This was more than enough of an increase to bring the servo loop back into range at 50 kHz, with a couple of volts negative or so at the jfet gate. However there is a trade off here. More loop gain in the multiplier stage means the injection of more control loop ripple and thus potentially worse distortion performance at all frequencies, and this in fact was the outcome. At 1 kHz the distortion + noise residual of the analyser increased from -122 dB to only -118 dB (0.00013%). I'm not entirely sure what is going on here. I may have even soldered in the wrong type of jfet (it's hard to tell with these SMD parts). I don't recall having this issue with the MK1 Generator board and according to my sums there should have been an adequate control range in the multiplier stage. For the time being, until I figure out exactly what is going on, I have removed the schematic diagram of the Generator Board, pending a possible revision to the relevant component values.

Anyway, after increasing the multiplier loop gain five times so that the Generator operated properly at all frequencies, and along with the bandwidth filters of the metering board wired in, I was able to progress with my residual testing at all frequencies. Here is the preliminary result at the operating frequency of 20 kHz:

The Generator output (measuring 1.7 V rms), at 1 V per vertical division, is shown along with the residual signal (measuring 2.4 uV rms), at 20 uV per vertical division. As can be seen, the residual is almost entirely noise. 1.7 V / 2.4 uV equates to -117 dB; 0.00014%. For an input signal as low as 1.7 V rms, I think this is an exceptional result. I currently have the PCB layout for the Line Driver board 90% completed, and am eager to get this module of the analyser up and running and inserted between the Generator board and the distortion board. It will be interesting to see how the analyzer's residual noise and distortion will look with the Generator output amplified to a higher level.


Generator problem solved!
State variable oscillator builders - beware of ultra high GBWP opamps!

Well, after much banging my head against the bench top, this evening I finally solved the puzzling issue of the Generator refusing to oscillate at high frequencies (for which there was a delightfully simple cure, about to be explained), as mentioned in yesterdays update. What I have found should be of interest to all experimenters of state variable oscillators. The problem, in a nutshell, begins with the fact that, theoretically, a state variable oscillator can not in fact oscillate with ideal building blocks. To go back to basic theory, the positive feedback loop of a state variable oscillator consists of two (typically identical but not necessarily so) integrator stages connected in series with an inverter stage, as depicted in the simplified diagram:

Each integrator provides 90 degrees of phase shift while the inverter stage provides 180 degrees of phase shift for 360 degrees in total. But here is the problem - the description just given, which is the typical text book one, is not entirely correct. An ideal integrator stage cannot, in fact, provide 90 degrees of phase shift. It can get extremely close, but it can never actually reach exactly 90 degrees. As a consequence, the phase shift around the theoretically ideal loop will always be slightly short of the 360 degrees required for actual oscillation.

So then, how does a state variable oscillator manage to work at all in real life? The answer is an obvious one. The real world op-amps that are typically used to make these oscillators have finite bandwidths. The rather tiny extra phase shift required to ensure the full 360 degrees required for oscillation is simply provided by the limited bandwidths of our building blocks. This, however, is something that has puzzled me ever since my first study into the theory of state variable oscillator operation, simply because I have not seen the issue actually discussed or quantified anywhere. The question begs; when, in state variable oscillator design, does this theoretical consideration become one that needs to be considered on a practical level? Well I have found out - when using 55 MHz GBWP op-amps instead of the old and venerable 10 MHz GBWP NE5534!

To reiterate the problem I was having; my state variable oscillator refused to oscillate reliably at the highest frequency setting of 50 kHz unless I significantly increased the voltage gain of the multiplier stage used for amplitude stabilization - to level that, theoretically, was much higher than what should actually be required. This is something at really bugged me because increasing the multiplier gain beyond that which should be adequate, measurably compromises the oscillators noise and distortion performance. I spent hours going over my operating circuit board with oscilloscope and multimeter. I checked and double checked every single component value; everything was perfect but the circuit still would not oscillate reliably at 50 kHz with the "correct" and suitable value of voltage gain in the multiplier stage. I was completely baffled as to why, until I realized that to oscillate at 50 kHz, my multiplier stage had to be configured with an excessively high voltage gain so that it could feed back, positively, (the signal probed at the output of the multiplier stage being in phase with the signal at the input) a high level of signal back into the inverter stage.

Then the penny dropped. The state variable loop simply refused to oscillate without an excessive amount of extra positive feedback via the amplitude-controlling multiplier stage simply because the 55 MHz LME49710 op-amps I had constructed the thing from were too close to ideal, providing very little in the way of the extra phase shift required by virtue of their high finite bandwidths. But why was this only an issue at the highest operating frequencies? Simply because each integrator stage, due to capacitor non-idealities and other issues at higher frequencies, behaves even less like a ideal integrator, coming not quite as close to phase shifting exactly 90 degrees as they do at lower frequencies. I had also observed that the amplitude of the positive feedback signal, as probed at the output of the multiplier stage, increased with operating frequency. This could also be observed by probing the gate voltage of the multipliers control jfet, as under the control of than amplitude stabilization loop. As the switched operating frequency was successively reduced from 50 kHz, the jfet's DC gate voltage would drop further negative, reducing the amount of positive feedback via the multiplier stage.

So, how did I solve the problem then? Firstly with an estimation. In trouble-less circuits the inverter stage for the state variable loop is typically made with an NE5534 configured with a gain very close to -1 (The Cordell oscillator, for example). In this configuration the inverter is running at a noise gain of about 2, so the closed loop bandwidth of this stage is equal to 10MHz(GBWP)/2 = 5 MHz. In comparison, the bandwidth of my inverter stage using the LME49710 op-amp was almost 28 MHz. My inverter stage has a 3k9 feedback resistor, which, to reduce the stage bandwidth to 5 MHz, requires a parallel capacitor of 8.16pF. For good measure I picked a 10 pF unit from my parts stock and soldered it in.

Somewhat along with my theoretical expectations, but still amazingly the oscillator performance was miraculously transformed by this simple modification. In other words the Generator now began to operate exactly as it should at all operating frequencies. Switched to 50 kHz, the positive feedback signal at the output of the multiplier stage dropped right back, permitting me to reduce the multiplier gain back down to the original setting. Viewing the positive feedback signal at the output of the multiplier stage, I can still observe a weak positive function of amplitude with operating frequency at the top decade frequency range, but I guess this is to be expected, and it is now no where near great enough to cause a concern for the stability of oscillation. The multiplier stage, at all frequencies is now operating well within its control range, just as it should!

To sum up, through a lots of oscilloscope probing and the application of some theory I have discovered the reason behind my oscillators initial quirk of requiring an inordinately high and detrimental multiplier feedback at high frequencies; all being due to the state variable topology being temperamental with modern op-amps having characteristics coming too close to the ideal! Fortunately the cure is simple to apply and complete. If anyone out there is already aware of the issue detailed above, has discovered the same thing themselves in their own experiments or can cite a text book reference that actually gives some analysis and practical consideration to the issue of the state variable oscillator's "extra" phase shift requirement, I'd really appreciate an e-mail!


I have finally uploaded amended schematic and PCB layout files for the Generator board, as linked to in the "Design Files" section at the top of this web page. In addition to the bandwidth limiting capacitor for the inverter stage (C51) I have revised the resistor values around the multiplier stage (U1). Part of the problem I was having with the HF operation of the Generator board turned out to be that the multiplier stage resistors values were in fact incorrect and sub optimal all along. I made an error when drafting the schematic diagram, taking the resistor values from an obsolete simulation file generated early on in the design phase. Doh! I was originally experimenting with a multiplier stage using two JFETs in parallel for the control element, and it is this circuit that I accidentally took the resistor values from. The control range of the multiplier was therefore limited, as the single JFET as used in the final design could not attain a sufficiently low on resistance. The real life unit has been thoroughly tested on the test bench at all operating frequencies with the revised component values and I can confirm that is now working 100% as it should.

With the revised Generator board component values, here is the analyzers current residual noise and distortion performance, operating at 50 kHz:

The 50 kHz fundamental measures 1.84V RMS and the residual is shown at 10uV per vertical division.

......and here it is at 20 kHz:

The 20 kHz fundamental measures 1.84V RMS and the residual is shown at again at 10uV per vertical division. The residual measures 2.2uV RMS, which is -118 dB.

Not a very good photo due to the overly bright trace intensity, but here it is operating at 1 kHz

At 1.5 uV RMS the residual is -122dB. In all cases the measurement bandwidth, as defined by the Metering Board, is ten times the fundamental frequency.


Not much of an update this evening, but I can report to having made a lot of progress in schematic drafting and PCB layouts for the remaining PCB's. I have completed the Line Driver PCB, the separate power supply modules for the Generator section and the Analyser section are under and I am currently drafting the schematic diagram for the "Auto Tune, Auto Level Set and Status Indicator" board. At the moment, the residual noise and distortion performance (as pictured in the last update) of the prototype analyser, as operating bare on my workbench, is somewhat compromised by the sub-optimal performance of the "dead bug" auto tune module that was built in a rush just to get the basic system up and running. This was in fact a primitive, "rough and ready" circuit soldered together without care for grounding issues and without anything as fancy as an instrumentation amplifier on any of the signal inputs to break pcb-to-pcb ground loops. It is in part the HF switching content of this boards non-isolated (hardwired) ground (as well as direct radiation from the unshielded rats nets construction) that is measurably compromising the THD+N floor at the higher operating frequencies. I also didn't include an adequate "null" trim facility into either the "amplitude tune" nor the "frequency tune demodulators of this rats nest construction. In some of the residual signals I can make out a quadrature fundamental component that must be due to a small, un-trimmed imbalance in the frequency demodulator.

It is for this reason I have given priority to completing the the circuit design and PCB layout for the "Auto Tune, Auto Level Set and Status Indicator" board. Because of the issues mentioned above, the next PCB that I wish to get built up and wired into the operating prototype, for the next round of bench testing, is this one.
Attached below is a screen capture of the partially drawn auto tune section of this board. For the synchronous detection of both the amplitude-tune and frequency-tune signals I have used a single ADG413 analogue switch, "clocked" by a pair of zero-crossing detectors operating from the fundamental (amplitude-tune) and quadrature (frequency-tune) signals provided by the state variable notch filter. This type of circuitry has a high input signal (distortion residual) handling capability/dynamic range, which results in quickly tuning servo loops without the need for additional speed-up circuitry. To this end I have also implemented range-switched time constants for the servo loops.

I have also revamped the Auto-Level Set circuitry which now incorporates, in the automatic gain control servo, true RMS detection of input signal. And as per the auto-tune servo loops, the integrator time constants are range-switched as well. However I have not finished drawing that part of the schematic yet!.

I will post the next update when this complete board is operational on my workbench. The schematic and PCB layout should only take three or four more evenings.


Finally, here is the completed / preliminary schematic for the Auto Tune, Auto Level-Set and Status Indicator board. I ended up economizing the design a bit in the end. Instead of switching a separate bank of servo time constants for each of the four decade ranges I have simplified the design to just two switched time constant sets. The time constants are only switched on the lowest decade range (10/20/50 Hz). Inter-PCB ground loops are eliminated by the use of differential (instrumentation amplifier) signal input stages.

The servo integrators for the auto-tune loops are U12 and U15. Each servo integrator is preceded by a single pole low pass filter. A zero in the integrator response compensates for the pole of this low pass filter. The low pass filter prevents high frequency switching components from the synchronous demodulator output, of significant amplitude, from reaching the integrator stage, whose capacity to attenuate high frequency signals is limited by finite operational amplifier gain-bandwidth product. One thing I have taken pains to ensure is that the bandwidth of the complete distortion amplification path, right up to the input of the synchronous demodulator is very wide, and thus minimal the phase shift of the distortion signal with respect to the fundamental and quadrature reference signals.

Few compromises have been made in the auto level-set section. True RMS amplitude detection of the input signal is employed. Having true RMS detection here (to complement the Metering Board's true RMS detection of the distortion residual), ensures that the measurement accuracy is not diminished when measuring input signals having very high levels of distortion.

I'm still working on the PCB layout but hope to have the board build and tested next weekend. In the meanwhile, one thing I noticed, when sorting out my auto-tune servo time constants / frequency compensation is that Cordell, in the design of his state variable oscillator section, set the zero in his amplitude control integrator about 100 times lower lower in frequency than what should be required. This unnecessarily diminishes the ability of the integrator, controlling the multiplier stage, to attenuate high frequency switching poop from the rectifier. Following his example, I set the zero in my integrator similarly (much too) low. Increasing this zero frequency may conceivably yield a small improvement in distortion performance on the lowest operating frequencies (which produce the greatest ripple amplitude from the 4-phase rectifier). This will be another thing to test on the bench.


Due to circumstances beyond my control, progress on this project has been halted for the last 2-3 weeks or so. To date I've been cutting my all PCB laminate and folding my sheet metal chassis work during lunch break or after hours at work, on a combination sheet metal guillotine/roller/press brake bender machine that I had access to. That machine suffered a major malfunction three or four weeks ago now due to misuse and a replacement isn't due any time soon. So, due to necessity, I ended up forking out for my own combination machine; a 30" unit, as pictured here:

It arrived last week Thursday and I finished installing it in my shed and aligning the blades Friday evening. Besides cutting PCB and bending chassis' for electronics projects I've probably found about 50 other uses for the thing in the meanwhile! Second to my angle grinder(s) I think this is the most useful thingy I have ever stuck into my shed. I've wanted one of these for years, but was always put off by the price. Comparable instruments used to sell for well over a grand, but I guess due to mass manufacture in China picking up along more and more machinery product lines they're now available for half or less the price they used to be.

But back to the THD analyser. Over the weekend I cut, etched and drilled most of the remaining boards for the project and am making steady progress loading the components in the evenings. Now that I have my own combination machine there is not stopping me now.... I also have sufficient stock of 0.7mm thick galvanised steel sheet with which to fabricate the custom chassis, which is the plan for the coming weekend.

A major update to this project page is thus finally not too far away, but in the meanwhile here is some theory stuff further to my comments in the progress update posted March 13 on the comparison between a Butterworth and a Bessel response for the low pass filtering (bandwidth limiting) of the distortion signal. Here is my LTspice simulation circuit for the comparison:

To start with, the simulation circuit consists of an un-biased complementary emitter follower (Q1 and Q2) driven by a 100-Hz sinewave source. This simulates a linear circuit with a high level of distortion; in this case mostly comprised of gross crossover distortion having a rich high frequency harmonic content. The signal from Q1 & Q2 is fed through a pair of cascaded high-Q tank circuits which notch out the fundamental. Here is the waveform plot for the distorted sinewave signal both before and after being fed through the notch filter.

With the 100-Hz fundamental notched out the signal at the output of the notch filter essentially contains only the harmonic distortion components. This distortion signal is then fed to a pair of 2-pole lowpass filters each having a corner frequency of 1-kHz (ten times the fundamental frequency). Here is the simulated frequency and phase response for the entire notch/lowpass filter path:

As can be seen, the Butterworth response is flatter in the passband and rolls off above the corner frequency at a steeper rate of attenuation, but incurs a significantly greater phase shift after the fourth harmonic or so. But how significant is the smaller phase shift of the Bessel filter, in comparison, in terms of minimising " waveform distortion of the distortion products" and "preserving the accuracy of the visual display"? Here is the waveform comparison:

Shown here is the non-bandwidth limited distortion signal at the output of the notch filter (red trace) along with the bandwidth limited Butterworth and Bessel responses. There is such a small difference between the two filter alignments, as far as waveform distortion is concerned, that I don't think the Bessel filter has any kind of superiority in this application. Although the differences are not enormous, the Butterworth alignment, with its particularly desirable characteristics (for the application) of a "maximally flat" pass band amplitude response and sharper high frequency roll off, makes it the better option, in my opinion.


Progress was delayed again due to waiting for a couple of critical parts which took longer than expected to arrive on back order, but here is an update just to show that this project really is heading down the road to completion. I have finally tested out the auto tune / auto level set and status indication board. Here it is in operation:

It worked perfectly on first turn on, with one small exception. Given Murphy's intervention there almost always has to be one silly mistake that creeps in. I specified the wrong instrumentation amps (U1 and U7) for interfacing the fundamental and quadrature reference signal inputs. The AD8226's are way too slooooooow. Doh! I wired the board into the analyser and set the frequency of operation to 1-kHz and everything worked fine, but something was amiss at 50-kHz. Probing the signal at the outputs of U1 and U7 I found that my sinewave at the higher frequency had turned into a partial triangle wave. The AD8226's were slew rate limiting. Their limited bandwidth also incurred a fair bit of phase shift, which degraded the ability of the synchronous detector servos to completely null out the fundamental signal from the residual. This, however, is a minor issue. I just have to swap the AD8226's with an in-amp having adequate bandwidth. With any luck I'll be able to fine a relatively cheap one in the same footprint. Once I have this minor issue sorted, the full set of design files for this board will be posted.

And here is the first prototype of the line driver board. Works fine, but am about to take to it with a solder sucker as there are some changes (ground plane) that I wish to make to the PCB layout, which will therefore need to be etched anew.

As for sticking all of this stuff into an instrument case, I have decided to take a different direction. The complete analyser is now to be assembled into two separate instrument cases. The signal source section, comprising the Generator board, Line Driver board and power supply, will be housed in its own case as a stand alone instrument. I've decided on this approach for a couple of reasons. Firstly, there are quite a number of boards that are large and complicated, and housing the whole lot in a single box takes a bit of effort. My original intent was to fabricate a deep 3U rack case, compartmentalised with a mezzanine floor. This is OK for me because I have the workshop facilities with which to fabricate and construct such a case, but not necessarily for someone else who would like to build the design. Secondly, the signal source section, independently, makes for a useful instrument in its own right.

Right now I am working on getting the signal source section built up in its own 2U case. I'll post the next update when done.


OK, I have sorted out the instrumentation amp issue with the auto tune board. I've gone back over my notes and I was supposed to specify the AD8429 for U1 and U7, instead of the AD8226, which is used in other boards of the analyser. This was just a schematic drafting oversight. The AD8226 has adequate bandwidth (1.5 MHz), but a pitiful 0.4V/uS slew rate, making a distortion free couple of volts rms at 50 kHz an impossibility. Just to be mildly pedantic, the visible "phase shift" I commented on previously wasn't due to any bandwidth limitation of the AD8226, but entirely a function of the slewing induced distortion. Anyway, the AD8429 is a pin-pin compatible in-amp with a far more than adequate specs, so that's that little niggle sorted. I've updated the Auto Tune, Auto Level-Set and Status Indicator board schematic diagram accordingly and have uploaded the PCB design files, all of which can now be downloaded via the links in the "Design Files" section towards the top of this webpage. Now I can get back to housing the signal generator section.


This project just keeps getting better and better. I ended up hesitating to solder suck the components off my line driver board for the 2nd PCB revision, just for now, deciding to run some further bench tests with the ground wiring between all of the PCB's improved. In a previous update I commented on the fact that I was eager to get the analyser operational with the Line Driver board in the loop. Up until now, all of my distortion residual measurements have been conducted at the fixed 1.84V rms signal output level as provided by my Generator board. I was eager to see how much (if at all) the analysers residual performance would be improved by amplifying this signal to a higher level to get it further out of the noise. I was also concerned with how much additional noise and distortion would be contributed by the Line Driver board itself. Here is the current test set up, with the Line Driver board wired in and the ground wiring tidied up:

Firstly, even before wiring in the Line Driver board, I was able to significantly improve the residual performance by tidying up the ground wiring. It helped a lot now that I have the Auto Tune board PCB all built up, having dispensed with the original dead bug prototype. Everything at the moment is being powered by my bench supply, which is still sub optimal, but is all I have for now until I have finished assembling the separate power supply modules for the Generator and analyser sections respectively and start installing the circuit boards into the instrument cases.

With everything bare on the bench the local AM broadcast station was giving me a bit of grief again, until I soldered some caps of a few hundred pF directly to both the signal input and the 50-ohm signal output of the Line Driver board. This of course won't be an issue when the whole lot is steel cased. With the Line Driver boards attenuator set to 0 dB, I adjusted the level control for a signal output (unbalanced) of 4V rms. Here is the resultant residual noise and distortion performance at an operating frequency of 1kHz:

The vertical scale for the fundamental sinewave is 2V per division and 10uV 20uV per division for the residual. The residual signal measured 1.36uV 2.72 uV rms, which equates to -129.34dB - 0.000034% !!!!!! -123.34dB - 0.000068% !!

Over the weekend I will finally have the complete generator section installed and operational its own case/chassis. The Generator section will then finally be running with its own galvanically isolated power supply, separate from the analyser section


Some progress pictures of the signal generator chassis. As mentioned previously, the chassis is made of galvanised steel rather than aluminium for low frequency (mains hum) electromagnetic shielding. To this end the power supply board has its own compartment. I'm using PCB-mount toroidal transformers on the power supply board, for their low radiated field. Having the transformers PCB mounted instead of directly on the chassis also prevents them inducing eddy currents into the case. With audio circuit as low in distortion as this, a steel case can measurably degrade the linearity, but only if the PCB's are mounted with very little clearance from the metal work. The space within this 2U case is adequately spacious and the Generator and Line Driver PCB's are to be mounted on 1" standoff's, so I'll be very surprised is that turns out to be a measurable phenomenon at all.


Just posting an update tonight to correct a silly error I made in the last posted THD+N residual measurement. This was in fact the first measurement I conducted with the auto-level set circuitry completed and operational, and I consequently neglected to factor in the gain of the auto-level-set VCA in the distortion residual amplification path. The THD+N figure is in fact 6dB higher, at -123.4dB, rather than -129.4dB. However there is still four zero's to the right of the decimal point, so I'm still not at all displeased with the performance :-)

I've decided to cut down the frequency of these updates from what I may have promised previously, to keep things a little less haphazard. It is a perpetual pain when fitting in work on a hobby project in a free spare hour or two here and there to keep progress updates such as these, invariably posted late at night in the last free minuted before bed, completely free of hiccups. To the few who have corresponded with me on the progress of this blog for their own assembly or copy of the design, or anyone else out there, just email me if the suspense builds up too high in the meanwhile.


Following are some pictures of the completed signal source, built up and operational with the final iteration of the Line Driver PCB layout.

I have not made much comment thus far on the PCB layout rules adhered to in the design of this project, but I feel this is a good point to make a few brief comments. Many of the PCB's for this project have gone through one or more interations to get 100% right, and this is, pertty much unarguably, THE kind of project where one quickly learns what works best and what doesn't work at all - and especially so when one has a 200 MHz DSO complemented with a wideband RF pre amp to sniff out oscillations in the 100 MHz and above region. The LME op-amps used in this project can be, pretty much, considered RF op-amps with their >50 MHz GBWP. With such parts at such frequencies the power supply pins can no longer be considered as seperate from the signal ground. The internal frequency compensation of these parts is, by necessity (due to the absence of a ground pin), internally referenced to the supply rail(s). To avoid oscillation the bypassed power supply pins must be at the same RF potential as the common mode signal ground. This is best achieved with healthy power supply rail bypassing on each IC combined with a low impedance common ground plane. I have found every other technique to be, to some degree either sub optimal in performance or a total oscillating disaster. The quick sketch below makes plain why traditional audio ground seperations are simply a bad idea here.

They say the proof is in the pudding. Here is the systems residual performance at 4V rms signal at a fundamental frequency of 20 kHz:

The vertical scale for the fundamental component is 2V per division and the vertical scale for the residual signal is 20uV per division. Due to the 200 kHz measurement bandwidth the residual signal is mostly noise. The residual distortion+noise signal measured 3.4uV rms, which computes to -121.3dB with respect to the 4V rms fundamental. This is a THD+N figure of 0.000085%. Even when terminating the Line Driver signal output into a low impedance load of 50 ohms, there is hardly any discernable change in the residual signal.

Over the next couple of evenings I will be busy evaluating the noise and distortion performance (and possibly revising the design a bit) of the very last PCB of this project - that being the Signal Amplifier board. When I am happy with the performance of this module the electrical design of the analyser will finally be complete. The complete Analyser section is to be housed in a custom chassis, like the Signal Source section, but of 3U depth rather than 2U. That chassis fabrication will be a job for next weekend and I will post updates when appropiate. However right now I have some hungry chooks, doves and canaries to look after and lawns to cut.


I've just uploaded all the (finalised) PCB and schematic files for the Line Driver board and the Generator section power supply, all of which can be downloaded via the links in the "Design Files" section towards the top of this blog page. The line driver schematic has also been revised at bit. To keep RFI under control I ended up having to install a power line filter into the signal source case:

With the signal source section fully cased, I was actually still getting demodulated audio from the local AM radio broadcast transmitter (891 kHz carrier) in my distortion residual signal. When measuring at 20 kHz, for example, this wasn't an issue, because the 20 kHz high pass of the measurement bandwidth filter filtered out the demodulated audio. The bandwidth of the audio modulation in AM radio transmissions is cut off well below 20 kHz. However, when measuring at lower frequencies, such as 1 kHz, a large part of the demodulated audio would lay within the measurement bandwidth, and therefore effect the distortion readings. The level of the demodulated audio was in the order of a part per million or two. Fitting the RFI mains filter finally eliminated the problem. I also had to take RFI suppression measures at the signal output XLR and these will be revealed in the wiring diagram for the generator section, when I finish drafting it.

So then, with the signal source section fully cased, here, finally, is the entirely RFI-free residual performance at a test frequency of 1 kHz:

The signal source control settings were set as follows:

Output: Unbalanced
Frequency: 1kHz
Attenuator: 0 dB
Level: adjusted for a signal output of 12V peak-peak (4.24V rms)
Z-out: 50 ohms

The oscilloscope setting are 2V/div. for the fundamental sinewave and 10uV/div. for the distortion+noise residual.
The residual signal measured 2.32uV rms, which computes to -125.2dB (0.000055%).

And here is an exact repeat of the above, but with the signal output terminated into a 51 ohm load (resistor). Note that the amplitude drops to half due to the 50 ohms output impedance of the Line Driver.

Here the residual signal measured 1.3uV rms, which computes to -124.3dB (0.00006%). For all sakes an purposes, the distortion performance of the Line Driver is entirely unaffected by the ~50 ohm loading.

Note that above tests were performed by connecting the signal source directly to the signal input of the Notch Filter board, without the Signal Amplifier board wired in between. The noise and distrortion performance of the Signal Amplifer board is something that I am still evaluating.

Gerber Files

When I started this web presence I decided on a policy of supplying my PCB artwork designs in PDF format only. The reason for this is that have seen a few others have their artwork ripped off for commercial purposes - in one case a design (an audio power amplifier) ended up as a product for sale by a kitset supplier on Ebay. The PCB silk screen in that case had been modified to remove any attribution to the original designer.

However not everyone has the facilities (or even the inclination) to etch printed circuit boards at home, and so long as I do not have the Gerber files for the projects I am presenting on this website uploaded, I will continue to receive polite requests for them. I have of course already been providing Gerber files to people I trust. However, the impossibility of keeping this fact completely secret inevitably and recurringly puts me in an awkward position if I am to decline a request based on “policy” that is, evidently, selectively applied.

It is for this reason I have decided to reverse said policy, delete the PDF files provide the Gerber PCB files instead. They are provided for non-profit, personal use.

Thus far, I have generated an uploaded the Gerber files for the three circuit boards that comprise the Signal Source section of this design; namely the Generator board, the Line Driver board and the associated power supply unit. The Gerber files for each board are compressed into individual *.zip files and can be downloaded via the links, along with the schematic diagrams, in the “Design Files” section towards the top of this blog page.

The files were generated up Protel 99SE and, where applicable, are identified by the following extensions:

*.DRL – NC Drill file
*.GTL – Gerber Top Layer
*.GBL – Gerber Bottom Layer
*.GTO – Gerber Top Overlay
*.GBO – Gerber Bottom Overlay
*.GTS – Gerber Top Solder Mask
*.GBS – Gerber Bottom Solder Mask

I will generate bills of materials and the Gerbers files for the Analyser section in due course. However the latter takes time because I have to edit the silkscreens, make sure all of the hole sizes are correct, etc.

Signal Amplifier redesign

On the update posted the 22nd of July I commented on the fact that I was busy evaluating the performance of the Signal Amplifier board. This was in fact the very first board for the analyser project that I designed and built, but it was then put aside while I developed the remaining circuit boards. The design was very heavily influenced by the signal input stage of the Audio Precision System One. Here is the original prototype:

That board was at least due now for a PCB revision as I originally laid it out with mostly through-hole components, before deciding to go mostly surface mount for the entire project for compactness. However, I now feel that the sophistication and performance of this Signal Amplifier is not up to the standard of the rest of the design and am completely redesigning it.

Of all the circuit boards that this project is comprised of, it is this very last one which is proving to have the most difficult set of constraints and compromises to balance and decide upon.

The original signal amplifier was primarily designed to provide high impedance balanced inputs for nominal DUT-output signals ranging from 100 mV rms to 100 V rms. In a general purpose THD analyser, this level of versatility is definitely desirable, but it comes at a price. In this case, the price is a noise floor higher than I would like.

Firstly, the high voltage input capability must be handled by a switched attenuator. If, without elaborate complication, this input attenuator is to provide a fixed, high input impedance, it will be a non negligible source of noise on some if not most settings. Secondly, the three op-amp instrumentation amplifier topology most conveniently used to interface the differential signal input is inherently noisy at low gain settings.

The Signal Amplifier as originally designed for the most part is quite OK for measurements at a test frequency of 1 kHz and less, but begins to significantly limit the analyser’s ultimate measurement floor, due to its noise contribution, for the higher test frequencies which are necessarily measured in a wider bandwidth.

In order to get around these issues I’ve had to compromise somewhat on versatility in the revised design, though I have not dispensed with a high impedance differential input, as I consider this facility mandatory for a high performance analyser. I have, however, dispensed entirely with a switched input attenuator. This means that the analyser will not be able to accept signal input voltages greater than what the +/-15V op-amp supply rails allow.

For measuring 99% solid state audio equipment, with the exception of power amplifiers, this will not present a limitation. For the signal amplifier I have designed a very low noise, switched gain instrumentation amplifier comprised of multiple parallel stages. Getting rid of the switched input attenuator dropped the overall complexity down to a more palatable level.

For testing audio power amplifiers with higher signal output voltages an external attenuator will be required. This isn’t something that I consider a big deal, as an external dummy load of either 4 or 8 ohms is required anyway. It is relatively trivial to solder up a series string of some low value power resistors in parallel with the dummy load to provide the necessary attenuation, having low impedance and therefore a totally negligible noise contribution. As a matter of fact, such an attenuator, with multiple signal tapping points/terminals is exactly what I intend to install into my current dummy load.

All this signal amplifier rethinking and redesign stuff has delayed the completion of this project, which I was hoping to have done by now, but I hope to at least get some pics of the mostly completed analyser section chassis loaded up in a week to two.


Well, I wound up being away from home, busy with something a wee little bit different:

.... since the last instalment and haven’t made any further progress in the construction and testing the redesigned signal amplifier board. However I’ve gotten back to the analyser section chassis construction now and it is progressing OK, along side another, totally different project – a junk box 40m AM radio transmitter:

I’ve had all of the major components for this project put aside for a while already, and it only made sense to get the chassis work out of the way while I’m already in the workshop fabricating the analyser chassis. The tube used in the final is an 807 tetrode, plate modulated and operating in class C. For the modulation transformer I’m using an Edcor Electronics “single ended” audio output transformer in reverse; that is with the low impedance (4 ohm) secondary winding driven by a class AB solid state audio power amplifier acting as the modulator. The transmitter will deliver a 15W carrier. A 6CK6 video amplifier pentode is utilised in the PA driver stage. All other stages are designed with solid state.

I’m not going to leave this project sitting around unfinished now that I have started it and am building it up along side the THD analyser. It will probably be a month now before I can show off the completely finished analyser project and begin finalising all of the necessary documentation.


It has been almost a month since the last update, during which I’ve been working on the analyser section chassis on and off. It just seems to be an extremely time consuming task. However the chassis is now complete and I’m just left with having to install the circuit boards and wire everything up.

The revised circuit for the low noise Signal Amplifier is complete and I am currently working on the circuit board layout. The chassis is divided into four separate, shielded sections. The compartment at the rear is for the power supply circuitry. The PSU PCB’s (with board-mounted torodial transformers) will be screwed to the removable rear panel. The compartment to the left of the chassis is for the signal amplifier (which will be mounted vertically on the left hand side panel) and the notch filter/distortion amplifier board has its own compartment in the middle.


Geez, doesn't time fly! More than a month has already flown by. As usual, I got sidetracked on half a dozen other things and this project got shifted onto the back burner. However I have now completed the MKII signal amplifier board. Here is the schematic. What this basically is, is a standard three op-amp instrumentation amplifier, but with four parallel amplifiers per op-amp stage, for the purpose of noise reduction. It took a tiny bit of head scratching to figure out how best to parallel the op-amp stages so that the gain remains programmable with a single switched resistance, but the solution was pretty simple in the end. The Signal Amplifier has five switched voltage gain settings, of 1, 2, 5, 10 and 20 Av. The PCB layout is about to be dropped into the etch tank. I have retained the switchable 600/150 ohm input dummy load and have also incorporated a signal overload indicator.

To prevent ground loops between this board and the Notch Filter board, the signal amplifier will have its own +/-15V power supply rails, provided by a dedicated transformer and separate from the rest of the analyser circuitry. The power supply circuitry for the analyser in its entirety will be accommodated on a single large PCB. The PCB will carry three PCB-mount toroidal transformers. A small 7.5VA unit for the isolated +/-15V rails of the Signal Amplifier, a 10VA unit for the regulated 5.6V relay supply, and a 25VA unit for the +/-15V rails for the rest of the analysers circuitry. Two of the toroidal transformers are on back-order from the UK. I probably still won't see them for another week. Due to the poor data from the manufacturer, I actually need the transformers to verify the PCB footprints before I can finalise the PCB layout. Once that is done I'll post up the power supply details. I'm also beginning to make an effort to get all of the Gerber files for all of the PCB's finalised and uploaded. So, very soon I'll be able to screw the lid onto this project and call it finished.

The Signal Amplifier PCB layout