Ian Jackson wrote:

In message <

[email protected]>, David Farber

Ian Jackson wrote:

In message <

[email protected]>, David Farber

I have a trusty old Sound Technology ST1000A FM generator and I

have a question regarding the variable FM output level. The

owner's manual states that the output impedance is 50 ohms,

VSWR<1.3, 200Vdc isolation. My question is if I am using this

device to check the sensitivity of an FM tuner, will the output

level be affected if I connect the output terminal to an RG-58

cable and then use a 75 ohm to 300 ohm matching transformer to

make it compatible with the old FM tuners? It seems to me the

2:1 step up ratio of the matching transformer should affect the

level not even considering the fact that the 75 ohm input of the

matching transformer does not match the 50 ohm output of the

cable. Thanks for your reply.

The fact that the generator 50 ohm output and coax has a 75 ohm

load will give you a voltage of 0.6 of the generator open circuit

voltage. If the load had been 50 ohms, you would get 0.5 of the

open circuit voltage. The increase is 0.6/0.5 = x1.2 (+1.58dB).

75-to-300 ohm transformer should give you a 2-to-1 voltage step-up

(+6dB).

The transformer will have some loss. This should not be more than

about 0.5dB.

So, the voltage at the FM tuner 300 ohm input will be the

generator output (into 50 ohms) + 1.58dB + 6dB -0.5dB = Vout +

7.08dB.

This assumes, of course, that the tuner input impedance really IS

300 ohms (which it probably isn't!).

I believe the output level dial is calibrated for a 50 ohm load. So

using your numbers if the output dial is set to 10?V then I

calculated a 7.08dB gain to be 22.6 ?V. In effect, the sensitivity

of the tuner at this point is 7 dB worse than what the dial

indicates. Is that right? Thanks for your reply.

The calculation looks OK. However, I suppose it depends on what your

'standard' impedance is.

If you have a 'normal' halfwave dipole at (say) 100MHz, the

impedance at the centre will be about 75 ohms. It would be normal

to connect it (via a 75 ohm feeder) to a tuner with a 75 ohm input

impedance. [Note: It might be more correct to say that the tuner is

designed to work best when fed from a 75 ohms source. In practice,

it might not have a very good 75 ohm input.] Anyway, let us assume

that the level of a received 100MHz FM radio signal level (into 75

ohm) is 1?V.

Now, if you replace the 'normal' dipole with a folded dipole, you

would use 300 ohm feeder and connect it to a tuner with a 300 ohm

input impedance. Ignoring distractions like differences in feeder

loss, the tuner input voltage will be 2?V.

However, despite being fed with twice the voltage, the 300 ohm tuner

won't work any better with the folded dipole than the 75 ohm tuner

works with the 75 ohm 'normal' dipole. In both cases, the input

power is the same. Internally, the electronics will be basically

the same. The only difference will be in the matching circuit

between the input and the RF stage.

So, it is reasonable to conclude that, when you specify the

sensitivity of a receiver, you have to specify the impedance. In

your test, if your standard is 300 ohms, then you would say that

your

tuner was receiving 22.6?V (7.04dB more than indicated on the

generator dial). If it was

75 ohms, it would be 11.3?V (6dB less).

Well, I think I'm correct! What do you reckon?

I don't think you need to mention the impedance of the receiver when

you mention FM sensitivity. If it is has the correct antenna and the

correct impedance matching circuit, the sensitivity of the circuitry

should be enough. No? That way I can compare one receiver/tuner to

another and not care what the input impedance is.