RHeTTRoNiCS
- Mar 28, 2013
- 35
- Joined
- Mar 28, 2013
- Messages
- 35
Hello All…any idea how much should be the impedance of the line can be read by multimeter when using to detect a 1.2KW RF?
Last edited:
What should be the appropriate impedance
So in short...the 300ohms can be correct to use for 2.5GHz, as long as no issue on impedance matching? Thus, i am wrong to say that the higher the RF power the higher impedance is required....and should be matter with matching impedance?...Thanks..
You are getting seriously sidetracked. Understand that the diode is only sampling the RF. Its impedance is irrelevent as far as the overall RF circuit goes.
The transmitter wants to see the appropriate impedance of the feedline and the antenna or dummy load on the end of the line. Variations in the diode are not going to affect that. The only thing that diode variations are going to affect is the amount of sampled voltage that is generated for your measuring circuit.
So you MUST make sure that the WAVEGUIDE IS be terminated correctly.
1) The termination MUST be able to handle the level of power you are going to put into it
2) The termination MUST be of an impedance value that the transmitter expects to see on the end of the waveguide
Dave