I have a remote control that puts out a 12V PWM signal. I want to use it to control an LED driver expecting a 10V PWM signal for dimming. I've done some reading and think that a 10V Zener diode regulator circuit could clip the 12V to 10V.
I've not selected any particular diode yet, but have been looking at many spec sheets. I am unable to find or work out the minimum current required through the diodes to keep them in their regulating range. Some tutorials suggest ~5 mA is common, but I'd like to get it right rather than assume.
What is the secret to finding this drop out current?
As you might have guessed, I'm a noobie at electronics. Any suggestions would be welcome. TIA
Regards
rdl
I've not selected any particular diode yet, but have been looking at many spec sheets. I am unable to find or work out the minimum current required through the diodes to keep them in their regulating range. Some tutorials suggest ~5 mA is common, but I'd like to get it right rather than assume.
What is the secret to finding this drop out current?
As you might have guessed, I'm a noobie at electronics. Any suggestions would be welcome. TIA
Regards
rdl