JonnyFrond
- Nov 10, 2010
- 1
- Joined
- Nov 10, 2010
- Messages
- 1
Dear Electronics wizards,
I am new to electronics, through the degree I am studying in Mechatronics, and I am trying to get my head around some of the basics.
I am currently learning about voltage comparators, and have come across a piece I need but cannot find anywhere, though I am loathed to harrass my lecturers.
Basically, in my assignment, there is a basic comparator circuit using a LDR and an led. Very simple light is shined on the LDR, and the LED comes on.
I am currently trying to calculate what resistor I need to use before the LED to get it to the optimum voltage.
What I need to know is:
With a comparator, I have been assuming that the main signal voltage going into the comparator is the same as the voltage at the output that is going to be used to light my LED.
I am now becoming aware that this is not a useful assumption.
I have never seen or used a real comparator as yet, but I am assuming it is similar to an op amp, in that it is powered with a separate feed from the power source. If this is the case, what will be the voltage at the output, is it the same as the main signal in, or is this signal input just used to switch the comparator on or off, and the output can then be assumed to be the same as the power supply.
Power supply 15V
Voltage at input has been calculated at 10.7V or 2.1V
Reference is 6V
Possible Output Voltage is it:
2.1V and 10.7V
or
0V and 12V
(or there abouts depending on the quality of the comparator of course)
It would also be useful for me to know what is standard practice so that in future I can add in a more accurate assumptions section to my answers to add clarity to my work.
Kind regards
Jonny
I am new to electronics, through the degree I am studying in Mechatronics, and I am trying to get my head around some of the basics.
I am currently learning about voltage comparators, and have come across a piece I need but cannot find anywhere, though I am loathed to harrass my lecturers.
Basically, in my assignment, there is a basic comparator circuit using a LDR and an led. Very simple light is shined on the LDR, and the LED comes on.
I am currently trying to calculate what resistor I need to use before the LED to get it to the optimum voltage.
What I need to know is:
With a comparator, I have been assuming that the main signal voltage going into the comparator is the same as the voltage at the output that is going to be used to light my LED.
I am now becoming aware that this is not a useful assumption.
I have never seen or used a real comparator as yet, but I am assuming it is similar to an op amp, in that it is powered with a separate feed from the power source. If this is the case, what will be the voltage at the output, is it the same as the main signal in, or is this signal input just used to switch the comparator on or off, and the output can then be assumed to be the same as the power supply.
Power supply 15V
Voltage at input has been calculated at 10.7V or 2.1V
Reference is 6V
Possible Output Voltage is it:
2.1V and 10.7V
or
0V and 12V
(or there abouts depending on the quality of the comparator of course)
It would also be useful for me to know what is standard practice so that in future I can add in a more accurate assumptions section to my answers to add clarity to my work.
Kind regards
Jonny
Last edited: