Maker Pro
Maker Pro

LCD VGA input circuit

How do LCD monitors get the correct dot clock frequency to sample the
incoming VGA?
If all you have is a HSYNC frequency, do they do a look up function or
something more fancy?
What chips are typically used to recover the clock, or is it built in
to some huge SOC?
 
J

Joel Kolstad

Jan 1, 1970
0
How do LCD monitors get the correct dot clock frequency to sample the
incoming VGA?
If all you have is a HSYNC frequency, do they do a look up function or
something more fancy?

Something more fancy: A phase-locked loop, derived from HSync.

Since the LCD "knows" it has, e.g., 1280 pixels to display, a PLL is
configured such that it generates 1280 pulse (pixel clocks) between the active
edges of HSync. Take a look at the data sheet for a digitizer meant for LCDs,
e.g., the Analog Devices 9884:
http://www.analog.com/en/prod/0,2877,AD9884A,00.html

---Joel
 
Something more fancy: A phase-locked loop, derived from HSync.

I appreciate how the clock is generated, I want to know how the thing
knows _which_ dot clock to generate. I can think of many situations
where the same HSYNC would have a different dot clock.
Look at it this way. You are in a cardboard box. All you have is a
tiny slot through which I slip you a piece of paper written 15743 Hz
on it. Now guess which video mode I'm in, and what the correct dot
clock is?
Since the LCD "knows" it has, e.g., 1280 pixels to display
How?

configured such that it generates 1280 pulse (pixel clocks) between the active

Yup, but I want to know how the thing knows it needs 1280 dot clocks
per line. It must have a timer, measures the HSYNC, looks up which
video mode is closest, then programs the genlock?
From the 9884 datasheet:
"A Voltage Controlled Oscillator (VCO) generates a much higher pixel
clock frequency. This pixel clock is divided
by the value PLLDIV programmed into the AD9884A, and phase compared
with the HSYNC input."

You see, you still need to program the value yourself via the I2C
interface. You don't just toss in a HSYNC and it magically determines
the dot clock.

I'm asking because of a personal project that's been on hold for a
long time.
 
R

Rich Grise

Jan 1, 1970
0
I appreciate how the clock is generated, I want to know how the thing
knows _which_ dot clock to generate. I can think of many situations where
the same HSYNC would have a different dot clock. Look at it this way. You
are in a cardboard box. All you have is a tiny slot through which I slip
you a piece of paper written 15743 Hz on it. Now guess which video mode
I'm in, and what the correct dot clock is?


How?

The guy that built the display told the PLL designer that that's how
many pixels the display has. Actually, if you had 1280 pixels, you
probably wouldn't use 1280 * Fh for a dot clock, because then you'd be
displaying horizontal retrace at the edges, so you'e up it a little and
gate the actual pixel address counter with the horizontal blanking pulse.
Yup, but I want to know how the thing knows it needs 1280 dot clocks per
line. It must have a timer, measures the HSYNC, looks up which video mode
is closest, then programs the genlock?

See above. It doesn't have to "know" anything - the circuit is designed
to work for however many pixels across that the designer put into it.

I think you're confusing the idea that the signal has to somehow "know"
something about what mode it's being displayed in, but it couldn't give
a shit less - if you get a chance, take a look at a TV video with the
hor. sweep set to give about one line across, and you get sort of a
top view of the edge of the picture, where height = brightness, and
sync it to a whole frame/field and you get like an edge-on view.

Cheers!
Rich


Hope This Helps!
Rich
 
J

jasen

Jan 1, 1970
0
Something more fancy: A phase-locked loop, derived from HSync.

Since the LCD "knows" it has, e.g., 1280 pixels to display, a PLL is
configured such that it generates 1280 pulse (pixel clocks) between the active
edges of HSync.

the only problem is that'd give the wrong result.

"1280x960" 108.00 1280 1376 1488 1800 960 961 964 1000 +hsync +vsync
A B C D E F G H
clock# action

0 first pixel
1279 last pixel
1280 right-hand overscan (aka border)
1376 overscan end/horizontal sync start (retrace period)
1488 horizontal sync end/left overscan start
1800/0 first pixel
1279 last pixel

there's more clocks per line than there are pixels.
 
there's more clocks per line than there are pixels.

Those are trivial details next to finding out what mode you're in. I'm
thinking of counting the hsyncs per frame so at least I know the
vertical resolution. From there I can guess what mode I'm in. Problem
is I'm dealing with an entirely programmable video chip, and there's
no fixed H-V relationship. Ugh.
 
The guy that built the display told the PLL designer that that's how
many pixels the display has. Actually, if you had 1280 pixels, you
probably wouldn't use 1280 * Fh for a dot clock, because then you'd be
displaying horizontal retrace at the edges, so you'e up it a little and
gate the actual pixel address counter with the horizontal blanking pulse.


See above. It doesn't have to "know" anything - the circuit is designed
to work for however many pixels across that the designer put into it.

I think you're confusing the idea that the signal has to somehow "know"

Not the signal, the monitor.
something about what mode it's being displayed in, but it couldn't give
a shit less - if you get a chance, take a look at a TV video with the
hor. sweep set to give about one line across, and you get sort of a

So the monitor doesn't need to know that the incoming signal is
640x480 VGA, and it magically appears on the RSDS lines to the gate
drivers on the panel as 1280x1024 native resolution?

Holy crap.
 
J

jasen

Jan 1, 1970
0
Those are trivial details next to finding out what mode you're in. I'm
thinking of counting the hsyncs per frame so at least I know the
vertical resolution. From there I can guess what mode I'm in. Problem
is I'm dealing with an entirely programmable video chip, and there's
no fixed H-V relationship. Ugh.

computer video cards have been like that since forever, CGA didn't offer the
opportunity to change the pixel clock but all else could be tweaked
(sometimnes not real good for the monitor)

I can dial up pretty much any mode I want by editing a text file for Xfree86
windows users can do the same by editiing the registry.

Bye.
Jasen
 
computer video cards have been like that since forever, CGA didn't offer the
opportunity to change the pixel clock but all else could be tweaked
(sometimnes not real good for the monitor)

I can dial up pretty much any mode I want by editing a text file for Xfree86
windows users can do the same by editiing the registry.

Bye.
Jasen

I don't mean in software on the PC, I mean a seperate piece of
hardware connected to only the analog VGA port. Just from the signals
present on the subD-15 connector, how do you figure out what the
correct video mode is? I've got a microcontroller set up right now to
give me a /1016 ratio on my PLL, but what can I do when the source
video chip is fully programmable wrt syncs?
 
J

joseph2k

Jan 1, 1970
0
I appreciate how the clock is generated, I want to know how the thing
knows _which_ dot clock to generate. I can think of many situations
where the same HSYNC would have a different dot clock.
Look at it this way. You are in a cardboard box. All you have is a
tiny slot through which I slip you a piece of paper written 15743 Hz
on it. Now guess which video mode I'm in, and what the correct dot
clock is?


Yup, but I want to know how the thing knows it needs 1280 dot clocks
per line. It must have a timer, measures the HSYNC, looks up which
video mode is closest, then programs the genlock?

"A Voltage Controlled Oscillator (VCO) generates a much higher pixel
clock frequency. This pixel clock is divided
by the value PLLDIV programmed into the AD9884A, and phase compared
with the HSYNC input."

You see, you still need to program the value yourself via the I2C
interface. You don't just toss in a HSYNC and it magically determines
the dot clock.

I'm asking because of a personal project that's been on hold for a
long time.

LDC's have a physical native resolution. Just the same most sync signals
are set up for physical CRT's. CRT's require retrace time, both
horizontally and vertically. Read up on NTSC signal composition,
especially vertical blanking. It is the basis of all display timings.
Next, when you have assimilated that, read up on X-windows modelines. Then
you will know for your self.
 
G

Gary Tait

Jan 1, 1970
0
[email protected] wrote in
Yup, but I want to know how the thing knows it needs 1280 dot clocks
per line. It must have a timer, measures the HSYNC, looks up which
video mode is closest, then programs the genlock?

1280 dots is hard coded in the firmware. The firmware programs the PLL so
as 1280 pixel clock cycles (or so) are generated between horizontal sync
pulses.

That is, if the display directly digitises the input signal to the display
elements.
 
Top