I appreciate how the clock is generated, I want to know how the thing
knows _which_ dot clock to generate. I can think of many situations where
the same HSYNC would have a different dot clock. Look at it this way. You
are in a cardboard box. All you have is a tiny slot through which I slip
you a piece of paper written 15743 Hz on it. Now guess which video mode
I'm in, and what the correct dot clock is?
How?
The guy that built the display told the PLL designer that that's how
many pixels the display has. Actually, if you had 1280 pixels, you
probably wouldn't use 1280 * Fh for a dot clock, because then you'd be
displaying horizontal retrace at the edges, so you'e up it a little and
gate the actual pixel address counter with the horizontal blanking pulse.
Yup, but I want to know how the thing knows it needs 1280 dot clocks per
line. It must have a timer, measures the HSYNC, looks up which video mode
is closest, then programs the genlock?
See above. It doesn't have to "know" anything - the circuit is designed
to work for however many pixels across that the designer put into it.
I think you're confusing the idea that the signal has to somehow "know"
something about what mode it's being displayed in, but it couldn't give
a shit less - if you get a chance, take a look at a TV video with the
hor. sweep set to give about one line across, and you get sort of a
top view of the edge of the picture, where height = brightness, and
sync it to a whole frame/field and you get like an edge-on view.
Cheers!
Rich
Hope This Helps!
Rich