Maker Pro
Maker Pro

Video question, again

OK, I'm gonna ask the same question again, but hopefully this time
I'll make myself more clear.

Any modern monitor, LCD or CRT, when you navigate the menus eventually
has a feature where it displays the mode of the incoming analog VGA.
For example, it'll tell you the VSYNC and HSYNC frequency, and
sometimes even the resolution depending on the model.

So again, I ask, how does the monitor know this? It seems pretty
simple to count the number of hsyncs per vsync to get the number of
lines per frame. Is it that simple? Like if I count 600 (+/-) hsyncs
per vsync, I'd guess... lemme see.... hmmm... 800x600?

Does anyone have a clue about this? I'm pretty sure the DDC lines are
not used for this, but am I wrong? I'm pretty sure the DDC is used the
other way round.

Basically all I want to know is how LCDs in particular set up their
incoming sampler section. The monitor must sample at the correct dot
clock before resizing the image and tossing it at the panel in native
resolution. Or so I think.

It's for a personal project that went into a coma a while back but is
awake again. It's actually a lot more convoluted than this, but I'd to
read up on how the pros do it.
 
V

vertreko

Jan 1, 1970
0
It seems pretty
simple to count the number of hsyncs per vsync to get the number of
lines per frame. Is it that simple? Like if I count 600 (+/-) hsyncs
per vsync, I'd guess... lemme see.... hmmm... 800x600?

I think that is exactly how the monitor does it. Although there are
weirdnesses: for example, some (rare) monitor timing formats are
actually interlaced, and you would count more HSYNCs on one field than
on the other. I'm betting that, in addition to counting H and V,
there is a table of sorts based on H and V polarity, H and V counts,
and maybe even front and back porch width (both H and V), that will
point the monitor to one of its possible standard timing formats.
Does anyone have a clue about this? I'm pretty sure the DDC lines are
not used for this, but am I wrong? I'm pretty sure the DDC is used the
other way round.

Yep, DDC is used by the PC to figure out which formats the monitor can
support. I'm 99% sure that the PC can't use the DDC line to tell the
monitor what it is getting.
 
J

Joel Kolstad

Jan 1, 1970
0
Any modern monitor, LCD or CRT, when you navigate the menus eventually
has a feature where it displays the mode of the incoming analog VGA.
For example, it'll tell you the VSYNC and HSYNC frequency, and
sometimes even the resolution depending on the model.

So again, I ask, how does the monitor know this? It seems pretty
simple to count the number of hsyncs per vsync to get the number of
lines per frame. Is it that simple? Like if I count 600 (+/-) hsyncs
per vsync, I'd guess... lemme see.... hmmm... 800x600?

I don't know for certain, but I would guess that this is correct: From the
VSync and HSync frequencies you can get the number of lines and make a very
good educated guess at the resolution. Modern LCDs seem to do a pretty good
job of synchornizing correctly (whereas I had one maybe 4/5 years ago that
seldom would without manual tweaking) when presented with "standard"
resolutions -- I wouldn't be surprised at all if many perform poorly if you
purposely create custom "oddball" resolutions such as, say, 1133 x 737.
Basically all I want to know is how LCDs in particular set up their
incoming sampler section. The monitor must sample at the correct dot
clock before resizing the image and tossing it at the panel in native
resolution. Or so I think.

No, it really doesn't have to do this: For the simplest LCD, regardless of
how many pixels are actually being sent per line, you just have to sample
the same number of times as your LCD has pixels on the screen. If you LCD
is 1280x1024 and someone's shipping you 1600 pixels per line... oh well,
you'll just miss some of the information. If they send you 800 pixels per
line, you'll sometimes sample the same pixel more than once. Depending on
the ratio of pixels sent to pixels displayed, you get various artifacts from
this approach, often noticeably as vertical banding, particularly around
sharp edges. Extra lines are skipped, "absent" lines are repeated. The
result is potentially quite ugly, but given that LCDs are intended to be
used at their native resolutions anyway, this isn't much of a problem.
(I've also seen LCDs that simply "center" imagines that don't have enough
pixels... this is relatively common with, e.g., widescreen monitors that'll
have the option to put up black vertical borders on the left and right of
the image if you send them a 4:3 screen.)

That being said, I believe that some of the better LCDs (and, e.g., video
projectors where it's very much expected that they won't be run at their
native resolution) *do* try to lock on to the exact incoming pixel stream,
store it in a multi-line buffer, and then regenerate the output to the LCD
using a filter of their choosing (you could use, e.g., a simple average
filter so that for an 800x600 LCD you just average together four pixels in a
cluster if someone's sending you 1600x1200 -- the filter you choose is
determined largely by whether you're attempting to highlight edges, trying
to be color accurate, etc.: This is the standard sampling problem in image
processing, and there are about a dozen filters commonly used to "re-sample"
display data in an effort to convert from one display format to another.)
It's for a personal project that went into a coma a while back but is
awake again. It's actually a lot more convoluted than this, but I'd to
read up on how the pros do it.

What are you trying to do?
 
Basically all I want to know is how LCDs in particular set up their
incoming sampler section. The monitor must sample at the correct dot
clock before resizing the image and tossing it at the panel in native
resolution. Or so I think.

I bet you could run some intersting experiments with a video driver
that lets you fully tune the timings (think xvidetune for xfree86) and
perhaps some sort of magnified to examine the screen.
 
E

Eeyore

Jan 1, 1970
0
OK, I'm gonna ask the same question again, but hopefully this time
I'll make myself more clear.

Any modern monitor, LCD or CRT, when you navigate the menus eventually
has a feature where it displays the mode of the incoming analog VGA.
For example, it'll tell you the VSYNC and HSYNC frequency, and
sometimes even the resolution depending on the model.

So again, I ask, how does the monitor know this?

I expect it uses a frequency counter. It's not difficult.

Graham
 
J

Jan Panteltje

Jan 1, 1970
0
OK, I'm gonna ask the same question again, but hopefully this time
I'll make myself more clear.

Any modern monitor, LCD or CRT, when you navigate the menus eventually
has a feature where it displays the mode of the incoming analog VGA.
For example, it'll tell you the VSYNC and HSYNC frequency,

freq counter
and
sometimes even the resolution depending on the model.
calculation


So again, I ask, how does the monitor know this? It seems pretty
simple to count the number of hsyncs per vsync to get the number of
lines per frame. Is it that simple? Like if I count 600 (+/-) hsyncs
per vsync, I'd guess... lemme see.... hmmm... 800x600?

yes, but note INTERLACE.
In case of interlace for example in PAL(whereIamorwasratherasallisdigitalnow)
50 fields per second vertical deflection so 50Hz, and a horizontal deflection
frequency of 15625 Hz, gives 15625 / 50 = 312.5 lines.
But because of interlace you get 625 lines, the lines of one frame are displayed
'between' those of the previous one.
The form of the vertical sync pulse causes a slightly later V trigger.


Does anyone have a clue about this?


Absolutely not ;-)

I'm pretty sure the DDC lines are
not used for this, but am I wrong? I'm pretty sure the DDC is used the
other way round.
dunno...

Basically all I want to know is how LCDs in particular set up their
incoming sampler section. The monitor must sample at the correct dot
clock before resizing the image and tossing it at the panel in native
resolution. Or so I think.

PLL
ON A LCD THE DOTS PER LINE ARE FIXED.
So if you have 1980 H dots, and 32 uS line sync, you know the time to
hang around in one dot.
calculate.

It's for a personal project that went into a coma

OK
 
No, it really doesn't have to do this: For the simplest LCD, regardless of
how many pixels are actually being sent per line, you just have to sample
the same number of times as your LCD has pixels on the screen. If you LCD
is 1280x1024 and someone's shipping you 1600 pixels per line... oh well,

This would imply that the LCD monitor is able to retime itself to spew
out its native 1280 pixels at the same rate as the incoming video
line.

"Real" 1280x1024 has a hsync of 60KHz let's say. But now I give the
monitor a 640x480 VGA signal at a hsync of 31Khz. That doesn't fit. So
the LCD slows itself down to sample the incoming signal at 31KHz at
1280 samples (or whatever the actual number is) per line and spews out
1280 pixels to the panel at 31KHz HSYNC.

That kind of makes sense, the monitor is slaved to the incoming hsync
and then it just samples whatever happens to be there, aliasing etc..
be damned and tosses it out to the panel. Hm.

But now what about the lines? Does it just do a naive sequence of
trying to fit the incoming 480 lines into the native 1024 lines by
sending out the same line twice, then next line three times, etc... to
sort of get close to the 2.13 ratio required?

So it still has to know someting about the incoming signal, otherwise
how could it do that trick?
you'll just miss some of the information. If they send you 800 pixels per
line, you'll sometimes sample the same pixel more than once. Depending on
the ratio of pixels sent to pixels displayed, you get various artifacts from
this approach, often noticeably as vertical banding, particularly around
sharp edges. Extra lines are skipped, "absent" lines are repeated. The
result is potentially quite ugly, but given that LCDs are intended to be
used at their native resolutions anyway, this isn't much of a problem.
(I've also seen LCDs that simply "center" imagines that don't have enough
pixels... this is relatively common with, e.g., widescreen monitors that'll
have the option to put up black vertical borders on the left and right of
the image if you send them a 4:3 screen.)

That being said, I believe that some of the better LCDs (and, e.g., video
projectors where it's very much expected that they won't be run at their
native resolution) *do* try to lock on to the exact incoming pixel stream,
store it in a multi-line buffer, and then regenerate the output to the LCD
using a filter of their choosing (you could use, e.g., a simple average
filter so that for an 800x600 LCD you just average together four pixels in a
cluster if someone's sending you 1600x1200 -- the filter you choose is
determined largely by whether you're attempting to highlight edges, trying
to be color accurate, etc.: This is the standard sampling problem in image
processing, and there are about a dozen filters commonly used to "re-sample"
display data in an effort to convert from one display format to another.)

Ah, my mistake was in thinking the signal needs to be accurately
sampled.
What are you trying to do?

Display 15KHz RGBI on VGA-style monitor. My project kind of works, one
of the problems I have is that the source computer, a Commodore 128,
uses a 16MHz video chip with programmable sync rate. The dot clock
never changes, so I have to be able to regenerate this clock with
whatever hsyncs are being tossed at me.

I either have to build a edge phase detector that only samples on the
edge of the hsync and tries to correct a ceramic resonator, or use a
genlock PLL but figure out on the fly the correct ratio to get back
16MHz.

The main problem is that my microcontroller is used as the divider,
but it also uses the recovered clock as its own clock. My mistake, I
guess. I can't use the microcontroller to count since its own clock is
always changing.

The chip that does the actual scan conversion has to know the dot
clock so that the output is as sharp and crisp and jitter and artifact-
free as displaying the RGBI on a native RGBI monitor. Otherwise why
bother?

Other problems include too much jitter on the recovered clock, and
bizarre behavior of the chip.

Anyways, I get a display, but there is fuzz around some pixels and I
can't get the LUT to work to map the colors to the correct VGA colors.
It's all a purple, fuzzy mess.

There is very little noise on the supply, the genlock's supply is so
quiet it's below what I can measure with my scope, I get the same 2mV
waveform wether my probe is connected or not....
 
R

Rich Grise

Jan 1, 1970
0
That's great, now guess the exact dot clock of the video signal. Still
not difficult?

Not for someone who knows how their display works. It's not a guess. You
know the horizontal rate - just sort out the blanking and sync pulses,
count one line time, and multiply that by the number of pixels across
the screen to get the dot clock.

Cheers!
Rich
 
J

John Fields

Jan 1, 1970
0
Not for someone who knows how their display works. It's not a guess. You
know the horizontal rate - just sort out the blanking and sync pulses,
count one line time, and multiply that by the number of pixels across
the screen to get the dot clock.
 
M

Michael A. Terrell

Jan 1, 1970
0
That's great, now guess the exact dot clock of the video signal. Still
not difficult?


Read the datasheets on the video processors in the monitors.


--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida
 
Top