Maker Pro
Maker Pro

Resolution switching on a monitor

T

Tim Benner

Jan 1, 1970
0
I know you can change your monitor resolution in Windows from say 800x600 to
1024x768. The dot pitch of the monitor does not change however, so what is
happening in the electronics to make this magic happen? Does the width of
the electron beam change? Is it the scan rate that changes when you change
resolutions? I'm curious for an explanation of what happens in the hardware
when you make a resolution change. I am talking about the CRT monitors.

Thanks!

[Tim]
 
B

Bob Myers

Jan 1, 1970
0
Tim Benner said:
I know you can change your monitor resolution in Windows from say 800x600 to
1024x768. The dot pitch of the monitor does not change however, so what is
happening in the electronics to make this magic happen? Does the width of
the electron beam change? Is it the scan rate that changes when you change
resolutions? I'm curious for an explanation of what happens in the hardware
when you make a resolution change. I am talking about the CRT monitors.

It's just the scan frequencies that are changing to accomodate the
new timing/pixel format. The focus (which is sort of the electron beam
width, at least as it is seen at the screen) MAY be altered slightly as
well,
if the monitor has the capability of storing adjustments for that and other
parameters (geometry, convergence, etc.) for specific timings, although
it is VERY unusual for focus to be included in this.

You're right, the dot pitch can't change - that's a fixed physical parameter
of the CRT itself - but the physical dots on the screen (or the holes in
the shadow mask) really have nothing at all to do with the logical pixels
of the image, other than being one of the things which ultimately limits
the resolution (in the proper sense of the word, the amount of detail which
can be resolved per unit distance or area) of the product.

Bob M.
 
M

Mjolinor

Jan 1, 1970
0
Bob Myers said:
800x600

It's just the scan frequencies that are changing to accomodate the
new timing/pixel format. The focus (which is sort of the electron beam
width, at least as it is seen at the screen) MAY be altered slightly as
well,
if the monitor has the capability of storing adjustments for that and other
parameters (geometry, convergence, etc.) for specific timings, although
it is VERY unusual for focus to be included in this.

You're right, the dot pitch can't change - that's a fixed physical parameter
of the CRT itself - but the physical dots on the screen (or the holes in
the shadow mask) really have nothing at all to do with the logical pixels
of the image, other than being one of the things which ultimately limits
the resolution (in the proper sense of the word, the amount of detail which
can be resolved per unit distance or area) of the product.

Bob M.

The scan frequencies do not necessarily change at all when you change the
resolution. What happens is that the signal as seen on the VGA plug has (for
example) 1024 discrete values between 2 consecutive line syncs as opposed to
800 discrete values and 768 line syncs between frame syncs as opposed to 600
(assuming non interlaced). What you drive it into is irrelevent. The way
your particular monitor handles this change can vary but generally the more
you pay for the monitor the more changes will take place inside when it
detects the faster data rates including thing like the dynamic focus and
dynamic beam acceleration.

On TFT monitors they specify a "recommended" resolution that the TFT works
best at and when not run at this resolution they get seriously blocky and in
some cases unreadable text.
 
B

Bob Myers

Jan 1, 1970
0
Mjolinor said:
The scan frequencies do not necessarily change at all when you change the
resolution.

Remember, we're talking about CRT monitors here, not LCD; it is
very rare for the scan frequencies NOT to change when changing
the pixel format. (There are a few of the original "VGA" modes, for
instance, that all use the same horizontal rate, but differ in the
vertical.)
Changing the "resolution" (pixel format) and not changing either the
horizontal or vertical rates can only come by packing more (or fewer)
pixels into a given scan line (since you can't possibly have changed the
lines per frame if neither scan rate changes, except trivially by altering
the blanking period). But since a CRT monitor doesn't know anything
about "pixels" in the first place, there's no real change from the monitor's
perspective.
What happens is that the signal as seen on the VGA plug has (for
example) 1024 discrete values between 2 consecutive line syncs as opposed to
800 discrete values and 768 line syncs between frame syncs as opposed to 600
(assuming non interlaced).

First, there aren't "1024" or "800" discrete values on the VGA video in
any case; the analog VGA interface provides absolutely no information
that permits "pixels" to be clearly distinguished. It carries a continuous
analog
video signal. (Which is not to say that this video can't be sampled at what
you BELIEVE are the correct "pixel" times - analog-input LCD monitors
do exactly that - but there is nothing on the interface itself that
identifies
the individual pixels for you.) Thought experiment - try showing one
line of video from a VGA interface, running 1024 x 768 @ 60 Hz, on
an oscilloscope - and point to pixel #483 on that line. This is especially
fun when the image in question is a full white raster...:)

Second, and more importantly - if you've changed the number of line syncs
between the number of frame syncs (i.e., changing the H sync rate vs. the
V sync rate), then you HAVE changed the line (horizontal scan) rate
by definition (assuming the same frame rate), right?
What you drive it into is irrelevent.

I have absolutely no idea what you mean by this. But since a CRT
monitor (except for a VERY few highly specialized designs) runs the
horizontal and vertical deflection at the H and V sync rates of the
incoming video, it's most definitely relevant to the original question.
On TFT monitors they specify a "recommended" resolution that the TFT works
best at and when not run at this resolution they get seriously blocky and in
some cases unreadable text.

That would be the native pixel format (and frame rate) of the panel.
This is recommended, since when the incoming video matches the
requirements of the panel, no scaling or frame-rate conversion (both
of which can result in visible artifacts in the image) is needed.

Bob M.
 
J

Jerry Greenberg

Jan 1, 1970
0
To be simple, the rate or speed of the scanning changes to match the
resolution. More lines are scanned, therefore there is more
information per line, and more information in total on the screen,
because there are more lines.

The detailed explanation of what happens is very long.

LCD and Plasma monitor displays are different in this approach.


Jerry G.
========
 
M

Mjolinor

Jan 1, 1970
0
Bob Myers said:
Remember, we're talking about CRT monitors here, not LCD; it is
very rare for the scan frequencies NOT to change when changing
the pixel format. (There are a few of the original "VGA" modes, for
instance, that all use the same horizontal rate, but differ in the
vertical.)
Changing the "resolution" (pixel format) and not changing either the
horizontal or vertical rates can only come by packing more (or fewer)
pixels into a given scan line (since you can't possibly have changed the
lines per frame if neither scan rate changes, except trivially by altering
the blanking period). But since a CRT monitor doesn't know anything
about "pixels" in the first place, there's no real change from the monitor's
perspective.

What I said was they do not necessarily change, "there is nothing in the
sync rates that denote the resolution of the image" is what I meant by that,
obviously you have to fit them into a frame so one or the other or both must
vary for that to happen.
opposed

First, there aren't "1024" or "800" discrete values on the VGA video in
any case; the analog VGA interface provides absolutely no information
that permits "pixels" to be clearly distinguished. It carries a continuous
analog
video signal. (Which is not to say that this video can't be sampled at what
you BELIEVE are the correct "pixel" times - analog-input LCD monitors
do exactly that - but there is nothing on the interface itself that
identifies
the individual pixels for you.) Thought experiment - try showing one
line of video from a VGA interface, running 1024 x 768 @ 60 Hz, on
an oscilloscope - and point to pixel #483 on that line. This is especially
fun when the image in question is a full white raster...:)

This is incorrect, there are 1024 discrete analogue values. You can show
this easily on an oscilloscope by looking at one line from a black field
with a one pixel width vertical white line.
I have absolutely no idea what you mean by this. But since a CRT
monitor (except for a VERY few highly specialized designs) runs the
horizontal and vertical deflection at the H and V sync rates of the
incoming video, it's most definitely relevant to the original question.

I don't think I am understanding your point here at all. What I meant by
that was that the video signal source and destination have no closed loop
properties. It is an open loop system. There is no control directed back to
the source from the monitor.
 
B

Bob Myers

Jan 1, 1970
0
Mjolinor said:
What I said was they do not necessarily change, "there is nothing in the
sync rates that denote the resolution of the image" is what I meant by that,
obviously you have to fit them into a frame so one or the other or both must
vary for that to happen.

Let's put it this way - if you're changing "resolution" (pixel
format) and/or refresh rate, then the vertical and horizontal
frequencies almost always change. No, there is nothing in the
sync rates that "denotes the resolution of the image", other than
the total number of lines per frame (and from this, and the
specific rates in question, the timing standard in use can usually
be identified). But no, since the analog video standard for PCs
does not include a true "blanking" or "display enable" signal
(nor anything from which these can readily be derived), you cannot
clearly identify the number of active lines per frame, which is
part of the pixel format description. You (or rather, your monitor)
are always basically guessing what standard timing is in use
(under the assumption that it IS in fact a standard timing).
This is incorrect, there are 1024 discrete analogue values.

No, there really aren't. IF you happen to be looking at an image
which consists of, say, alternating vertical lines, then yes, you can
tell where the "pixels" were supposed to be. Anything beyond
that is just a guess; again, there is NO pixel-level timing information
guaranteed in the VGA interface. You cannot unambiguously
determine the pixel locations within the video signal for any and
all video content. The best you can do is to try to generate a
pixel sampling clock from what timing information you DO have
(generally, just by multiplying up the horizontal sync rate) and
taking your best guess at how it should align with the active video
period.
You can show
this easily on an oscilloscope by looking at one line from a black field
with a one pixel width vertical white line.

Sure, but that's a specific (and very fortunate) case. Again,
there's no way to distinguish pixels within, say, a flat white
field, or a single HORIZONTAL line, so there's really no
guarantee of "discrete" values. This is one of the problems which
has traditionally plagued analog interfaces for fixed-format
displays (such as LCDs), since those DO require accurate sampling
at the pixel times. There is a new analog video signal standard
in the works which is designed to address this (the VESA NAVI
standard), but since it's not published yet I can't go into the details
of it here.

If you really want to get into the details of all this (and I guarantee
you that they're a LOT less interesting than you might think..:)),
it's covered in chapters 6-9 of my book, "Display Interfaces:
Fundamentals & Standards," published by J. Wiley & Sons.


Bob M.
 
E

E. Rosten

Jan 1, 1970
0
Mjolinor said:
quote you "frequencies almost always change"
quote me "frequencies do not necessarily change "

What is the difference between those two statements apart from maybe
pessimist versus optimist.

If there are not 1024 discrete analogue values in the signal what the hell
are there.

Not 1024 values. Take this example: with some older, cheaper video
cards, the image becomes fuzzy with a very high dot clock. This is
sometimes down to little ferrite beads on the output lines, which
slichtly low-pass filter the outpur signal. That means that even if the
DAC is outputting 1024 discrete values, what you see on the line is
certainly not that. You also get slewing in the DAC and any amplifiers
which may also reduce the bandwidth to below that of the dot clock.

Often the bad picture is caused by low-quality RAM instead.
There is an AtoD producing 1024 discrete voltage values and no
ammount "you can't see them" makes them something else. I am not saying you
do anything with them I am just saying that there lies the difference
between the two resolutions, in one there may be 800 and in the other, it is
probably faster (but not necessarily so) there are 1024 of them.

Maybe I will look at your book it would be interesting to learn about the
magic of a device that produces 1024 output voltages but miraculously when
they get to the end of the wire they are not there anymore. :)

It's called a low pass filter.

:)

-Ed




--
(You can't go wrong with psycho-rats.) (er258)(@)(eng.cam)(.ac.uk)

/d{def}def/f{/Times findfont s scalefont setfont}d/s{10}d/r{roll}d f 5/m
{moveto}d -1 r 230 350 m 0 1 179{1 index show 88 rotate 4 mul 0 rmoveto}
for /s 15 d f pop 240 420 m 0 1 3 { 4 2 1 r sub -1 r show } for showpage
 
M

Mjolinor

Jan 1, 1970
0
Bob Myers said:
Let's put it this way - if you're changing "resolution" (pixel
format) and/or refresh rate, then the vertical and horizontal
frequencies almost always change. No, there is nothing in the
sync rates that "denotes the resolution of the image", other than
the total number of lines per frame (and from this, and the
specific rates in question, the timing standard in use can usually
be identified). But no, since the analog video standard for PCs
does not include a true "blanking" or "display enable" signal
(nor anything from which these can readily be derived), you cannot
clearly identify the number of active lines per frame, which is
part of the pixel format description. You (or rather, your monitor)
are always basically guessing what standard timing is in use
(under the assumption that it IS in fact a standard timing).


No, there really aren't. IF you happen to be looking at an image
which consists of, say, alternating vertical lines, then yes, you can
tell where the "pixels" were supposed to be. Anything beyond
that is just a guess; again, there is NO pixel-level timing information
guaranteed in the VGA interface. You cannot unambiguously
determine the pixel locations within the video signal for any and
all video content. The best you can do is to try to generate a
pixel sampling clock from what timing information you DO have
(generally, just by multiplying up the horizontal sync rate) and
taking your best guess at how it should align with the active video
period.


Sure, but that's a specific (and very fortunate) case. Again,
there's no way to distinguish pixels within, say, a flat white
field, or a single HORIZONTAL line, so there's really no
guarantee of "discrete" values. This is one of the problems which
has traditionally plagued analog interfaces for fixed-format
displays (such as LCDs), since those DO require accurate sampling
at the pixel times. There is a new analog video signal standard
in the works which is designed to address this (the VESA NAVI
standard), but since it's not published yet I can't go into the details
of it here.

If you really want to get into the details of all this (and I guarantee
you that they're a LOT less interesting than you might think..:)),
it's covered in chapters 6-9 of my book, "Display Interfaces:
Fundamentals & Standards," published by J. Wiley & Sons.


Bob M.

quote you "frequencies almost always change"
quote me "frequencies do not necessarily change "

What is the difference between those two statements apart from maybe
pessimist versus optimist.

If there are not 1024 discrete analogue values in the signal what the hell
are there. There is an AtoD producing 1024 discrete voltage values and no
ammount "you can't see them" makes them something else. I am not saying you
do anything with them I am just saying that there lies the difference
between the two resolutions, in one there may be 800 and in the other, it is
probably faster (but not necessarily so) there are 1024 of them.

Maybe I will look at your book it would be interesting to learn about the
magic of a device that produces 1024 output voltages but miraculously when
they get to the end of the wire they are not there anymore. :)
 
M

Mjolinor

Jan 1, 1970
0
E. Rosten said:
Not 1024 values. Take this example: with some older, cheaper video
cards, the image becomes fuzzy with a very high dot clock. This is
sometimes down to little ferrite beads on the output lines, which
slichtly low-pass filter the outpur signal. That means that even if the
DAC is outputting 1024 discrete values, what you see on the line is
certainly not that. You also get slewing in the DAC and any amplifiers
which may also reduce the bandwidth to below that of the dot clock.

Often the bad picture is caused by low-quality RAM instead.


It's called a low pass filter.

:)

-Ed

OK I'll concede the practical point but I was not talking about bandwidth
limited devices rather I was talking about the theory of it. :)
 
T

Tim Benner

Jan 1, 1970
0
Just want to say thanks for all the info. But my million dollar question
is, does the size of a pixel change with different screen resolutions? I
was staring at the screen yesterday as I was switching resolutions, and it
appears like it to me. But what causes this? Are more of the color triads
being lit up(or skipped) at the lower resolutions then at the higher
resolutions?
This would make sense to me since the number of color triads on the
monitor does not change. If you go from 1600 pixels per line down to 640
pixels per line, then either the lower resolution has to take up more color
triads per pixel, or skip some color triads to fit the smaller resolution
across the monitor screen. Can it do both?

[Tim]
 
B

Bob Myers

Jan 1, 1970
0
Mjolinor said:
If there are not 1024 discrete analogue values in the signal what the hell
are there.

Well, from the above, you apparently believe that when a
DAC is clocked, the output instantaneously reaches precisely
the nominal voltage level intended, with zero rise time, no
overshoot, undershoot, ringing, etc., and then does this again
exactly one pixel time later. I would submit to you that this
does not, and in fact CANNOT, occur. Further, you cannot
possibly identify the pixel periods with any real certainty, given
just the information carried over the VGA interface. So I
guess I'm still having some problems understanding just what
"1024 discretre analogue values" would actually mean in any
practical sense.

In simpler terms, in analog video the notion of a "pixel" as a
distinguishable thing simply does not make sense. In analog
video, all you can really talk about is the video bandwidth,
which is what really corresponds to "resolution" here.

Bob M.
 
B

Bob Myers

Jan 1, 1970
0
Tim Benner said:
Just want to say thanks for all the info. But my million dollar question
is, does the size of a pixel change with different screen resolutions? I
was staring at the screen yesterday as I was switching resolutions, and it
appears like it to me. But what causes this? Are more of the color triads
being lit up(or skipped) at the lower resolutions then at the higher
resolutions?

The bottom line is that on a CRT display, there's really no such thing
as a readily-identifiable "pixel." For faster timings, certainly the
duration of a pixel period (and therefore the width of "single-pixel"
features such as vertical lines) will change, but there is not generally
an intentional change in the CRT spot size made by the monitor.
(In fact, and this is going to seem rather counter-intuitive, the spot
size of a typical CRT monitor is almost always considerably larger
than the "pixel size" you'd get by simply dividing, say, the image
width by the number of logical "pixels" per line.)

With "higher-res" timings, yes, less screen area (number of
phosphor triads, whatever) corresponds to a given logical
pixel. But it's not as neat and tidy a relationship as you might
think at first.


Bob M.
 
M

Mjolinor

Jan 1, 1970
0
Bob Myers said:
Well, from the above, you apparently believe that when a
DAC is clocked, the output instantaneously reaches precisely
the nominal voltage level intended, with zero rise time, no
overshoot, undershoot, ringing, etc., and then does this again
exactly one pixel time later. I would submit to you that this
does not, and in fact CANNOT, occur. Further, you cannot
possibly identify the pixel periods with any real certainty, given
just the information carried over the VGA interface. So I
guess I'm still having some problems understanding just what
"1024 discretre analogue values" would actually mean in any
practical sense.

It has no meaning at all in the practical sense but I wasn't talking in a
practical sense. The original question wasn't aimed at understanding things
that deeply he was concerned at to what the difference was in the signals
that allowed the monitor to "know" and react to what was going on in the
changed signal from the PC. I attempted to explain the changes in this
signal without the complex analysis that a full explanation would require. I
don't think it would serve any purpose for basic understanding to go into
the finite bandwidth of devices and the instantaneous voltage when examined
several orders of magnitude faster than the pixel rate involved.
 
B

Bob Myers

Jan 1, 1970
0
Mjolinor said:
It has no meaning at all in the practical sense but I wasn't talking in a
practical sense. The original question wasn't aimed at understanding things
that deeply he was concerned at to what the difference was in the signals
that allowed the monitor to "know" and react to what was going on in the
changed signal from the PC.

Exactly. And since the monitor does NOT see any
information that identifies individual pixels within the video
signal (even though that may be a convenient way for the
video to be thought of in some cases), this has absolutely
NOTHING to do with "the signals that allowed the monitor
to 'know' and react to what was going on..." The only
change in the signal set that the monitor sees, recognizes,
and acts upon are the changes to the horizontal and vertical
sync timing, PERIOD. To bring up the notion of "discrete
pixels" in such a discussion is irrelevant, misleading, and
simply incorrect.
I attempted to explain the changes in this
signal without the complex analysis that a full explanation would require. I
don't think it would serve any purpose for basic understanding to go into
the finite bandwidth of devices and the instantaneous voltage when examined
several orders of magnitude faster than the pixel rate involved.

There's not even a need to examine it THAT fast; for
"single-pixel" details, the output of the ADC may NEVER
be at the nominal intended voltage level corresponding
to the ADC input, except for the briefest of instants during
transition.

Bob M.
 
Top