ubuntu bulletproof x

Adam Jackson ajackson at redhat.com
Tue Sep 4 16:16:46 UTC 2007


On Tue, 2007-09-04 at 10:08 -0600, Richi Plana wrote:
> On Tue, 2007-09-04 at 10:39 -0400, Adam Jackson wrote:
> > Why does anyone write a modeline by hand anymore?
> > 
> > % man cvt
> 
> Thanks for the tip. Is there any way to fine-tune the results? On my
> Sony projector TV (with HDMI input), the automatically generated
> modeline (from EDID information, I might add) results in the graphics
> screen going out-of-bounds of the physical display.

xvidtune should still work.  Sadly it's probably the nicest UI for that
atm.

I'm also working on implementing DDC/CI, so you should be able to
control many monitor adjustment parameters from the host side.

> One of the nice things I've learned about X Windows is that the standard
> resolutions (1280x1024, 800x600, 1680x1050, etc.) aren't fixed
> (depending on the display device). That one can come up with arbitrary
> resolutions so long as the display supports it. There was a time when I
> would find the maximum resolution that my video card and display screen
> could use while maintaining a refresh rate of 75Hz. It was all done by
> hand and my bumbling about with numbers. Are those days gone? Or will
> that capability be somehow revived through a GUI program that will take
> care of the mathematical part of computing modelines (which used to be
> done by hand)?

That's sometimes true and sometimes not.  For CRTs it's certainly still
true.  LCDs almost always have a native refresh rate, and they either
have a built-in scaler or they don't.  If they do, then you can feed
them different refresh rates, and they'll resample the input to their
native output frequency, but it doesn't win you anything to do so since
the screen is going to pump out 60Hz whether you want it to or not.  If
the panel doesn't have a built-in scaler (which no dual-link monitor
I've seen has, and many laptop panels don't), then you pretty much have
to give it exactly the right refresh rate and an integer factor of the
native pixel size.

To compensate for that, many graphics chips have a scaler right at the
output.  NVIDIA cards, for example, are pretty much always driving the
panel at the native size, and using the built-in scaler to change modes.

This was actually a deficiency in EDID version 1.3; the "ranges" section
was more or less required even though the monitor might not meaningfully
_have_ sync ranges.  EDID 1.4 (which I almost have implemented parser
support for now) makes the ranges section optional, and defines how the
host software is supposed to interpret EDID blocks without ranges.

> It really is too bad that the Modes can't be changed on the fly (without
> needing an X restart).

In RANDR 1.2 they can.

Ideally someone will write a usable frontend to it - so you can do mode
injection and fallback recovery with a nice pretty UI - and we'll get
all the drivers ported over to the RANDR 1.2 API.  That will be winning.

- ajax




More information about the fedora-devel-list mailing list