modeset on/off changes xrandr output

Adam Jackson ajax at redhat.com
Tue May 12 14:55:43 UTC 2009


On Tue, 2009-05-12 at 10:08 +0100, Mary Ellen Foster wrote:
> I've been looking around and I haven't seen anywhere that this is
> documented: if I toggle the modeset flag the output of xrandr changes
> radically (this is on fully updated F11 rawhide with Intel graphics):
> 
> - Outputs have different names:
>   - modeset: VGA1, LVDS1, DVI{1,2}, TV1
>   - nomodeset: VGA, LVDS, HDMI-{1,2}, TV

This is a bug, we should be getting the names the same between KMS and
UMS.  It's pretty much cosmetic though, IIRC Gnome will remember output
configuration based on the EDID of the attached device and not the
output name, so it shouldn't matter.

> - Maximum virtual size is different:
>   - modeset: 8192 x 8192
>   - nomodeset: 1920 x 1920 (nb: native resolution is 1920x1200)

This is not a bug, this is a feature.  KMS enables us to dynamically
reconfigure the framebuffer, which means we can advertise the maximum
virtual size the hardware can support.  We'll only ever allocate memory
based on what you currently have configured though, including
dynamically allocating the ancillary buffers for 3D apps.

In UMS we can't do this, so we have to pick a maximum size up front, and
then limit you for the rest of the session to desktop sizes that will
fit within that guess.  We don't want to allocate too much up front
because (on unified memory architectures like Intel) any memory you use
for graphics is memory you can't use for apps.  It adds up quick:
1920x1200 at 32bpp is 8.8M, and you need three buffers that size for 3D
(front, back, depth), so that's 26M gone right up front, and doesn't
even count the N full screen windows you're likely to have (nautilus,
firefox, firefox, firefox...).  So we pick something based on what
you've got attached and how much memory you've got and it all gets a bit
handwavey, but in your case you end up with 1920 square.

The reason we advertise 1920x1920 is for rotation: if you're rotated,
the screen is 1200x1920.

> - Available resolutions and refresh rates are different, with many
> more available with nomodeset
>   - e.g., LVDS has only 1920x1200 with modesetting, but six further
> resolutions with nomodeset

This is a bug, but one that's at least fix-in-progress.

Mode list construction is kind of a weird process.  For our purposes,
EDID gives you a list of modes, and optionally a bit that says "I can do
other modes too", and also optionally a range descriptor (hsync, vsync,
and max dotclock).  You really don't want to send it a mode that isn't
supported, because on a lot of LCDs that really won't work - either
you'll get a black screen, or it'll do a half-ass job trying to show you
some subset of the signal you're sending and your panel will be off the
bottom of the screen.  So we try to be conservative in the mode list we
construct from KMS (and in the first pass of mode list construction in
UMS too).

However, users occasionally want other modes for various reasons.  Maybe
3D performance isn't acceptable if the resolution is too high; maybe you
want to clone the same mode across multiple outputs; maybe the app
you're running has a fixed UI and you want it to fill the screen anyway.
So if the display has the "I can do other modes too" bit set, we'll try
to add some additional modes based on the range descriptor in the
monitor, or infer one from the list of modes it does support if it
doesn't give one explicitly.

That only happens if the output has a sane EDID block though.  Many
laptop panels don't.  Either they have an EDID block that just contains
the one mode, or they don't have an EDID at all, just a magic bit of
data somewhere in the video ROM saying what the native size is.  But
most laptop chips can scale on the other end, on the GPU instead of in
the display.  So it's actually okay to add pretty much arbitrary modes
to the list, since the driver will just drive the panel at the native
timings and upsample in the GPU.  In UMS we'd catch this case, and add
some default modes below that size; in KMS we don't yet.

Actually that's a lie: we do fix it up for radeon, but not for intel
yet, as a workaround for a different bug.  There's two kind of pixel
layouts the screen can have for intel, linear or tiled, and sometimes we
have to switch between them, and we don't get that right yet.  The
kicker is that the switch happens basically between >= 1024x768 (tiled)
and <= 800x600 (linear).  So imagine what happens at anaconda time: X
starts at whatever resolution, then tries to randr down to 800x600 for
its fixed UI.  KMS will screw up the tiled->linear transition, and
you'll get a black screen of doom.  Awesome.  If, however, 800x600 isn't
available in the randr mode list, then the randr call will just fail
harmlessly, and anaconda will carry on its merry way centered in the
middle of the screen.  Arguably this is what anaconda should do _all_
the time.

The tiling setup failure shows up in a few other places too, so it's
definitely on the list to get fixed for F11 gold, and once that happens
we'll enable the code to do more resolutions on LVDS for intel too.

- ajax
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 197 bytes
Desc: This is a digitally signed message part
URL: <http://listman.redhat.com/archives/fedora-test-list/attachments/20090512/b0754c6e/attachment.sig>


More information about the fedora-test-list mailing list