Settng DPI on gdm and Sessions

Tom Horsley tom.horsley at att.net
Wed May 20 00:36:46 UTC 2009


On Tue, 19 May 2009 17:06:49 -0700
Adam Williamson wrote:

> There's no such thing as a DPI that's 'best for you'. DPI means dots per
> inch. The correct DPI is a pure mathematical calculation based on the
> size of the display and the resolution in use. There is no room for
> subjectivity.

It is utter and complete nonsense like this which leads to so many
idiotic decisions in linux. Caring about DPI having a formally correct
definition is for those afflicted with OCD. Being able
to read the damn characters most apps put on the screen is what
virtually everyone in the real world would prefer.

There are two ways to make the characters readable:

1. Rewrite every app in creation to conform to yet another new complicated
Xorg visibility layer for applying magnification factors computed
from visual acuity of the user, and distance from display, combined
with several years worth of human factors AI algorithms.

2. Lie about the DPI and achieve the mathematically identical effect
without modifying a single app.

In fact, quite a lot of monitors don't report the correct physical
dimensions, so unless you can lie about the DPI, you can't even correct
these display devices to show the absolute anally correct DPI, much
less the DPI that makes characters visible on your 52DPI HD monitor you
are sitting 3 feet away from.




More information about the fedora-test-list mailing list