Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.naic.edu/~phil/hardware/nvidia/doc/driverInstall/appendix-e.html
Дата изменения: Fri Oct 30 20:31:11 2009 Дата индексирования: Tue Nov 24 16:29:27 2009 Кодировка: Поисковые слова: equinox |
DPI (Dots Per Inch), also known as PPI (Pixels Per Inch), is a property of an X screen that describes the physical size of pixels. Some X applications, such as xterm, can use the DPI of an X screen to determine how large (in pixels) to draw an object in order for that object to be displayed at the desired physical size on the display device.
The DPI of an X screen is computed by dividing the size of the X screen in pixels by the size of the X screen in inches:
DPI = SizeInPixels / SizeInInches
Since the X screen stores its physical size in millimeters rather than inches (1 inch = 25.4 millimeters):
DPI = (SizeInPixels * 25.4) / SizeInMillimeters
The NVIDIA X driver reports the size of the X screen in pixels and in millimeters. On X.Org 6.9 or newer, when the XRandR extension resizes the X screen in pixels, the NVIDIA X driver computes a new size in millimeters for the X screen, to maintain a constant DPI (see the "Physical Size" column of the `xrandr -q` output as an example). This is done because a changing DPI can cause interaction problems for some applications. To disable this behavior, and instead keep the same millimeter size for the X screen (and therefore have a changing DPI), set the ConstantDPI option to FALSE (see Appendix B, X Config Options for details).
You can query the DPI of your X screen by running:
% xdpyinfo | grep -B1 dot
which should generate output like this:
dimensions: 1280x1024 pixels (382x302 millimeters) resolution: 85x86 dots per inch
The NVIDIA X driver performs several steps during X screen initialization to determine the DPI of each X screen:
If the display device provides an EDID, and the EDID contains information about the physical size of the display device, that is used to compute the DPI, along with the size in pixels of the first mode to be used on the display device. If multiple display devices are used by this X screen, then the NVIDIA X screen will choose which display device to use. You can override this with the "UseEdidDpi" X configuration option: you can specify a particular display device to use; e.g.:
Option "UseEdidDpi" "DFP-1"
or disable EDID-computed DPI by setting this option to false:
Option "UseEdidDpi" "FALSE"
EDID-based DPI computation is enabled by default when an EDID is available.
If the "-dpi" commandline option to the X server is specified, that is used to set the DPI (see `X -h` for details). This will override the "UseEdidDpi" option.
If the "DPI" X configuration option is specified (see Appendix B, X Config Options for details), that will be used to set the DPI. This will override the "UseEdidDpi" option.
If none of the above are available, then the "DisplaySize" X config file Monitor section information will be used to determine the DPI, if provided; see the xorg.conf or XF86Config man pages for details.
If none of the above are available, the DPI defaults to 75x75.
You can find how the NVIDIA X driver determined the DPI by looking in your X log file. There will be a line that looks something like the following:
(--) NVIDIA(0): DPI set to (101, 101); computed from "UseEdidDpi" X config option
Note that the physical size of the X screen, as reported through `xdpyinfo` is computed based on the DPI and the size of the X screen in pixels.
The DPI of an X screen can be confusing when TwinView is enabled: with TwinView, multiple display devices (possibly with different DPIs) display portions of the same X screen, yet DPI can only be advertised from the X server to the X application with X screen granularity. Solutions for this include:
Use separate X screens, rather than TwinView; see Chapter 15, Configuring Multiple X Screens on One Card for details.
Experiment with different DPI settings to find a DPI that is suitable for both display devices.