Solving the Linux DPI Puzzle

by Billy Biggs <vektor@dumbterm.net>, Sat Dec 4 2004

Linux applications use the DPI reported by the X server when converting from font point size to pixels. "Sans 10" will be a smaller number of pixels if your X server is configured at 75 DPI than if it is at 100 DPI.

How the XFree86 and Xorg servers calculate DPI

The DPI of the X server is determined in the following manner:

  1. The -dpi command line option has highest priority.
  2. If this is not used, the DisplaySize setting in the X config file is used to derive the DPI, given the screen resolution.
  3. If no DisplaySize is given, the monitor size values from DDC are used to derive the DPI, given the screen resolution.
  4. If DDC does not specify a size, 75x75 DPI is used by default.

You can check what DPI your X server is set to by running xdpyinfo | grep resolution in a terminal window.

Other abstractions

DPI is also defined in two other places:

  1. By Xft/fontconfig. fontconfig defines a "dpi" value independent of X which you can define in your local.conf file. GTK+ 2 and Qt 3 applications will honour this value if it is set.
  2. By GNOME. If "gnome-settings-daemon" is running, this will advertise the DPI value set in the gnome-font-properties dialog. This also changes the Xft value to match, and so this will affect Qt applications started after the daemon is running.

Problem: choosing default font sizes

Having a standardized DPI is important for choosing good default font sizes.

Windows machines use the DPI value as a way of globally changing font size. Windows XP defaults to 96 DPI. Changing to large font mode increases the DPI to 120. Users can also specify a custom DPI value. The default application font on Windows is "Tahoma 8".

MacOS X standardizes on 72 DPI, which means that fonts are smaller on the Mac at the same point size as on Windows. The default font on my MacOS X laptop is "Lucida Grande 13".

GTK+ uses a default application font of "Sans 10". This size seems to be chosen assuming a screen DPI of 96x96.

DPI in practice

The DPI used on a Linux desktop is defined by the following:

  1. If gnome-settings-daemon is running, it defaults to 96 DPI, and all GTK+/Qt applications will use this value. Your fonts will appear as intended.
  2. Otherwise, some distributions launch X using "-dpi 100". Fonts will appear as intended.
  3. If your monitor announces a size via DDC, X will derive its DPI from these values. These values are unreliable, and regardless, this is not a good way to determine font sizes. The result is fonts which are usually either too big or too small.
  4. Your X server uses 75x75, and your fonts are all too small.

In one weekend supporting tvtime and Eclipse on IRC, I saw the following DPI values from various users, all of whom were using the default X setup from their distribution: 75x75, 85x80, 100x100, 117x115, and 133x133.

Proposal

I strongly believe that fonts on Linux should pick a standard default DPI value. Applications and desktop systems cannot reliably choose default font sizes without this. My proposal is that we clean up all of the rough edges by deciding on a default DPI, and work towards making all of our abstractions track one global setting.

The proposal is as follows:

  1. Decide on 96x96 DPI as the default, since this is already quite popular and matches all default GNOME desktops.
  2. Distributions should go through all of their scripts: startx, gdm's gdm.conf, and xdm/kdm's Xservers files. These should all start X with the -dpi 96 command line option.
  3. Distributions should modify fontconfig's dpi setting in the default /etc/fonts/local.conf file.

Here is the code for inclusion in the fontconfig local.conf file:

     <match target="font">
         <edit name="dpi"><int>96</int></edit>
     </match>

Why is a fixed default DPI better than an autodetected one?

An obvious criticism of this proposal is that it is proposing a single and arbitrary DPI, which seems to go against the whole concept of "dots per inch". Having a DPI value calculated based on the monitor size seems like a better idea.

The reasons why I believe a fixed DPI is a good idea are as follows:

  1. The values obtained from monitors via DDC seem to give wildly incorrect and variable results for users, and if it fails, the fallback DPI is 75x75 which is too small for the default font sizes of GTK+ applications.
  2. DPI applies well to printing, but not well to the screen. If I project my laptop display on a screen for a presentation, the theoretical DPI has clearly changed, but I do not want all of my fonts to suddenly change with it. DPI values for computer screens are simply convention and not meaningful.
  3. Other operating systems like Windows and MacOS choose arbitrary DPI values rather than auto-calculate it.
  4. Having a standard default fixed DPI makes it easier for application and desktop developers to choose default fonts.
  5. Font hints are specified for certain popular font sizes. Changing the DPI can affect the appearance of text, not just its size.

References

  1. fontconfig bug 2014
  2. DPI on Windows systems