Using Ubuntu 16 on HiDPI and 4K Displays

August 07, 2017

I have updated to Ubuntu 18 and still using the settings below.

I am using Ubuntu 16.04 on a 4K display and here is the X Window settings I have.

First, a few notes:

  • I am using i3 tiling window manager. If you are using something else, you may need different/additional settings.
  • There seems to be multiple ways to achieve the same result, so it does not mean the only settings that works is the one I wrote below.
  • I am using Quadro M620 with the latest NVIDIA driver and Dell P2415Q 4K display.

There is a great answer in the askubuntu forum for this topic and what I will say is very similar.

To summarize the problem first:

  • You have a HiDPI display, meaning you have a similar physical display size as before (e.g. 19", 22", 24", 27") but the resolution of the display is doubled. The display I am using is 24" and 4K so it supports 3840 x 2160 and it is 184 dpi. I was using 2560 x 1440 on 27" before, so it was 109 dpi. (check this site to see DPI values)

  • Now suppose there is an application using a font size $12 pt$ (points). $ 1pt = \dfrac{1}{72}$ inch by typography standards, so $12 pt = \dfrac{12}{72} inch = \dfrac{1}{6} inch$.

  • For years, it was standard to assume
    $96 px$ (pixels or dots) = $1 inch$ on computer screens.
    So $\dfrac{96}{6} = 16 px$ was used to show this font.
    Now if we show $16px$ on a 4K display like P2415Q, it consumes $ \dfrac{16 px}{184 dpi} \simeq 0.08 inch \simeq 0.2 cm \simeq~ 2 mm$. Good luck trying to see that.

The solution is:

  • Instead of assuming $96 px = 1 inch$ (i.e. 96 dpi) on HiDPI displays (like 184 dpi), we can change this to a higher value, typically a multiple (e.g. 125%, 150%) of 96 (e.g. 120, 144) so the fonts (and as a result actually anything) can scale up in their pixel size, so the physical size remains the same or similar to what we had before. So if a text was $1cm$ tall on a previous display, it should be similar on current display, but because the current display has higher pixel density, it will be much sharper.

Here is a great article on MSDN which explains both the problem and solution in detail and much better than me.

Although the solution is simple, because of different moving parts especially on Linux, application is more complicated.

Here is the steps:

  1. Tell X server to use whatever DPI you want to use. I will use 144 so I created /etc/lightdm/lightdm.conf.d/dpi.conf with the following contents:

    [SeatDefaults]
    xserver-command=X -dpi 144
    

    This setting for dpi, as far as I know, has the highest priority. So anything that tries to override it will not succeed. That is important since there are many ways of setting the DPI. You may see on other articles or know NVIDIA driver can read the actual DPI from EDID (Extended Display Identification Data) and you may need to disable it etc. I do not need that because this setting overrides that.

  2. Tell X the dpi setting. I created a ~/.Xresources file with the following contents:

    Xft.dpi: 144
    

    and use it by running the following in ~/.xsessionrc:

    if [ -f $HOME/.Xresources]; then
      xrdb -merge ~/.Xresources
    fi
    

    Xft is basically responsible for drawing text in the X Window system.

  3. Tell the dpi setting through xrandr. So I added the following to ~/.xsessionrc also:

    xrandr --dpi 144
    

    xrandr is I believe the last point that can scale or modify the output just before it is going to display. There can be a problem with this and you may need to delay execution. If so, try:

    bash -c "sleep 5; xrandr --xpi 144"
    

In addition to these, I run Chrome with force scaling:

/usr/bin/google-chrome-stable — force-device-scale-factor=2