Messing with sizes

The size of an item in computer displays or printouts depends on a lot of configurations. Font size, DPI/PPI, screen resolution, page zooming, etc. all contribute to the output size.

Font sizes are traditionally measured in points, which are used in typography long before screens were invented. A point is equal to 1/72 inch, i.e. 352.778 μm. Therefore, the font size is a physical size. A standard character M in 12-point font has a height of 1/6 inch.

Computer displays are in the form of pixels. The relationship between the physical size of the display and the number of pixel is measured in PPI, which means pixels per inch (although the term DPI, which means dots per inch, is also commonly used). It is calculated simply by dividing the number of pixels by the physical size of a display. For example, a 4:3 screen with diagonal length 15″ with 1024×768 screen resolution is 12″ wide and 9″ tall, which has 85.33 PPI.

In the early days of computer display, Apple chose 72 PPI as their standard, which means the display of an Apple computer had 72 pixels in an inch. The choice of 72 PPI arose from the fact that an inch is equal to 72 points in typography, that resulted in 1 point being equal to 1 pixel. Therefore, a 10-point character M was 10 pixels tall on screen, which resulted in letters which were difficult to read, especially for lower case letters which the x-height was only 5 pixels.

To solve this problem, Microsoft made a lie to the software running on top of its platform by saying that the PPI was 96, in which a 10-point character M was 13 pixels tall on screen. When the physical PPI of the display was actually 72, everything became 1/3 larger. It was meant to compensate the fact that the viewing distance of a screen was generally 1/3 larger than the viewing distance of a printout, and allow bitmaps to be rendered with a greater detail. The term PPI was then split into physical PPI, which meant the actual PPI of the display, commonly 72, and logical PPI, which meant the PPI told to software, commonly 96. However, the physical PPI could also mean the native PPI, which meant the PPI of the display running at its native resolution, or the PPI of the resolution chosen by the user. Afterwards, many software and even web pages was written with 96 PPI in mind, which broke with common alternative PPI values such as 72 and 120.

Then, the mess began. If the physical PPI was not equal to the logical PPI, all sizes shown in software were no longer true on screen. For example, if a piece of software running under the standard PPI value (96 PPI) claimed that the size of item was 1 cm, it was really 1.33 cm on a standard screen (72 PPI) if you measured it with a ruler. If you view it on a 96-PPI screen, it became really 1 cm because the logical PPI was the same as the physical PPI, but the items would become too small again. Therefore, Microsoft added an option of 120 PPI, which was meant to be used on 96-PPI displays.

Windows 95 allowed the user to adjust the (logical) PPI, and a ruler was even shown in the configuration dialogue which allowed the user to place a physical ruler directly and drag the ruler on screen to match the physical ruler to adjust the PPI to the physical value.

The default of 96 PPI worked fine for many years. Until very recently, LCD monitors were designed to have about 96 PPI, e.g. 17″ monitor with 1280×1024 resolution. However, as technology advances, ultra-high resolution monitors become commonplace that it is possible to view a 1920×1080 video on a 19″ wide screen with full details, which means higher physical PPI. The logical PPI has to be tuned up to compensate for the smaller size, preferably the same as the physical PPI of the display. Then, many applications break horribly, including text overflow, loss of buttons, etc. The problem is even more serious on mobile phones where it is possible to fit 1920×1080 into 5 inches. Websites designed with 96 PPI in mind are completely unreadable on mobile phones unless they are zoomed in a lot.

Apart from the standard PPI, there is also a standard font size. The standard default font size is 12 points for printed documents and 8 points for the user interface. However, it is commonly seen to be too small on 96 PPI displays so the default changed to 9 starting from Windows Vista. When the screen is viewed far away, the user probably wants to make the text larger, but leaves the size of the UI unchanged. Therefore, the user changes the default font size larger but keeps the PPI same as the physical display. I like to change the default font size to 14, and use even larger font sizes, such as 16, 18 in some contexts. Then, again, many poorly-written websites break horribly because they are assuming standard font sizes.

Nowadays, most modern browsers allow the user to zoom the page and choose the font size independently. Web pages never break when the zoom ratio is changed, but poorly-written pages break when the font size is changed. To make a box which is styled with 1 cm actually displays 1 cm on screen, the zoom level should be set to 100% in the browser and the system PPI should be set to be the same as the physical PPI of the display, and, to produce clear image, the display resolution should be set to the native resolution.

The best thing to do would be achieving both size-independence and resolution-independence. The former means not using absolute physical sizes such as cm, inches in UI designs and the latter means not using pixels as a form of measurement. This means only relative sizes (such as em and %) are used, which minimises the chance of breaking when the user changes the defaults.

Reference: http://blogs.msdn.com/b/fontblog/archive/2005/11/08/490490.aspx