Monitor Jargon Explained

Monitors are crucial pieces of hardware that can either greatly enhance or inhibit your experience with digital media. Some are for professionals, some for hobbyists, and others for everyone in-between. They’re also costly investments, and as is the case with any big purchase, they require at least a surface-level understanding before an informed decision can be made. If you’re in the market, this guide will clearly define every important term you are going to need to know. Let’s get started.

The Essentials

These are the headliners — the buzzwords you’re going to be bombarded with right out of the gate when viewing the specifications of any modern monitor.

Color: Accuracy, Depth, Contrast Ratio, And More

It’s an easy thing to forget when looking at all the cutting-edge features that manufacturers plaster on their ads and packaging, but color quality is far-and-away the most important factor when buying a monitor. If you’re a hobbyist — someone who’s buying a display chiefly for video games and movies — you want your monitor to display media as vividly and as lifelike as possible. If you’re a creative, you need a display that has a high degree of accuracy, so you understand how your work will look on displays that aren’t your own. On the other hand, if you’re buying for productivity, you need the information on the screen to be displayed clearly and consistently. All of that is to say before you look at anything else, look at what a monitor is offering in terms of color.

Color quality can be viewed as the lump sum of three different factors: Accuracy, Depth, and Contrast.

Accuracy is your display’s ability to match the colors present at the source of a video signal. An accurate monitor will display a shade of red as red, where an inaccurate one would display the same shade as red-orange, for example. Accuracy is achieved by way of producing exact tones, which with video signals often originate as shades of gray.

Color depth is interchangeable with bit depth, with both being the number of bits of color that are packed into a single pixel. When you cram more bits of color into a single pixel, the result is an image with greater detail, especially when it comes to images with gradients— areas of an image that gradually go from light in color to dark in color. Generally speaking, color depth is more important when viewing darker images; If you ever see a display being talked about as having “great blacks”, color depth is what is being referred to.

Lastly, contrast, or contrast ratio, is the correlation between an image’s brightest and darkest colors. A monitor with a better contrast ratio will be able to display colors that are both very bright as well as very dark. This concept is called luminance.

Viewing Angle

Viewing angle is the consistency of a monitor’s image quality when viewed from multiple angles. It’s a concept that is much more important with TVs than with monitors, as most will always be viewing their monitor from roughly the same position. That said, there are still some cases where it may be of note. If you’re buying a screen to be used as an additional display for work, for example, a poor viewing angle could make a display that’s fine from head-on into something less ideal. Image quality will always degrade as you move varying degrees to either side of the monitor’s plane (front). Viewing angle can also be something to keep in mind for gamers who don’t want to rule out couch co-op.

Input

A monitor’s input is the hardware it uses to receive signals from a source. A source is anything from a desktop computer to a home gaming console. Most monitors will be receptive to multiple kinds of inputs, and a monitor’s inputs are always listed on the packaging or in the product specifications. Inputs are not static or standardized and they change as different types of hardware are released; Dozens of different inputs have come and gone because of advancements in display-based technology.

At the time of writing, there are five different types of inputs in circulation: D-Sub, VGA, DVI, HDMI, and DP+. Each different input methodology has mild differences between one another; Below is a brief description of each type.

The 5 Different Input Types

D-Sub— D-Sub’s name comes from its unique shape; When the cable is held vertically, its connective area looks like a D. It is rapidly being dropped from circulation, but it can still be found in hardware intended for office use.

VGA— VGA cables have long been the standard for desktop monitors. This has only recently begun to change with the widespread adoption of HDMI. It sends and receives analog signals to displays, and has no capacity for transmitting audio. The hardware is somewhat dated, image quality isn’t as sharp as its contemporaries, save for D-Sub. VGA ports are included in the vast majority of older monitors.

HDMI— HDMI cables have been the standard for consumer technology for years now, and have slowly worked their way into professional-grade hardware as well. These cables transmit and receive both audio and video signals. Every monitor constructed within the last four or five years will include an HDMI port.

DP+— DP+ is a more powerful, efficient version of HDMI. It works in the same way, in that it transmits both audio and video signals through the same cable. However, it does so with a greater degree of efficiency and cleaner results. More importantly, DP+’s enhanced transmitting speeds make it the best choice for 4K and demanding video signals. As 4K overtakes 1080p as the standard resolution, so to will DP+ replace HDMI as the standard input.

HDCP— High-Bandwith Digital Content protection is encryption technology that blocks any non-certified displays from receiving its signal. This isn’t hardware: Any of the above inputs can use HDCP encryption. Companies and content creators use this input method to prevent unauthorized usage of their hardware. For general consumers, it’s a non-issue. If a display isn’t HDCP certified, it won’t be able to receive the signal from HDCP cord.

TN (LCD)

TN is short for Twisted Nematic, and its the monitor-specific variant of LCD technology. LCDs, or Liquid Crystal Displays, have been the standard in display technology for virtually every piece of hardware created in the past decade. Only recently, with the rapid adoption of Ultra-HD resolutions, have LCDs been surpassed in terms of image quality. Statistically speaking, you are most likely reading this article on an LCD screen.

IPS

IPS, or In-Plane Switching, Panels are generally seen as TN panel’s superior. Where LCDs necessitate the bending of light through the panel, IPS panels avoid this entirely by layering the plane (or screen) in such a way that light isn’t as heavily scattered. The most immediate benefit of this technology can be seen in viewing angles; A decent IPS panel will produce an image that looks exactly the same regardless of what angle it is viewed from. IPS panels also have the capacity for more accurate colors and deeper blacks, features that translate to more appealing visuals in nearly every type of digital content. IPS technology is held back by its tendency to increase latency, which translates to higher response times. Because of this, many gaming-focused displays will forgo the technology, in spite of its capacity for greater visual fidelity.

OLED

OLED is shorthand for organic light-emitting diode. In terms of visual fidelity and capacity for displaying Ultra-HD visual signals, OLED is the current leader in display technology. OLED technology sets itself apart from other options by illuminating each individual pixel; With other options, every pixel is illuminated by a single backlight. The individual lighting provides colors and depth that at time of writing can’t be matched by other types of displays. The tech is new, however, and isn’t currently compatible with other highly-sought features. An OLED screen with a high refresh rate, HDR, G-Sync or Free-Sync, and a low response time is many years out. If you’re only interested in visual fidelity, however, then OLED will be the best option.

Aspect Ratio

Aspect Ratio is the ratio of the width of an image to its height. 16:9, also called widescreen, has been the standard for the past decade. This aspect ratio is also the one utilized with both Full HD and Ultra HD content. 21:9, or ultra widescreen, is another popular aspect ratio; It’s unique to monitors, however. Ultra Widescreen offers greater screen real estate, making it a popular choice for both productivity and gaming. The extra space offered by 21:9 is the rough equivalent to an entire 16:9 display. 21:9 isn’t always supported by game developers, however, which can create headaches with compatibility.

4K

A 4K signal is one with a resolution of 3,840 x 2,160p. It’s a signal that boasts four times the pixel count of Full HD, hence the name. 4K technology is Full HD’s inevitable successor and the default resolution for most new displays. It hasn’t been adopted by the market on a broad enough scale to be considered the new standard, however. That said, any and all digital content that is currently in development, be they games or movies, is being developed with 4K in mind. Because of this, it’s the best choice for those looking for a future-proofed display.

HDR

HDR, or high-dynamic-range imaging, has come to be seen as a companion to 4K. The HDR feature offers drastically increased color accuracy and depth, something made possible by the broader range of luminance produced by the technology. With TVs, HDR is a win-win, and a borderline-necessary feature when paired with Ultra HD resolutions. With monitors, things are a bit more complicated. Most displays that offer HDR are actually using a sort of ‘pseudo HDR’. Broadly speaking, this is because the actual guidelines for ‘Full HDR’, or HDR 10, require 1000 bits of brightness, something that would be more of a hindrance than a benefit when taking into account how close people usually are to their monitor. In other words, proper HDR is still a work-in-progress for desktop displays — Early adopters beware.

Full HD

Full HD is a resolution of 1920 x 1080p. This resolution is the standard for modern displays, with ‘regular’ HD (1280 x 720p) having been removed from circulation. With the invention of QHD and Ultra HD, however, it now sits as the bare minimum resolution offered by monitors. The ‘p’ found at the end of resolutions stands for progressive scan, which is the rendering methodology that denotes a ‘true’ resolution, as opposed to an interlaced signal, denoted by an ‘i’, which denotes a ‘fake’ signal.

QHD

QHD, or Quad HD, denotes a monitor with a resolution of 2560 x 1440p. The resolution sits in between Full HD and Ultra HD in terms of visual fidelity. It is extremely popular in the gaming space, as it offers noticeably improved clarity and detail when compared to Full HD, but without the performance requirements and higher price point associated with 4K displays.

Refresh Rate

Refresh rate is the frequency with which the image being displayed on your monitor ‘refreshes’ itself; Liken this to how quickly a projector moves from one slide to the next. Refresh rates are inextricable from the commonly cited term FPS — frames per second. A monitor with a higher refresh rate will allow digital media to be display with more frames per second. More frames per second translate to a more smooth and immersive experience, especially with video games. With movies, the benefits aren’t nearly as noticeable: Most films are shot at 24 FPS, six frames below the gaming industry’s standard of 30 FPS.

Refresh rates listed by monitors are the maximum that they can achieve, and not necessarily what you will be seeing; To take advantage of your monitor’s refresh rate, you will need hardware capable of hitting that same benchmark in terms of FPS. The standard refresh rate is 60 Hz, with enthusiast-grade hardware being capable of reaching refresh rates of 240 Hz.

Response Time

Response Time is how long it takes for your monitor display changes taking place in the displayed image’s source. When you press a key on your keyboard, a monitor with a low response time will reflect the changes caused by you hitting that key faster than one with a higher response time would. Once again, this is a concept that is most important with video games. For a monitor to be classified as a ‘gaming monitor’, it needs a refresh rate between 1 and 5 milliseconds. For reference, most televisions have a response time of about 50 milliseconds.

Adaptive-Sync, Free-Sync, And G-Sync

Adaptive-Sync is a process used to synchronize your monitor’s refresh rate with your hardware’s FPS output. If your hardware has an FPS output that is significantly lower than your display’s refresh rate, you will experience stuttering. This is a nuisance characterized by frames harshly and jarringly moving between one another. Conversely, if your hardware has an FPS that is higher than your display’s refresh rate, you will experience screen tearing. This is a visual hiccup where, for a brief moment, the image displayed will appear to be ‘torn’ in half. Adaptive-Sync fixes both of these issues.

It is also the generic term for the technology; AMD and Nvidia, the two largest names in the PC gaming space, have proprietary versions of Adaptive-Sync that are only compatible with their respective hardware. AMD has Free-Sync, Nvidia has G-Sync. Both do roughly the same things, but because Nvidia has a larger share of the PC gaming market, G-Sync is generally seen as more desirable, since its more broadly applicable.

Backlight

A display’s backlight is the LED that illuminates the screen’s pixels. There are two different kinds of backlighting: standard backlighting and edge lighting. Each method has a unique issue associated with it. Backlighting has the potential for backlight bleed — a visual nuisance where the backlight is clearly visible during darker scenes and images. Sidelighting has local dimming, an issue where some areas of the screen, usually the center, are much dimmer than others. Backlight bleed is generally seen as the lesser of the two evils, and in accordance, backlight LEDs are more popular.