Selecting a high-resolution monitor is like herding cats: As soon as you think you have one specification pinned down, you realize you’ve lost your grip on another requirement. There are many variables in monitor technology already, so the proliferation of formats and standards that have entered the market in recent years — such as 4K, 16:10, and FHD — can be dizzying.
We’ll start clearing things up by defining some terms, which are often the source of confusion. The terms high definition and high resolution, although related, are subtly different. Definition refers to the number of pixels in a screen; high definition is used for anything above what the old standard-definition TVs displayed, which was usually 576 horizontal lines of pixels. Resolution, on the other hand, refers to pixel density: higher resolutions convey more information in the same amount of space.
Higher monitor resolution can yield sharper images and finer detail, which is important for CAD users who need to see model details and linework clearly.
The following acronyms and names are a mishmash of official industry standards, proprietary brand names, and nicknames, but they're all measures of the number of pixels.
HD/720p. The original “high-definition” or HD monitors have 720 horizontal lines of pixels, meaning they are 720 pixels tall (hence the name 720p). These cost less than $100, but CAD professionals will want to bypass these in favor of a higher level of resolution.
FHD/1080p. One of the first video modes native to the 16:9 aspect ratio of high-definition standards, it indicates 1,080 horizontal lines of pixels. (In this case, “native” means the software and drivers involved displayed 16:9 without stretching or compressing the picture to fill the space.) Another moniker used for 1080p when the standard first became popular was Full HD (FHD). You can pay as much as $250 for a 1080p monitor up to about 25", but there are deals to be had for less than $100.
2K. A 2K display contains 2,048 vertical lines of pixels (called the horizontal resolution), and doesn’t cost much more than a 1080p display. Be aware that this term is sometimes applied to 1080p displays also.
UHD/4K. You may see the term UHD used for monitors in this class, or 4K, or combinations of the two. For example, as soon as the Consumer Electronics Association decreed that the term ultra high definition (UHD) should supersede 4K across the industry, Sony thumbed its nose, releasing a 4K standard and calling it 4K ultra high definition.
And here’s where things get even more confusing: The term 4K usually indicates 4,096 vertical lines of pixels, and that’s the case with in cinema projectors (this is sometimes called “cinema 4K”). But UHD monitors, even though they may be called 4K, typically have 3,840.
Regardless of the exact number, more pixels in your screen theoretically translate to richer color and a sharper picture than lower resolutions. But a 4K monitor needs substantial processing power lest your refresh rate suffer, so it might take the extra investment of a graphics processing unit (GPU) upgrade along with the purchase price. Whether you'll need to upgrade your graphics architecture depends on your current card and computer's processing power and what you intend to do with it, but if you notice a reduction in monitor performance after you upgrade, your GPU is one area to investigate. Test-drive a system with higher specs along with your 4K monitor and see if you notice a difference.
Now that 4K monitors have been on the market for a few years, they are much more affordable than the thousand-dollar highs of 2014–2015. A decent one — meaning a name-brand monitor in the 24–27" range, with reputable after-sales support — will cost you $300–500. You can still pay as much as $1,000 for more extreme variations, such as very large, curved models.
It's tempting to ignore everything else and just pay attention to the number of pixels per inch, simply because more pixels on a screen means a clearer picture — there are more color and shade elements available to display the image. And higher pixel density is certainly better in digital charge-coupled device (CCD) camera arrays, and when printing on paper. But when it comes to monitors, you will make a better overall decision if you keep a few other factors in mind, including size.
While you might gravitate toward the biggest monitor you can afford, the picture will only be as strong as the weakest link in your system. For CAD use, 19" monitors are considered the entry-level size; the 22–27" range is more common because there's less panning and zooming, and the (often busy) palettes and windows of 3D design programs can take up a lot of screen real estate. Using multiple monitors — such as one for your live workflow and the other for your palettes and other work — is also popular. If your GPU isn't up to it, you'll get better performance from a smaller screen that refreshes at a faster rate (read about refresh rates below).
Many high-definition monitors have native resolutions, and if your applications use different fixed resolutions, they might be stretched to fill the screen, resulting in a pixelated or blurred image. Before you invest, test your applications on a screen size you're considering — if windows or text appear too small, you might need to select a different model, with either a higher native resolution or one you can change in your systems monitor driver.
Aspect ratio is closely related to the resolution. Referring to the proportionate measures of the horizontal and vertical lengths of the screen, many of the current models have a standard ratio of 16:9. Until the early 2000s, the standard monitor ratio was 4:3 (the same as an old analog TV), with a common widescreen option of 16:10, but around 2010 manufacturers started to adopt the 16:9 ratio as standard.
The reason is that as processors and memory became cheaper, the features in software applications increased to take advantage of them. More power could also drive bigger screens that displayed expanded tool palettes and on-screen controls.
In many applications, such as playing movies and gaming, a fast refresh rate — the time between the processor and graphics card sending the data and the LCD display firing up the right pixels at the right brightness — is an essential element of display technology.
It's not as critical in the world of CAD, but it's still important. (If you've ever used a large monitor with a slow refresh rate, you'll understand why it's still a question you need to ask.) Even slight amounts of latency, noticeable as jerky movements when you drag windows or the mouse pointer across the screen, will have you tearing your hair out.
The standard for measuring screen latency is the milliseconds that pass while the system changes the color state of one pixel. But there is also a small amount of time necessary for the video signal to leave your CPU or GPU and travel to the monitor: If you've ever used a high-resolution screen with a high-powered computer and wondered why the picture is so laggy (after the salesperson told you what a great refresh rate the screen had had), this is probably the reason.
Don't pay too much attention to advertising or benchmarks when it comes to refresh rates; your eye is the best guide. Test-drive your usual software tools and if you're comfortable with the result, you've found an acceptable monitor.
Most LCD screens use one of two types of technology: thin film transistor liquid crystal display (TFT-LCD) and in-plane switching (IPS). IPS has better color accuracy and has wider viewing angles, whereas the older TFT technology has better response time. Depending on your particular needs, it boils down to whether rich color or smooth movement is more important. IPS displays have also improved in speed in recent years, so for CAD purposes it might not be an issue, whereas it still can be in high-motion work such as video editing.
Even though there seem to be a hundred monitor choices on the market, they don't all have proprietary connectors. Almost all major display manufacturers adhere to the HDMI (high-definition multimedia interface) standard. There are other standards, including DisplayPort, VGA, and DVI, but with the best-performing data-transfer interface, HDMI has emerged as the de facto industry standard. Much older and lower-powered standards, such as AppleTalk and USB, aren't suitable for high-end CAD work; they simply don't transfer the huge data loads fast enough.
There have been two HDMI versions — 1.4 (standard) and 2.0 (high speed) — but there's no such thing as HDMI cable versions. Although you might pay $30–40 for a cable with gold connectors, you can get equally good results from a $3 version.