This guide is intended to be a quick checklist to make sure you don’t make any obvious mistakes when buying a monitor – not a comprehensive guide.
If it is not listed anywhere, avoid at all costs. If it says “TN”, avoid at all costs. The reason why is because not only are TN panels usually very cheap and bad in quality, but they also have poor viewing angles. Even though you’re sitting straight in front of your monitor, gamma will severely distorted the further you get from the center. Don’t believe me? Have a look at http://www.lagom.nl/lcd-test/viewing_angle.php. Good panel types are H-IPS, S-IPS or S-PVA. e-IPS is worse, but still better than TN.
Look for an information on the amount of colors. If it says “16.2 million colors”, it means they actually use 0.2 million colors but dither to give the illusion of more. Good signs are “16.7 million colors”, which is usually still dithered, but often indicates higher quality dithering or even a native 8 bit panel. (Or, if something advertises “1.07 billion colors”, you know you’ve struck gold – usually native 8 bit + dithering to 10)
Screen size / resolution
Huge screen sizes but comparatively low resolutions (eg. 27″ but 1920×1080) – avoid these, they’ll end up looking like shit. Good sizes are 22″ for 1920×1080 and 27″ for 2560×1440.
“LED” screens are just a marketing hype word for edge-lit white phosphor LEDs which don’t offer any image quality improvements. (They do, however, offer lower power consumption than CCFLs). Real active matrix or RGB LED screens are much more expensive, but good. Don’t be scared of buying CCFLs, they can still be very good.
Response time and input latency
Not the same thing! Response time basically just means that you’ll see a faint after-image of moving content because the pixels themselves don’t change fast enough, but unless it’s >10ms or so you probably won’t notice it. Input latency is the real killer, and you have to look for reviews that measure this to really find out how good your screen would be for gaming. Ideally you want a CRT if input latency bothers you.
If your screen says “wide gamut” anywhere or promises “super vivid colors”, you should avoid it unless you know what you’re doing (example: Dell UltraSharp U2410 is a wide gamut display, as well as many other mid-budget IPS screens). Check to see if your monitor has a good sRGB emulation mode if so. (The U2410 has a very good one, fortunately). If you don’t know what this means or can’t find the information anywhere, then it’s probably sRGB so you should be fine.
Most screens are way, way too fucking bright. This is because when lots of screens are displayed next to each other in some shop, the human eye responds to the brightness differences – brighter means “better”. As such, manufacturers make sure their screen is as bright as possible when it comes out of the box. 120 cd/m² is the industry standard for serious graphics work, most screens come at >200 cd/m², usually in the range of 300-400 cd/m². For example, I have my U2410 currently set at 9/100 on the brightness rating.
Anything super high like 20,000:1 or 50,000,000:1 is all bullshit – real contrast ratios are on the order of 1000:1-ish, usually less. First off, you do NOT want any sort of dynamic contrast (I won’t go into the details why, but it’s very annoying); second, stable black levels are much more important than a higher contrast ratio. You could have a static contrast ratio of 2000:1 and a black level of 0.4 cd/m² and it would be awful – the whites would literally blind you and the blacks would be dark gray at best, expect lots of bleed. On the contrast, if you have a contrast ratio of 600:1 and a black level 0.2 cd/m² the whites would be fine (at 120 cd/m²) and the blacks would be quite a bit better. Higher does not necessarily equate to “better”, because higher can just mean that the display is fucking bright at maximum luminosity.
The “industry” methods of determining the contrast ratio usually works like this: 1. Set the screen as bright as fucking possible, measure luminosity; 2. Set the screen as dark as possible and turn off all lights, then use some “dynamic contrast feature” to shut off the backlight. Measure some really low value which is basically just background ambient light at this point. BAM! Contrast ratio of 50 fucking billion to 1! And off into the market it goes.
If you’re going to watch a lot of Blu-ray or DVD material (shot at approx. 24 Hz), then it’s much more important to get a monitor with a refresh rate that’s a multiple of that – eg. 24 Hz or 48 Hz (120 Hz works too) – 60 Hz will cause noticeable framerate judder due to 3:2 pulldown. If you want to watch stereoscopic material (aka “3D”), then you’ll need a refresh ratio that is a multiple of 48 Hz (for example, 96 Hz, 144 Hz or 192 Hz) – 48 Hz (or 60 Hz) works too, but you’re going to get a /lot/ of noticeable flickering due to the lengthy black cycles. Alternating between two eyes is quite a bit different from displaying just one image because each eye only sees something half the time, whereas with a single frame you can display it for as long as needed.
Most monitors come at a color temperature somewhere at 9300K, which might be fine for office work but ultimately is far too blue for accurate colors on the web and in movies, both of which are standardized to sRGB/Rec.709/D65 (which is closer to 6504K). If you can adjust your color temperature, do so – avoid cheap monitors which don’t allow you to do this.
Those who watch a lot of movies (and get bothered by black bars) should probably prefer 16:9, but the industry standard is to extend the height vertically for 16:10 (8:5) instead of decreasing the horizontal. As such, most 16:10 resolutions are going to have /more pixels/ than their accompanying 16:9 versions. If you’re going to do a lot of work, a 16:10 monitor will be desirable. It is worth noting that the 16:10 ratio is nearly identical the golden ratio (1% error margin), which can be found almost anywhere in nature and has been proven time and time again to appeal to the human eye.