Brief history lesson: LCD technology is what totally replaced CRT. This brings up the first issue; this shouldn't have happened. LCD taking over for lower end markets is fine, but it's not really suitable for any enthusiast gaming or movie/TV watching. A much better technology was proposed to succeed CRT, and that was FED or Field Emission Display. SED, or Surface-Conduction Electron-Emitter Display, was born of this. Both technologies are far superior to LCD in virtually every way, quite close to OLED in fact with some benefits. They are like CRT taken to another level: Rather than having a single cathode ray tube (CRT), both FED and SED utilize a matrix of tiny CRTs for each individual pixel.
Self-illuminating pixels, which SED/FED and OLED technology offer, is the best type of display. It gets you true blacks, since the pixels will emit no light in black scenarios, thus they also bring much higher (virtually infinite) contrast ratios. Unlike LCD which uses a static edge-mounted or back-mounted light to power the display, making it impossible to properly display blacks and greatly limiting contrast. SED/FED would have been a much more real looking display, more like a window than a screen, also the natural evolution of CRT.
Thus, the benefits over LCD are superior contrast and blacks, perfect luminance uniformity, superior color uniformity, superior color accuracy (better than OLED in this regard as well), faster response times (1 ms with no overdrive artifacts), and apparently greater power efficiency as well. Downsides may have been less potential peak brightness and temporary image retention on static images that require counter measures to avoid.
A prototype Canon SED TV in 2006.
Many enthusiasts are strong advocates for OLED, but had FED and/or SED taken over instead of LCD, even if not completely, (starting from high end market and eventually working its way down as OLED is), we wouldn't be extremely excited for OLED.
So why didn't FED or SED take over? The main reason is cost. It was and would have remained more expensive to produce than LCD. The only reason OLED has a chance of eventually succeeding LCD is not because it's significantly better in essentially every way, but because it may eventually be cheaper and simpler to produce.
But FED and SED weren't the only panel technologies superior to LCD. Remember Plasma? Not suitable for monitors as they had to be large, but much better than LCD for TV. There was of course misconception about Plasma, mainly regarding "burn-in" but by the end of its lifespan only temporary image retention from static images was present and the TVs had features to clear it. The benefits of Plasma were CRT-like greatly superior motion clarity/response times (some of them refreshed their phosphors at 600 Hz which didn't create input lag like modern motion interpolation), and much better image quality with deeper blacks and contrast ratios potentailly exceeding 20,000:1.
But again, LCD was cheaper to make and more economical/efficient than Plasma, so Plasma bit the dust. Irrelevant for computer monitors but this kept the TV industry lagging behind where it should be.
OLED isn't a very new technology. It's taking way too long for it to be introduced to the computer monitor industry. LG has an OLED TV lineup and Panasonic released one OLED TV using an LG panel, that's it. The industry should have pushed harder for OLED, not drag out its release.
But even LCD technology in computer monitors is, for the most part, years behind that of the high end TV industry. The "best gaming monitors" of today, such as all high refresh rate models, are all using edge-lit backlighting while very high end TVs are using full array backlighting (the LEDs are mounted behind the panel rather than on the edges). Edge-mounted lighting is greatly inferior as it produces heavily flawed and inconsistent brightness uniformity across the screen and has backlight bleeding issues (light "spilling" onto the screen). But even edge-lit TVs have far less bleeding and uniformity issues because they have superior QC.
A defective Eizo Foris FG2421 and the horrifying backlight bleed it had.
The same FG2421, showing the effects in game.
Where are the full array gaming monitors? But that's not the only deficiency in gaming monitors, not even the biggest one. The biggest problem with LCD computer monitors, aside from QC, is that high end ones are only using TN or IPS panels. TN is the ideal LCD for super high refresh rate competitive gaming monitors, like the ASUS 180 Hz and 240 Hz monitors, since image quality is almost irrelevant there. But IPS is a useless waste of time at least without full array local dimming with hundreds of dimming zones, especially the rather poor IPS panels used by IPS gaming monitors (144 Hz models especially). These IPS monitors set new records for highest amount of IPS glow ever, it ruins image quality rather than enhances it compared to TN.
So IPS without extreme local dimming has no real use in gaming monitors. It's meant to look better than TN but it looks worse in dark scenes, and the difference is negligible elsewhere over the 1440p 144 Hz TN monitors. Both have only around 1000:1 static contrast, viewing angles don't matter that much (as long as they're good enough for head-on viewing), color accuracy is very similar at least after calibration.
For immersive gaming, VA is the temporary solution until OLED finally makes an appearance, unless the upcoming 4k 144 Hz IPS monitors with 384 dimming zones prove better. But VA monitors have been so problematic, often having peaks of 40 ms or more response times leading to some nasty trailing/streaking behind moving objects in certain color transitions. If not this, then they often have excessive overdrive leading to inverse ghosting. They tend to have only 2500:1 to 3000:1 static contrast, a big improvement over the 1000:1 from IPS and TN but when the average $700+ TV has 5000:1 contrast today, and higher end ones might have 7000:1, it is upsetting.
Another issue is, why do TVs get less intrusive glossy coatings or better, while monitors always get blurry damaging AG coatings? See this picture below for reference.
An anti-glare coating removed. This is why they ruin image quality so much.
Before you mention glare and reflection issues, higher end TVs (in $700 and above price range) have the best of both worlds with their anti-glare glossy coatings. AR treated glass is used on some very high end models, like the LG E6 and upcoming E7.
To illustrate the differences between the best gaming monitors, the aforementioned Samsung high refresh rate VA, and very high end VA TVs, see the table below. OLED isn't even included since that's unfair for LCD.
There is a consistent pattern here. The TVs are designed around picture quality where they are lightyears ahead of any monitor, while the monitors are designed more around high refresh rate fluidity. The fact that the TVs for some reason still use PWM backlighting highlights this the most, since the TVs actually have good response times (as do the monitors). Note that the backlight isn't visibly flickery on the TVs, but it harms motion clarity somewhat.
But as of 2017, there is some hope. The ASUS ROG Swift PG27UQ, Acer Predator XB272-HDR, and an AOC 32" equivalent have been announced. These are 27" (32" for the AOC) 3840 x 2160 144 Hz monitors via DisplayPort 1.4 (so DSC "lossless compression" is used), combining an AU Optronics AHVA (IPS) panel with full array local dimming and 384 dimming zones. Furthermore they are designed around DCI-P3 color space, which has 25% more colors than sRGB. While this will only oversaturate sRGB content like video games, the oversaturation is not grotesque and for all we know these monitors will have a good sRGB emulation mode. These monitors are also equipped with quantum dot technology, peak 1000 nits brightness (800 nits for the AOC), HDR support, and G-SYNC. This may surpass VA, although the lingering issue is the haloing or blooming effect caused by 384 dimming zones being insufficient (per-pixel dimming is ideal, which equates to 8,294,400 'dimming zones' like a 4k OLED screen). The halo/bloom effect will be worse (stronger) on these IPS monitors compared to a VA screen with the same amount of dimming zones, because IPS subpixels are worse than VA for controlling light (hence why VA static contrast is anywhere between 2.5x and 7x as good as IPS depending on the exact panel).
To summarize all of this, computer monitors are very far behind TVs, and TVs are behind too. LCD should ideally be a dead technology by now. The VA LCD panels used by TVs are overall far nicer than what gaming monitors use, although gaming monitors do have as good motion clarity as one can expect from the fundamentally flawed LCD technology (although there is still lots of room for improvement when looking past LCD). Our only hope is an OLED takeover.