Brief history lesson: LCD technology is what totally replaced CRT. This brings up the first issue; this shouldn't have happened. LCD taking over for lower end markets is fine, but it's not really suitable for any enthusiast gaming or movie/TV watching. A much better technology was proposed to succeed CRT, and that was FED or Field Emission Display. SED, or Surface-Conduction Electron-Emitter Display, was born of this. Both technologies are far superior to LCD in virtually every way, quite close to OLED in fact with some benefits. They are like CRT taken to another level: Rather than having a single cathode ray tube (CRT), both FED and SED utilize a matrix of tiny CRTs for each individual pixel.
Self-illuminating pixels, which SED/FED and OLED technology offers, is the best type of display. It gets you true blacks, since the pixels will emit no light in black scenarios, thus they also bring much higher (virtually infinite) contrast ratios. Unlike LCD which uses a static edge-mounted or back-mounted light to power the display, making it impossible to properly display blacks and greatly limiting contrast. SED/FED would have been a much more real looking display, more like a window than a screen, also the natural evolution of CRT.
Thus, the benefits over LCD are superior contrast and blacks, perfect luminance uniformity, superior color uniformity, superior color accuracy (better than OLED in this regard as well), faster response times (1 ms with no overdrive artifacts), and apparently greater power efficiency as well. Downsides may have been less potential peak brightness and temporary image retention on static images.
A prototype Canon SED TV in 2006.
Many enthusiasts are strong advocates for OLED, but had FED and/or SED taken over instead of LCD, even if not completely, (starting from high end market and eventually working its way down as OLED is), we wouldn't be extremely excited for OLED.
So why didn't FED or SED take over? The main reason is cost. It was and would have remained more expensive to produce than LCD. The only reason OLED has a chance of eventually succeeding LCD is not because it's significantly better in essentially every way, but because it may eventually be cheaper and simpler to produce.
But FED and SED weren't the only panel technologies superior to LCD. Remember Plasma? Not suitable for monitors as they had to be large, but much better than LCD for TV. There was of course misconception about Plasma, mainly regarding "burn-in" but by the end of its lifespan only temporary image retention from static images was present and the TVs had features to clear it. The benefits of Plasma were CRT-like greatly superior motion clarity/response times (some of them refreshed their phosphors at 600 Hz which didn't create input lag like modern motion interpolation), and much better image quality with deeper blacks and contrast ratios potentailly exceeding 20,000:1.
But again, LCD was cheaper to make and more economical/efficient than Plasma, so Plasma bit the dust. Irrelevant for computer monitors but this kept the TV industry lagging behind where it should be.
OLED isn't a very new technology. It's taking way too long for it to be introduced to the computer monitor industry. LG has an OLED TV lineup and Panasonic released one OLED TV using an LG panel, that's it. The industry should have pushed harder for OLED, not drag out its release.
But even LCD technology in computer monitors is, for the most part, years behind that of the high end TV industry. The "best gaming monitors" of today, such as all high refresh rate models, are all using edge-lit backlighting while very high end TVs are using full array backlighting (the LEDs are mounted behind the panel rather than on the edges). Edge-mounted lighting is greatly inferior as it produces heavily flawed and inconsistent brightness uniformity across the screen and has backlight bleeding issues (light "spilling" onto the screen). But even edge-lit TVs have far less bleeding and uniformity issues because they have superior QC.
A defective Eizo Foris FG2421 and the horrifying backlight bleed it had.
The same FG2421, showing the effects in game.
Where are the full array gaming monitors? But that's not the only deficiency in gaming monitors, not even the biggest one. The biggest problem with LCD computer monitors, aside from QC, is that high end ones are only using TN or IPS panels. TN is the ideal LCD for super high refresh rate competitive gaming monitors, like the ASUS 180 Hz and upcoming 240 Hz monitors, since image quality is almost irrelevant there. But IPS is a useless waste of time, especially the rather poor IPS panels used by IPS gaming monitors (144 Hz models especially). These IPS monitors set new records for highest amount of IPS glow ever, it ruins image quality rather than enhances it compared to TN.
So IPS has no real use in gaming monitors. It's meant to look better than TN but it looks worse in dark scenes, and the difference is negligible elsewhere over the 1440p 144 Hz TN monitors. Both have only around 1000:1 static contrast, viewing angles don't matter that much (as long as they're good enough for head-on viewing), color accuracy is very similar at least after calibration.
For immersive gaming, VA is the temporary solution until OLED finally makes an appearance. But VA monitors have been so problematic until the absolute latest Samsung VA panels which are 144 Hz 1080p and 100 Hz 3440 x 1440. Before these, they had peaks of 40 ms or more response times leading to some nasty trailing/streaking behind moving objects in certain color transitions. They also usually only had around 2500:1 contrast although that's still a nice improvement over IPS/TN's 1000:1. Samsung's latest ones at least fix the response time issues, putting them in the same ballpark as equivalent refresh rate IPS while bringing 3000:1 contrast and superior QC.
Surprise, even these Samsung VA monitors are nowhere near the quality of high end TVs. No, the problem isn't that they explode. They don't explode. The CFG70 is "only" $400 for the 24" and $500 for the 27" so I'll only complain about resolution and anti-glare coatings for those. 1080p is so 2012, for PC gaming. Why do TVs get less intrusive glossy coatings or better, while monitors always get blurry damaging AG coatings? See this picture below for reference.
An anti-glare coating removed. This is why they ruin image quality so much.
To illustrate the differences between the best gaming monitors, the aforementioned Samsung high refresh rate VA, and very high end VA TVs, see the table below. OLED isn't even included since that's unfair for LCD.
There is a consistent pattern here. The TVs are designed around picture quality where they are lightyears ahead of any monitor, while the monitors are designed more around high refresh rate fluidity. The fact that the TVs for some reason still use PWM backlighting highlights this the most, since the TVs actually have good response times (as do the monitors). Note that the backlight isn't visibly flickery on the TVs, but it harms motion clarity somewhat.
There really needs to be high end monitors, such as the Samsung CF791, that have the best of both worlds. Although Samsung is still using edge-lighting rather than full array lighting on their TVs. If LCD monitors were were they should be, there wouldn't be $600-800 1440p 144 Hz IPS monitors. Instead there would be $600-800 glossy or semi glossy 1440p 120-144 Hz VA monitors with variable refresh rate, blur reduction, edge-mounted lighting but better QC, 100% sRGB, good response times, around 5000:1 contrast, very low input lag, and DC controlled flicker free backlight.
Then there would be $1000+ monitors, up to 4k 120 Hz, with full array backlighting plus local dimming with up to 384 dimming zones, HDR-10 with some models also having Dolby Vision, 10-bit color (12-bit for Dolby Vision models), > 90% DCI-P3 color space and a 100% sRGB mode, 6500:1 - 7000:1 static contrast, blur reduction, variable refresh rate, good response times, AR treated glass coating, DC controlled flicker free backlight, very low input lag. 4k models would have good upscaling for 720p, 1080p, and 2x 1080p, being able to display them as native or better. These monitors should all be DisplayPort 1.4, since display connector industry moves even slower than display industries as we're still on DisplayPort 1.2 with monitors and HDMI 2.0 on TVs.
But we have nothing remotely close to that. The few high refresh rate VA monitors are 1080p save for the Samsung CF791 which is 21:9, and all use ugly matte AG coatings that harm image quality. They have less than half the static contrast of higher end Samsung TVs, have no local dimming which would probably be quite nice on a computer monitor with full array backlighting and 384 dimming zones (due to the smaller size). Only now are VA monitors catching up to the better response times of high end VA TVs.
To summarize all of this, computer monitors are very far behind TVs, and TVs are behind too. LCD should ideally be a dead technology by now. The VA LCD panels used by TVs are overall far nicer than what gaming monitors use, although gaming monitors do have as good motion clarity as one can expect from the fundamentally flawed LCD technology (although there is still lots of room for improvement). Monitors won't catch up to TVs any time soon. Our only hope is an OLED takeover.