• Computer Hardware Industries Desperately Behind the Times

    We talk a lot about monitors and display technology in general here at GND-Tech. Display technology in general is not where it should be, but computer monitors have it the worst. To those without much knowledge in this area, it would be beneficial to first read this.

    Brief history lesson: LCD technology is what totally replaced CRT. This brings up the first issue; this shouldn't have happened. LCD taking over for lower end markets is fine, but it's not really suitable for any enthusiast gaming or movie/TV watching. A much better technology was proposed to succeed CRT, and that was FED or Field Emission Display. SED, or Surface-Conduction Electron-Emitter Display, was born of this. Both technologies are far superior to LCD in virtually every way, quite close to OLED in fact with some benefits. They are like CRT taken to another level: Rather than having a single cathode ray tube (CRT), both FED and SED utilize a matrix of tiny CRTs for each individual pixel.

    Self-illuminating pixels, which SED/FED and OLED technology offers, is the best type of display. It gets you true blacks, since the pixels will emit no light in black scenarios, thus they also bring much higher (virtually infinite) contrast ratios. Unlike LCD which uses a static edge-mounted or back-mounted light to power the display, making it impossible to properly display blacks and greatly limiting contrast. SED/FED would have been a much more real looking display, more like a window than a screen, also the natural evolution of CRT.

    Thus, the benefits over LCD are superior contrast and blacks, perfect luminance uniformity, superior color uniformity, superior color accuracy (better than OLED in this regard as well), faster response times (1 ms with no overdrive artifacts), and apparently greater power efficiency as well. Downsides may have been less potential peak brightness and temporary image retention on static images.


    A prototype Canon SED TV in 2006.

    Many enthusiasts are strong advocates for OLED, but had FED and/or SED taken over instead of LCD, even if not completely, (starting from high end market and eventually working its way down as OLED is), we wouldn't be extremely excited for OLED.

    So why didn't FED or SED take over? The main reason is cost. It was and would have remained more expensive to produce than LCD. The only reason OLED has a chance of eventually succeeding LCD is not because it's significantly better in essentially every way, but because it may eventually be cheaper and simpler to produce.

    But FED and SED weren't the only panel technologies superior to LCD. Remember Plasma? Not suitable for monitors as they had to be large, but much better than LCD for TV. There was of course misconception about Plasma, mainly regarding "burn-in" but by the end of its lifespan only temporary image retention from static images was present and the TVs had features to clear it. The benefits of Plasma were CRT-like greatly superior motion clarity/response times (some of them refreshed their phosphors at 600 Hz which didn't create input lag like modern motion interpolation), and much better image quality with deeper blacks and contrast ratios potentailly exceeding 20,000:1.

    But again, LCD was cheaper to make and more economical/efficient than Plasma, so Plasma bit the dust. Irrelevant for computer monitors but this kept the TV industry lagging behind where it should be.

    OLED isn't a very new technology. It's taking way too long for it to be introduced to the computer monitor industry. LG has an OLED TV lineup and Panasonic released one OLED TV using an LG panel, that's it. The industry should have pushed harder for OLED, not drag out its release.

    But even LCD technology in computer monitors is, for the most part, years behind that of the high end TV industry. The "best gaming monitors" of today, such as all high refresh rate models, are all using edge-lit backlighting while very high end TVs are using full array backlighting (the LEDs are mounted behind the panel rather than on the edges). Edge-mounted lighting is greatly inferior as it produces heavily flawed and inconsistent brightness uniformity across the screen and has backlight bleeding issues (light "spilling" onto the screen). But even edge-lit TVs have far less bleeding and uniformity issues because they have superior QC.


    A defective Eizo Foris FG2421 and the horrifying backlight bleed it had.


    The same FG2421, showing the effects in game.

    Where are the full array gaming monitors? But that's not the only deficiency in gaming monitors, not even the biggest one. The biggest problem with LCD computer monitors, aside from QC, is that high end ones are only using TN or IPS panels. TN is the ideal LCD for super high refresh rate competitive gaming monitors, like the ASUS 180 Hz and upcoming 240 Hz monitors, since image quality is almost irrelevant there. But IPS is a useless waste of time, especially the rather poor IPS panels used by IPS gaming monitors (144 Hz models especially). These IPS monitors set new records for highest amount of IPS glow ever, it ruins image quality rather than enhances it compared to TN.

    So IPS has no real use in gaming monitors. It's meant to look better than TN but it looks worse in dark scenes, and the difference is negligible elsewhere over the 1440p 144 Hz TN monitors. Both have only around 1000:1 static contrast, viewing angles don't matter that much (as long as they're good enough for head-on viewing), color accuracy is very similar at least after calibration.

    For immersive gaming, VA is the temporary solution until OLED finally makes an appearance. But VA monitors have been so problematic until the absolute latest Samsung VA panels which are 144 Hz 1080p and 100 Hz 3440 x 1440. Before these, they had peaks of 40 ms or more response times leading to some nasty trailing/streaking behind moving objects in certain color transitions. They also usually only had around 2500:1 contrast although that's still a nice improvement over IPS/TN's 1000:1. Samsung's latest ones at least fix the response time issues, putting them in the same ballpark as equivalent refresh rate IPS while bringing 3000:1 contrast and superior QC.

    Surprise, even these Samsung VA monitors are nowhere near the quality of high end TVs. No, the problem isn't that they explode. They don't explode. The CFG70 is "only" $400 for the 24" and $500 for the 27" so I'll only complain about resolution and anti-glare coatings for those. 1080p is so 2012, for PC gaming. Why do TVs get less intrusive glossy coatings or better, while monitors always get blurry damaging AG coatings? See this picture below for reference.




    An anti-glare coating removed. This is why they ruin image quality so much.

    To illustrate the differences between the best gaming monitors, the aforementioned Samsung high refresh rate VA, and very high end VA TVs, see the table below. OLED isn't even included since that's unfair for LCD.


    There is a consistent pattern here. The TVs are designed around picture quality where they are lightyears ahead of any monitor, while the monitors are designed more around high refresh rate fluidity. The fact that the TVs for some reason still use PWM backlighting highlights this the most, since the TVs actually have good response times (as do the monitors). Note that the backlight isn't visibly flickery on the TVs, but it harms motion clarity somewhat.

    There really needs to be high end monitors, such as the Samsung CF791, that have the best of both worlds. Although Samsung is still using edge-lighting rather than full array lighting on their TVs. If LCD monitors were were they should be, there wouldn't be $600-800 1440p 144 Hz IPS monitors. Instead there would be $600-800 glossy or semi glossy 1440p 120-144 Hz VA monitors with variable refresh rate, blur reduction, edge-mounted lighting but better QC, 100% sRGB, good response times, around 5000:1 contrast, very low input lag, and DC controlled flicker free backlight.

    Then there would be $1000+ monitors, up to 4k 120 Hz, with full array backlighting plus local dimming with up to 384 dimming zones, HDR-10 with some models also having Dolby Vision, 10-bit color (12-bit for Dolby Vision models), > 90% DCI-P3 color space and a 100% sRGB mode, 6500:1 - 7000:1 static contrast, blur reduction, variable refresh rate, good response times, AR treated glass coating, DC controlled flicker free backlight, very low input lag. 4k models would have good upscaling for 720p, 1080p, and 2x 1080p, being able to display them as native or better. These monitors should all be DisplayPort 1.4, since display connector industry moves even slower than display industries as we're still on DisplayPort 1.2 with monitors and HDMI 2.0 on TVs.

    But we have nothing remotely close to that. The few high refresh rate VA monitors are 1080p save for the Samsung CF791 which is 21:9, and all use ugly matte AG coatings that harm image quality. They have less than half the static contrast of higher end Samsung TVs, have no local dimming which would probably be quite nice on a computer monitor with full array backlighting and 384 dimming zones (due to the smaller size). Only now are VA monitors catching up to the better response times of high end VA TVs.

    To summarize all of this, computer monitors are very far behind TVs, and TVs are behind too. LCD should ideally be a dead technology by now. The VA LCD panels used by TVs are overall far nicer than what gaming monitors use, although gaming monitors do have as good motion clarity as one can expect from the fundamentally flawed LCD technology (although there is still lots of room for improvement). Monitors won't catch up to TVs any time soon. Our only hope is an OLED takeover.


    Comments 8 Comments
    1. Enad's Avatar
      Enad -
      sux m8.

      I can concur on the monitor front, my VA Samsung TV looks far superior to my BenQ BL3200 VA Monitor.
    1. Jester's Avatar
      Jester -
      Quote Originally Posted by Enad View Post
      sux m8.

      I can concur on the monitor front, my VA Samsung TV looks far superior to my BenQ BL3200 VA Monitor.
      For what it's worth, that gap would be bridged quite a bit if you removed the AG coating on the BL3200PT (not that I recommend attempting this until you no longer need the monitor). Honestly AG coatings are probably what annoy me the most about monitors. Unless OLED takes over, we're going to be stuck with AG coatings if we also want high refresh rate and reasonable size (under 40").

      Seriously, I'd shut up if we had some kind of masterclass LCD monitor. 27-32" 3840 x 2160 120 Hz SPVA quantum dot monitor (DisplayPort 1.4) with full array WLED backlighting, at least 384 local dimming zones (areas around the screen where backlight is controlled dynamically based on content, leading to at least 25,000:1 zone contrast ratio), 5,000:1 - 7,000:1 static contrast, AR treated glass coating, HDR-10, 10-bit color with both >= 100% sRGB and DCI-P3 modes, good out of the box setup, fast response time, excellent strobing method with good lower refresh rate support (e.g. Eizo Turbo240 + the zone/scanning method seen on the Samsung CFG70), G-SYNC and/or FreeSync. It's not OLED, but it'd be good enough... almost.
    1. Grompz's Avatar
      Grompz -
      Since you have mentioned keyboards, I think gaming mouses also deserve a mention. Most of self-proclaimed gaming mouses have inaccurate sensors and cheap plastic build, they are also designed for ambidextrous persons, even some of the most expensives ones, which, in my opinion, is not as comfortable as a mouse fully designed for right-handed or left-handed user.
    1. Jester's Avatar
      Jester -
      I find high end optical sensors quite good, especially 3366. After trying 3366 I can't go back to anything else. But I don't know a whole lot about mouse sensors.

      There are two other things I'm considering adding to this article:

      • Processors/RAM - PS4 uses shared GDDR5, graphics memory is already beyond GDDR5, why are CPUs/RAM still only on DDR4?
      • AIO Water Coolers - I'm cutting them some slack since this market is actually growing quite rapidly, replacing high end air coolers. We finally have good quality ones from Swiftech, EK, and Alphacool, but all are only using DDC pumps. Time for some higher end models using D5 pumps? Quick disconnects need to be a standard for them too, at least for 240mm and above.
    1. Charcharo's Avatar
      Charcharo -
      At least there is light at the end of the tunnel for monitors. Sound? Not so.
    1. Jester's Avatar
      Jester -
      Quote Originally Posted by Charcharo View Post
      At least there is light at the end of the tunnel for monitors. Sound? Not so.
      There's still a chance that OLED never takes over. But at least we'd get some really good LCDs, and by 2018 most likely.
    1. strudinox's Avatar
      strudinox -
      Good read, especially the part about sound cards. They're pretty exclusive now days and they're used by pretty much nobody. I wonder though if the lack of diversity and competition is fueled by these large corporations hoarding patents making it almost impossible for start ups to gain any traction?
    1. Jester's Avatar
      Jester -
      Quote Originally Posted by strudinox View Post
      Good read, especially the part about sound cards. They're pretty exclusive now days and they're used by pretty much nobody. I wonder though if the lack of diversity and competition is fueled by these large corporations hoarding patents making it almost impossible for start ups to gain any traction?
      That is 100% the case with sound cards. OpenAL used to be open source, then Creative bought it and patented it down. Although there is still an open source implementation of it called OpenAL Soft that is still way better than everything else out there.

      The ability to process OpenAL instructions via hardware is patented by Creative too, of course, and only their X-Fi processors (no longer used in today's sound cards) can do it.

      Patents = death
      Open source = life