• Computer Hardware Industries Desperately Behind the Times

    We talk a lot about monitors and display technology in general here at GND-Tech. Display technology in general is not where it should be, but computer monitors have it the worst. To those without much knowledge in this area, it would be beneficial to first read this.

    Brief history lesson: LCD technology is what totally replaced CRT. This brings up the first issue; this shouldn't have happened. LCD taking over for lower end markets is fine, but it's not really suitable for any enthusiast gaming or movie/TV watching. A much better technology was proposed to succeed CRT, and that was FED or Field Emission Display. SED, or Surface-Conduction Electron-Emitter Display, was born of this. Both technologies are far superior to LCD in virtually every way, quite close to OLED in fact with some benefits. They are like CRT taken to another level: Rather than having a single cathode ray tube (CRT), both FED and SED utilize a matrix of tiny CRTs for each individual pixel.

    Self-illuminating pixels, which SED/FED and OLED technology offer, is the best type of display. It gets you true blacks, since the pixels will emit no light in black scenarios, thus they also bring much higher (virtually infinite) contrast ratios. Unlike LCD which uses a static edge-mounted or back-mounted light to power the display, making it impossible to properly display blacks and greatly limiting contrast. SED/FED would have been a much more real looking display, more like a window than a screen, also the natural evolution of CRT.

    Thus, the benefits over LCD are superior contrast and blacks, perfect luminance uniformity, superior color uniformity, superior color accuracy (better than OLED in this regard as well), faster response times (1 ms with no overdrive artifacts), and apparently greater power efficiency as well. Downsides may have been less potential peak brightness and temporary image retention on static images that require counter measures to avoid.


    A prototype Canon SED TV in 2006.

    Many enthusiasts are strong advocates for OLED, but had FED and/or SED taken over instead of LCD, even if not completely, (starting from high end market and eventually working its way down as OLED is), we wouldn't be extremely excited for OLED.

    So why didn't FED or SED take over? The main reason is cost. It was and would have remained more expensive to produce than LCD. The only reason OLED has a chance of eventually succeeding LCD is not because it's significantly better in essentially every way, but because it may eventually be cheaper and simpler to produce.

    But FED and SED weren't the only panel technologies superior to LCD. Remember Plasma? Not suitable for monitors as they had to be large, but much better than LCD for TV. There was of course misconception about Plasma, mainly regarding "burn-in" but by the end of its lifespan only temporary image retention from static images was present and the TVs had features to clear it. The benefits of Plasma were CRT-like greatly superior motion clarity/response times (some of them refreshed their phosphors at 600 Hz which didn't create input lag like modern motion interpolation), and much better image quality with deeper blacks and contrast ratios potentailly exceeding 20,000:1.

    But again, LCD was cheaper to make and more economical/efficient than Plasma, so Plasma bit the dust. Irrelevant for computer monitors but this kept the TV industry lagging behind where it should be.

    OLED isn't a very new technology. It's taking way too long for it to be introduced to the computer monitor industry. LG has an OLED TV lineup and Panasonic released one OLED TV using an LG panel, that's it. The industry should have pushed harder for OLED, not drag out its release.

    But even LCD technology in computer monitors is, for the most part, years behind that of the high end TV industry. The "best gaming monitors" of today, such as all high refresh rate models, are all using edge-lit backlighting while very high end TVs are using full array backlighting (the LEDs are mounted behind the panel rather than on the edges). Edge-mounted lighting is greatly inferior as it produces heavily flawed and inconsistent brightness uniformity across the screen and has backlight bleeding issues (light "spilling" onto the screen). But even edge-lit TVs have far less bleeding and uniformity issues because they have superior QC.


    A defective Eizo Foris FG2421 and the horrifying backlight bleed it had.


    The same FG2421, showing the effects in game.

    Where are the full array gaming monitors? But that's not the only deficiency in gaming monitors, not even the biggest one. The biggest problem with LCD computer monitors, aside from QC, is that high end ones are only using TN or IPS panels. TN is the ideal LCD for super high refresh rate competitive gaming monitors, like the ASUS 180 Hz and 240 Hz monitors, since image quality is almost irrelevant there. But IPS is a useless waste of time at least without full array local dimming with hundreds of dimming zones, especially the rather poor IPS panels used by IPS gaming monitors (144 Hz models especially). These IPS monitors set new records for highest amount of IPS glow ever, it ruins image quality rather than enhances it compared to TN.

    So IPS without extreme local dimming has no real use in gaming monitors. It's meant to look better than TN but it looks worse in dark scenes, and the difference is negligible elsewhere over the 1440p 144 Hz TN monitors. Both have only around 1000:1 static contrast, viewing angles don't matter that much (as long as they're good enough for head-on viewing), color accuracy is very similar at least after calibration.

    For immersive gaming, VA is the temporary solution until OLED finally makes an appearance, unless the upcoming 4k 144 Hz IPS monitors with 384 dimming zones prove better. But VA monitors have been so problematic, often having peaks of 40 ms or more response times leading to some nasty trailing/streaking behind moving objects in certain color transitions. If not this, then they often have excessive overdrive leading to inverse ghosting. They tend to have only 2500:1 to 3000:1 static contrast, a big improvement over the 1000:1 from IPS and TN but when the average $700+ TV has 5000:1 contrast today, and higher end ones might have 7000:1, it is upsetting.

    Another issue is, why do TVs get less intrusive glossy coatings or better, while monitors always get blurry damaging AG coatings? See this picture below for reference.




    An anti-glare coating removed. This is why they ruin image quality so much.

    Before you mention glare and reflection issues, higher end TVs (in $700 and above price range) have the best of both worlds with their anti-glare glossy coatings. AR treated glass is used on some very high end models, like the LG E6 and upcoming E7.

    To illustrate the differences between the best gaming monitors, the aforementioned Samsung high refresh rate VA, and very high end VA TVs, see the table below. OLED isn't even included since that's unfair for LCD.


    There is a consistent pattern here. The TVs are designed around picture quality where they are lightyears ahead of any monitor, while the monitors are designed more around high refresh rate fluidity. The fact that the TVs for some reason still use PWM backlighting highlights this the most, since the TVs actually have good response times (as do the monitors). Note that the backlight isn't visibly flickery on the TVs, but it harms motion clarity somewhat.

    But as of 2017, there is some hope. The ASUS ROG Swift PG27UQ, Acer Predator XB272-HDR, and an AOC 32" equivalent have been announced. These are 27" (32" for the AOC) 3840 x 2160 144 Hz monitors via DisplayPort 1.4 (so DSC "lossless compression" is used), combining an AU Optronics AHVA (IPS) panel with full array local dimming and 384 dimming zones. Furthermore they are designed around DCI-P3 color space, which has 25% more colors than sRGB. While this will only oversaturate sRGB content like video games, the oversaturation is not grotesque and for all we know these monitors will have a good sRGB emulation mode. These monitors are also equipped with quantum dot technology, peak 1000 nits brightness (800 nits for the AOC), HDR support, and G-SYNC. This may surpass VA, although the lingering issue is the haloing or blooming effect caused by 384 dimming zones being insufficient (per-pixel dimming is ideal, which equates to 8,294,400 'dimming zones' like a 4k OLED screen). The halo/bloom effect will be worse (stronger) on these IPS monitors compared to a VA screen with the same amount of dimming zones, because IPS subpixels are worse than VA for controlling light (hence why VA static contrast is anywhere between 2.5x and 7x as good as IPS depending on the exact panel).

    To summarize all of this, computer monitors are very far behind TVs, and TVs are behind too. LCD should ideally be a dead technology by now. The VA LCD panels used by TVs are overall far nicer than what gaming monitors use, although gaming monitors do have as good motion clarity as one can expect from the fundamentally flawed LCD technology (although there is still lots of room for improvement when looking past LCD). Our only hope is an OLED takeover.


    Comments 8 Comments
    1. Enad's Avatar
      Enad -
      sux m8.

      I can concur on the monitor front, my VA Samsung TV looks far superior to my BenQ BL3200 VA Monitor.
    1. Jester's Avatar
      Jester -
      Quote Originally Posted by Enad View Post
      sux m8.

      I can concur on the monitor front, my VA Samsung TV looks far superior to my BenQ BL3200 VA Monitor.
      For what it's worth, that gap would be bridged quite a bit if you removed the AG coating on the BL3200PT (not that I recommend attempting this until you no longer need the monitor). Honestly AG coatings are probably what annoy me the most about monitors. Unless OLED takes over, we're going to be stuck with AG coatings if we also want high refresh rate and reasonable size (under 40").

      Seriously, I'd shut up if we had some kind of masterclass LCD monitor. 27-32" 3840 x 2160 120 Hz SPVA quantum dot monitor (DisplayPort 1.4) with full array WLED backlighting, at least 384 local dimming zones (areas around the screen where backlight is controlled dynamically based on content, leading to at least 25,000:1 zone contrast ratio), 5,000:1 - 7,000:1 static contrast, AR treated glass coating, HDR-10, 10-bit color with both >= 100% sRGB and DCI-P3 modes, good out of the box setup, fast response time, excellent strobing method with good lower refresh rate support (e.g. Eizo Turbo240 + the zone/scanning method seen on the Samsung CFG70), G-SYNC and/or FreeSync. It's not OLED, but it'd be good enough... almost.
    1. Grompz's Avatar
      Grompz -
      Since you have mentioned keyboards, I think gaming mouses also deserve a mention. Most of self-proclaimed gaming mouses have inaccurate sensors and cheap plastic build, they are also designed for ambidextrous persons, even some of the most expensives ones, which, in my opinion, is not as comfortable as a mouse fully designed for right-handed or left-handed user.
    1. Jester's Avatar
      Jester -
      I find high end optical sensors quite good, especially 3366. After trying 3366 I can't go back to anything else. But I don't know a whole lot about mouse sensors.

      There are two other things I'm considering adding to this article:

      • Processors/RAM - PS4 uses shared GDDR5, graphics memory is already beyond GDDR5, why are CPUs/RAM still only on DDR4?
      • AIO Water Coolers - I'm cutting them some slack since this market is actually growing quite rapidly, replacing high end air coolers. We finally have good quality ones from Swiftech, EK, and Alphacool, but all are only using DDC pumps. Time for some higher end models using D5 pumps? Quick disconnects need to be a standard for them too, at least for 240mm and above.
    1. Charcharo's Avatar
      Charcharo -
      At least there is light at the end of the tunnel for monitors. Sound? Not so.
    1. Jester's Avatar
      Jester -
      Quote Originally Posted by Charcharo View Post
      At least there is light at the end of the tunnel for monitors. Sound? Not so.
      There's still a chance that OLED never takes over. But at least we'd get some really good LCDs, and by 2018 most likely.
    1. strudinox's Avatar
      strudinox -
      Good read, especially the part about sound cards. They're pretty exclusive now days and they're used by pretty much nobody. I wonder though if the lack of diversity and competition is fueled by these large corporations hoarding patents making it almost impossible for start ups to gain any traction?
    1. Jester's Avatar
      Jester -
      Quote Originally Posted by strudinox View Post
      Good read, especially the part about sound cards. They're pretty exclusive now days and they're used by pretty much nobody. I wonder though if the lack of diversity and competition is fueled by these large corporations hoarding patents making it almost impossible for start ups to gain any traction?
      That is 100% the case with sound cards. OpenAL used to be open source, then Creative bought it and patented it down. Although there is still an open source implementation of it called OpenAL Soft that is still way better than everything else out there.

      The ability to process OpenAL instructions via hardware is patented by Creative too, of course, and only their X-Fi processors (no longer used in today's sound cards) can do it.

      Patents = death
      Open source = life