• Computer Hardware Industries Desperately Behind the Times


    Here, we are speaking of both Intel and AMD platforms. Due to Intel's significant lead over AMD within the past seven years (in the desktop environment), they have gotten extremely complacent. DDR4 RAM is new to desktop computers, while GPUs have had GDDR5 (quad data rate) for years and some now use GDDR5X (faster GDDR5) or even 3D stacked memory? This is one of the things that triggers us the most. If the industry desired, there could have been DDR5 RAM already in desktop computers.

    Then there is the very slow, incremental improvements Intel has been showing with price increases as well, but AMD is to blame here as they haven't been able to keep up. This has led to Intel limiting their own processors and gimping them more than ever. They are limiting overclocking more than ever, and have reduced themselves to no longer soldering the IHS (integrated heat spreader) onto their more mainstream CPUs, using cheap thermal glue instead, leading to temperature limited overclocking. Those that want to avoid this will have to pay more for their highest end X platform (Broadwell-E) where they still solder the IHS to the CPU die which leads to much better contact and thus temperatures, the process that used to be standard across all of their CPUs. But then Broadwell-E has inferior clock speeds and IPC performance to Skylake, causing Skylake to still be superior for the vast majority of video games. We are all caught between a rock and a hard place.

    They also limit the i7 5820k and i7 6800k (last and current gen lowest end desktop X99 processors, $389 and $434 MSRP respectively) to 28 PCI-E lanes, still not enough to run two GPUs at full x16 bandwidth. If you want to use two GPUs, you better spend $600 on their higher end processor with more PCI-E lanes! Spending more just for X99 isn't enough, you have to get at least their $600 processor as well, since a few of today's cards and many future ones will strongly benefit from full PCI-E 3.0 x 16 bandwidth.

    What we would have liked to see is competition from AMD. There is some hope for Zen at least. Competition leading to Intel not gimping their own CPUs so much, Intel still soldering the IHS to the die on all of their processors, Intel not being so cheap with PCI-E lanes on Broadwell-E, Intel not gimping clock speeds so much. Furthermore, we would have liked to see X99's successor use 3D stacked memory and X99/Z170 using DDR5 RAM. And perhaps Z270 should ideally use a superior DDR5 solution equivalent to GDDR5X?

    There aren't signs of this changing any time soon. Early performance tests indicate that Zen won't be enough to create a drastic change, and Intel won't change for the better on their own.

    Comments 8 Comments
    1. Enad's Avatar
      Enad -
      sux m8.

      I can concur on the monitor front, my VA Samsung TV looks far superior to my BenQ BL3200 VA Monitor.
    1. Jester's Avatar
      Jester -
      Quote Originally Posted by Enad View Post
      sux m8.

      I can concur on the monitor front, my VA Samsung TV looks far superior to my BenQ BL3200 VA Monitor.
      For what it's worth, that gap would be bridged quite a bit if you removed the AG coating on the BL3200PT (not that I recommend attempting this until you no longer need the monitor). Honestly AG coatings are probably what annoy me the most about monitors. Unless OLED takes over, we're going to be stuck with AG coatings if we also want high refresh rate and reasonable size (under 40").

      Seriously, I'd shut up if we had some kind of masterclass LCD monitor. 27-32" 3840 x 2160 120 Hz SPVA quantum dot monitor (DisplayPort 1.4) with full array WLED backlighting, at least 384 local dimming zones (areas around the screen where backlight is controlled dynamically based on content, leading to at least 25,000:1 zone contrast ratio), 5,000:1 - 7,000:1 static contrast, AR treated glass coating, HDR-10, 10-bit color with both >= 100% sRGB and DCI-P3 modes, good out of the box setup, fast response time, excellent strobing method with good lower refresh rate support (e.g. Eizo Turbo240 + the zone/scanning method seen on the Samsung CFG70), G-SYNC and/or FreeSync. It's not OLED, but it'd be good enough... almost.
    1. Grompz's Avatar
      Grompz -
      Since you have mentioned keyboards, I think gaming mouses also deserve a mention. Most of self-proclaimed gaming mouses have inaccurate sensors and cheap plastic build, they are also designed for ambidextrous persons, even some of the most expensives ones, which, in my opinion, is not as comfortable as a mouse fully designed for right-handed or left-handed user.
    1. Jester's Avatar
      Jester -
      I find high end optical sensors quite good, especially 3366. After trying 3366 I can't go back to anything else. But I don't know a whole lot about mouse sensors.

      There are two other things I'm considering adding to this article:

      • Processors/RAM - PS4 uses shared GDDR5, graphics memory is already beyond GDDR5, why are CPUs/RAM still only on DDR4?
      • AIO Water Coolers - I'm cutting them some slack since this market is actually growing quite rapidly, replacing high end air coolers. We finally have good quality ones from Swiftech, EK, and Alphacool, but all are only using DDC pumps. Time for some higher end models using D5 pumps? Quick disconnects need to be a standard for them too, at least for 240mm and above.
    1. Charcharo's Avatar
      Charcharo -
      At least there is light at the end of the tunnel for monitors. Sound? Not so.
    1. Jester's Avatar
      Jester -
      Quote Originally Posted by Charcharo View Post
      At least there is light at the end of the tunnel for monitors. Sound? Not so.
      There's still a chance that OLED never takes over. But at least we'd get some really good LCDs, and by 2018 most likely.
    1. strudinox's Avatar
      strudinox -
      Good read, especially the part about sound cards. They're pretty exclusive now days and they're used by pretty much nobody. I wonder though if the lack of diversity and competition is fueled by these large corporations hoarding patents making it almost impossible for start ups to gain any traction?
    1. Jester's Avatar
      Jester -
      Quote Originally Posted by strudinox View Post
      Good read, especially the part about sound cards. They're pretty exclusive now days and they're used by pretty much nobody. I wonder though if the lack of diversity and competition is fueled by these large corporations hoarding patents making it almost impossible for start ups to gain any traction?
      That is 100% the case with sound cards. OpenAL used to be open source, then Creative bought it and patented it down. Although there is still an open source implementation of it called OpenAL Soft that is still way better than everything else out there.

      The ability to process OpenAL instructions via hardware is patented by Creative too, of course, and only their X-Fi processors (no longer used in today's sound cards) can do it.

      Patents = death
      Open source = life