Jump to content

silikone

Member
  • Posts

    6
  • Joined

  • Last visited

Everything posted by silikone

  1. Ideally, you would use both pieces of integrated circuits. I believe there was a third-party solution advertised on some motherboards that could leverage the power of the iGPU to augment high-end gaming, but I heard it had problems. Now, we have modern API's with more flexible dual-GPU possibilities. Couldn't DX12 be worked to take advantage of it natively?
  2. I may need to test this one day, but there are many old games struggling on my HD 530 that I ran perfectly fine on my 8800 a long time ago. Granted, being a modern D3D11-compliant chip, Intel does have an edge in practice, and direct comparisons between them are not honest. It is obviously held back by the slow memory.
  3. Sure, but my point is that it is still a product of the future. In the here and now, Skylake is still the last notable release, and it's like, four years old now? As it is now, a 13 year old 8800 GTX can still beat out the puny silicon included with almost every mid-tier CPU of today. Yeah, I neglected AMD, mostly because I have zero experience with their iGPU's, which is quite telling of the unfortunate positions on the market.
  4. As my GPU died, I went back to using my Intel HD 530, which is surprisingly capable in some games. Checking up on how it stacks up against others, I noticed that there have not been many improvements on the Intel HD side of things. On paper, the differences are minimal, if not completely absent. Only with the upcoming Ice Lake does it seem like we are finally seeing a leap, and a big one at that. Yeah, this is a rather moot subject to talk about, but I am personally fascinated by low-spec gaming, as it is very relevant to budget laptop gaming.
  5. This fallacious frame rate analogy seems to be repeated again and again. First of all, it's misleading to state that there is a lack of support for ultra widescreens. If there were no support, the game would either not allow you to play at a 21:9 resolution, or it would allow it while exhibiting various artifacts. The frame rate is a single axis that can go either up or down. 120Hz is a direct improvement over 60Hz, and the benefits of going even higher, while negligible, are theoretically indefinite. Screen aspect ratio however is not merely something that lies somewhere on a line from high to low. A monitor's width is a relative term, so while it's correct to assert that a monitor is wider than another, it is equally correct to assert that a monitor is taller than another. I'd actually wager that the latter is more true, as many video signalling standards are made to cap at 4K, serving as a reference point to anything lower, but I am getting ahead of myself. Wider is merely "different" rather than "better", and favoring one means that others will be disadvantaged. There are really four routes to take for this: 1. Favor width, and put tall monitors at a disadvantage. 2. Favor height, and put wide monitors at a disadvantage. This is common in games made when CRTs were still dominant. 3. Favor something in between so that no extreme will get a significant advantage or disadvantage. This is what Overwatch is doing. 4. Favor nothing by lifting all restrictions and allowing the player to adjust the field of view as they are pleased. The last option appeases everyone, but it legitimizes ludicrously distorted projections caused by pathological FoV settings, and many consider it to ruin the competitive spirit. As for the other options, it really comes down to what works best for most. Because the market is dominated by 16:9, it is what Blizzard chose. A decent compromise to solve the motion sickness problem right now would be to increase the FoV limit to 120, which would give 21:9 users everything that can be seen in 16:9, while allowing those using 16:9 to see even more than they do now.
  6. I am using a GTX 760 and am disappointed that I can't get 60 FPS in the latest games. I am extremely sensitive to drops.
×