Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About Humbug

  • Title
    Junior Member

Profile Information

  • Gender
    Not Telling

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Well apparently AMD is announcing RDNA 2.0 GPUs next month at CES. Hopefully those are high end parts.
  2. I don't think they are convinced that low end is the way to go. I think they have limited 7nm supply from TSMC and have decided that they can make much more money if they use those dies to product Ryzen, Threadripper and Epyc. Even amongst CPUs they are prioritizing Epyc, then Threadripper, so the supply of bigger Ryzen chips is also not enough to cater to the demand. Hopefully within the next few months they will get ample supply and no longer have to choose between CPU and GPU markets. They have done most of the hard work creating RDNA, which is so much better than GCN for gaming, it would be a shame to not produce high end chips. LOL Going by AMD's recent pattern they will match the 2080ti once Nvidia launches the 3080ti.
  3. It seems like now AMD is making money on CPUs and they want to keep most of their limited 7nm supply for CPUs. GPUs aren't as lucrative for them. I too wanted them to start making high end RDNA GPUs in 2019 but it seems like the announcement has got pushed to January 2020 CES. They have been increasing their R&D budget too as they are profitable now. Hopefully as 7nm ramps up next year to higher volumes we can get back to the days when they offered an entire line up from low end to high end on the same base architecture. Their gaming offerings are too fragmented now between Polaris, Vega, Navi. Same with the naming schemes, they need to unify everything and send a clear message to buyers- this is what we offer, something for every price range. It's a mess right now.
  4. Still use mine. I love the feature where you can easily browse between different community made configurations from in-game using the steam overlay. It's a clever solution to bridge the gap between the fact that it's the most customizable controller and the fact that most people just want to jump in and play without tinkering themselves. Hope they make a V2.
  5. I didn't know that stuff like this was still happening; AMD CPUs being detected via the vendor ID and then suboptimal methods being used.
  6. It's not a demo. It will be a proper feature length game. Not the Vive in particular. Valve wants a healthy VR headset and games ecosystem around Steam. The better PC VR does in general the better for Valve. Most of their money is via Steam and the VR ecosystem is another innovation to drive that forward.
  7. Vulkan and DX12 have explicit multi-GPU support in the API. It's different from crossfire and SLI. The game developer has to specifically program for it to spread the load across multiple graphics chips.
  8. Update - 10/11/2019 - turn on async compute for Vulkan Some users including hardware unboxed had noticed some hitching/stuttering under vulkan. It seems like Rockstar had forgotten to turn async compute on. It needs to be manually enabled in the config files and makes the frametimes smoother. This should help for GCN, RDNA and possibly even Turing. open the below file Documents\Rockstar games\Red Dead Redemption 2\Settings\system.xml look for the below line <asynComputeEnabled value "false" /> Change it to true and save
  9. The other possibility is that since this game was built from the ground up as a modern graphics API game with a programming paradigm closer to vulkan / dx12 and no traditional dx11 backend; that RDNA, Turing and GCN are just better equipped at an architectural level than pascal is.
  10. I guess AMD is now in a position where they can charge more. They have delivered multiple consecutive generations of good CPUs now, so their brand value and mindshare has increased. AMD now knows that they have to make as money as possible in the next two years while Intel is still getting their shit together.
  11. The question is- For those of you who have played it on PC- is it the best looking game to date technically? Hard to judge from compressed online videos and static screenshots, games always look better in person. If the graphics horsepower has been put to good efficient use then it's very justified IMO.
  12. Ya, even in the guru3d comments under this article there are lots of people accusing nvidia of gimping. I think many people don't understand that what is going on is not active gimping, it is just that the nvidia software engineers who have limited time are spending their manhours trying to make Turing as fast as possible in the latest AAA games. Older generations don't get the same attention because they are not trying to convince people to buy them anymore, that doesn't mean they are gimped or intentionally slowed down- just less optimized.