Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined


This user doesn't have any awards

About Consul

  • Title

Profile Information

  • Location
  • Gender


  • CPU
    Intel Core i7 8700k
  • Motherboard
    MSI Z370-A PRO
  • RAM
    CORSAIR Vengeance LPX 16(2x8)GB DDR4 3000MHz
  • GPU
    MSI GeForce GTX1060 GAMING X 6GB GDDR5 192Bit
  • Case
    CORSAIR Graphite 270R
  • Storage
    Samsung 850 EVO 500 GB SATA 3 SSD
  • PSU
    CORSAIR TX750M 750W 80+ Gold
  • Display(s)
    ASUS 24" VG248QE 1ms 144Hz
  • Keyboard
    Razer Ornata Chroma
  • Mouse
    Razer Deathadder V2
  • Operating System
    Windows 10
  • Phone
    Sony Xperia Z5

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Well I said it about "majority of people", so are majority of people NVIDIA fanboys? If it's a business then why would NVIDIA keep releasing GPUs if they can't profit from it through people buying them?
  2. Why are they doing a refresh of a series so soon while the majority of people can't even get their hands on the normal versions to begin with?
  3. I have checked the PSU, no coil whines there. I've done some test on 3DMark and didn't hear any coil whine during the tests so I guess I should use the card a bit more because I feel like its going to go away at some point. I also don't hear the coil whine when my I put on the headset.
  4. Yesterday I got my hands on the card. I noticed that it was coil whining when I get high frames though it has a weird pattern. When I play Yakuza Kiwami 2 with SSAA x1.25 or above it doesn't coil whine (I get around 140 fps), but as soon as I turn it down to SMAA or lower it starts to coil whine. The card also coil whines no matter what in Red Dead Redemption 2 (I get above 100 fps with Hardware Unboxed optimized settings). Is this normal? Could it also be the PSU? (I have a Corsair TX750M, bought it in 2017)
  5. Seems like I'll be going with the 3070 whenever it's available then, thanks for the answers!
  6. Is it worth waiting for the 3070ti (looking at its rumored specs) or should I just buy an ASUS TUF 3070 OC whenever its available? My specs: i7 8700k, 16 GB DDR4, 1060 6GB, 1080p monitor
  7. I've heard that its basically Avengers version of Destiny 2.
  8. There are rumors here and there about a 3080 20GB being announced/released after Big Navi but of course... they are rumors.
  9. So the 3080 20GB will be in the scale of buyers that can easily go for 3090 or am I wrong? If what you say is accurate, will 3080 20GB be a cheaper 3090?
  10. That clears everything up if that's the case in next-gen consoles. That is true, however there are speculations about a big price difference between 10gb model and 20gb model and I don't think I will really need a 20GB GPU.
  11. As a person who is thinking about buying a 3080 at some point, I'd like to ask a question. I've been seeing most people say that 10GBs of VRAM is not enough for the near future (2021-2022) I now have a 1080p monitor, I'll be using the 3080 with that first, and when I get the money, I might upgrade to a 1440p 144hz monitor. So the questions are: Is 10GB's of VRAM that is in the 3080 enough? Does GDDR6X make up for the shortage of VRAM when compared to 2080ti's 11GBs of VRAM?
  12. After reading this, I'm very happy that I went for an 8700k. Even though its just 2 FPS, on RDR2, it beats the i9 10900k when overclocked (and it is on the exact same level in terms of average fps when not overclocked)
  13. Also I'm hyped to personally see the difference between a 3080 and my current 1060 6GB. It's going to be amazing.
  14. I've watched that video a while ago, and now that I see what you've written too, here's my conclusion: There is no way an 8700k CPU will be bottlenecking a 3080, especially when overclocked. Even if there is, it will be really small that it's unnoticable. (Mine is not overclocked, but I use my PC on High Performance power settings and it goes up to 4.5 GHz, and I feel like it is enough for a 3080 at this point)