Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

1 Follower

About Tanaz

  • Title


  • CPU
    AMD Ryzen 7 3700x
  • Motherboard
    ASUS X570 TUF
  • RAM
    2X16GB HyperX Predator 3200mhz
  • GPU
    MSI GTX 1080 Ti Gaming X
  • Case
    Thermaltake Urban R31
  • Storage
    1x Kingston A2000 256GB NVME 1X A-Data SX8200 Pro 1TB 3x Kingston UV500 240GB
  • PSU
    Corsair RM650x
  • Display(s)
    Main:Alienware AW2521HF 1080p 240hz
    Secondary: Acer Predator XB271HU 1440p 165hz
    Third: Samsung C34F791 UltraWide 1440p 100hz
  • Cooling
    Noctua NH-D15
  • Keyboard
    Ducky Shine 7
  • Mouse
    Razer Deathadder V2
  • Sound
    Logitech Z906 / HyperX Cloud 2 / Rode NT-USB
  • Operating System
    Windows 10 Pro / Any Linux Distro when bored

Recent Profile Visitors

901 profile views
  1. Sounds to me like a bad motherboard. The chance of having multiple components fail on you at the same time is slim to none. Your board is probably spazzing out and throwing out random codes. Difficult to troubleshoot without having a spare board. What you can try doing is updating the BIOS but other than that you should probably get it back to the manufacturer / seller for repair or replacement.
  2. Upgradeability and repairability are two sectors that the manufacturers are actively trying to kill. You can't even get GDDR memory from Micron / Samsung for repairs. The goal is to keep making you buy their newest products year in and year out. Before that was achieved because generational technological leaps were so huge that upgrading was a no-brainer and 2 gen old hardware was obsolete. Right now all of that has stalled to a crawl so the only way to keep the customer invested is to simply make products that are harder to repair and have shorter effective lifespans.
  3. Ultrawide high refreshrate monitors are not 4k. They're 3440x1440 which is considered 2k Ultrawide. If you go with that you'll push much more FPS than with a 16:9 4K monitors as it contains way less pixels than an actual 4K panel. It all depends. People say that the 3080 is 4K 144hz GPU but it really isn't when it comes to triple A games. I'd either go with 2k high refresh rate or if you want the detail you can go with a 4K panel but keep in mind you won't be able to hold 140FPS+ in AAA games.
  4. If you want the same resolution but higher refreshrate a great option is the BenQ MOBIUZ EX2710. If you want to make the jump to 1440p the BenQ EX2780Q is very good but it is quite a bit more expensive.
  5. You'll need a different board for a different type of panel most likely. Also it really REALLY isn't worth the money. Replacing a broken panel alone is way too expensive in most cases when you account for the panel cost + shipping from China. Upgrading is also not worth it because you'll not only need the new panel but a compatible controller board and (maybe) a new power supply depending on what the power requirements of the new panel are. So in short - just buy a new monitor. The juice ain't worth the squeeze.
  6. Many people and tech reviewers in particular have praised the hell out of Ampere, dare I say embarrasingly so. Realistically the new Nvidia architecture is one of it's least efficient and least improved ones yet. The power consumption of the new cards is linear to their performance improvement which points to no architectural generational improvement whatsoever. People keep drawing false comparisons between the 3080 and 2080Ti which was insanely ovepriced and leave the 3090 completely out of the question because it is " a Titan level card" . No, it is not a Titan card - hence it doesn't have a
  7. It is tough now and it's unrealistic to assume there will be many (or any) 2080Ti's going for less than a 3070. Once February-March hits and the 3070s and 6800s are widely spread then we can expect a price drop below the 3070 MSRP but even still I don't see a reason for people to part ways with their 2080Tis given that the difference to a 3080 is not worth it at all especially when you take into account the additional 150W of consumption.
  8. Оh he has a 1080Ti I didn't even see. Yeah I wouldn't (and didn't) upgrade.
  9. The whole power consumption argument is kinda sorta not relevant. The 3080 was supposed to pull 320W but the high end versions pull 400+. I'm not buying the 220W TDP and even if it's the case a 2080Ti Strix is pulling 275W which is neglegable and we're talking about a decked out power delivery. I 100% would buy a second hand (under warranty) 2080Ti if it's even 50-100$ cheaper than a new 3070. The VRAM alone is enough to justify it IMO.
  10. There is a noticeable difference especially at 1440p. My best friend has a 1070 and I'm running 1080Ti and I can safely say that for 1080p I wouldn't deem it worth the upgrade but for 1440p it 100% is. 170$ is not terrible considering the performance bump you'll get at around 40-50% FPS increase.
  11. Are you talking about Geforce Experience or OBS ? Also what bitrate and resolution are you running? If it's the driver indeed you can uninstall the newest driver with DDU and install the version before it.
  12. I'd strongly recommend against enabling GPU Scheduling. Are both versions of Windows using the same driver version? Might be a power management issue or something else entirely unknown. Windows 10 updates' are like a box of chocolates - you never know what you're gonna get (but instead of chocolates it's trash)
  13. You can set a +100 overclock to the VRAM which will effectively make it run at 7000 (mhz x2 is how vram frequency works). It is strange but it might be a weird BIOS tuning from the manufacturer. What brand is your 1660?
  14. You should check the VRAM usage during gaming. If you're exceeding the VRAM capacity that might be a possible reason for the memory to downclock itself. Other than that I'm hardpressed to see a problem ( maybe it's PUBG , test other games to see if problem is repeatable in them)
  15. What GPU? What system? What operating system? How long has this problem existed for? What's the measuring tool with which you determined the Vram is running at set mhz? What are GPU temperatures? What games/applications are you running?