Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

xAcid9

Member
  • Content Count

    10,085
  • Joined

  • Last visited

Awards

This user doesn't have any awards

About xAcid9

  • Title
    えなこ ❤
  • Birthday August 1

Contact Methods

Profile Information

  • Location
    Malaysia
  • Gender
    Not Telling

System

  • CPU
    Intel Xeon E3-1230V2
  • Motherboard
    MSI B75A-G43
  • RAM
    G.SKILL RipJawsX 4x 4GB
  • GPU
    Zotac GTX 980 Ti AMP! Omega 6GB
  • Case
    Cooler Master HAF EVO XB
  • Storage
    Plextor PX-512M5Pro 512GB
  • PSU
    Enermax NAXN+ 650w 80+ Bronze
  • Display(s)
    Samsung 48" J5000 TV
  • Cooling
    CM Hyper 212x
  • Keyboard
    Cooler Master Quick Fire TK (Blue)
  • Mouse
    Logitech G600 White
  • Sound
    Fiio E07K
  • Operating System
    Windows 10 Pro x64

Recent Profile Visitors

9,847 profile views
  1. 3080 for 4k. Or wait for the rumored 3080 Ti. 6800/6800XT is best for 1440p imo.
  2. Doesn't matter if it not optimize for it. Just like AMD GCN "raw" compute power.
  3. Both have different ways of doing it, if one game favor one way then the other will suffer.
  4. 3080 at 4K or higher. 6800 XT below 4K.
  5. Simply because the performance drop is horrible and what it give in return is barely noticeable in some cases.
  6. Depend on what tools and scenario they used to measure power consumption. Yes, they've been doing this for years for consistency. You can check their Furmark result for max out power consumption. Not really, you're often hit CPU/memory bottleneck in high fps gaming. CPU bottleneck = Less work for the GPU
  7. remove the shroud and strap a couple of good 120mm fans.
  8. Lower temp if you open the side panel? - If yes this indicate airflow restriction. If not then your cooler/heatpipe probably broken or didn't make proper contact. I hope you cover the whole die with thermal paste and not using the pea/rice/line/x method.
  9. Yeah, i just watched that. RIP. Nice overclocker though.
  10. Any review tested the new encoder? According to Linus, still garbage.
  11. In before power consumption doesn't matter. because Nvidia lost in power consumption this time.
  12. Really? Works for me or you can set it in Nvidia Control Panel.
×