Jump to content

Katness Everdeen

Member
  • Posts

    118
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Katness Everdeen's Achievements

  1. No they're not, the 290 consumes a tiny bit more power than the 780 and is a tiny bit faster. The power consumption claims are vastly over-exaggerated.
  2. The H240-X is out and it's awesome ! You're looking at the best performing 280mm closed loop water cooler of all. It's the coolest, quietest and most importantly the most modable ! You can expand the loop as much as you want by adding more rads, resevoirs and tubing to add a GPU or multiple to your loop. The pump is powerful enough to insure adequate flow into a total area of 840mm, which is a hexa 140mm radiator setup. source
  3. If it's true that it's going to be 20nm then AMD wouldn't even have to bother with efficiency. A 290/290X ported to 20nm would automatically become more efficient than the 900 series thanks to the smaller process.
  4. Total power consumption of the system doesn't go UP with a 980 compared to a 680 if the 980 consumes less power. Logically, total power consumption should go DOWN if you replace a 680 with a 980 by 30W according to Nvidia's official power figures.
  5. .http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/23 Again to everyone that bought the "TDP is not power consumption" argument. Well obviously the press and Nvidia disagree with you and even Anandtech's editor fell for it even though his own test results clearly dispute this. So according to Ryan Smith the GTX 980 consumes less power than the 680... that's what he said in his conclusion, completely ignoring the test that he has done and relying on Nvidia's manipulated TDP figures. I mean I don't see how else to say this, but Nvidia fooled everyone it seems.
  6. It's very funny when you think about it but Nvidia actually sent both of the following images to the press to cover the Maxwell launch. And the images below have conflicting information. These figures are for the total system power consumption. So they're naturally going to be higher than the GPU's power consumption alone. That's not the issue. However compare the GTX 980 and the GTX 680. According to Nvidia the GTX 980 consumes 30W less but we clearly see in the graph above that the GTX 980 actually consumes 12 watts more than the GTX 680. Which actually makes it a 207W card not a 165W card. Then Nvidia does some questionable math to arrive at the 2X performance/w improvement headline. Nvidia divides the TFLOPs by the power to get the performance per watt. So 5 TFLOPS/165W = 30 GFLOPS/W which is twice what the 680 is capable of. Which is simply wrong. Now if you argue that 165W TDP is actually the heat output for the GTX 980 and not power consumption and that the TDP (Thermal Design Power) actually doesn't have to represent power consumption. Then there is something very clearly wrong here because you can't use a measure of heat to get performance/w. The TDP obviously represents power consumption to Nvidia because they used it to measure performance/w. Again performance/TDP = 5 TFLOPS/165 = performance/w = 30 GFLOPS/W . http://videocardz.com/52552/nvidia-geforce-gtx-980-and-gtx-970-press-slides-pictures-charts
  7. 1- Outdated display controllers not drivers. hence the reason why their GPUs can't support Adaptive Sync and need a GSync module acting as a display controller. 2- Completely unrelated & frankly off topic. 3- No they were NOT right, a compatible scalar did exist & just because a piece of hardware needs a firmware update doesn't mean that the hardware doesn't exist. 4- They DID lie because #1 the standard HAS been updated to support adaptive sync, #2 a compatible ASIC had already existed #3 apparently "nearly impossible" in Nvidia's dictionary means it will happen in a few months.... http://www.pcper.com/news/Graphics-Cards/AMD-Demonstrates-Prototype-FreeSync-Monitor-DisplayPort-Adaptive-Sync-Feature
  8. Maxwell 2.0 is 28nm, AMD is still making 28nm chips. Looks like 28nm is here to stay and 20nm doesn't sound very exciting either. Lower power but also lower clocks. 16nm should bring a true next-generation performance jumps and perhaps even make 4K gaming playable on single gpu cards.
  9. If you can afford 2 780 ti's then get an R9 295X2 it beats everything under the sun even 2x780 Ti's in SLI. All displayports support an audio pass through .
  10. Get an FX 8320. It simply has an unmatched performance/price. Also get an R9 280 instead of the GTX 760. You get better performance, 3GB of memory, Mantle and three games int he never settle bundle which you get to choose. R9 280 = HD 7950
×