Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

don_svetlio

Banned
  • Content Count

    32,709
  • Joined

  • Last visited

Reputation Activity

  1. Like
    don_svetlio got a reaction from Piorun in How Good Are Channel Well (CWT) Made PSU's?   
    Ideally you want your PSU made by SeaSonic, Delta or Super Flower. I don't settle for anything less these days
  2. Like
    don_svetlio got a reaction from ghorbani in 860k, FX 6300, i3 4130   
    FX 6300's performance is split along 6 cores whereas the i3's slightly lower performance is split between 2 cores making it 3 - 1.5 times more powerful in most games atm which can only use up to 4 cores. Newer games do sometimes take advantage of the 6 cores (witcher 3) and will have both CPUs running close to each other.

    The 860k is a very budget-oriented CPU and shouldn't be paired with the 380.
  3. Agree
    don_svetlio got a reaction from J.b091 in Do you really need a fancy I5 or even I7 to play games at 1080p?   
    Even the i5 bottlenecks the 1070/1080 in modern games which use 8-16 threads
  4. Like
    don_svetlio got a reaction from Ictinike in Graphics card TDP and Power Consumption Explained   
    So, seeing as there seems to be a bit of misconception about what TDP and Power Draw are exactly and how they correlate so I decided to try and explain it as best I can.
    For testing purposes and in order to avoid brand bias - I will be using the GTX 760 and GTX 960 as my main sources for the numbers and measurements. Here goes nothing.
     

    TDP:
    The thermal design power (TDP), sometimes called thermal design point, is the maximum amount of heat  generated by the Central/Graphics Processing Unit that the cooler attached to it has to dissipate under normal operation. TDP does not account for power viruses (furmark, kombustor, prime95) or any other sort of torture test and can easily exceed a product's rated TDP if one of those is applied.
    Since safety margins and the definition of what constitutes a real application vary among brands, TDP values between different manufacturers cannot be accurately compared. For example, while a graphics card with a TDP of 200 W will almost certainly use more power at full load than a GPU with a 190 W TDP from the same manufacturer, it may or may not use more power than a GPU from a different manufacturer that has a 190 W TDP.
    The dynamic power consumed by a switching circuit is approximately proportional to the square of the voltage:
    P=CV2f
    where C is capacitance, f is frequency, and V is voltage.

    Now, boring math aside. Let me get more into the part of how exactly heat is generated and what affects the TDP of each graphics card, to some extent.

    As we all know - each GPU core has an absurd amount of transistors we hear AMD or Nvidia brag about.
    Nvidia:"Titan X has 8 billion transistors!"
    AMD:"Oh yeah? Fury X has 8.9 billion transistors!"
                       ...and this goes on...
    So what's the significance of said transistors for TDP? Why am I bringing it up? Well - as we all know - by lowering the manufacturing process we usually have an increase in transistor count to go with it. Going to a smaller die size usually means reduced power as it allows the voltage across the transistors to be reduced. That also increases speed as the channel is now shorter so you can have faster switching transistors or more complex architectures running at the same speed. And as we all know - the lower the voltage applied to transistors - the less heat they generate. So say - if GPU A (which is 40nm) has 1B transistors @ 1.5V it would output more heat than GPU B (which is 28nm) with 1B transistors running @ 1.25V. Now. Although process shrinkage and transistor count cancel each other out to some extent - because GPUs have been stuck on the same 28 nm process for the last 3-4 years - we have only seen an increase in transistor count and thus an increase in temps as more transistors = more heat generation = higher TDPs. So, why is the 960 suddenly so much cooler than the 760? Well. It has less transistors. The 760 (3.5B transistors) has a TDP of 170W. The 960 (2.94B transistors) has a TDP of 120W. Yet, the 960 outperforms the 760 consistently and by a decent bit. Well - that's architecture for you. Optimization goes a long way. Realistically, the TDP in Maxwell wasn't achieved magically but by reducing the amount of heat-generating components of the GPU core. Now - it's nowhere near as simple as transistors being the only important thing - there are other factors as well.

    Now that we've established what exactly produces the heat and how process shrinkage affects it - let's move onto the most frustrating part - power draw.
     

    Power Consumption
    Now - remember the TDPs of the cards we're looking at? 120W for the 960 and 170W for the 760. Well - according to bit-tech and anandtech - those TDPs have little to do with how much power each card actually pulls.

    Bit-tech used Unigine valley to determine GPU power-draw as they deem it to be mostly GPU-centered. Using a watt meter - they determine the 960 system using about 240W of power and the 760 system about 300W.

    Anandtech use "Battlefield 3" as it's an actual game and got a reading of the 760 system using 335W of power from the wall. I couldn't find them doing a similar test for the 960 but taking the offset of 30W from the other test - It'd be safe to assume a 960 test bench would use about 275W of power for the whole system.

    Factoring out about 100W in both cases of the game benchmark for the CPU, HDD and RAM and we have these estimates. The 960 uses about 175W of power and the 760 uses about 235W of power.
    Going back to the TDP of the card - 120W(draws 175W) for the 960 and 170W(draws 235W) for the 760. Suddenly - we see how TDP does not accurately reflect how much power GPU A or GPU B would draw. In most cases - power draw is "guessed" based on TDP and sometime that's somewhat accurate but usually it's totally wrong.

    Now - I'm sure everyone and their grandma can throw an infinite amount of benchmarks showing GPU A, B or C drawing less power than my speakers but you need to remember that in order for these mid or high end products to actually deliver their rated performance, they also need the power. The thing is - Nvidia have managed to reduce the amount of heat given off by individual transistors better than AMD has and thus resulting in lower TDP - that causes people to immediately assume that you need a 1000W PSU to run a 380 and that you can run the 960 off a few potatoes and some wires. In almost all cases - that's totally wrong.

    In order to accurately assess how much power a component is drawing, you need highly specialized testing equipment - such equipment is used by sites like Bit-tech or Anandtech in their testing and those test are almost always more accurate then a wild guess based on how much heat a certain component puts out. Example - the 7950 Boost has a TDP of 200W, yet draws exactly the same amount as a 760 - a card with a 170W TDP. Another example would be the 380 - it has a TDP of 190W - 70W more than the 960. In Legitreviews test, they used a P3 Kill-A-Watt power meter to determine that the 380 uses about 180W of power - almost identical power draw to the 960 but the TDP is vastly different. The same argument could be made about the 970 and 390 and so on.

    All in all - I simply want to advise everyone to look for reputable sources with the appropriate equipment when researching power and thermals - do not simple decide that the card with the lower TDP will automatically draw less power. Always review the model
    Small example


    Sources:
    http://www.legitreviews.com/sapphire-nitro-r9-390-8gb-nitro-r9-380-4gb-video-card-review_166123/11
    http://www.bit-tech.net/hardware/graphics/2015/01/22/nvidia-geforce-gtx-960-review-feat-asus/10
    https://www.techpowerup.com/reviews/Gigabyte/GTX_960_OC/25.html
    http://www.anandtech.com/show/7103/nvidia-geforce-gtx-760-review/16

    Tags to @Aniallation @STRMfrmXMN @EllieThePurpleFuzzy @quan289 @themaniac

    Feedback, suggestions and corrections are appreciated. Thanks for your time
  5. Like
    don_svetlio got a reaction from Sabir in 3440x1440 vs 2560x1440   
    Yep - that's 30% more pixels, 30% more GPU power needed for the same fps.
  6. Informative
    don_svetlio got a reaction from SubLimation7 in Graphics card TDP and Power Consumption Explained   
    So, seeing as there seems to be a bit of misconception about what TDP and Power Draw are exactly and how they correlate so I decided to try and explain it as best I can.
    For testing purposes and in order to avoid brand bias - I will be using the GTX 760 and GTX 960 as my main sources for the numbers and measurements. Here goes nothing.
     

    TDP:
    The thermal design power (TDP), sometimes called thermal design point, is the maximum amount of heat  generated by the Central/Graphics Processing Unit that the cooler attached to it has to dissipate under normal operation. TDP does not account for power viruses (furmark, kombustor, prime95) or any other sort of torture test and can easily exceed a product's rated TDP if one of those is applied.
    Since safety margins and the definition of what constitutes a real application vary among brands, TDP values between different manufacturers cannot be accurately compared. For example, while a graphics card with a TDP of 200 W will almost certainly use more power at full load than a GPU with a 190 W TDP from the same manufacturer, it may or may not use more power than a GPU from a different manufacturer that has a 190 W TDP.
    The dynamic power consumed by a switching circuit is approximately proportional to the square of the voltage:
    P=CV2f
    where C is capacitance, f is frequency, and V is voltage.

    Now, boring math aside. Let me get more into the part of how exactly heat is generated and what affects the TDP of each graphics card, to some extent.

    As we all know - each GPU core has an absurd amount of transistors we hear AMD or Nvidia brag about.
    Nvidia:"Titan X has 8 billion transistors!"
    AMD:"Oh yeah? Fury X has 8.9 billion transistors!"
                       ...and this goes on...
    So what's the significance of said transistors for TDP? Why am I bringing it up? Well - as we all know - by lowering the manufacturing process we usually have an increase in transistor count to go with it. Going to a smaller die size usually means reduced power as it allows the voltage across the transistors to be reduced. That also increases speed as the channel is now shorter so you can have faster switching transistors or more complex architectures running at the same speed. And as we all know - the lower the voltage applied to transistors - the less heat they generate. So say - if GPU A (which is 40nm) has 1B transistors @ 1.5V it would output more heat than GPU B (which is 28nm) with 1B transistors running @ 1.25V. Now. Although process shrinkage and transistor count cancel each other out to some extent - because GPUs have been stuck on the same 28 nm process for the last 3-4 years - we have only seen an increase in transistor count and thus an increase in temps as more transistors = more heat generation = higher TDPs. So, why is the 960 suddenly so much cooler than the 760? Well. It has less transistors. The 760 (3.5B transistors) has a TDP of 170W. The 960 (2.94B transistors) has a TDP of 120W. Yet, the 960 outperforms the 760 consistently and by a decent bit. Well - that's architecture for you. Optimization goes a long way. Realistically, the TDP in Maxwell wasn't achieved magically but by reducing the amount of heat-generating components of the GPU core. Now - it's nowhere near as simple as transistors being the only important thing - there are other factors as well.

    Now that we've established what exactly produces the heat and how process shrinkage affects it - let's move onto the most frustrating part - power draw.
     

    Power Consumption
    Now - remember the TDPs of the cards we're looking at? 120W for the 960 and 170W for the 760. Well - according to bit-tech and anandtech - those TDPs have little to do with how much power each card actually pulls.

    Bit-tech used Unigine valley to determine GPU power-draw as they deem it to be mostly GPU-centered. Using a watt meter - they determine the 960 system using about 240W of power and the 760 system about 300W.

    Anandtech use "Battlefield 3" as it's an actual game and got a reading of the 760 system using 335W of power from the wall. I couldn't find them doing a similar test for the 960 but taking the offset of 30W from the other test - It'd be safe to assume a 960 test bench would use about 275W of power for the whole system.

    Factoring out about 100W in both cases of the game benchmark for the CPU, HDD and RAM and we have these estimates. The 960 uses about 175W of power and the 760 uses about 235W of power.
    Going back to the TDP of the card - 120W(draws 175W) for the 960 and 170W(draws 235W) for the 760. Suddenly - we see how TDP does not accurately reflect how much power GPU A or GPU B would draw. In most cases - power draw is "guessed" based on TDP and sometime that's somewhat accurate but usually it's totally wrong.

    Now - I'm sure everyone and their grandma can throw an infinite amount of benchmarks showing GPU A, B or C drawing less power than my speakers but you need to remember that in order for these mid or high end products to actually deliver their rated performance, they also need the power. The thing is - Nvidia have managed to reduce the amount of heat given off by individual transistors better than AMD has and thus resulting in lower TDP - that causes people to immediately assume that you need a 1000W PSU to run a 380 and that you can run the 960 off a few potatoes and some wires. In almost all cases - that's totally wrong.

    In order to accurately assess how much power a component is drawing, you need highly specialized testing equipment - such equipment is used by sites like Bit-tech or Anandtech in their testing and those test are almost always more accurate then a wild guess based on how much heat a certain component puts out. Example - the 7950 Boost has a TDP of 200W, yet draws exactly the same amount as a 760 - a card with a 170W TDP. Another example would be the 380 - it has a TDP of 190W - 70W more than the 960. In Legitreviews test, they used a P3 Kill-A-Watt power meter to determine that the 380 uses about 180W of power - almost identical power draw to the 960 but the TDP is vastly different. The same argument could be made about the 970 and 390 and so on.

    All in all - I simply want to advise everyone to look for reputable sources with the appropriate equipment when researching power and thermals - do not simple decide that the card with the lower TDP will automatically draw less power. Always review the model
    Small example


    Sources:
    http://www.legitreviews.com/sapphire-nitro-r9-390-8gb-nitro-r9-380-4gb-video-card-review_166123/11
    http://www.bit-tech.net/hardware/graphics/2015/01/22/nvidia-geforce-gtx-960-review-feat-asus/10
    https://www.techpowerup.com/reviews/Gigabyte/GTX_960_OC/25.html
    http://www.anandtech.com/show/7103/nvidia-geforce-gtx-760-review/16

    Tags to @Aniallation @STRMfrmXMN @EllieThePurpleFuzzy @quan289 @themaniac

    Feedback, suggestions and corrections are appreciated. Thanks for your time
  7. Agree
    don_svetlio got a reaction from JJVGaming in V-Sync or FPS Limiter?   
    fps limiter since V-sync enforces a slight FPS penalty and it can only limit to the refresh rate or halves of it - say if you have a 60Hz screen and dip to 49fps, V-sync will allow for only 30fps to be displayed at once
  8. Informative
    don_svetlio got a reaction from Ovcharski in GTX 1060 3GB - Video Editing   
    Again, it doesn't work like that. You really want a decent SSD and an i7 quad-core with HT for smooth editing. A 750 TI provides perfectly smooth 4K 60fps playback.
  9. Like
    don_svetlio got a reaction from 8BitBuilder in Budget PC deal -- Is it good?   
    It's a scam - 980s NEVER come with 2GB and the Pentium is garbage
  10. Informative
    don_svetlio got a reaction from divided_throwaway in Pentium G4400 vs AMD x4 860k   
    860K if you don't plan on upgrading

    Wait and get an i3 if you want to upgrade later.

    Pentiums aren't worth as modern games refuse to even boot with them
  11. Agree
    don_svetlio got a reaction from KevinLiau in Do you really need a fancy I5 or even I7 to play games at 1080p?   
    I had quite the laugh with those graphs, nice joke man.
  12. Like
    don_svetlio got a reaction from Mr Robot in Experiences with Arrogant "Techies"   
    Hey, I'm great with budget builds
  13. Funny
    don_svetlio reacted to lee32uk in Experiences with Arrogant "Techies"   
    You wanna hear something funny ? Apparently Mr Buzzsaw has worked in IT for over 20 years. 
     
     
    https://linustechtips.com/main/topic/664895-new-pc-build/?do=findComment&comment=8595158
     
  14. Funny
    don_svetlio reacted to david31160 in Experiences with Arrogant "Techies"   
    I know a guy who says a 750ti is an insane graphics card and can run any game on ultra 
  15. Funny
    don_svetlio reacted to lee32uk in Experiences with Arrogant "Techies"   
    The guy is impossible to reason with. If you pull his builds to pieces (Which I have done many times) he goes off on a rant  I am on his ignore list now 
  16. Informative
    don_svetlio got a reaction from Aytex in Laptops for college engineering   
    He'd end up with a machine running at 100*C....
  17. Agree
    don_svetlio reacted to D2ultima in razer blade or razer badel sleath   
    https://linustechtips.com/main/topic/658939-why-does-linus-rave-about-the-razer-blade-14-and-do-you-agree-with-him/#comment-8524518
     
  18. Agree
    don_svetlio reacted to Dackzy in razer blade or razer badel sleath   
    why are people still looking at those shitty laptop's.
    1. They all thermal throttle, that is not good. You loose at lot of performance and it gets so hot that it shortens the laptops life spand, by a lot.
    2. They are loud really loud. 60dB
    3. Build like cheap kids toys, it does not matter if you build the case out of titanium, but then use cheap screws and cheap glue to keep it together. Build quality is way more than just the feel of a laptop. Many idiots on YouTube and in real life only look at the feel when they talk build quality, but that just tells me that they know jack shit about build quality.
    4. No quality control, so you might get one that has a dead USB port or a dead keyboard or maybe it doesn't want to turn on.
    5. "Support", one word: Useless.
    6. The keyboard, that thing feels like a keyboard that you should get on a 600$ laptop.
     
    All of the razer blades are jokes and if we want to change this, then don't buy them, that is the only way. We as a consumer has a lot more power than you might think.
    let us just get some more people in here, that knows what they talk about.
    @Pendragon
     
  19. Agree
    don_svetlio got a reaction from PRTI in 16Gb: 2x8 or 4x4 ? Is there a difference ?   
    If 4x4 are running in quad-channel then you'd get better bandwidth which, depending on the application, could show either a decent improvement or no improvement at all
  20. Agree
    don_svetlio got a reaction from Pendragon in razer blade or razer badel sleath   
    Neither? I mean, the Blade 14 runs in the mid 90s under load and sounds like a hair drier ( I am serious - Blade 14 runs at 55-60dB, most hair driers are 55-65dB), the keyboard is quite crappy and the battery life is piss poor. Razer support is also terrible

    Blade Stealth? That one has been somewhat improved though it still runs at 84-85*C under load and is quite loud due to the absurdly small fans and no heatsink. Keyboard is VERY bad - mushy, has no travel, has a LOT of flex in the centre of it. Razer support is once again terrible
  21. Agree
    don_svetlio reacted to Pendragon in razer blade or razer badel sleath   
    hz LOLOLOLOLOLOLOLOLOL. Yes. 75ms from Razer is something I wouldn't be surprised to see tbh lOLOLOLOLOL
  22. Funny
    don_svetlio reacted to Pendragon in razer blade or razer badel sleath   
    MSI IS COMING BACK TO TN 25MS ON THEIR FUCKING 2016 120HZ SCREENS. SO YOU KNOW. GG MSI. 
  23. Like
    don_svetlio got a reaction from Constantin Mihaila in after a long timp of hard work i need some advice #new gaming build   
    Oookay, no. Not great. Give me 3 mins
  24. Agree
    don_svetlio got a reaction from Benjals in after a long timp of hard work i need some advice #new gaming build   
    I'd say this should be better value
    PCPartPicker part list / Price breakdown by merchant
    CPU: Intel Core i7-6700 3.4GHz Quad-Core Processor  (£262.00 @ Amazon UK)
    Motherboard: MSI Z170A SLI PLUS ATX LGA1151 Motherboard  (£104.72 @ Amazon UK)
    Memory: Kingston FURY 16GB (2 x 8GB) DDR4-2133 Memory  (£66.99 @ Ebuyer)
    Storage: Samsung 850 EVO-Series 250GB 2.5" Solid State Drive  (£79.99 @ Amazon UK)
    Storage: Western Digital Red 2TB 3.5" 5400RPM Internal Hard Drive  (£72.34 @ Aria PC)
    Video Card: Palit GeForce GTX 1070 8GB Dual Video Card  (£368.28 @ Aria PC)
    Case: Fractal Design Define R4 (Black Pearl) ATX Mid Tower Case  (£77.26 @ Amazon UK)
    Power Supply: SeaSonic G-750 750W 80+ Gold Certified Semi-Modular ATX Power Supply  (£103.96 @ CCL Computers)
    Total: £1135.54
    Prices include shipping, taxes, and discounts when available
    Generated by PCPartPicker 2016-09-25 11:10 BST+0100
  25. Agree
    don_svetlio got a reaction from Kimpton in Help Me Choose a Notebook - [University][Programming]   
    Blade and Blade Stealth both have serious problems which no reviewer aside from Gaming Laptop Junky and NBC are covering. Most owners of the Blade series on NBR forums are really pissed cause of that
×