Jump to content

AMD is really falling behind NVIDIA

ContemplatingBeluga

heat output is irrelevant aswell :)

They make it seem like GPUs put out 500c worth of heat LOL. For allot of us that live in Canada heat is a good thing cause it's cold here for most of the year.

Link to comment
Share on other sites

Link to post
Share on other sites

Some people don't care about heat output. I like it as it's an indicator of high performance. More heat, more power consumption, more performance ... "it's simple dynamics"

That's not how electronics works... More power consumption does NOT in any way equal more performance. Please do not attempt to school me on "simply dynamics" I have an engineering degree.

 

Have you been paying attention to Maxwell at all? I honestly dont even know how to respond to that...

 

Try this, my good old GTX 570 uses 219W of power according to Nvidia

 

Now, interestingly enough a GTX 770 uses 230W of power also according to Nvidia

 

By your argument, the 770 is only about 5% better than my 570... That is an absolutely insane argument, and dis-proven by absolutely every single benchmark and game on the market. 

 

Please, just stop,

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yet you are still arguing about relatively minor concerns such as case temp; which are much more easily solved than GPU temp problem.

A 25% reduction in heat output does reduce case temps. I am for not having the problem in the first place instead of solving it once you have the problem.

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's not how electronics works... More power consumption does NOT in any way equal more performance. Please do not attempt to school me on "simply dynamics" I have an engineering degree.

 

Have you been paying attention to Maxwell at all? I honestly dont even know how to respond to that...

We have to keep it apples to apples which you are not. High end Maxwell will consume considerably more power than low end Maxwell as well in turn create more heat and performance.

You are moving the goal posts now and lying to try and make nvidia look good and try and sugar coat your lies.

Link to comment
Share on other sites

Link to post
Share on other sites

We have to keep it apples to apples which you are not. High end Maxwell will consume considerably more power than low end Maxwell as well in turn create more heat and performance.

You are moving the goal posts now and lying to try and make nvidia look good and try and sugar coat your lies.

What exactly was my lie? And how did I move the goal post? 

 

You said that power consumption was a good way to judge performance, I refuted that with facts. Did I miss something?

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

What exactly was my lie? And how did I move the goal post? 

 

You said that power consumption was a good way to judge performance, I refuted that with facts. Did I miss something?

Apples to Apples. High end cards always consume more power than low end cards. Why would you try and compare generation older tech to brand new. That's like saying I am wrong because a 6800 Ultra consumes more power than a 750ti but performs less. This is called moving the goal posts. Maxwell to Maxwell, Fermi to Fermi, Tahiti to Tahiti high end cards will always consume more power, create more heat and perform better than their low end counterparts.

Link to comment
Share on other sites

Link to post
Share on other sites

Apples to Apples. High end cards always consume more power than low end cards. Why would you try and compare generation older tech to brand new. That's like saying I am wrong because a 6800 Ultra consumes more power than a 750ti but performs less. This is called moving the goal posts. Maxwell to Maxwell, Fermi to Fermi, Tahiti to Tahiti high end cards will always consume more power, create more heat and perform better than their low end counterparts.

Then you cannot make that same claim between AMD and Nvidia at all, which you did...

 

Neither of the cards I listed were/low end at introduction, both mid level cards. The only reason I used that particular comparison is because there is one sitting next to me and that gave me the idea. 

 

I have not lied, for clarification. 

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Then you cannot make that same claim between AMD and Nvidia at all, which you did...

 

Neither of the cards I listed were/low end at introduction, both mid level cards. The only reason I used that particular comparison is because there is one sitting next to me and that gave me the idea. 

 

I have not lied, for clarification. 

So then why compare the only Maxwell card out when their is no other new tech available. Also the 750ti is a low end card. 760 is mid range.

Link to comment
Share on other sites

Link to post
Share on other sites

Try this, my good old GTX 570 uses 219W of power according to Nvidia

 

Now, interestingly enough a GTX 770 uses 230W of power also according to Nvidia

 

By your argument, the 770 is only about 5% better than my 570... That is an absolutely insane argument, and dis-proven by absolutely every single benchmark and game on the market.

I never said, implied or tried to argue any of that bollocks in bold above YOU DID and here in is were you began to lie by putting words into others mouths. I simply stated more power consumption more performance. Then you started to move goal posts and flash numbers around specifically the 5% figure that you pulled from your ass.

All that aside I am glad you brought up the GTX 770 as it is an excellent part to discuss this issue as it is an APPLES TO APPLES direct comparison.

GTX 680 - 190watt Kepler part

GTX 770 -  230watt Kepler part

 

Which one performs better and consumes more power ? Excuse me that was indeed a rhetorical question LOL.

Take the GTX 680 nvidia went the ... "efficiency" route with that card and lost out in the performance department.

 

This is a real shame and makes nvidia fanboy's look like asshats which I know is not entirely the case.

Link to comment
Share on other sites

Link to post
Share on other sites

whatever you say the bottom line is nvidia is always going to be better and amd cant beat that !!!

That has nothing to do with what was being discussed in the post you replied to. Please stop being ignorant.

Link to comment
Share on other sites

Link to post
Share on other sites

So then why compare the only Maxwell card out when their is no other new tech available. Also the 750ti is a low end card. 760 is mid range.

I never did compare the 750ti to anything. I haven't even brought it up. I did mention Maxwell in passing as a rebuttal to your argument that power consumption = performance. We have since moved off of that as you have clarified your position.

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's not an apples to apples comparison. You need to compare Maxwell to Maxwell not GK 106 to maxwell. I don't care about power consumption just performance. What are you on battery power ?

Where did you ever mention that it would only be true in apples to apples comparison? You simply said that more power consumption = more performance which clearly isn't true. Of course when comparing two GPUs of the same architecture, the stronger one will always consume more power.

And I would also like to point out that I'm no Nvidia fanboy. I like AMD just as much as Nvidia and I did indeed use a R9 280x in the pc I recently built for my dad.

GPU: Gigabyte GTX 970 G1 Gaming CPU: i5-4570 RAM: 2x4gb Crucial Ballistix Sport 1600Mhz Motherboard: ASRock Z87 Extreme3 PSU: EVGA GS 650 CPU cooler: Be quiet! Shadow Rock 2 Case: Define R5 Storage: Crucial MX100 512GB
Link to comment
Share on other sites

Link to post
Share on other sites

whatever you say the bottom line is nvidia is always going to be better and amd cant beat that !!!

Although I do believe that right now Nvidia makes a better product than the AMD offerings, its a bit crazy to say that Nvidia will always be better than AMD. In a year or two the AMD cards may kick the shit out of the Nvidia offerings, who knows?

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Where did you ever mention that it would only be true in apples to apples comparison? You simply said that more power consumption = more performance which clearly isn't true. Of course when comparing two GPUs of the same architecture, the stronger one will always consume more power.

 

Unfortunately the only way for you to make your claims work is to move the goal posts by comparing two completely different architectures from different times. That's like comparing a brand new 2013 Toyota Prius to a 1980 Honda Civic with 220k on the odometer.

Link to comment
Share on other sites

Link to post
Share on other sites

Physx=Tressfx,

Shadowplay=Dvr
GameStream-Steam in home streaming will counter this... And trust me I had a shield and sold it cause after a week of using it got boring real quick...
Powerconsumption- AMD is working on more power efficient architectures as well hear of Tonga and the R9 285 its nearly as strong as the 770 and consumes only slightly more power than the 760

Gsync=FreeSync but free not costing extra and availible to anyone who uses VESA standards (which most everyone does)

Mantle has no competiton so is Nvidia falling behind?  And AMD has said when it comes out of beta they're going to offer it to their competition (intel and Nvidia)
 

5820k4Ghz/16GB(4x4)DDR4/MSI X99 SLI+/Corsair H105/R9 Fury X/Corsair RM1000i/128GB SM951/512GB 850Evo/1+2TB Seagate Barracudas

Link to comment
Share on other sites

Link to post
Share on other sites

Mediocre chip design can be mitigated by throwing more power at it, bringing it up to par. How much power is based on the manufacturers (AMD) comfort level and willingness to deal with RMAs, DOAs, thermal degradation, etc.

Given realistic thermal headroom, a well designed chip can use that extra power for greater gains. However the manufacturer (Nvidia) gave the users the option of whether or not to increase the power. This is because those untapped watts weren't necessary to remain competitive.

The medium (silicon) and process nm (28) is the ONLY legitimate apples to apples starting comparison. From that point forward the design can vary wildly.

The current gen GPUs are proof enough that wattage does not indicate performance, chip design making good use of wattage does.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah I agreee AMD really fell behind in negativity.

+++

Fortunatly

Looks at milestone and price tables.

No you are wrong.

 

Looks at milestone and price tables.

How they achieved victory doesnt matter.

 

Idiots. Stupidity product of thought treatement. You make decisions about who wins and then you justify your inferior purchase with, for exemple, public value of the company.

Time for another break of 2 months from stupid land.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

The current gen GPUs are proof enough that wattage does not indicate performance

Ya right that's why as I pointed out earlier that the 770 consumes moire power and has increased performance over the 680. More power more performance period.

Link to comment
Share on other sites

Link to post
Share on other sites

Because you happen to have a shield Tablet and a specific need for every difference Nvidia has.

Right? BickWhale

 

Simply next time just add an appropriate title stating

''

AMD is really falling behind NVIDIA in my life context''

 

or say it's an idiot's opinion. Because I was really optimistic this would be a relevant groupement of new information.

instead of a stupid post.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah I agreee AMD really fell behind in negativity.

Ya AMD is really behind after all they only brought to market in beta form a whole new graphics API with 57 game devs already on board, fixed their shitty drivers, and brought to market the DMA engine for bridge less stutter free superior muti card scaling and not to mention Never settle. Ya looks like AMD is really far behind LOL.

Link to comment
Share on other sites

Link to post
Share on other sites

Ya AMD is really behind after all they only brought to market in beta form a whole new graphics API with 57 game devs already on board, fixed their shitty drivers, and brought to market the DMA engine for bridge less stutter free superior muti card scaling and not to mention Never settle. Ya looks like AMD is really far behind LOL.

Still in front in market offerings in terms of hardware.

Sorry I didnt know you were a stockholder :rolleyes:

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

Still in front in market offerings in terms of hardware.

Sorry I didnt know you were a stockholder :rolleyes:

that has to do with what ?

Link to comment
Share on other sites

Link to post
Share on other sites

that has to do with what ?

To do with your argument is now void and made yourself an idiot.

 

I held myself saying it was already getting to complicated for you and it was legal to be stupid.

You exceeded my pesimism.

 

Unless you want to prove what this has to do with:

''Ya AMD is really behind after all they only brought to market in beta form a whole new graphics API with 57 game devs already on board, fixed their shitty drivers, and brought to market the DMA engine for bridge less stutter free superior muti card scaling and not to mention Never settle. Ya looks like AMD is really far behind LOL.''

 

The market offerings has to do with, when someone rational goes to shop, AMD wins.

 

 

Here's a guess too: your from USA.

CPU: Ryzen 2600 GPU: RX 6800 RAM: ddr4 3000Mhz 4x8GB  MOBO: MSI B450-A PRO Display: 4k120hz with freesync premium.

Link to comment
Share on other sites

Link to post
Share on other sites

To do with your argument is now void and made yourself an idiot.

 

I held myself saying it was already getting to complicated for you and it was legal to be stupid.

You exceeded my pesimism.

Perhaps some of us are into supporting the gaming industry not corporate cock suckers.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×