Jump to content

What do you think Nvidia will do to strike back at AMD?

ManOfDisguise

Nivida really doesnt need to hope or wait, Because mantle is just recoding games to process from the gpu rather than the cpu then gpu. Any company can do it

You are correct there but I don't think nvidia will make a mantle substitute all, though I truly hope they do. Since I think nvidia is let's say to proud, if another company makes a mantle substitute they probably won't support it, that's why I think they will have to cram Microsoft to fix directx.

Please correct me if I am wrong or just way out here. Like I want to be wrong lol.

CPU: i7 4770k@4.3Ghz GPU: GTX 780ti Motherboard: ROG maximus vi formula PSU: Evga supernova 1000w Platinum

Case: NZXT Switch 810 Memory: g.skill ripjaws X Cooler: Corsair h100i(getting custom loop when i get money next) Storage: Samsung 840 evo 250gb Keyboard: Corsair K95 Mouse: SteelSeries Rival

Link to comment
Share on other sites

Link to post
Share on other sites

Nothing, because Mantle mean nothing at the moment and 780ti is the king

 

btw nvidia has gsync

Link to comment
Share on other sites

Link to post
Share on other sites

They will do what they always do: Release some fancy new technology and lock it down to only nVidia users.... only to have AMD later release a similar thing but free and open to anyone.

Link to comment
Share on other sites

Link to post
Share on other sites

They will do what they always do: Release some fancy new technology and lock it down to only nVidia users.... only to have AMD later release a similar thing but free and open to anyone.

Tell me how freesync works with nvidia
Link to comment
Share on other sites

Link to post
Share on other sites

Tell me how freesync works with nvidia

It's an open VESA technology...AMD didn't invent it, they took a existent tech that they have experience with from their work with laptop and tablet displays and modified it.

AMD just used it to take the piss out of nVidia for being money-whores.

Link to comment
Share on other sites

Link to post
Share on other sites

This is rather sad.

 

Freesync (...) is nothing as of yet - check news section, there's enough stuff about it there. AMD has nothing on NVIDIA here. The tech is quite possibly not available for anything but laptops and tablets.

 

Mantle - the numbers you have quoted are meant for PR and we still haven't seen any proof of the promised performance boosts.

 

GPUs - nvidia has the fastest gpu on the market right now. That's one. For cheaper option you either go with quiet and cool nvidia option with OC headroom or hot and loud amd option for some 'up to' performance

 

I'm sorry to disappoint you but NVIDIA is still the premium brand when it comes to high end tech and AMD is the more budget option. Is it bad for either of them? No. But all things considered I'd say that it's still AMD that needs to catch up. It's sad for me that NVIDIA could just throw up a new premium card within weeks of AMD move, while we had to wait for over half a year for AMD's response in the first place. Hopefully next gen AMD will be even stronger and that will put some fresh air into competition, instead of the 'hold the tech back until they make their move' strategy we have seen of lately.

Link to comment
Share on other sites

Link to post
Share on other sites

This is rather sad.

 

Freesync (...) is nothing as of yet - check news section, there's enough stuff about it there. AMD has nothing on NVIDIA here. The tech is quite possibly not available for anything but laptops and tablets.

 

Mantle - the numbers you have quoted are meant for PR and we still haven't seen any proof of the promised performance boosts.

 

GPUs - nvidia has the fastest gpu on the market right now. That's one. For cheaper option you either go with quiet and cool nvidia option with OC headroom or hot and loud amd option for some 'up to' performance

 

I'm sorry to disappoint you but NVIDIA is still the premium brand when it comes to high end tech and AMD is the more budget option. Is it bad for either of them? No. But all things considered I'd say that it's still AMD that needs to catch up. It's sad for me that NVIDIA could just throw up a new premium card within weeks of AMD move, while we had to wait for over half a year for AMD's response in the first place. Hopefully next gen AMD will be even stronger and that will put some fresh air into competition, instead of the 'hold the tech back until they make their move' strategy we have seen of lately.

Freesync - It's a demo'd technology, we have videos, and it's certainly more than we got at nVidia's G-sync announce!

Mantle - We have builds of it already, they had an FPS counter on that video demo that was pushing 100 FPS ultra on mid-to-high range hardware.

GPUs - You're arguing for a "halo effect" as Linus puts it, which he has said is a terrible way for choosing GPUs. Also, quiet and cool nVidia? Unless you are using the nightmarish reference 290x for that compatison, then you have absolutely no basis for saying that nvidia is cool and quiet while AMD is hot and loud, the non-reference cards that we all buy are practically IDENTICAL GPU to GPU. That's like me saying that nVIdia is hot and loud because the GTX 480 was, it's asinine!

If anybody has one the last year in GPUs? It's AMD. They got both next-gen consoles, they are making a very powerful push into the mobile sector, and they have a solid lineup of GPUs and APUs...

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia is in a tight spot as AMD now has a load of money thanks to making the hardware for BOTH new consoles. AMD used to be the cheaper and worse performing company ie. 7970 was not on par with 680. But now R9 290 is close to 780 and partner coolers+OC will close that gap and then the same will happen with R9 290X and 780TI(780Ti may stay ahead but no by as much). If this is repeated next generation then AMD will come out ahead.

The amount of R&D that Nvidia is using on Moblie Devices etc is very clear after CES and that can't be great for their PC GPU's.

Gaming Rig:CPU: Xeon E3-1230 v2¦RAM: 16GB DDR3 Balistix 1600Mhz¦MB: MSI Z77A-G43¦HDD: 480GB SSD, 3.5TB HDDs¦GPU: AMD Radeon VII¦PSU: FSP 700W¦Case: Carbide 300R

 

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync - It's a demo'd technology, we have videos, and it's certainly more than we got at nVidia's G-sync announce!

Mantle - We have builds of it already, they had an FPS counter on that video demo that was pushing 100 FPS ultra on mid-to-high range hardware.

GPUs - You're arguing for a "halo effect" as Linus puts it, which he has said is a terrible way for choosing GPUs. Also, quiet and cool nVidia? Unless you are using the nightmarish reference 290x for that compatison, then you have absolutely no basis for saying that nvidia is cool and quiet while AMD is hot and loud, the non-reference cards that we all buy are practically IDENTICAL GPU to GPU. That's like me saying that nVIdia is hot and loud because the GTX 480 was, it's asinine!

If anybody has one the last year in GPUs? It's AMD. They got both next-gen consoles, they are making a very powerful push into the mobile sector, and they have a solid lineup of GPUs and APUs...

 

NVIDIA brought PCs with a working demo and an announcment of a release tech. AMD brought 2 laptops with an old tech which according to posts like this http://linustechtips.com/main/topic/99660-the-real-difference-between-free-sync-vs-g-sync/ is not capable of working on a desktop grade pc. They even said themselves that they have no plans nor idea if and when they get it to work.

 

AMD has NEVER posted any official specs of the pc it was running mantle on nor the ingame settings. You just know that on some pc at some settings bf4 reached 100 fps on a counter. You also know that according to AMD mantle can offer up to 45% increase in performance. Now I'd like some REAL facts, not PR garbage. If you want to believe in it, you are free to, but don't bring it as a fact. The only fact we have about mantle's performance increase is that we have no facts.

 

I do believe that nvidia screwed up a with the 480, I'm saying it as a person who has been using that gpu for past 3 years and been happy with it (btw, it's quieter and colder than a 290/x). While 3rd party solution are amazing, I will not judge reference tech by 3rd party achievements. What was stopping AMD from releasing a cooler on par with nvidia titan blower? And their own decision was to delay 3rd party solutions on top of that. Am I happy with that? Hell NO. I may not be interested in AMD products right now as I don't feel like their target audience, but I do want some real pressure from them so that nvidia has to do something apart from sitting sunbathing for prolonged periods of time.

 

And yes, I'm comparing high end products. I'm doing that from EU perspective where you are less likely to buy something at a huge discount. In my own mind a gpu that can run quietly and silently, has a huge headroom for OC and is using high quality materials is superior to one that is constantly hot, loud, uses the magical up-to's when describing its performance and is cheaply built. First case shows some technical maturity the other rather not. I sincerely hope that AMD is at the point where NVIDIA was with their 400 series.

 

AMD have indeed won the console wars, (as they had to in order to survive) but it has nothing to do with this discussion. Nvidia has the meat now, AMD has the promise which I hope they bring into life, but I've worked with a corp for too long to believe in PR bs. I want proof.

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone seems to forget, that the r9 series of cards was release pretty much at the end of the fiscal year, whereas titan/7xx series was released in the beginning/middle. That is a lot of time for nVidia sitting around and twiddling their thumbs. Furthermore, if it was the case that the 680 was nVidia's reaction due to the 7970 being lackluster, than that means they've had a long, long time to sit around and mull life and its meaning over.

 

Upon release of the r9 cards their reply was "hey fully unlocked gk110". I think the only person who was sweating was Jen-Hsun during the 780ti announcment. (Brothers head is always so shiny with oil).

 

nVidia doesn't have their panties in a bunch, Jen-Hsun is probably going to step on a stage in 3 months itme and be like behold the worlds fastest graphics card GTX 880, 4gb of ram blah blah blah amazing blah blah blah, faster than the gtx 780 ti the worlds previously fastest graphics card by blah. The price tag will be only an arm, a leg, and your first born child.

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone seems to forget, that the r9 series of cards was release pretty much at the end of the fiscal year, whereas titan/7xx series was released in the beginning/middle. That is a lot of time for nVidia sitting around and twiddling their thumbs. Furthermore, if it was the case that the 680 was nVidia's reaction due to the 7970 being lackluster, than that means they've had a long, long time to sit around and mull life and its meaning over.

 

Upon release of the r9 cards their reply was "hey fully unlocked gk110". I think the only person who was sweating was Jen-Hsun during the 780ti announcment. (Brothers head is always so shiny with oil).

 

nVidia doesn't have their panties in a bunch, Jen-Hsun is probably going to step on a stage in 3 months itme and be like behold the worlds fastest graphics card GTX 880, 4gb of ram blah blah blah amazing blah blah blah, faster than the gtx 780 ti the worlds previously fastest graphics card by blah. The price tag will be only an arm, a leg, and your first born child.

3 months time? god no...

As we know, we wont be seeing 20 nm Maxwell until Q3 at the EARLIEST.

Link to comment
Share on other sites

Link to post
Share on other sites

Quiet, efficient, yet powerful video cards. G-Sync (haha, freesync). Good driver support. Shadowplay and similar features. More powerful flagships.

 

Yo, I don't know about you guys, but I cheer for both companies. Trust me, you wouldn't want either card to form a monopoly. How about a $1,500 R9 390X if Nvidia goes under? You can't do squat because it's not like AMD has to price it competitively.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Well uh...as long as coin miners keep their shenanigans up? All Nvidia has to do is sit back and laugh. They don't have to drop prices. They just sit back and make money.

 

Not like I could find a R9 280x for 300 bucks. I sure did find an open box GTX 770 for 279 with rebate though at Microcenter.

 

Hell they have a superclocked ACX cooler EVGA 770 there right now for 295 when you add the rebate.

 

http://www.microcenter.com/product/415537/02G-P4-2774-KR_NVIDIA_GeForce_GTX_770_Superclocked_w-ACX_Cooler_2048MB_GDDR5_PCIe_30_x16_Video_Card

 

Which one are you buying? A 400 plus 280x or a GTX 770 with a bad@#% cooler on it for over 100 dollars less? 

 

I hope Mantle does well, but anything past a r9 270 (and even half of those are sold out) and Nvidia is the better budget option atm, and they didn't even have to touch their prices. I think Nvidia will be "ok". 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA brought PCs with a working demo and an announcment of a release tech. AMD brought 2 laptops with an old tech which according to posts like this http://linustechtips.com/main/topic/99660-the-real-difference-between-free-sync-vs-g-sync/ is not capable of working on a desktop grade pc. They even said themselves that they have no plans nor idea if and when they get it to work.

 

AMD has NEVER posted any official specs of the pc it was running mantle on nor the ingame settings. You just know that on some pc at some settings bf4 reached 100 fps on a counter. You also know that according to AMD mantle can offer up to 45% increase in performance. Now I'd like some REAL facts, not PR garbage. If you want to believe in it, you are free to, but don't bring it as a fact. The only fact we have about mantle's performance increase is that we have no facts.

 

I do believe that nvidia screwed up a with the 480, I'm saying it as a person who has been using that gpu for past 3 years and been happy with it (btw, it's quieter and colder than a 290/x). While 3rd party solution are amazing, I will not judge reference tech by 3rd party achievements. What was stopping AMD from releasing a cooler on par with nvidia titan blower? And their own decision was to delay 3rd party solutions on top of that. Am I happy with that? Hell NO. I may not be interested in AMD products right now as I don't feel like their target audience, but I do want some real pressure from them so that nvidia has to do something apart from sitting sunbathing for prolonged periods of time.

 

And yes, I'm comparing high end products. I'm doing that from EU perspective where you are less likely to buy something at a huge discount. In my own mind a gpu that can run quietly and silently, has a huge headroom for OC and is using high quality materials is superior to one that is constantly hot, loud, uses the magical up-to's when describing its performance and is cheaply built. First case shows some technical maturity the other rather not. I sincerely hope that AMD is at the point where NVIDIA was with their 400 series.

 

AMD have indeed won the console wars, (as they had to in order to survive) but it has nothing to do with this discussion. Nvidia has the meat now, AMD has the promise which I hope they bring into life, but I've worked with a corp for too long to believe in PR bs. I want proof.

It is capable of working on a desktop PC, it's just that some monitors have skimped on VESA standards...

The reference coolers from nVidia for the 480 was shite that ran at a 65 degree delta in celsius, commonly running at about 90 degrees, which is what the 290/x reference runs at, and I assume you have a non-reference 480...

AMD and nVidia both make SHITHOUSE reference cards with the exception of the 7xx series, but seeing as nobody buys the reference cards if there are 3rd party ones available, which there almost always are, it doesnt really matter. AMD and nVidia are chip design companies, not board and cooler companies.

And oh god...every sentence in the paragraph starting with "And yes, I am..." REEKS of fanboy...

Let's take each fanboy-crap statement one by one.

"GPU that runs Quietly and Silently" - Every non-reference card, Twin Frozr, DC 2, has practically the EXACT same temperature and acoustic properties no matter if it's an nVidia or AMD GPU.

"Has a huge headroom for OC" - Have you seen the 290x Lightning at CES? Because if you had, you would comprehend that OCing capability has practically F*CK ALL to do with the GPU, and far more to do with the card itself! The reference 290x was a SHIT OCer, but thats because it's a SHIT CARD. The 290x GPU itself can do FAR better than that.

"GPU uses high quality materials" - Ok, now I'm just laughing at this one. You do realise that both AMD and nVidia both have TSMC fab their GPUs, both using the same technology, same standards, same everything. And both of their VRAM is made by Hynix. Saying a GPU uses "high quality materials" is fanboy bullshit.

Link to comment
Share on other sites

Link to post
Share on other sites

Well uh...as long as coin miners keep their shenanigans up? All Nvidia has to do is sit back and laugh. They don't have to drop prices. They just sit back and make money.

 

Not like I could find a R9 280x for 300 bucks. I sure did find an open box GTX 770 for 279 with rebate though at Microcenter.

 

Hell they have a superclocked ACX cooler EVGA 770 there right now for 295 when you add the rebate.

 

http://www.microcenter.com/product/415537/02G-P4-2774-KR_NVIDIA_GeForce_GTX_770_Superclocked_w-ACX_Cooler_2048MB_GDDR5_PCIe_30_x16_Video_Card

 

Which one are you buying? A 400 plus 280x or a GTX 770 with a bad@#% cooler on it for over 100 dollars less? 

 

I hope Mantle does well, but anything past a r9 270 (and even half of those are sold out) and Nvidia is the better budget option atm, and they didn't even have to touch their prices. I think Nvidia will be "ok". 

The price thing is only an issue in the US and Canada, anywhere else, it's perfectly fine, a 280x is almost 150 dollars cheaper than a 770 here in Aus.

Link to comment
Share on other sites

Link to post
Share on other sites

3 months time? god no...

As we know, we wont be seeing 20 nm Maxwell until Q3 at the EARLIEST.

 

Sources? Cause last source I read said it would be delayed until about Q2, so I expected announcment Q2 with Maxwell in someones hand by early Q3.

Link to comment
Share on other sites

Link to post
Share on other sites

The price thing is only an issue in the US and Canada, anywhere else, it's perfectly fine, a 280x is almost 150 dollars cheaper than a 770 here in Aus.

Australian pricing is weird when it comes to tech, I bought my 770 for about the same price as an R9 280X, which was before the mining craze.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Sources? Cause last source I read said it would be delayed until about Q2, so I expected announcment Q2 with Maxwell in someones hand by early Q3.

I cant find the link dammit...it was a report from TSMC.

Starting manufacture in Q1/Q2 and ramping up production in Q3/Q4. Seeing as nVidia has low-priority for 20 nm, its safe to say they will most likely be getting it in Q3 when they are at full production.

Link to comment
Share on other sites

Link to post
Share on other sites

Tell me how freesync works with nvidia

 

"Freesync" is a VESA standard that is built into DisplayPort so anyone can use it.

Link to comment
Share on other sites

Link to post
Share on other sites

It is capable of working on a desktop PC, it's just that some monitors have skimped on VESA standards...

The reference coolers from nVidia for the 480 was shite that ran at a 65 degree delta in celsius, commonly running at about 90 degrees, which is what the 290/x reference runs at, and I assume you have a non-reference 480...

AMD and nVidia both make SHITHOUSE reference cards with the exception of the 7xx series, but seeing as nobody buys the reference cards if there are 3rd party ones available, which there almost always are, it doesnt really matter. AMD and nVidia are chip design companies, not board and cooler companies.

And oh god...every sentence in the paragraph starting with "And yes, I am..." REEKS of fanboy...

Let's take each fanboy-crap statement one by one.

"GPU that runs Quietly and Silently" - Every non-reference card, Twin Frozr, DC 2, has practically the EXACT same temperature and acoustic properties no matter if it's an nVidia or AMD GPU.

"Has a huge headroom for OC" - Have you seen the 290x Lightning at CES? Because if you had, you would comprehend that OCing capability has practically F*CK ALL to do with the GPU, and far more to do with the card itself! The reference 290x was a SHIT OCer, but thats because it's a SHIT CARD. The 290x GPU itself can do FAR better than that.

"GPU uses high quality materials" - Ok, now I'm just laughing at this one. You do realise that both AMD and nVidia both have TSMC fab their GPUs, both using the same technology, same standards, same everything. And both of their VRAM is made by Hynix. Saying a GPU uses "high quality materials" is fanboy bullshit.

I'm afraid I'm an owner of the reference Gigabyte's model (http://www.ixbt.com/video3/images/gf100-4/gigabyte-gtx480-scan-front.jpg) and it does idle at around 50 degrees jumping to about 70 under full load, with one exception being World of Tanks xmas menu, which just fries my system, both gpu and cpu wise. I blame terrible coding, as it doesn't happen anywhere else (unlimited frames?). I can bring the noise to similiar levels as the r9 290/x when i ramp up the fan to 100%, but even then it doesn't have this piercing whiney noise that the latter produces.

 

I might have worded it poorely. If you take a look at a card with titan style blower and compare it to r9 290/x on one side you have a solid aluminum that gives you that premium feel, on the other hand you have a bunch of cheap plastic. Titan cooler can keep up with its clocks and do it well. r9's cannot. I dislike the plastic whatever that is nvidia's 680, but their 770,780,780ti and titan are amazing and nvidia should be praised for them. 

 

Of course that the MSI lightning edition is a great overclocker in comparison to the reference version. That is true for both 780 and 290x. And you pay premium for it. But the 780 is a great overclocked in the reference version while the r9 290/x are not. Reference design is a thing. A lot of people purchase reference cards. Currently nvidia is on top in that respect. Neither AMD nor Nvidia should take the credit for MSI's, ASUS', GIGABYTE's, EVGA's  and the others work. What you said is like concluding that it's ok that a game is broken on release because mods will fix it later with.

 

Please make yourself acquainted with the articles on the tech, one of which I have linked. They make it quite clear that the laptop/tablet monitors are quite different to standalones used for desktops and this is the reason why AMD used them instead of some random monitor using that particular VESA standard. I'd prefer to not be called a fanboy. There is no, and I repeat NO proof of it being useable in the way AMD suggested, and until they provide such proof I shall remain sceptical.

 

I dislike that nvidia can hold back because AMD underdelivers. I'm not happy that they can get away with those prices. Yes, I prefer nvidia's design in the current gen. Yes, I want proof of a concept and a working prototype before I even consider a tech to be viable. In no way it makes me a 'fanboy', unless that's some cheap technique to undermine my opinions. If you want to steep to this level then please, go on... but how will call all those people jumping on the bandwagon of 'AMD said that X might be possible, OMG WTF BBQ NVIDIA  SO NOOB NOW!'? Drones?

Link to comment
Share on other sites

Link to post
Share on other sites

I'm afraid I'm an owner of the reference Gigabyte's model (http://www.ixbt.com/video3/images/gf100-4/gigabyte-gtx480-scan-front.jpg) and it does idle at around 50 degrees jumping to about 70 under full load, with one exception being World of Tanks xmas menu, which just fries my system, both gpu and cpu wise. I blame terrible coding, as it doesn't happen anywhere else (unlimited frames?). I can bring the noise to similiar levels as the r9 290/x when i ramp up the fan to 100%, but even then it doesn't have this piercing whiney noise that the latter produces.

 

I might have worded it poorely. If you take a look at a card with titan style blower and compare it to r9 290/x on one side you have a solid aluminum that gives you that premium feel, on the other hand you have a bunch of cheap plastic. Titan cooler can keep up with its clocks and do it well. r9's cannot. I dislike the plastic whatever that is nvidia's 680, but their 770,780,780ti and titan are amazing and nvidia should be praised for them. 

 

Of course that the MSI lightning edition is a great overclocker in comparison to the reference version. That is true for both 780 and 290x. And you pay premium for it. But the 780 is a great overclocked in the reference version while the r9 290/x are not. Reference design is a thing. A lot of people purchase reference cards. Currently nvidia is on top in that respect. Neither AMD nor Nvidia should take the credit for MSI's, ASUS', GIGABYTE's, EVGA's  and the others work. What you said is like concluding that it's ok that a game is broken on release because mods will fix it later with.

 

Please make yourself acquainted with the articles on the tech, one of which I have linked. They make it quite clear that the laptop/tablet monitors are quite different to standalones used for desktops and this is the reason why AMD used them instead of some random monitor using that particular VESA standard. I'd prefer to not be called a fanboy. There is no, and I repeat NO proof of it being useable in the way AMD suggested, and until they provide such proof I shall remain sceptical.

 

I dislike that nvidia can hold back because AMD underdelivers. I'm not happy that they can get away with those prices. Yes, I prefer nvidia's design in the current gen. Yes, I want proof of a concept and a working prototype before I even consider a tech to be viable. In no way it makes me a 'fanboy', unless that's some cheap technique to undermine my opinions. If you want to steep to this level then please, go on... but how will call all those people jumping on the bandwagon of 'AMD said that X might be possible, OMG WTF BBQ NVIDIA  SO NOOB NOW!'? Drones?

I never said that asking for prototypes and proof of concepts makes you an ignorant fanboy, I said that saying that nvidia GPUs are made from "higher quality materials" does. And who seriously buys reference cards? No shop I know of sells reference cards on AMD or nVidia's side unless it's what the board partner is using, such as the Titan cooler. Why argue over the specifics of the reference cards when it's not what we buy.

Link to comment
Share on other sites

Link to post
Share on other sites

I never said that asking for prototypes and proof of concepts makes you an ignorant fanboy, I said that saying that nvidia GPUs are made from "higher quality materials" does. And who seriously buys reference cards? No shop I know of sells reference cards on AMD or nVidia's side unless it's what the board partner is using, such as the Titan cooler. Why argue over the specifics of the reference cards when it's not what we buy.

Most stores around order 10+ per company and maybe 1-2 aftermarket ones. So at least in the EU perspective reference version do matter.

Link to comment
Share on other sites

Link to post
Share on other sites

Now that's just completely untrue. Running quieter has nothing to do with luck, and while there is a lottery factor in getting good chips the high-end NVIDIA cards are consistently clocking better than AMD's counterparts.

Freesync - It's a demo'd technology, we have videos, and it's certainly more than we got at nVidia's G-sync announce!

Mantle - We have builds of it already, they had an FPS counter on that video demo that was pushing 100 FPS ultra on mid-to-high range hardware.

GPUs - You're arguing for a "halo effect" as Linus puts it, which he has said is a terrible way for choosing GPUs. Also, quiet and cool nVidia? Unless you are using the nightmarish reference 290x for that compatison, then you have absolutely no basis for saying that nvidia is cool and quiet while AMD is hot and loud, the non-reference cards that we all buy are practically IDENTICAL GPU to GPU. That's like me saying that nVIdia is hot and loud because the GTX 480 was, it's asinine!

If anybody has one the last year in GPUs? It's AMD. They got both next-gen consoles, they are making a very powerful push into the mobile sector, and they have a solid lineup of GPUs and APUs...

Quoting for emphasis, because this kind of trashes the argument the other guy had.. I was going to post the same thing :).

R9 290s have their flaws here n there but they're the newest GPU from AMD. The next line that use the same chip will be even better, and that likely includes their reference cards. If not, maybe Sapphire or someone else can put out some really nice cards.

As for NVidia OCing better (significantly better, in terms of % gain), this is only true on the highest end NVidia + EVGA cards with their unlocked voltages and high overall build quality. They also cost a fortune; $800-1000+.

To people who say NVidia still has the market share, that may be true for a while but AMD has locked down consoles and is beginning to look strong in the mobile market. On top of that, they're not exactly far behind elsewhere and they have certain markets or "revolutionary" tech advances almost entirely to themselves, or at least enough to put their name out there in a big way. Intel and NVidia are competition, but so is AMD towards them, which is extremely apparent now IMO.

The boost in sales (if you could say they will definitely be selling more products now than previously) as well as one-upping their competition in various ways at the tail end of 2013 + early 2014 should help with them bridging the gap between their competition very soon. I don't expect AMD to be the top dog by the end of 2015 or something, but unless more stuff happens, AMD is looking much better than they have in the past and, as I said before, they were already competition for Intel and NVidia.

Link to comment
Share on other sites

Link to post
Share on other sites

If you think logically, NVIDIA is actually making more money in the process by attracting their current prices on their graphics cards. On the other hand AMD is having WAYYYY high prices due to mining.

Link to comment
Share on other sites

Link to post
Share on other sites

Most stores around order 10+ per company and maybe 1-2 aftermarket ones. So at least in the EU perspective reference version do matter.

I have not seen a reference card (once again, barring 290/x and titan blower) on sale in Australia since the 4xx series from nVidia... 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×