Jump to content

Nvidia vs AMD Cards

InfinityHardware
I think AMD is best because of price, but i've only had one nvidia card and it never really worked properly.
my 9600GT choked to death on dust .... my fault really but ahwell xD
Link to comment
Share on other sites

Link to post
Share on other sites

if you buy a card based on nvida vs AMD then your an idiot. you should look at you need and what you want to use the card for. look at what both cards have to offer and not just at the brand name. some people say Amd overclock better some say nvida. this is untrue the OEM is a lot more important in this regard.
nVidia voltage caps on the 600 series are a fact, and the 7000 series gain a lot more of performance per MHz increased compared to the 600 series, due to the memory bottlenecks.

That's also a fact.

Link to comment
Share on other sites

Link to post
Share on other sites

ive got a 200+ offset without adjusting voltage and i can ramp volatge up tp 1175mV. im not here to argue which company is better because its clear youve already made up your mind

also when you claim something is a fact you better post a source to your claim.

How about Linus as a source ?

What ? you suddenly decided to forget about the whole MSI vs nVidia green light issue ? where they forced MSI to drop the triple over-voltage on the 660 Ti PE , the 670 PE & the 680 lightning ?

What about when they forced EVGA to stop the over-volting feature on the 680 classified cards...?

It is fact, I don't argue which company is better, because that is pointless ,but I argue which product is better.

Just to put it into perspective , HD 7950s when they were first launched were clocked @ 800mhz and people took them all the way up to 1200mhz , that's a 50% increase in core clock rate, you can't do that with any 600 series nVidia card, the overclocking argument is valid, and nVidia clients like you should be outraged that they are taking it away from you.

Even Slick admits that nVidia is slowly distancing itself from overclocking among other important enthusiast DIY features that they used to offer & we used to take for granted.

I've owned both nVidia and AMD/ATI cards, infact I've owned twice as many nVidia cards, I know what I'm talking about, I used to overclock these things.

Sources :

http://www.tweaktown.com/news/26082/...pus/index.html

http://www.overclockers.com/nvidia-s...oltage-control

Weekly live stream : Jump to 41:48 , watch to 43

Link to comment
Share on other sites

Link to post
Share on other sites

I chose the EVGA GTX 680 4GB for a few reasons. Some of them, everyone will care about, others are specifically for me.

Adaptive V-sync

Some people don't care about this. I get that. However, this does something very important. It prevents your GPU from working itself to death. Less time at 100% load usually means less heat. Less heat means better overclocking potential, AND (far more importantly) longer life-span. Those are things that are pros and everyone wants from their graphics card.

PhysX

Yes, it only makes a difference in games that utilize it. Borderlands 2 is a great example. The difference is large and noticeable to me and so it's worth it for the games that have it.

Better multi-GPU support

Yes, both companies support for multi-GPU sucks. But nVidia sucks slightly less than AMD. I definitely intend to buy a second GTX 680 down the road for SLI.

Free Borderlands 2

I won the Never Settle Promo game pack from a website competition, so that holds no real value for me in regards to buying an AMD product.

Better drivers in general.

I understand that it's basically back and forth between the two companies, however, right now, nVidia's drivers are better. The lag problem with the AMD's being an example of this.

AMD is about (about meaning generally soon) to release the 8XXX series and so the 7XXX series will be outdated sooner.

Not that that one actually matters as much.

My system's theme is green. nVidia cards are green. .... yeah.

You get the idea.

† Christian Member †

For my pertinent links to guides, reviews, and anything similar, go here, and look under the spoiler labeled such. A brief history of Unix and it's relation to OS X by Builder.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

it is to my understanding that it was a warranty issue . or it may be Nvida much like intel doesn't want you to be able to over clock to much. also you have not posted a source. i have never seen a 400+ plus off set on a GPU i would like to see this

I wouldn't give you 100% accurate information with several sources on several facts and make myself liable by lying about one of them, you keep going back to your default "what are your sources?" .

Which is fine I guess, but it goes to show how much more research anyone should do before they make a snap judgement on products of a company like nVidia, AMD or any other.

I think that you've been misled by a lot of the false advertising going on.

Anyway , here is just one example of a 50% increase on the core clock of a 7950 .

Tiny Tom Logan , is a down to earth guy, an extremely credible journalist. Even Linus watches and recommends his videos .

Link to comment
Share on other sites

Link to post
Share on other sites

@Tech, you do wonder where's his source to prove his statement.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×