Jump to content

I will never doubt AMD again after I watched this video

c00face

I can't believe how much of a !@#$ nVidia is, and they have been for a very very long time. I won't ever doubt the boys again at AMD. Those guys are great, and they've always look out for the interest of gamers as a whole rather than as a company. They've shared their technologies and open source with nVidia so that all gamers can benefit, but nVidia will never do the same.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This again...

 

EDIT: Just buy what's best for your money and your needs. Performance is what matters. Do not base the purchase on the GPU chip manufacturer.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

Ah 3/2kliksphillip...didn't think I'll see you being brought up on a LTT fourm thread...

 

What he says is correct but for crying out loud, you don't need to make a thread about this...everyone knows this and just move on :) 

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Mr.Meerkat said:

Ah 3/2kliksphillip...didn't think I'll see you being brought up on a LTT fourm thread...

 

What he says is correct but for crying out loud, you don't need to make a thread about this...everyone knows this and just move on :) 

I didn't know this, and it'll be great to now disable tessellation feature on AMD to see how it performs.

Link to comment
Share on other sites

Link to post
Share on other sites

I liked ATI before it was AMD. Now I can't say the same because I have had 2 video cards from AMD fail within 2 years. Old ATI? I have had cards last me 10 years. Old Nvidia? I have had cards last me 10 years. Switching over to Nvidia in hopes that they make more reliable GPU's.

 

Anyways, Nvidia isn't obliged to share anything with AMD, and vice versa. It's not their responsibility to give away all of their research and development over to a rival company. When did Apple ever share anything with microsoft? There's examples of Microsoft porting over their stuff to Apple's platforms, but there's too many times where Apple has refused to do so. And we don't see people throw a fit about that now do we?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Supermangik said:

I didn't know this, and it'll be great to now disable tessellation feature on AMD to see how it performs.

Okay, not fully true but just get whatever give you the best performance for your budget :P (and when i say most know about this, i mean regulars on techy PC forums would know this)

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, KingKeith55 said:

I liked ATI before it was AMD. Now I can't say the same because I have had 2 video cards from AMD fail within 2 years. Old ATI? I have had cards last me 10 years. Old Nvidia? I have had cards last me 10 years. Switching over to Nvidia in hopes that they make more reliable GPU's.

 

Anyways, Nvidia isn't obliged to share anything with AMD, and vice versa. It's not their responsibility to give away all of their research and development over to a rival company. When did Apple ever share anything with microsoft? There's examples of Microsoft porting over their stuff to Apple's platforms, but there's too many times where Apple has refused to do so. And we don't see people throw a fit about that now do we?

 

 

There product doesn't affect people or a certain community. If you wanted to dual boot between Mac and Windows, you can. This isn't true for nVidia and AMD. AMD is willing to share their technologies so that gamers around the world can experience better gameplay, but nVidia will not. This in turn affects the community because there's no fix to this unless somehow AMD figure out how the technology works. So I really don't see how Microsoft and Apple compares.

Link to comment
Share on other sites

Link to post
Share on other sites

The other issue is, as much as Nvidia is a douchebag company, is that AMD is actually going to the opensource and compatibility route because their solutions are, simply, even by small amounts, inferior to Nvidia's solutions. AMD does it so that their solutions get adopted easier and used more often. People on NVidia's solutions are fine the way they are, because the AMD variants are usually slightly worse than the ones that NVidia has to offer. Why go with something that's not as good if you don't have to? I'm sure if AMD was in NVidia's situation, where they pushed out superior solutions for everything, they would keep them closed so that they have the "Best" stuff. So now this works out great for AMD, as now their clever method to get more adoption for their solutions; which is because they're opensourced, is now making AMD look like victims of Nvidia's "evil corporate wrath", and then they gain better reputation and more people using their products. TL;DR Nvidia is a Douchebag company, AMD is actually only opensourcing things for faster and more widespread adoption of their technology, which in the long run will generate them the "big moneys" because everyone uses their tech.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, huilun02 said:

Nope. They opened up Mantle for development of DX12 and other APIs long before it was scrapped.

We would not have DX12 if not for Mantle. Here's proof for all you wankers who think otherwise.

They didn't even stop Nvidia from using Mantle. And you know what their response would be.

 

AMD pushed for inclusion of adaptive sync into the Displayport standard (and ultimately reaching HDMI today)

It is there for anyone to use it. Even Intel has shown interest for their iGPU to use adaptive sync.

Updated my post to explain both sides. You'll see that it's really clever on AMD's behalf, and it's going to benefit them a lot in the future.

Link to comment
Share on other sites

Link to post
Share on other sites

This is how the world works. 

 

Company A is catching up to company B so company B released some bullshit features that won't work with company A's product or makes it work very inferior to their own. 

 

Now, AMD sees NVIDIA and what they do, to counter, they create an open source platform to counter NVIDIAs propriety stuff in hopes that manufactures will adopt it much more quickly than the competitor. 

 

It's nothing new. Companies do it all the time, especially with technology.  Tessellation, NVIDIA gameworks. G-sync. physx. And what is it? Bullshit to make people buy their product for features that are stupid. Consumers such as us gobble that shit up

Do you even fanboy bro?

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, KingKeith55 said:

Updated my post to explain both sides. You'll see that it's really clever on AMD's behalf, and it's going to benefit them a lot in the future.

Yes and no actually as with the Nvidia's technologies floating around, AMD is having to fight for companies to use their tech. Nvidia doesn't as they have all the money to pay and bribe companies to use their tech which in turn could mean AMD's technologies not gaining any traction.

 

A reason why people hate how everything from Nvidia is closed off isn't that it's closed up, it's hurting AMD cards performance. Look at gameworks (especially hairworks), if it's turned on with a AMD card, it hurts it's performance badly so what that's doing is sabotaging performance. Think about what if Nvidia paid devs to have hairworks turned on permanently while AMD can't optimise it? Their cards would look like they perform shit compared to Nvidia cards in those titles.

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Morgan MLGman said:

This again...

 

EDIT: Just buy what's best for your money and your needs. Performance is what matters. Do not base the purchase on the GPU chip manufacturer.

Both are equally as bad. AMD with falsely marketing its post K10 architectures and renaming old architectures to sell them as new (recent trend of theirs), Nvidia with turbo cache (another name for using system memory instead of vRAM-example, 512MB Geforce 6200 with only 16-64MB DDR2-the other shared) and older 64bit bus GPU (which run like shit compared to those with 128bit bus despite having the same model name)

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I'd like to think that AMD is ethically the better company, and I hope it remains that way.

 

Open Source software, I think tends to be less of a douchey/greedy approach to things. I mean in some instances I can understand closed source licensing, but in most cases I think the majority of stuff should be under open source licensing.

2 minutes ago, Liltrekkie said:

It's nothing new. Companies do it all the time, especially with technology.  Tessellation, NVIDIA gameworks. G-sync. physx. And what is it? Bullshit to make people buy their product for features that are stupid. Consumers such as us gobble that shit up

True, Nvidia uses those technologies as gimmicks or selling points. I mean Gsync and Freesync are the same. But it already looks like there are more monitors going for Freesync than Gsync and Gsync has been around quite a bit longer.

a Moo Floof connoisseur and curator.

:x@handymanshandle x @pinksnowbirdie || Jake x Brendan :x
Youtube Audio Normalization
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Supermangik said:

Those guys are great, and they've always look out for the interest of gamers as a whole rather than as a company.

Yeah, that's why they threw Radeon under the bus and told development to basically do nothing. Because they caaaaaaaare.

 

lol

.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Dabombinable said:

AMD with falsely marketing its post K10 architectures and renaming old architectures to sell them as new (recent trend of theirs)

Do you mean the R9 200 series to R9 300 series?

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, wcreek said:

I'd like to think that AMD is ethically the better company, and I hope it remains that way.

 

Open Source software, I think tends to be less of a douchey/greedy approach to things. I mean in some instances I can understand closed source licensing, but in most cases I think the majority of stuff should be under open source licensing.

True, Nvidia uses those technologies as gimmicks or selling points. I mean Gsync and Freesync are the same. But it already looks like there are more monitors going for Freesync than Gsync and Gsync has been around quite a bit longer.

That's becaue NVIDIA decided to make a module that costs way more money for monitor companies to R&D and put in. You think NVIDIA didn't know they could do this FOR FREE and make it cost zip on display port? I bet they knew, but wanted to make money off it. You know why? Because it adds the +1 to their "LOL i have a feature you dont have" list

Do you even fanboy bro?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Liltrekkie said:

That's becaue NVIDIA decided to make a module that costs way more money for monitor companies to R&D and put in. You think NVIDIA didn't know they could do this FOR FREE and make it cost zip on display port? I bet they knew, but wanted to make money off it. You know why? Because it adds the +1 to their "LOL i have a feature you dont have" list

I suppose, and now that Freesync is a thing. It's less of a selling point?

a Moo Floof connoisseur and curator.

:x@handymanshandle x @pinksnowbirdie || Jake x Brendan :x
Youtube Audio Normalization
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, wcreek said:

I suppose, and now that Freesync is a thing. It's less of a selling point?

Yes and no. There are advantages and disadvantages to each. The biggest being that freesync relies on monitor companies to set adaptive refresh rates, Freesync has the ability to go all the way down to 7 FPS, but have you ever seen a monitor that could do that? No, why? Because no monitor company wants to spend the money expanding the adaptive refresh rates we get.  It's the catch 22 on what makes what a better product.

Do you even fanboy bro?

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Morgan MLGman said:

Do you mean the R9 200 series to R9 300 series?

And to add on top, the whole r7/r9 200 series where their names were 280X and lower :/ 

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD is the smaller company, they have to make their features open source so they have a chance of competing. If game developers only used Nvidia Gameworks, AMD would be out of the market, so AMD has to make an open source clone so they can compete.

Link to comment
Share on other sites

Link to post
Share on other sites

you do realize the video is from like 2015 or something...

and it's been posted on this forum like 50 times minimum...

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Comic_Sans_MS said:

AMD is the smaller company, they have to make their features open source so they have a chance of competing. If game developers only used Nvidia Gameworks, AMD would be out of the market, so AMD has to make an open source clone so they can compete.

Issue here is that due to Nvidia tech being closed behind doors, it's gimping performance (or was) of AMD cards in games which is why there's such an outrage on Nvidia (or there was) :P 

 

2 minutes ago, DXMember said:

you do realize the video is from like 2015 or something...

and it's been posted on this forum like 50 times minimum...

Never seen it before...ermmm...also posted 9 days before 2016 ;) 

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Mr.Meerkat said:

Issue here is that due to Nvidia tech being closed behind doors, it's gimping performance (or was) of AMD cards in games which is why there's such an outrage on Nvidia (or there was) :P 

 

Never seen it before...ermmm...also posted 9 days before 2016 ;) 

Well - FurWorks was a shitfest - 64x Tess was beyond overkill

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Morgan MLGman said:

Do you mean the R9 200 series to R9 300 series?

Yep. In the past they at least kept the architectures name the same and didn't try to hide that it was the same.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×