Jump to content

AMD GPUs seems to have an advantage over the equivalent GeForce models in the BF5 alpha

D13H4RD

LOL, Toms hardware was sent the closed alpha invite for this game by an Nvidia PR firm so they decided to run benchmarks excluding AMD because they did not want to show Nvidia in bad light

https://www.tomshardware.com/reviews/battlefield-v-gameplay-benchmarks,5677.html

 

And then they choose to highlight the fact that there is an issue currently being tracked on the forums regarding performance issues on AMD cards. Because yes pointing out that somebody made a forum thread was better than running your own tests LOL.

 

Pretty ridiculous how they towed the line of the PR firm when they had the opportunity to run the tests themselves.

 

I don't see the problem with testing on AMD, even if Nvidia provided the Game access shouldn't they expect that when you provide access to Tech journalists they will test performance. I guess they want to tightly control the messaging until they have the relative AMD:Nvidia performance where they want it to be...

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Humbug said:

LOL, Toms hardware was sent the beta invite for this game by an Nvidia PR firm so they decided to run benchmarks excluding AMD because they did not want to show Nvidia in bad light

https://www.tomshardware.com/reviews/battlefield-v-gameplay-benchmarks,5677.html

 

And then they choose to highlight the fact that there is an issue currently being tracked on the forums regarding performance issues on AMD cards. Because yes pointing out that somebody made a forum thread was better than running your own tests LOL.

 

Pretty ridiculous how they towed the line of the PR firm when they had the opportunity to run the tests themselves.

 

I don't see the problem with testing on AMD, even if Nvidia provided the Game access shouldn't they expect that when you provide access to Tech journalists they will test performance. I guess they want to tightly control the messaging until they have the relative AMD:Nvidia performance where they want it to be...

They explain openly why they  have no AMD cards:

 

Quote

Why no AMD cards? To begin, our closed alpha invite came by way of Nvidia’s PR firm, so we were already wary of running benchmarks that’d pit the two companies against each other. Then we spotted the following known issue on EA’s closed alpha forum: “[TRACKED] BFV - PC - Massive performance issue is present with the AMD RX series cards.” We’ll revisit a head-to-head between AMD and Nvidia once the final game becomes available in a few months.

 

And that's not a generic forum thread, that is an official list of bugs posted by EA from EA. 

 

I don't see how it is bad journalism or "ridiculous" when they are being transparent and telling you outright they haven't included AMD because they were PR'ing for Nvidia and don't want to unfairly present AMD benchmarks if they are not representative of AMD performance due to an issue currently being fixed. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, mr moose said:

They explain openly why they  have no AMD cards:

 

 

And that's not a generic forum thread, that is an official list of bugs posted by EA from EA. 

 

I don't see how it is bad journalism or "ridiculous" when they are being transparent and telling you outright they haven't included AMD because they were PR'ing for Nvidia and don't want to unfairly present AMD benchmarks if they are not representative of AMD performance due to an issue currently being fixed. 

It's a closed alpha. Performance issues are expected. No big deal... I also fully expect Nvidia to get better with game ready drivers and game updates etc.

 

What readers expect of a tech website is to test the performance themselves and then point to the dev feedback either in the form of a bug report or email exchange etc.

 

When IHVs and PR firms provide items for testing they generally know that they are going to be compared, it's the norm in this Tech journalism and benchmarking field.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Humbug said:

It's a closed alpha. Performance issues are expected. No big deal... I also fully expect Nvidia to get better with game ready drivers and game updates etc.

 

What readers expect of a tech website is to test the performance themselves and then point to the dev feedback either in the form of a bug report or email exchange etc.

 

When IHVs and PR firms provide items for testing they generally know that they are going to be compared, it's the norm in this Tech journalism and benchmarking field.

 

The only thing I expect as a reader is that the media are honest.  Tick here for that.

 

You can't be mislead if the journalist is transparent about why they did what they did.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

The only thing I expect as a reader is that the media are honest.  Tick here for that.

 

You can't be mislead if the journalist is transparent about why they did what they did.

Ok cool,  I am not questioning their integrity or calling them liars.

 

Just that they had the opportunity to publish a BF5 alpha performance preview but choose to do an Nvidia exclusive preview instead. I do understand that this is now an Nvidia sponsored game so obviously Nvidia and the PR firm that gave them access wanted to use it as an opportunity to push Nvidia GPUs only. I suppose they would not have been granted access unless they agreed to these terms...

 

It would have been useful and interesting to their readers to have a bit more variety in the GPU testing as is the norm when benchmarking.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Humbug said:

Ok cool,  I am not questioning their integrity or calling them liars.

 

Just that they had the opportunity to publish a BF5 alpha performance preview but choose to do an Nvidia exclusive preview instead. I do understand that this is now an Nvidia sponsored game so obviously Nvidia and the PR firm that gave them access wanted to use it as an opportunity to push Nvidia GPUs only. I suppose they would not have been granted access unless they agreed to these terms...

 

It would have been useful and interesting to their readers to have a bit more variety in the GPU testing as is the norm when benchmarking.

I'm glad they didn't run AMD if there was a chance the results would be unfairly distorted.  It's not like they aren't going to do them. or that there aren't others out there doing them.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, mr moose said:

I'm glad they didn't run AMD if there was a chance the results would be unfairly distorted. 

There are already norms in the tech press and benchmarking world for this, it's no big deal . Pop in an AMD GPU (which Tom has loads of), check if the game runs and capture the performance data, and verify that it is rendering correctly the same image as Nvidia cards. Mention in the article any discovered abnormalities and point to the dev acknowledgement on the forums, or state that everything ran properly for you and that you were unable to recreate said issue...

 

Also the game is a work in progress, numbers will change for both sides.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Humbug said:

There are already norms in the tech press and benchmarking world for this, it's no big deal . Pop in an AMD GPU (which Tom has loads of), check if the game runs and capture the performance data, and verify that it is rendering correctly the same image as Nvidia cards. Mention in the article any discovered abnormalities and point to the dev acknowledgement on the forums, or state that everything ran properly for you and that you were unable to recreate said issue...

 

Also the game is a work in progress, numbers will change for both sides.

you keep saying that. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Please close these threads this isnt NEWS, the particular optimization or the lack thereof a particular game in alpha means nothing, even if it were final game version, unless you had official news from the developer that they used certain AMD GPU features and its running faster on AMD ITS NOT  TECH NEWS.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/4/2018 at 12:36 AM, D13H4RD2L1V3 said:

So it really seems that AMD's optimization efforts beginning with the Mantle API on Battlefield 4 and GCN seems to still be paying off. Particularly worrying is how the GeForce GPUs seem to struggle a bit on DX12 when the Radeon ones seem to not break much of a sweat 

 

It shows that AMD GPUs aren't that bad when they aren't obstructed by shitty programming or other bullshit like excessive tesselation.

 

And no, its more like their Console engagement pays off, as well as the push for Low Level APIs.

 

You heard what all the Developers said about low level APIs and nVidia?

Because they tell the story that they went to nVidia first but they didn't seem interested, so they went to AMD wich really liked the Idea...

 

Quote

It's too early to tell whether AMD cards still get an advantage to this degree at launch but there is plenty of time to have the delta closed. Let's just hope that the delta is closer by improving the performance when on GeForce models rather than having AMD GPUs gimped. 

Sounds like you're mad that the AMD Cards perform better...

Well, that is that they are better, most games just aren't optimized for AMD at all or integrate some other bullshit to obstruct the performance of AMD Cards. 

 

Just look for Async Compute. Or look at Benchmarks for mostly Console Games that aren't that large (and from mostly Japanese Developers) and don't come with none of that nVidia Garbage to obstruct the competition. 

 

For example ni no kuni 2:
https://gamegpu.com/rpg/ролевые/ni-no-kuni-ii-revenant-kingdom-test-gpu-cpu

 

And the Gamedevelopers have to optimize to death on AMD Hardware because of PS4 and XBox One!

 

There are also other examples where the AMD Cards perform very well and a VEGA 56 beats a GTX 1080 for example. And that are games that don't come with an nVidia Logo or Gameworks...

 

On 7/6/2018 at 4:28 AM, yian88 said:

the particular optimization or the lack thereof a particular game in alpha means nothing, even if it were final game version

Agreed, because the Final Version will probably get some Gameworks Features or "asks" Dice to implement high tesselation to make AMD Cards look worse than they are. 

We've seen that a couple of times.

 

On 7/4/2018 at 12:45 AM, Jurrunio said:

It's probably due to heavy use of async computing. Ya know, Pascal cards dont support this on a hardware level, instead using CUDA to emulate Async compute.

They don't use CUDA Stuff, they just use a mostly software solution vs. the Hardware Solution AMD uses. AMDs Hardware has features implemented that allow them to run multiple things on the GPU, with the unused resource. That's what the so called "ACE" are for.

 

As for Performance on GCN: Yes, its improved with later iterations of GCN due to more of these units. So GCN1.0 (Tahiti, Pitcairn) might not benefit as much as later chips such as Hawaii or Polaris.

 

On 7/4/2018 at 12:51 AM, Jurrunio said:

old news https://www.techpowerup.com/228447/amd-cripples-older-gcn-gpus-of-async-compute-support seems to suggest that 200 series cards and after that arent rebrands of HD series all benefit from async compute one way or another. hardware support wins.

If you know the architecture differences of Tahiti vs. more modern iterations of GCN and look at an official AMD Block diagramm of the Chips, you'd understand why that might have been the case or not. I'm not in the mood to switch on my PC with an older Tahiti or Pitcairn Card but, as I said, there are some major differences between GCN 1.0 and later iterations like Hawaii, Tonga and so on.

 

For example:

https://www.anandtech.com/show/7457/the-radeon-r9-290x-review/3

Quote

AMD has scaled up the number of ACEs from 2 in Tahiti to 8 in Hawaii. With each ACE now containing 8 work queues this brings the total number of work queues to 64.

That might be the reason for that...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Stefan Payne said:

If you know the architecture differences of Tahiti vs. more modern iterations of GCN and look at an official AMD Block diagramm of the Chips, you'd understand why that might have been the case or not. I'm not in the mood to switch on my PC with an older Tahiti or Pitcairn Card but, as I said, there are some major differences between GCN 1.0 and later iterations like Hawaii, Tonga and so on.

 

 

For example:

https://www.anandtech.com/show/7457/the-radeon-r9-290x-review/3

All I know is 200 series include new silicon and old rebrands, and only new chips support Async compute.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Stefan Payne said:

Sounds like you're mad that the AMD Cards perform better...

I don't think "mad" means in English what you think it means. He expressed his hope that the later version will perform better on NVIDIA not by gimping AMD but by increasing NVIDIA performance.

 

Seriously, you need to simmer down this anti-NVIDIA and anti-Intel crusade of yours because while I do appreciate your input in PSU forum section it's getting harder and harder to read you everywhere else. 

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stefan Payne said:

It sounds like you're mad that the AMD Cards perform better...

Uhhhhh, what? 

 

Simmer down a little. I was never mad that AMD cards performed better. I was stating that I hope the delta is closed through better optimization on the NVIDIA side rather than purposely making the AMD cards perform worse in order to make the NVIDIA cards look better. 

 

I recommended the RX 580 and its contemporaries because they are very good upper midrange options especially for DX12 titles. I may use a GTX 1060 but I've been critical of NVIDIA's business practices for a long time. My hope has always been AMD making a new GPU that's every bit as competitive for a great price. In essence, the Ryzen of GPUs. 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

I watched the alpha game play on Twitch. Despite people freaking out about the gender thing, it didn't impact multiplayer at all. Weird how that happened.

 

But the graphics looked pretty good. There we plenty of small graphical details such as snow accumulation on the hand of the person and melting over time, or even when the player is shot, they blood splatters in their weapon.

 

And it's crazy that it works so well with AMD compared to Nvidia. I guess kind of expected when they just switched gears...

 

.. but that gets me wondering... Does the "NVIDIA, the way it's meant to be played" actually mean it's been actually optimized? Or is it just sponsorship/advertising/partnering stuff?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Ryujin2003 said:

I watched the alpha game play on Twitch. Despite people freaking out about the gender thing, it didn't impact multiplayer at all. Weird how that happened.

 the culture wars going on in the west (left vs right etc) have made people on both sides hyper sensitive about the motives of the other side. So now people freak out even for small things rather than reserving that reaction for when it is truly justified.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/6/2018 at 11:12 AM, Ryujin2003 said:

And it's crazy that it works so well with AMD compared to Nvidia. I guess kind of expected when they just switched gears...

 

.. but that gets me wondering... Does the "NVIDIA, the way it's meant to be played" actually mean it's been actually optimized? Or is it just sponsorship/advertising/partnering stuff?

Frostbite has been pretty well optimized for a while now, and AMD GPUs certainly benefit from that. 

 

The NVIDIA thing might just be a sponsored thing 

 

On 7/6/2018 at 11:12 AM, Ryujin2003 said:

I watched the alpha game play on Twitch. Despite people freaking out about the gender thing, it didn't impact multiplayer at all. Weird how that happened.

Likely due to the reveal trailer, which gave a pretty forced impression 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

Can I ask this to people here:

why is it when a game favors AMD by quite a bit, it's "good optimization" and Nvidia cards are "shit", but when a game favors Nvidia by quite a bit, it's "Gimpworks" and "Nvidia gimped AMD cards reeeeeeee"?

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Dan Castellaneta said:

Can I ask this to people here:

why is it when a game favors AMD by quite a bit, it's "good optimization" and Nvidia cards are "****", but when a game favors Nvidia by quite a bit, it's "Gimpworks" and "Nvidia gimped AMD cards reeeeeeee"?

Some of it can be fanboy-ism, but there is some legitimacy to the claims.  For example, the Nvidia hair effects are specifically designed to only function well on an Nvidia card, and are closed source so that AMD (or others) can't make their cards compatible.  Witcher 3 being one example, where disabling that functionality made the game run significantly better.  Whereas the AMD version (can't recall the name offhand) of the hair effects are open source, and work pretty much equally well on AMD or Nvidia cards.  There is obviously some benefit on AMD cards, due to their being programmed for the AMD architecture, but it won't drag an Nvidia card down to a crawl when enabled.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dan Castellaneta said:

Can I ask this to people here:

why is it when a game favors AMD by quite a bit, it's "good optimization" and Nvidia cards are "shit", but when a game favors Nvidia by quite a bit, it's "Gimpworks" and "Nvidia gimped AMD cards reeeeeeee"?

Based on previus behavior from Nvidia there is some legitimazy to why people scream "gimpworks". Last time i checked a lot of the Gameworks features are almost sledge hammers against performance. Mostly against AMD but also against NVIDIA themselves.

 

As with AMD its clear its good optimizations from previus BF games. Cant speak from other games though

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Jito463 said:

Some of it can be fanboy-ism, but there is some legitimacy to the claims.  For example, the Nvidia hair effects are specifically designed to only function well on an Nvidia card, and are closed source so that AMD (or others) can't make their cards compatible.  Witcher 3 being one example, where disabling that functionality made the game run significantly better.  Whereas the AMD version (can't recall the name offhand) of the hair effects are open source, and work pretty much equally well on AMD or Nvidia cards.  There is obviously some benefit on AMD cards, due to their being programmed for the AMD architecture, but it won't drag an Nvidia card down to a crawl when enabled.

There have been many discussions on this forum regarding hairworks.  It can really only be one of two things, either Nvidia purposely used specific hardware to limit it to their cards or they simply leverage the best process they could (It's not up to a company to consider their opposition when developing a product). 

 

Clearly those from column A using derisory/petulant nomenclature and those from column b just get on with life.

 

As far as I am concerned, as soon as people start using those terms (gimpworks and ngreedia or M$ etc) they have shown they are not interested in understanding the issues and just want to bandwagon.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, mr moose said:

There have been many discussions on this forum regarding hairworks.  It can really only be one of two things, either Nvidia purposely used specific hardware to limit it to their cards or they simply leverage the best process they could (It's not up to a company to consider their opposition when developing a product).

Given that there is a clear and obvious trend by Nvidia to want to funnel and force people into their architecture (proprietary G-Sync vs adaptive sync standard, closed source software such as Hairworks/CUDA/etc), and given that while Nvidia's Hairworks only functions properly on Nvidia cards, AMD's TressFX - which behaves very similarly in appearance and functionality -  manages to work on both cards equally well, I'd argue that there's no debate.  Nvidia just likes to focus on making things that only play nice with their stuff.  Then they'll make arrangements with developers specifically to have them implement their closed-source libraries, so the game only runs well on Nvidia cards.

 

This is the main reason that it's such a surprise to see AMD cards besting a comparable Nvidia card on an Nvidia sponsored title.  Of course, BF5 is still in the alpha stages, so only time will tell if it continues to stay this way.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Jito463 said:

Given that there is a clear and obvious trend by Nvidia to want to funnel and force people into their architecture (proprietary G-Sync vs adaptive sync standard, closed source software such as Hairworks/CUDA/etc),

 

14 minutes ago, Jito463 said:

and given that while Nvidia's Hairworks only functions properly on Nvidia cards, AMD's TressFX - which behaves very similarly in appearance and functionality -  manages to work on both cards equally well, I'd argue that there's no debate. 

 

I agree, there is no debate, it's exactly what they are doing.  The issue tends to be around why some people paint it as unethical or an unfair trade practice.  I can see why some don't like it, but having an open source alternative doesn't make it unethical. and it certainly doesn't mean nvidia are being greedy. 

14 minutes ago, Jito463 said:

Nvidia just likes to focus on making things that only play nice with their stuff.  Then they'll make arrangements with developers specifically to have them implement their closed-source libraries, so the game only runs well on Nvidia cards.

I would argue that Nvidia focus on making things that work well on their stuff,  whether they actively go out of their way to hinder how it works on AMD remains to be seen.  I say this because hairworks and gsync not working on AMD stuff is also a natural result of simply not considering AMD hardware during development as opposed to actively ensuring it doesn't work.

14 minutes ago, Jito463 said:

This is the main reason that it's such a surprise to see AMD cards besting a comparable Nvidia card on an Nvidia sponsored title.  Of course, BF5 is still in the alpha stages, so only time will tell if it continues to stay this way.

 

Actually I am not surprised, with so much still being changed and worked on, I expect this to be an anomaly and I expect to see both brands even out performance wise as development winds on.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, mr moose said:

The issue tends to be around why some people paint it as unethical or an unfair trade practice. 

I classify it as bad engineering, mathematics or science. We give out prizes and accolades for great feats of all 3 of those and strive for the most elegant and least resource consuming ways to do them and my feelings on that applies to game development and the hardware involved too.

 

If there is indeed an alternative to Nvidia Hairworks that produces the same result more efficiently or elegantly then I deem the former, now (should be) obsolete way inferior.

 

The issue comes in proving it, that both produce the same results and are equally as versatile but one is superior than the other. Even removing the closed source nature of the Nvidia technology wouldn't make that easy as that doesn't solve the inherent problem of different GPU architectures, the best method for one may not be for the other etc.

 

45 minutes ago, mr moose said:

I would argue that Nvidia focus on making things that work well on their stuff,  whether they actively go out of their way to hinder how it works on AMD remains to be seen. 

The accusation very often is that Nvidia do look at AMD and ways to make it perform sub optimally on their hardware, not simply make it run well on their own hardware. Rightly or wrongly people have come to these conclusions based on the behavior of the technologies that appear to be more than coincidence like abnormally high levels of tessellation compared to what general literature on the technique would suggest is necessary for the desired effect.

 

I'm not sure if there are other examples like that though where people can point to a hardware weakness in AMD, an Nvidia technology/technique and literature that the behavior is unnecessary.

 

45 minutes ago, mr moose said:

Actually I am not surprised, with so much still being changed and worked on, I expect this to be an anomaly and I expect to see both brands even out performance wise as development winds on.

My guess is currently on AMD hardware the game is actually not rendering on an equivalent level due to development problems as suggested, you don't get these kinds of performance difference with just mere optimizations when literally no other game ever exhibits these performance disparities.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

I classify it as bad engineering, mathematics or science. We give out prizes and accolades for great feats of all 3 of those and strive for the most elegant and least resource consuming ways to do them and my feelings on that applies to game development and the hardware involved too.

 

If there is indeed an alternative to Nvidia Hairworks that produces the same result more efficiently or elegantly then I deem the former now, should be, obsolete way inferior.

If I was a company developing similar tech i would not be wasting time looking at the opposition other than to ensure they aren't better.  Artificially gimping the competition is a secondary priority to making sure your own product works.

 

5 minutes ago, leadeater said:

The issue comes in proving it, that both produce the same results and are equally as versatile but one is superior than the other. Even removing the closed source nature of the Nvidia technology wouldn't make that easy as that doesn't solve the inherent problem of different GPU architectures, the best method for one may not be for the other etc.

 

The accusation very often is that Nvidia do look at AMD and ways to make it perform sub optimally on their hardware, not simply make it run well on their own hardware. Rightly or wrongly people have come to these conclusions based on the behavior of the technologies that appear to be more than coincidence like abnormally high levels of tessellation compared to what general literature on the technique would suggest is necessary for the desired effect.

That's it for me in a nut shell,  No proof, just the same result that can be put down to them working only on their own product.  when was the last time we called ford unethical because their alternator mounting design was not compatible with toyota? I am sure they don't go out of the way to make it incompatible, they just don't put effort into making it compatible.

5 minutes ago, leadeater said:

I'm not sure if there are other examples like that though where people can point to a hardware weakness in AMD, an Nvidia technology/technique and literature that the behavior is unnecessary.

 

My guess is currently on AMD hardware the game is actually not rendering on an equivalent level due to development problems as suggested, you don't get these kinds of performance difference with just mere optimizations when literally no other game ever exhibits these performance disparities.

i wasn't thinking optimizations, I was think more like bugs that need to be addressed.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, mr moose said:

If I was a company developing similar tech i would not be wasting time looking at the opposition other than to ensure they aren't better.  Artificially gimping the competition is a secondary priority to making sure your own product works.

Not such a bad idea in a 2 horse race though, you have to be better than the rest right? When there is only one other competitor it's not that great of a burden.

 

I know I could win an Olympic gold medal if there was only one other person competing and all he had to do was "have an accident" ;) ;).

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×