Jump to content

Vega finally beats RTX? Vulkan strikes again!

BluJay614
4 minutes ago, RejZoR said:

Anyone ever thought it's not "this game is specifically optimized for AMD" but more like everyone else don't care to optimize games for AMD which is why it's so bad everywhhere else? I mean, for RX 590 to be hitting such high performance levels or Vega 56 even, you ask yourself what all other games are doing wrong that AMD is so much worse, but absolutely annihilates NVIDIA here... One thing is few frames difference, not almost low/mid end card going avgainst top of the line one...

seen this many times throughout the years

lack of drivers

optimized for certain arch

6 minutes ago, BluJay614 said:

What reviewers that are reliable and trustworthy use it. I don't mean bringing up the claims, I mean ACTUALLY use it.

not many now but if you just google tflops you will run into old and new articles always bringing it up

and comparing

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, BluJay614 said:

When have you ever seen anyone use TFLOP as an actual metric beyond AMD or Nvidia themselves?

It's the industry standard metric for floating point performance, it's more commonly used for LINPACK than it is for evaluating a single GPU. The main reason it doesn't hold much relevance for GPUs in graphics performance is it's a measure of compute performance but it is informative of the general hardware capability along with many techniques used in modern game engines being heavily compute based. It's not a full picture of GPU performance but it's not irrelevant, very much not so if you are using the GPU for compute but that's not what we are talking about.

 

Edit:

Also I think you missed the point, something with that clear difference in hardware capability shouldn't perform the same. Compute still uses the hardware shader pipeline of the GPU, the very core parts of the GPU used are common across compute and graphics. There isn't that much of a difference but there is enough to matter, but not enough for an over 20% difference equating to the same performance.

Link to comment
Share on other sites

Link to post
Share on other sites

Not sure if mentioned but also relevant:

 

 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bouzoo said:

Not sure if mentioned but also relevant:

 

 

It is, and it isn't. In this case, it's more Vega and Touring. This comparison and all the others that Hardware Unboxed have done demonstrate how different games will operate differently based on how the game is optimized, coded, and who's sponsoring it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, BluJay614 said:

It is, and it isn't. In this case, it's more Vega and Touring. This comparison and all the others that Hardware Unboxed have done demonstrate how different games will operate differently based on how the game is optimized, coded, and who's sponsoring it.

It is not Vulcan games sure, but people have, as usual, started to discuss things if general. Also it shows much more. If nothing else how unused amd cards are, as the case usually is, and how late amd is with proper performance support. One of these cards costs more than 50% less. And it is a well known fact that AMD titles run great on everything while NV titles with all bells and whistles run better on NV cards.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, leadeater said:

GNM is low-level hardware access and GNMX is higher-level access aimed at smaller devs and for quicker projects etc, GNMX is first point of call unless you really need that extra performance/optimization.

Which makes me wonder where the adoption of Vulkan is and how easy it is to work with. Looking online about DX11 vs DX12 programming tells me unless you really know what you're doing or you really need the extra performance, start with DX11.

 

If Vulkan is like DX12, then to me it should follow that it should have a similar level of "difficulty."

Link to comment
Share on other sites

Link to post
Share on other sites

AMD needs to make their card better than Nvidia, not just by nitpicking some games but with all games. Their reputation including brand as a video card maker is going down fast. As for them powering game consoles, that won't help rebuilt their image because, most consumers who buys it, don't even know what's in it. I tried to give them a chance for the past 8+ years or so, hoping they will release a card that truly shows they're back in the game, by outperforming Nvidia, but that day never came and never will. So I got tired of waiting and went with Nvidia.

AMD Navi beating Turing? With them right now like this, I seriously doubt that's going to happen.

 

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, BluJay614 said:

What reviewers that are reliable and trustworthy use it. I don't mean bringing up the claims, I mean ACTUALLY use it.

It's a metric of performance they use all the time in the HPC world. Except it's usually in FP64 numbers and Petaflops

ƆԀ S₱▓Ɇ▓cs: i7 6ʇɥפᴉƎ00K (4.4ghz), Asus DeLuxe X99A II, GT҉X҉1҉0҉8҉0 Zotac Amp ExTrꍟꎭe),Si6F4Gb D???????r PlatinUm, EVGA G2 Sǝʌǝᘉ5ᙣᙍᖇᓎᙎᗅᖶt, Phanteks Enthoo Primo, 3TB WD Black, 500gb 850 Evo, H100iGeeTeeX, Windows 10, K70 R̸̢̡̭͍͕̱̭̟̩̀̀̃́̃͒̈́̈́͑̑́̆͘͜ͅG̶̦̬͊́B̸͈̝̖͗̈́, G502, HyperX Cloud 2s, Asus MX34. פN∩SW∀S 960 EVO

Just keeping this here as a backup 9̵̨̢̨̧̧̡̧̡̧̡̧̡̡̢̢̡̢̧̡̢̡̡̢̧̛̛̛̛̛̛̱̖͈̠̝̯̹͉̝̞̩̠̹̺̰̺̲̳͈̞̻̜̫̹̱̗̣͙̻̘͎̲̝͙͍͔̯̲̟̞͚̖̘͉̭̰̣͎͕̼̼̜̼͕͎̣͇͓͓͎̼̺̯͈̤̝͖̩̭͍̣̱̞̬̺̯̼̤̲͎̖̠̟͍̘̭͔̟̗̙̗̗̤̦͍̫̬͔̦̳̗̳͔̞̼̝͍̝͈̻͇̭̠͈̳͍̫̮̥̭͍͔͈̠̹̼̬̰͈̤͚̖̯͍͉͖̥̹̺͕̲̥̤̺̹̹̪̺̺̭͕͓̟̳̹͍̖͎̣̫͓͍͈͕̳̹̙̰͉͙̝̜̠̥̝̲̮̬͕̰̹̳͕̰̲̣̯̫̮͙̹̮͙̮̝̣͇̺̺͇̺̺͈̳̜̣̙̻̣̜̻̦͚̹̩͓͚̖͍̥̟͍͎̦͙̫̜͔̭̥͈̬̝̺̩͙͙͉̻̰̬̗̣͖̦͎̥̜̬̹͓͈͙̤̜̗͔̩̖̳̫̑̀̂̽̈́̈́̿͒̿̋̊͌̾̄̄̒̌͐̽̿̊͑̑̆͗̈̎̄͒̑̋͛̑͑̂͑̀͐̀͑̓͊̇͆̿͑͛͛͆́͆̓̿̇̀̓͑͆͂̓̾̏͊̀̇̍̃́̒̎̀̒̄̓̒̐̑̊̏̌̽̓͂͋̓̐̓͊̌͋̀̐̇̌̓̔͊̈̇́̏͒̋͊̓̆̋̈̀̌̔͆͑̈̐̈̍̀̉̋̈́͊̽͂̿͌͊̆̾̉͐̿̓̄̾͑̈́͗͗̂̂́̇͂̀̈́́̽̈́̓̓͂̽̓̀̄͌̐̔̄̄͒͌̈́̅̉͊̂͒̀̈́̌͂̽̀̑̏̽̀͑̐̐͋̀̀͋̓̅͋͗̍́͗̈́̆̏̇͊̌̏̔̑̐̈́͑̎͑͆̏̎́̑̍̏̒̌̊͘͘̚̕̚̕̕̚̕̚̕̕͜͜͜͜͜͝͝͠͠͝͝͝͝͝͝͝͠͝͝ͅͅͅͅͅͅͅ8̵̨̛̛̛̛̮͍͕̥͉̦̥̱̞̜̫̘̤̖̬͍͇͓̜̻̪̤̣̣̹̑͑̏̈́̐̐́̎͒̔͒̌̑̓̆̓͑̉̈́́͋̌͋͐͛͋̃̍̽̊͗͋͊̂̅͊͑́͋͛̉̏̓͌̾̈́̀͛͊̾͑̌̀̀̌̓̏̑́̄̉̌͂́͛̋͊̄͐͊̈́̀̌̆̎̿̓̔̍̎̀̍̚̕̕͘͘͘̕̚͝͝͠͠͠0̶̡̡̡̢̨̨͕̠̠͉̺̻̯̱̘͇̥͎͖̯͕̖̬̭͔̪̪͎̺̠̤̬̬̤̣̭̣͍̥̱̘̳̣̤͚̭̥͚̦͙̱̦͕̼͖͙͕͇̭͓͉͎̹̣̣͕̜͍͖̳̭͕̼̳̖̩͍͔̱̙̠̝̺̰̦̱̿̄̀͐͜͜ͅͅt̶̡̨̡̨̧̢̧̢̨̧̧̧̧̢̡̨̨̢̨̢̧̢̛̛̛̛̛̠͍̞̮͇̪͉̩̗̗͖̫͉͎͓̮̣̘̫͔̘̬̮̙̯̣͕͓̲̣͓͓̣̹̟͈̱͚̘̼̙̖̖̼̙̜̝͙̣̠̪̲̞̖̠̯̖̠̜̱͉̲̺͙̤̻̦̜͎̙̳̺̭̪̱͓̦̹̺͙̫̖̖̰̣͈͍̜̺̘͕̬̥͇̗̖̺̣̲̫̟̣̜̭̟̱̳̳̖͖͇̹̯̜̹͙̻̥̙͉͕̜͎͕̦͕̱͖͉̜̹̱̦͔͎̲̦͔̖̘̫̻̹̮̗̮̜̰͇̰͔̱͙̞̠͍͉͕̳͍̰̠̗̠̯̜̩͓̭̺̦̲̲͖̯̩̲̣̠͉̦̬͓̠̜̲͍̘͇̳̳͔̼̣͚̙͙͚͕̙̘̣̠͍̟̪̝̲͇͚̦̖͕̰̟̪͖̳̲͉͙̰̭̼̩̟̝̣̝̬̳͎̙̱͒̃̈͊̔͒͗̐̄̌͐͆̍͂̃̈́̾͗̅̐͒̓̆͛̂̾͋̍͂̂̄̇̿̈͌̅̈́̃̾̔̇̇̾̀͊͋̋̌̄͌͆͆̎̓̈́̾̊͊̇̌̔̈́̈́̀̐͊̊̍͑̊̈̓͑̀́̅̀̑̈́̽̃̽͛̇́̐̓̀͆̔̈̀̍̏̆̓̆͒̋́̋̍́̂̉͛̓̓̂̋̎́̒̏̈͋̃̽͆̓̀̔͑̈́̓͌͑̅̽́̐̍̉̑̓̈́͌̋̈́͂̊́͆͂̇̈́̔̃͌̅̈́͌͛̑̐̓̔̈́̀͊͛̐̾͐̔̾̈̃̈̄͑̓̋̇̉̉̚̕̚͘̕̚̚̕̕͜͜͜͜͜͜͜͜͜͜͜͜͜͝͝͝͠͝͝͝͝͝͠ͅͅͅͅͅi̵̢̧̢̧̡̧̢̢̧̢̢̢̡̡̡̧̧̡̡̧̛̛͈̺̲̫͕̞͓̥̖̭̜̫͉̻̗̭̖͔̮̠͇̩̹̱͈̗̭͈̤̠̮͙͇̲͙̰̳̹̲͙̜̟͚͎͓̦̫͚̻̟̰̣̲̺̦̫͓̖̯̝̬͉̯͓͈̫̭̜̱̞̹̪͔̤̜͙͓̗̗̻̟͎͇̺̘̯̲̝̫͚̰̹̫̗̳̣͙̮̱̲͕̺̠͉̫̖̟͖̦͉̟͈̭̣̹̱̖̗̺̘̦̠̯̲͔̘̱̣͙̩̻̰̠͓͙̰̺̠̖̟̗̖͉̞̣̥̝̤̫̫̜͕̻͉̺͚̣̝̥͇̭͎̖̦̙̲͈̲̠̹̼͎͕̩͓̖̥̘̱̜͙̹̝͔̭̣̮̗̞̩̣̬̯̜̻̯̩̮̩̹̻̯̬̖͂̈͂̒̇͗͑̐̌̎̑̽̑̈̈́͑̽́̊͋̿͊͋̅̐̈́͑̇̿̈́̌͌̊̅͂̎͆̏̓͂̈̿̏̃͑̏̓͆̔̋̎̕͘͘͘͜͜͜͜͜͜͜͝͝͠͠ͅͅͅͅͅͅͅͅͅZ̴̧̢̨̢̧̢̢̡̧̢̢̢̨̨̨̡̨̧̢̧̛̛̬̖͈̮̝̭̖͖̗̹̣̼̼̘̘̫̠̭̞͙͔͙̜̠̗̪̠̼̫̻͓̳̟̲̳̻̙̼͇̺͎̘̹̼͔̺̹̬̯̤̮̟͈̭̻͚̣̲͔͙̥͕̣̻̰͈̼̱̺̤̤͉̙̦̩̗͎̞͓̭̞̗͉̳̭̭̺̹̹̮͕̘̪̞̱̥͈̹̳͇̟̹̱̙͚̯̮̳̤͍̪̞̦̳̦͍̲̥̳͇̪̬̰̠͙͕̖̝̫̩̯̱̘͓͎̪͈̤̜͎̱̹̹̱̲̻͎̖̳͚̭̪̦̗̬͍̯̘̣̩̬͖̝̹̣̗̭͖̜͕̼̼̲̭͕͔̩͓̞̝͓͍̗̙̯͔̯̞̝̳̜̜͉̖̩͇̩̘̪̥̱͓̭͎͖̱̙̩̜͎̙͉̟͎͔̝̥͕͍͓̹̮̦̫͚̠̯͓̱͖͔͓̤͉̠͙̋͐̀͌̈́͆̾͆̑̔͂͒̀̊̀͋͑̂͊̅͐̿́̈́̐̀̏̋̃̄͆͒̈́̿̎́́̈̀̀͌̔͋͊̊̉̿͗͊͑̔͐̇͆͛̂̐͊̉̄̈́̄̐͂͂͒͑͗̓͑̓̾̑͋̒͐͑̾͂̎̋̃̽̂̅̇̿̍̈́́̄̍͂͑̏̐̾̎̆̉̾͂̽̈̆̔́͋͗̓̑̕͘̕͘͜͜͜͜͜͝͝͝͝͠͠͝ͅo̶̪͆́̀͂̂́̄̅͂̿͛̈́̿͊͗́͘͝t̴̡̨̧̨̧̡̧̨̡̢̧̢̡̨̛̪͈̣̭̺̱̪̹̺̣̬̖̣̻͈̞̙͇̩̻̫͈̝̭̟͎̻̟̻̝̱͔̝̼͍̞̼̣̘̤̯͓͉̖̠̤͔̜̙͚͓̻͓̬͓̻̜̯̱̖̳̱̗̠̝̥̩͓̗̪̙͓̖̠͎̗͎̱̮̯̮͙̩̫̹̹̖͙̙͖̻͈̙̻͇͔̙̣̱͔̜̣̭̱͈͕̠̹͙̹͇̻̼͎͍̥̘͙̘̤̜͎̟͖̹̦̺̤͍̣̼̻̱̲͎̗̹͉͙̪̞̻̹͚̰̻͈͈͊̈́̽̀̎̃̊́̈́̏̃̍̉̇̑̂̇̏̀͊̑̓͛̽͋̈́͆́̊͊̍͌̈́̓͊̌̿̂̾̐͑̓̀́͒̃̋̓͆̇̀͊̆͗̂͑͐̀͗̅̆͘̕͘̕̕͜͜͝͝͝͝͝͝͝ͅͅͅͅͅͅͅͅͅḁ̶̢̡̨̧̡̡̨̨̧̨̡̡̢̧̨̡̡̛̛̛͍̱̳͚͕̩͍̺̪̻̫̙͈̬͙̖͙̬͍̬̟̣̝̲̼̜̼̺͎̥̮̝͙̪̘̙̻͖͇͚͙̣̬̖̲̲̥̯̦̗̰̙̗̪̞̗̩̻̪̤̣̜̳̩̦̻͓̞̙͍͙̫̩̹̥͚̻̦̗̰̲̙̫̬̱̺̞̟̻͓̞͚̦̘̝̤͎̤̜̜̥̗̱͈̣̻̰̮̼̙͚͚̠͚̲̤͔̰̭̙̳͍̭͎̙͚͍̟̺͎̝͓̹̰̟͈͈̖̺͙̩̯͔̙̭̟̞̟̼̮̦̜̳͕̞̼͈̜͍̮͕̜͚̝̦̞̥̜̥̗̠̦͇͖̳͈̜̮̣͚̲̟͙̎̈́́͊̔̑̽̅͐͐͆̀͐́̓̅̈͑͑̍̿̏́͆͌̋̌̃̒̽̀̋̀̃̏̌́͂̿̃̎̐͊̒̀̊̅͒̎͆̿̈́̑̐̒̀̈́̓̾͋͆̇̋͒̎̈̄̓̂͊̆͂̈́̒̎͐̇̍̆̋̅̿̔͒̄̇̂̋̈́͆̎̔̇͊̊̈́̔̏͋́̀͂̈́̊͋͂̍̾̓͛̇̔̚͘̚̕̚͘͘̕̕̕̚͘͘̚̕̚̕͜͜͜͝͝͝͝͝͝͝͝ͅͅͅͅͅç̵̧̢̨̢̢̢̧̧̡̨̡̢̧̧̧̨̡̡̨̨̢̢̢̧̨̢̨̢̛̛͉̗̠͇̹̖̝͕͚͎̟̻͓̳̰̻̺̞̣͚̤͙͍͇̗̼͖͔͕͙͖̺͙̖̹̘̘̺͓̜͍̣̰̗̖̺̗̪̘̯̘͚̲͚̲̬̞̹̹͕̭͔̳̘̝̬͉̗̪͉͕̞̫͔̭̭̜͉͔̬̫͙̖̙͚͔͙͚͍̲̘͚̪̗̞̣̞̲͎͔͖̺͍͎̝͎͍̣͍̩̟͈͕̗͉̪̯͉͎͖͍̖͎̖̯̲̘̦̟̭͍͚͓͈͙̬͖̘̱̝̜̘̹̩̝̥̜͎̬͓̬͙͍͇͚̟̫͇̬̲̥̘̞̘̟̘̝̫͈̙̻͇͎̣̪̪̠̲͓͉͙͚̭̪͇̯̠̯̠͖̞̜͓̲͎͇̼̱̦͍͉͈͕͉̗̟̖̗̱̭͚͎̘͓̬͍̱͍̖̯̜̗̹̰̲̩̪͍̞̜̫̩̠͔̻̫͍͇͕̰̰̘͚͈̠̻̮͊̐̿̏̐̀̇̑̐̈͛͑͑̍̑̔̃̈́̓̈́̇̐͑̐̊̆͂̀̏͛̊̔̍̽͗͋̊̍̓̈́̏̅͌̀̽́̑͒͒̓͗̈́̎͌͂̕̚͘͘͜͜͜͜͜͠͝͝͝͝ͅͅͅͅͅͅͅS̵̡̡̧̧̨̨̡̢̡̡̡̡̧̧̡̧̢̫̯͔̼̲͉͙̱̮̭̗͖̯̤͙̜͚̰̮̝͚̥̜̞̠̤̺̝͇̻̱͙̩̲̺͍̳̤̺̖̝̳̪̻̗̮̪̖̺̹̭͍͇̗̝̱̻̳̝̖̝͎̙͉̞̱̯̙̜͇̯̻̞̱̭̗͉̰̮̞͍̫̺͙͎̙̞̯̟͓͉̹̲͖͎̼̫̩̱͇̲͓̪͉̺̞̻͎̤̥̭̺̘̻̥͇̤̖̰̘̭̳̫̙̤̻͇̪̦̭̱͎̥̟͖͕̣̤̩̟̲̭̹̦̹̣͖̖͒̈́̈́̓͗̈̄͂̈́̅̐̐̿̎̂͗̎̿̕͘͜͜͜͜͝͝ͅͅt̸̡̡̧̧̨̡̢̛̥̥̭͍̗͈̩͕͔͔̞̟͍̭͇̙̺̤͚͎͈͎͕̱͈̦͍͔͓̬͚̗̰̦͓̭̰̭̎̀̂̈́̓̒̈́̈́̂̄̋́̇̂͐͒̋̋̉͐̉̏̇͋̓̈́͐̾͋̒͒͐̊̊̀̄͆̄͆̑͆̇̊̓̚̚̕̚̕͜͠͝͝ͅͅơ̵̡̨̡̡̡̨̛̺͕̼͔̼̪̳͖͓̠̘̘̳̼͚͙͙͚̰͚͚͖̥̦̥̘̖̜̰͔̠͕̦͎̞̮͚͕͍̤̠̦͍̥̝̰̖̳̫̮̪͇̤̱̜͙͔̯͙̙̼͇̹̥̜͈̲̺̝̻̮̬̼̫̞̗̣̪̱͓̺̜̠͇͚͓̳̹̥̳̠͍̫͈̟͈̘̯̬̞͔̝͍͍̥̒̐͗͒͂͆̑̀̿̏́̀͑͗̐́̀̾̓́̌̇̒̈́̌̓͐̃̈́̒̂̀̾͂̊̀̂͐̃̄̓̔̽̒̈́̇̓͌̇̂̆̒̏̊̋͊͛͌̊̇̒̅͌̄̎̔̈́͊́̽̋̈̇̈́́͊̅͂̎̃͌͊͛͂̄̽̈́̿͐̉̽̿́́̉͆̈́̒́̂̾̄̇̌̒̈̅̍̿̐͑̓͊̈́̈̋̈́̉̍̋̊̈̀̈́̾̿̌̀̈́͌̑̍́̋̒̀̂̈́́̾̏̐̅̈̑͗͐̈͂̄̾̄̈́̍̉͑͛͗͋̈́̃̄̊́́͐̀̀̽̇̓̄̓̃͋͋̂̽̔̀̎͌̈́̈́̑̓̔̀̓͐͛͆̿̋͑͛̈́͂̅̋̅͆͗̇́̀̒́̏͒̐̍͂̓͐͐̇̂̉̑̊͑̉̋̍͊̄̀͂̎͒̔͊̃̏̕̚̕̕͘͘͘̚͘̚͘̕͘̚͘̚̚̚̕͘͜͜͜͝͝͠͠͝͝͠͠͝͝͝͝͝͝͝͝͝ͅͅͅc̴̨̡̢̢̢̡̡̢̛̛̛̻͇̝̣͉͚͎͕̻̦͖̤̖͇̪̩̤̻̭̮̙̰̖̰̳̪̱̹̳̬͖̣͙̼̙̰̻̘͇͚̺̗̩̫̞̳̼̤͔͍͉̟͕̯̺͈̤̰̹̍̋́͆̾̆̊͆͋̀͑͒̄̿̄̀̂͋̊͆́͑̑̽͊̓́̔̽̌͊̄͑͒͐̑͗̿̃̀̓̅́̿͗̈́͌̋̀̏̂͌̓́̇̀͒͋̌̌̅͋͌̆͐̀̔̒͐̊̇̿̽̀̈́̃̒̋̀̈́̃̏̂̊͗̑̊̈̇̀̌͐̈́̉̂̏͊̄͐̈̽͒̏̒̓́̌̓̅́̓̃͐͊͒̄͑̒͌̍̈́̕͘̚͘̕͘̚̕͜͝͠͝͝͝ͅǩ̴̢̢̢̧̨̢̢̢̨̨̨̢̢̢̨̧̨̡̡̢̛̛̛̛̛̛̛̜̥̩̙͕̮̪̻͈̘̯̼̰̜͚̰͖̬̳͖̣̭̼͔̲͉̭̺͚̺̟͉̝̱̲͎͉̙̥̤͚͙̬̪̜̺͙͍̱̞̭̬̩̖̤̹̤̺̦͈̰̗̰͍͇̱̤̬̬͙̙̲̙̜͖͓̙̟̙̯̪͍̺̥͔͕̝̳̹̻͇̠̣͈̰̦͓͕̩͇͈͇̖͙͍̰̲̤̞͎̟̝̝͈͖͔͖̦̮̗̬̞̞̜̬̠̹̣̣̲̮̞̤̜̤̲̙͔͕̯͔͍̤͕̣͔͙̪̫̝̣̰̬̬̭̞͔̦̟̥̣̻͉͈̮̥̦̮̦͕̤͇̺͆͆̈͗̄̀̌̔̈́̈̉̾̊̐̆̂͛̀̋́̏̀̿͒̓̈́̈́͂̽̾͗͊̋̐̓̓̀̃̊̊͑̓̈̎̇͑̆̂̉̾̾̑͊̉̃́̑͌̀̌̐̅̃̿̆̎̈́̀̒́͛̓̀̊́̋͛͒͊̆̀̃̊͋̋̾̇̒̋͂̏͗͆̂̔́̐̀́͗̅̈̋̂̎̒͊̌̉̈̈́͌̈́̔̾̊̎́͐͒̋̽̽́̾̿̚̕͘͘̚̕̕̕̚̚̕̚̕͘͜͜͜͝͠͝͝͝͝͝͝͝͝ͅͅͅͅͅͅB̸̢̧̨̡̢̧̨̡̡̨̡̨̡̡̡̢̨̢̨̛̛̛̛̛̛͉̞͚̰̭̲͈͎͕͈̦͍͈̮̪̤̻̻͉̫̱͔̞̫̦̰͈̗̯̜̩̪̲̻̖̳͖̦͎͔̮̺̬̬̼̦̠̪̤͙͍͓̜̥̙̖̫̻̜͍̻̙̖̜̹͔̗̪̜̖̼̞̣̠̫͉̯̮̤͈͎̝̪͎͇͙̦̥͙̳̫̰̪̣̱̘̤̭̱͍̦͔̖͎̺̝̰̦̱̣͙̙̤͚̲͔̘̱̜̻͔̥̻͖̭͔̜͉̺͕͙͖̜͉͕̤͚̠̩̮̟͚̗͈͙̟̞̮̬̺̻̞͔̥͉͍̦̤͓̦̻̦̯̟̰̭̝̘̩̖̝͔̳͉̗̖̱̩̩̟͙͙͛̀͐̈́̂̇͛̅̒̉̏̈́̿͐́̏̃̏̓̌̽͐̈́͛̍͗͆͛̋̔̉͂̔̂̓̌͌͋̂͆̉͑̊̎́̈́̈̂͆͑́̃̍̇̿̅̾́́̿̅̾̆̅̈́̈̓͒͌͛̃͆̋͂̏̓̅̀͂̽̂̈̈́̎̾̐͋͑̅̍̈́̑̅̄͆̓̾̈́͐̎̊͐̌̌̓͊̊̔̈́̃͗̓͊͐̌͆̓͗̓̓̾̂̽͊͗́́́̽͊͆͋͊̀̑̿̔͒̏̈́́̏͆̈́͋̒͗͂̄̇̒͐̃͑̅̍͒̎̈́̌̋́̓͂̀̇͛̋͊͆̈́̋́̍̃͒̆̕̚̚̕̕̕͘̕̚̚͘̕͜͜͜͜͝͠͠͝͠͝͝͝͝͠͝͝͝͝ͅͅͅͅͅI̵̡̢̧̨̡̢̨̡̡̢̡̧̡̢̢̢̡̢̛̛͕͎͕̩̠̹̩̺̣̳̱͈̻̮̺̟̘̩̻̫͖̟͓̩̜̙͓͇̙̱̭̰̻̫̥̗̠͍͍͚̞̘̫͉̬̫̖̖̦͖͉̖̩̩̖̤̺̥̻̝͈͎̻͓̟̹͍̲͚͙̹̟̟̯͚̳̟͕̮̻̟͈͇̩̝̼̭̯͚͕̬͇̲̲̯̰̖̙̣̝͇̠̞̙͖͎̮̬̳̥̣̺̰͔̳̳̝̩̤̦̳̞̰̩̫̟͚̱̪̘͕̫̼͉̹̹̟̮̱̤̜͚̝̠̤̖̮̯̳͖̗̹̞̜̹̭̿̏͋̒͆̔̄̃̾̓͛̾̌́̅̂͆̔͌͆͋̔̾́̈̇̐̄̑̓̂̾́̄̿̓̅̆͌̉̎̏̄͛̉͆̓̎͒͘̕̕͜͜͜͜͜͜͜͝͠ͅͅƠ̷̢̛̛̛̛̛̛̛̛̟̰͔͔͇̲̰̮̘̭̭̖̥̟̘̠̬̺̪͇̲͋͂̅̈́̍͂̽͗̾͒̇̇̒͐̍̽͊́̑̇̑̾̉̓̈̾͒̍̌̅̒̾̈́̆͌̌̾̎̽̐̅̏́̈̔͛̀̋̃͊̒̓͗͒̑͒̃͂̌̄̇̑̇͛̆̾͛̒̇̍̒̓̀̈́̄̐͂̍͊͗̎̔͌͛̂̏̉̊̎͗͊͒̂̈̽̊́̔̊̃͑̈́̑̌̋̓̅̔́́͒̄̈́̈̂͐̈̅̈̓͌̓͊́̆͌̉͐̊̉͛̓̏̓̅̈́͂̉̒̇̉̆̀̍̄̇͆͛̏̉̑̃̓͂́͋̃̆̒͋̓͊̄́̓̕̕̕̚͘͘͘̚̕̚͘̕̕͜͜͝͝͝͠͝͝͝͝͠ͅS̷̢̨̧̢̡̨̢̨̢̨̧̧̨̧͚̱̪͇̱̮̪̮̦̝͖̜͙̘̪̘̟̱͇͎̻̪͚̩͍̠̹̮͚̦̝̤͖̙͔͚̙̺̩̥̻͈̺̦͕͈̹̳̖͓̜͚̜̭͉͇͖̟͔͕̹̯̬͍̱̫̮͓̙͇̗̙̼͚̪͇̦̗̜̼̠͈̩̠͉͉̘̱̯̪̟͕̘͖̝͇̼͕̳̻̜͖̜͇̣̠̹̬̗̝͓̖͚̺̫͛̉̅̐̕͘͜͜͜͜ͅͅͅ.̶̨̢̢̨̢̨̢̛̻͙̜̼̮̝̙̣̘̗̪̜̬̳̫̙̮̣̹̥̲̥͇͈̮̟͉̰̮̪̲̗̳̰̫̙͍̦̘̠̗̥̮̹̤̼̼̩͕͉͕͇͙̯̫̩̦̟̦̹͈͔̱̝͈̤͓̻̟̮̱͖̟̹̝͉̰͊̓̏̇͂̅̀̌͑̿͆̿̿͗̽̌̈́̉̂̀̒̊̿͆̃̄͑͆̃̇͒̀͐̍̅̃̍̈́̃̕͘͜͜͝͠͠z̴̢̢̡̧̢̢̧̢̨̡̨̛̛̛̛̛̛̛̛̲͚̠̜̮̠̜̞̤̺͈̘͍̻̫͖̣̥̗̙̳͓͙̫̫͖͍͇̬̲̳̭̘̮̤̬̖̼͎̬̯̼̮͔̭̠͎͓̼̖̟͈͓̦̩̦̳̙̮̗̮̩͙͓̮̰̜͎̺̞̝̪͎̯̜͈͇̪̙͎̩͖̭̟͎̲̩͔͓͈͌́̿͐̍̓͗͑̒̈́̎͂̋͂̀͂̑͂͊͆̍͛̄̃͌͗̌́̈̊́́̅͗̉͛͌͋̂̋̇̅̔̇͊͑͆̐̇͊͋̄̈́͆̍̋̏͑̓̈́̏̀͒̂̔̄̅̇̌̀̈́̿̽̋͐̾̆͆͆̈̌̿̈́̎͌̊̓̒͐̾̇̈́̍͛̅͌̽́̏͆̉́̉̓̅́͂͛̄̆͌̈́̇͐̒̿̾͌͊͗̀͑̃̊̓̈̈́̊͒̒̏̿́͑̄̑͋̀̽̀̔̀̎̄͑̌̔́̉̐͛̓̐̅́̒̎̈͆̀̍̾̀͂̄̈́̈́̈́̑̏̈́̐̽̐́̏̂̐̔̓̉̈́͂̕̚̕͘͘̚͘̚̕̚̚̚͘̕̕̕͜͜͝͠͠͝͝͝͝͠͝͝͝͠͝͝͝͝͝͝ͅͅͅī̸̧̧̧̡̨̨̢̨̛̛̘͓̼̰̰̮̗̰͚̙̥̣͍̦̺͈̣̻͇̱͔̰͈͓͖͈̻̲̫̪̲͈̜̲̬̖̻̰̦̰͙̤̘̝̦̟͈̭̱̮̠͍̖̲͉̫͔͖͔͈̻̖̝͎̖͕͔̣͈̤̗̱̀̅̃̈́͌̿̏͋̊̇̂̀̀̒̉̄̈́͋͌̽́̈́̓̑̈̀̍͗͜͜͠͠ͅp̴̢̢̧̨̡̡̨̢̨̢̢̢̨̡̛̛͕̩͕̟̫̝͈̖̟̣̲̖̭̙͇̟̗͖͎̹͇̘̰̗̝̹̤̺͉͎̙̝̟͙͚̦͚͖̜̫̰͖̼̤̥̤̹̖͉͚̺̥̮̮̫͖͍̼̰̭̤̲͔̩̯̣͖̻͇̞̳̬͉̣̖̥̣͓̤͔̪̙͎̰̬͚̣̭̞̬͎̼͉͓̮͙͕̗̦̞̥̮̘̻͎̭̼͚͎͈͇̥̗͖̫̮̤̦͙̭͎̝͖̣̰̱̩͎̩͎̘͇̟̠̱̬͈̗͍̦̘̱̰̤̱̘̫̫̮̥͕͉̥̜̯͖̖͍̮̼̲͓̤̮͈̤͓̭̝̟̲̲̳̟̠͉̙̻͕͙̞͔̖͈̱̞͓͔̬̮͎̙̭͎̩̟̖͚̆͐̅͆̿͐̄̓̀̇̂̊̃̂̄̊̀͐̍̌̅͌̆͊̆̓́̄́̃̆͗͊́̓̀͑͐̐̇͐̍́̓̈́̓̑̈̈́̽͂́̑͒͐͋̊͊̇̇̆̑̃̈́̎͛̎̓͊͛̐̾́̀͌̐̈́͛̃̂̈̿̽̇̋̍͒̍͗̈͘̚̚͘̚͘͘͜͜͜͜͜͜͠͠͝͝ͅͅͅ☻♥■∞{╚mYÄÜXτ╕○\╚Θº£¥ΘBM@Q05♠{{↨↨▬§¶‼↕◄►☼1♦  wumbo╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ╚̯̪̣͕̙̩̦͓͚̙̱̘̝̏̆ͤ̊̅ͩ̓̏̿͆̌Θ̼̯͉ͭͦ̃͊͑̉ͯͤ̈́ͬ͐̈́͊ͤͅº͍̪͇͖̝̣̪̙̫̞̦̥ͨ̂ͧ̄̿£̺̻̹̠̯͙͇̳ͬ̃̿͑͊ͨͣ

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

but here we have an AMD 8.2 TFLOP GPU matching an Nvidia 11.7 TFLOP GPU and there is no magic that will ever let that happen unless there is a lack of optimization for Nvidia.

Well... yes and no... but basically youre right.

 

Processing throughput (shader ops) is just one thing; however other properties of both drivers and hardware affect percieved (or as you will, "real world") performance as well. Many of those other properties (like wheter the hw is faster switching shader or faster switching textures on the processing units) vary not only between manufacturers like nvidia and amd but also between (sometimes generations or models) of gpus.

 

That closely related to optimisation in the gameengines using comparativly low level drivers (like displaylist sorts on either shader or texture), just to give a simplistic example (which is easily mitigated but others arent).

 

So orderering draw calls alone on one criteria or the other can make as much as 25% difference between gpus (this is even more pronounced on mobile platforms). If - exagerating here - a program switch takes 0.1 second, the shader op throughput can be 100 teraflops but it will still be a slideshow when running a game (many shaders / tex mix per frame) although it could "fly" in productivity where batches of hunders of frames is easy and switches can be prevented more by serializing operations.

 

Nasty thing for devs is: 1. not all optimisation are as easy as sort order changes, 2. some are not at drawcall level but at shader level. 3. Some have relation in strategy between 1 and 2. 4. An optimisation in one architecture can be contra in another.

 

This makes it very hard (read: expensive) to "full optimise" for 2 or more "tastes", and due to the flexibile and low-level nature of drivers its not possible for some of the most effective "strategic" optimisations to match hw properties (since they require domain knowledge at the higher level) to be at driver level.

 

This is also why such differences in performance between seemingly simmilar powerfull hw can be seen on any api (open or closed source) such as opengl.

 

There is no easy way to mitigate this unfortunatly (since increasing the level of driver api would reduce flexibility).

 

In the end, esp for consumer level producs like games, this is bad for the consumer though, as devs will generally follow the money (esp if sponsored as extra incentive...) and optimize for largest market share (in the long run becoming a reason for even more market share of targetted hw). This means less chance for the competition even if their drivers and hw are better on some levels (what might very well be the case for amd).

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, 7Hertz said:

No since i have Vega 56

Why so angry??
That I mentioned Tom Clancy's Wildlands, wich still crashes on all Turing Chips??

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

I know Sony goes a bit insane with consoles but not that much.

Like using an ARM SOC for I/O that maps its memory into the AMD APU Memory.

For accessing the I/O Stuff...

Easy mode would be to use the AMD stuff.

 

But that might also be the reason why the I/O Performance of the PS4 is so plain awful and SSDs don't offer a huge improvement...

2 hours ago, leadeater said:

GNM is low-level hardware access and GNMX is higher-level access aimed at smaller devs and for quicker projects etc, GNMX is first point of call unless you really need that extra performance/optimization.

 

As it is it's very similar to Vulkan and DX12 which both have high and low level access.

Hm, a Variant of Mantle possibly??

That might eventually fit in the timeframe...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, NumLock21 said:

AMD needs to make their card better than Nvidia, not just by nitpicking some games but with all games.

Yeah, right.

Ignoring the Drivers.

And what you can do on the Software Side...

 

What if the Tahiti Chip was vastly superior to GK104?
What if the Hawaii Chip was vastly superior to the GK110?

And the real difference is due to the Driver "Optimizations" on nVidia's Side, wich uses proprietary Culling algorythms that throw away anything not visible by the user?
And replaces the Shaders the Developers intended with nVidia's Shaders...

1 hour ago, NumLock21 said:

Their reputation including brand as a video card maker is going down fast.

Yeah, because their products are good enough but people bash them at every possible moment.

That includes the Tech Media, wich is very critical towards AMD but Sandballing nVidia. Harsh criticism for the shit they do? Naa.

Especially because that might anger them and then you don't get no testsamples no more...

 

For example AMD uses 10W more? BIG DEAL! AMD bad, BUY NVIDIA!!!11

 

nVidia does some weird shit with their memory controller: Ah, dude, you're overexaggerating, they know what they're doing. It will be fine....

 

1 hour ago, NumLock21 said:

As for them powering game consoles, that won't help rebuilt their image because, most consumers who buys it, don't even know what's in it.

Yes it will, because it gives an incentive to game developers to optimize their software for AMD Hardware as that is a big part of gaming.

In PC-Gaming the "normal people" don't buy AMD Hardware because they're told from all sides to not buy AMD Hardware for whatever reasons.


With CPUs there seems to be way less brand loyalty than with GPUs though.

1 hour ago, NumLock21 said:

I tried to give them a chance for the past 8+ years or so, hoping they will release a card that truly shows they're back in the game, by outperforming Nvidia, but that day never came and never will. So I got tired of waiting and went with Nvidia.

Yeah, people with nVidia Turing Chips who like to play Wildlands might see it differently.

Its just one of the things that people claim that couldn't be further from the Trugh: the "amazing" Driver Quallity of nVidia Products.

This Game crashed reproducably when opening the inventory. Wasn't fixed until a couple of weeks ago - now it doesn't crash in Inventory no more but randomly in the game.


And its a multi player Co-Op (PVE) Game.

 

1 hour ago, NumLock21 said:

AMD Navi beating Turing? With them right now like this, I seriously doubt that's going to happen.

Why doubt and not wait and see??


With Radeon 7 they kept up and closed it a bit, though the card is very expensive...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Stefan Payne said:

Yeah, right.

Ignoring the Drivers.

And what you can do on the Software Side...

 

What if the Tahiti Chip was vastly superior to GK104?
What if the Hawaii Chip was vastly superior to the GK110?

And the real difference is due to the Driver "Optimizations" on nVidia's Side, wich uses proprietary Culling algorythms that throw away anything not visible by the user?
And replaces the Shaders the Developers intended with nVidia's Shaders...

Yeah, because their products are good enough but people bash them at every possible moment.

That includes the Tech Media, wich is very critical towards AMD but Sandballing nVidia. Harsh criticism for the shit they do? Naa.

Especially because that might anger them and then you don't get no testsamples no more...

 

For example AMD uses 10W more? BIG DEAL! AMD bad, BUY NVIDIA!!!11

 

nVidia does some weird shit with their memory controller: Ah, dude, you're overexaggerating, they know what they're doing. It will be fine....

 

Yes it will, because it gives an incentive to game developers to optimize their software for AMD Hardware as that is a big part of gaming.

In PC-Gaming the "normal people" don't buy AMD Hardware because they're told from all sides to not buy AMD Hardware for whatever reasons.


With CPUs there seems to be way less brand loyalty than with GPUs though.

Yeah, people with nVidia Turing Chips who like to play Wildlands might see it differently.

Its just one of the things that people claim that couldn't be further from the Trugh: the "amazing" Driver Quallity of nVidia Products.

This Game crashed reproducably when opening the inventory. Wasn't fixed until a couple of weeks ago - now it doesn't crash in Inventory no more but randomly in the game.


And its a multi player Co-Op (PVE) Game.

 

Why doubt and not wait and see??


With Radeon 7 they kept up and closed it a bit, though the card is very expensive...

you know wildlands is just horrible all around?

you can see many many issues for any hardware and OS all around

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, pas008 said:

you know wildlands is just horrible all around?

you can see many many issues for any hardware and OS all around

 

Oh yeah because it causes Problems on nVidia and nVidia doesn't bother with a Fix, its "horrible all around".

 

Why can't you admit that this issue should have been adressed by nVidia and fixed MONTHs ago because with Pascal it seems to work fine...

 

There is even a German Article about the issue:

http://www.pcgameshardware.de/Tom-Clancyxs-Ghost-Recon-Wildlands-Spiel-55716/News/Geforce-RTX-2080-Ti-Absturz-Inventar-1276490/

 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stefan Payne said:

Oh yeah because it causes Problems on nVidia and nVidia doesn't bother with a Fix, its "horrible all around".

 

Why can't you admit that this issue should have been adressed by nVidia and fixed MONTHs ago because with Pascal it seems to work fine...

 

There is even a German Article about the issue:

http://www.pcgameshardware.de/Tom-Clancyxs-Ghost-Recon-Wildlands-Spiel-55716/News/Geforce-RTX-2080-Ti-Absturz-Inventar-1276490/

 

the game is just a mess lol not just inventory rtx many are needing to do a check list just to play

just like many other ubisoft games

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Stefan Payne said:
Spoiler

Yeah, right.

Ignoring the Drivers.

And what you can do on the Software Side...

 

What if the Tahiti Chip was vastly superior to GK104?
What if the Hawaii Chip was vastly superior to the GK110?

And the real difference is due to the Driver "Optimizations" on nVidia's Side, wich uses proprietary Culling algorythms that throw away anything not visible by the user?
And replaces the Shaders the Developers intended with nVidia's Shaders...

Yeah, because their products are good enough but people bash them at every possible moment.

That includes the Tech Media, wich is very critical towards AMD but Sandballing nVidia. Harsh criticism for the shit they do? Naa.

Especially because that might anger them and then you don't get no testsamples no more...

 

For example AMD uses 10W more? BIG DEAL! AMD bad, BUY NVIDIA!!!11

 

nVidia does some weird shit with their memory controller: Ah, dude, you're overexaggerating, they know what they're doing. It will be fine....

 

Yes it will, because it gives an incentive to game developers to optimize their software for AMD Hardware as that is a big part of gaming.

In PC-Gaming the "normal people" don't buy AMD Hardware because they're told from all sides to not buy AMD Hardware for whatever reasons.


With CPUs there seems to be way less brand loyalty than with GPUs though.

Yeah, people with nVidia Turing Chips who like to play Wildlands might see it differently.

Its just one of the things that people claim that couldn't be further from the Trugh: the "amazing" Driver Quallity of nVidia Products.

This Game crashed reproducably when opening the inventory. Wasn't fixed until a couple of weeks ago - now it doesn't crash in Inventory no more but randomly in the game.


And its a multi player Co-Op (PVE) Game.

 

Why doubt and not wait and see??


With Radeon 7 they kept up and closed it a bit, though the card is very expensive...

 

Brand reputation is everything, when you don't have that, then nothing else matters. A person who goes to a retail store to buy their computer. The chances of one sold with Nvidia graphics will be higher than those from AMD because of Nvidia's brand reputation. It's the same reason why many prefer Apple over Windows, where it lets them charge high prices and people will still line up and buy it.

 

102213-futureshop-microsoft-swas.jpg

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Stefan Payne said:

And the real difference is due to the Driver "Optimizations" on nVidia's Side, wich uses proprietary Culling algorythms that throw away anything not visible by the user?
And replaces the Shaders the Developers intended with nVidia's Shaders...

This sound interesting :)

 

Could you elaborate, or provide source? Especially on that second thing, replacing the shaders (i have a special interest since gpu manu's "tweaking" their driver shader compiler units for "optimisations" has been a major pita for me professionaly ever since 2011, since they tend to optimize 1 thing but break 2 others).

 

And any algorithm made up by any dev by any company is proprietary, that doesnt make it bad neccesarily. And culling away things you cant see is done on several levels in the stack, not just the driver, and nor rendering what yiu cannot see is a pretty legitimate and standard m.o. Ironically, in order to raytrace this is less so, and one of the reasons it eats memory for breakfast and hits perf so much).

 

Thats said, the relation between nvidias "game ready" drivers and specific games makes my skin crawl as well; a driver shoukd be generic and pure; both from a technical standpoint and for a (more) even playingfield for both devs and consumers.

 

I do think that in the long (to be honest i hope its short) run, nvidia will shoot itself in the foot with this (indie devs not playing along their game and indies get increasingly better chances now that the big studio names are putting out worse and worse titles).

 

I really hope that navi brings amd what ryzen did: happy surprise for consumer, nasty surprise for the competition :)

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Bartholomew said:

Thats said, the relation between nvidias "game ready" drivers and specific games makes my skin crawl as well; a driver shoukd be generic and pure; both from a technical standpoint and for a (more) even playingfield for both devs and consumers.

Why? If the driver can fix shortcomings done by a game, then it should fix it.

 

Also all a driver does is translate the API calls from the application to something the hardware can understand. If there's any "unfairness", it's because optimizations that help both sides are few and far between. It's like making a game on ARM and x86. While there are generic design patterns that can help the application overall, nothing's going to help any architectural differences that you have to account for. If you only have the time and resources to optimize for one, you're going to have to make a decision.

 

EDIT: The general impression I get from AMD as far as software is concerned is they want to help the application developer achieve overall optimizations. Which is noble and all, but AMD is missing the point of what they need to do. They need to focus on making their hardware easier/better for the application developer to use. I mean, I don't know how hard it is to optimize for AMD or NVIDIA, but given how much NVIDIA puts its resources on making it easier to optimize for their platforms, it's no wonder a lot of application developers go for them than AMD.

 

I'd rather have the hardware vendor help me use their hardware than a hardware vendor trying to get my software to work better. Getting my software to work better is my job, not theirs.

Edited by Mira Yurizaki
Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, SolarNova said:

At this point we should all be "AMD desperate" with the Sh*t NVidia is pulling with £1000+ 80ti cards.

If things don't change you can fully expect to see future generations have 80ti cards at the £1000+ price point for no other reason than "because they can".

I don't think that's fair.  NVIDIA priced them to what the market could handle, just like AMD priced the Radeon VII to what the market would tolerate.  If AMD made a 2080ti equivalent card it too would be priced well over €1000.  AMD needs to make a lot of money if they are going to keep production up on it's new GPUs.  Whenever they've come out with a good video card at a good price they wouldn't keep up with demand. 
 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Mira Yurizaki said:

 

I'd rather have the hardware vendor help me use their hardware than a hardware vendor trying to get my software to work better. Getting my software to work better is my job, not theirs

Isnt that what Vulcan great? You dont have to ask for Nvidia or AMD to help

 

 

Or am i missing something here

Link to comment
Share on other sites

Link to post
Share on other sites

GG LTT,  any chance to have a good discussion about AMD performing well and having a better future turns into a fanboy shitfest about how bad NVIDIA is. 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, GoldenLag said:

Isnt that what Vulcan great? You dont have to ask for Nvidia or AMD to help

 

Or am i missing something here

Assuming all of the analogies I make about OpenGL/DX11 vs. Vulkan/DX12 are true, then no, Vulkan makes the problem worse. If it's like writing a C program vs. Java or Python, C assumes you know quite a bit about the system to get the most out of it. Java and Python holds your hand over some lower level details so you don't have to worry about it.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Mira Yurizaki said:

Why? If the driver can fix shortcomings done by a

, then it should fix it.

No, the driver should only optimize whats comming in, so it automatically works for all games. Not detect where its comming from and based on that do something else. A shortcoming in a game is called a bug or bad perf coding and requires a game patch. Doing it the othe way around (or refraining from doing so depending on who makes the game) is threading into bad territory for a plethora of obvious reasons.

 

39 minutes ago, Mira Yurizaki said:

Also all a driver does is translate the API calls from the application to something the hardware can understand

That, and compilation of all the types of shaders, and optimizing the compiles results, and control (within limits) caching, memory use, coreshifts corelocks, various timing, threading and sync controls etc. Programming these apis goes a little further than just pure api funcion calls, as well is (especially) implementing the drivers beyond them.

 

39 minutes ago, Mira Yurizaki said:

If there's any "unfairness", it's because optimizations that help both sides are few and far between. It's like making a game on ARM and x86. While there are generic design patterns that can help the application overall, nothing's going to help any architectural differences that you have to account for. If you only have the time and resources to optimize for one, you're going to have to make a decision.

A design pattern is structure/form for implementing a strategy; its unrelated to "optimizing"; other than that, if i worded my self clealy the 1st time (apologies, englisch not 1st lang), what you state here is what i meant as well.

 

39 minutes ago, Mira Yurizaki said:

d rather have the hardware vendor help me use their hardware than a hardware vendor trying to get my software to work better. Getting my software to work better is my job, not theirs

Contradicting your first statement here?

39 minutes ago, Mira Yurizaki said:

y? If the driver can fix shortcomings done by a game, then it should fix it.

The driver should be seen as part of the hardware mind you.

 

39 minutes ago, Mira Yurizaki said:

The general impression I get from AMD as far as software is concerned is they want to help the application developer achieve overall optimizations. Which is noble and all, but AMD is missing the point of what they need to do. They need to focus on making their hardware easier/better for the application developer to use. I mean, I don't know how hard it is to optimize for AMD or NVIDIA, but given how much NVIDIA puts its resources on making it easier to optimize for their platforms, it's no wonder a lot of application developers go for them than AMD.

 

Overall opts should be done by the driver. Strategic opts (like the simplest prog  vs texswap sample i gave) is up to the developer, and there you may have a point (may, as im not up to speed on how well / frequent both parties document their scientific papers on optimisation). That notwithstanding, if part of nvidias spend resources is basically paying / sponsoring devs to focus on them thats simply a bad thing (esp if it might imply that devs not participating might not see driver issues resolved that affects just them, even if that devs code and use is within spec). If that where the case thats abuse of power imho.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Bartholomew said:

Could you elaborate, or provide source?

I'd like to but sadly the User deleted his account in the German Forum "3DCenter".

I know that he was a developer and, at the time, pissed at nVidia because he send them his code of the thing he was developing and after that the objects generated were vanishing.

 

The User's Name at the time was Alpha-Tier and it was a couple of years ago.

 

What I heard is that nVidia has some big Computerfarms that for example work on this specific optimisations - wich is something AMD can't do due to lack of Resources.

30 minutes ago, Bartholomew said:

Especially on that second thing, replacing the shaders

I thought that its pretty well known in developer circles that GPU (driver) Developers use shader replacement in their drivers for some AAA Games, to replace the shader and get a bit more performance out it.

 

The Accusations are also rather old and came up when nVidia released their Gameworks libary. There is the accusation that nVidia does one thing in the libary and then does something else in their drivers.


Again, the Source is the German Forum 3DCenter, where a ton of Developers are, one of them worked on the DOOM Engine for example.
 

Another one worked for EA-Phenomic...

30 minutes ago, Bartholomew said:

Thats said, the relation between nvidias "game ready" drivers and specific games makes my skin crawl as well; a driver shoukd be generic and pure; both from a technical standpoint and for a (more) even playingfield for both devs and consumers.

Yeah, that is why AMD is pushing for Low Level APIs because that limits the scope of optimizations, the GPU Manufacturers lose control over the application and can't influence it.

In Games with the older Direct X APU up to version 9, you can for example force FSAA and AF in the Driver although the application doesn't know anything about that.

With DX10/11 that was changed and those things don't work no more.

 

With DX12 and Vulkan its increasing further.

 

THe Problem with DX12 and Vulkan is that the Game Engine needs to be developed completely anew, specifically for the new APIs. And the Developers need to learn the new APIs as well, not because its more complicated but different.

30 minutes ago, Bartholomew said:

I do think that in the long (to be honest i hope its short) run, nvidia will shoot itself in the foot with this (indie devs not playing along their game and indies get increasingly better chances now that the big studio names are putting out worse and worse titles).

Its a rather dangerous game that only works as long as they can throw around money as they like...

And the People want their shit. But AMD isn't sleeping and gaining money by the day, as Intel can't deliver processors, so some (like Linus ;)) Have to Switch to AMD. Especially in the lower end. Intel seems to concentrate on the higher end.

 

And to make matters worse, nVidia is bleeding good people, one of them is Tom Peterson...

Because Intel is working on GPUs as well, that looks like they can be for consumers as well.

 

And there is the Connection to the CPU as well: PCIe is not good enough in the HPC area, wich is why GenZ, CCIX and other, similar stuff exists....

 

 

30 minutes ago, Bartholomew said:

I really hope that navi brings amd what ryzen did: happy surprise for consumer, nasty surprise for the competition :)

I hope that the rumors about it are true. The Performance is not as good as Radeon 7 but at least VEGA could be replaced, for a slightly lower price with a dramatically lower TBP...

 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, valdyrgramr said:

Isn't that every thread involving Intel, AMD, and/or Nvidia?

 

Yes, but to me it is especially annoying because we don't often get to talk about AMD's better performances without it. If the thread is about something Intel or NVIDIA is doing well it's not exactly new news that means anything for competition.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×