Jump to content

Vega finally beats RTX? Vulkan strikes again!

BluJay614
1 hour ago, pas008 said:

the game is just a mess lol not just inventory rtx many are needing to do a check list just to play

just like many other ubisoft games

If what you say is true, than answer this question:
 

WHY is it only hapening on Turing cards??

Why does it work on Maxwell, Pascal???
Why does it seem like there aren't many Problems with AMD either??

 

That doesn't add up.

 

And it feels like you are just defending nVidia at all cost, wich comes at the cost of the consumer, because you are defending them, they don't have an incentive to fix that issue.

 

Remember: It works on Pascal, doesn't on Turing.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mr moose said:

 

Yes, but to me it is especially annoying because we don't often get to talk about AMD's better performances without it. If the thread is about something Intel or NVIDIA is doing well it's not exactly new news that means anything for competition.

That's why I've pretty much given up on conversation about these types of things on the internet. Everyone is too busy rooting for a team like it's a fucking football game.

i7 2600k @ 5GHz 1.49v - EVGA GTX 1070 ACX 3.0 - 16GB DDR3 2000MHz Corsair Vengence

Asus p8z77-v lk - 480GB Samsung 870 EVO w/ W10 LTSC - 2x1TB HDD storage - 240GB SATA SSD w/ W7 - EVGA 650w 80+G G2

3x 1080p 60hz Viewsonic LCDs, 1 glorious Dell CRT running at anywhere from 60hz to 120hz

Model M w/ Soarer's adapter - Logitch g502 - Audio-Techinca M20X - Cambridge SoundWorks speakers w/ woofer

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Princess Cadence said:

Competition intensifies, Vulkan is probably going to de-throne DX12 sooner than later, specially how it can be natively used on other OS without need to port everything.

 

nVidia better get their stuff together before they lose the crown to AMD, while Intel is also getting started... things are not looking so great to team Green right now.

Yeah. Itd be interesting to see if Nvidia will adopt Vulkan onto their cards. I vaguely remember something about AMD opening Vulkan up for the competition to use.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Bartholomew said:

No, the driver should only optimize whats comming in, so it automatically works for all games. Not detect where its comming from and based on that do something else. A shortcoming in a game is called a bug or bad perf coding and requires a game patch. Doing it the othe way around (or refraining from doing so depending on who makes the game) is threading into bad territory for a plethora of obvious reasons.

A shortcoming in my mind is literally anything that prevents a game from running faster. An often used example I like to pick out is DX11 deferred contexts. It was possible to speed up the CPU side of rendering in DX11 by using deferred contexts. NVIDIA saw value in it and implemented support for it if the game used it, then implemented it on a driver wide level, allowing deferred contexts to be "automatically done." AMD, as far as  I know, never implemented support for this period.

 

From the looks of it, deferred contexts may be an advanced thing that game studios who want to do their own thing may not have the experience or resources to make their game take advantage of it.

 

Quote

A design pattern is structure/form for implementing a strategy; its unrelated to "optimizing"; other than that, if i worded my self clealy the 1st time (apologies, englisch not 1st lang), what you state here is what i meant as well.

Design patterns or strategies can affect the overall performance of the application depending on the use case.

 

Quote

Contradicting your first statement here?

The driver should be seen as part of the hardware mind you.

No, because they're not touching my application by fixing something at the driver level.

 

Quote

That notwithstanding, if part of nvidias spend resources is basically paying / sponsoring devs to focus on them thats simply a bad thing (esp if it might imply that devs not participating might not see driver issues resolved that affects just them, even if that devs code and use is within spec). If that where the case thats abuse of power imho.

You can argue all day about their contractual clauses for using the GameWorks program. But at the end of the day:

  • A game developer needs to release a game yesterday
  • The developer's stakeholders want all these whizzbang features
  • NVIDIA can help them get all of these things in a shorter amount of time

These people are commercial software developers. This type of environment demands the best from you in a short amount of time. So anything that helps make the process go faster is better.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, valdyrgramr said:

You'll find this on most tech forums.  

-Circle jerking of insert company here

 

-Insert whataboutism of insert company here

-fanboys snickering

-derailing intensifies

That's why I don't bother forum hoping.

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, corsairian said:

Yeah. Itd be interesting to see if Nvidia will adopt Vulkan onto their cards. I vaguely remember something about AMD opening Vulkan up for the competition to use.

Under what rock you were living for the past couple of years? NVIDIA supports Vulkan for ages. That was AMD Mantle and AMD did already gave it away to Kronos which is maintainer of former OpenGL which is now Vulkan. Both, AMD and NVIDIA support Vulkan and it has generally delivered excellent performance on both. Sometimes even higher than DX12.

Link to comment
Share on other sites

Link to post
Share on other sites

At the end of the day though, Vulkan is biased towards AMD from the get go by the very nature of it being based on Mantle. AMD didn't have to do much work to support it they basically already did. NVIDIA however had to figure out how to implement Vulkan support on their end.  This unlike OpenGL which was developed by SGI. NVIDIA and ATI both had to figure out how to implement it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mira Yurizaki said:

At the end of the day though, Vulkan is biased towards AMD from the get go by the very nature of it being based on Mantle. AMD didn't have to do much work to support it they basically already did. NVIDIA however had to figure out how to implement Vulkan support on their end.  This unlike OpenGL which was developed by SGI. NVIDIA and ATI both had to figure out how to implement it.

very little is left from mantle though, vulkan is very much its own thing now, and one of the lead devs is a nvidia employee.

btw vulkan supports middleware layers so that devs can code at a higher level

5 minutes ago, Mira Yurizaki said:

A shortcoming in my mind is literally anything that prevents a game from running faster. An often used example I like to pick out is DX11 deferred contexts. It was possible to speed up the CPU side of rendering in DX11 by using deferred contexts. NVIDIA saw value in it and implemented support for it if the game used it, then implemented it on a driver wide level, allowing deferred contexts to be "automatically done." AMD, as far as  I know, never implemented support for this period.

 

From the looks of it, deferred contexts may be an advanced thing that game studios who want to do their own thing may not have the experience or resources to make their game take advantage of it.

 

Design patterns or strategies can affect the overall performance of the application depending on the use case.

 

No, because they're not touching my application by fixing something at the driver level.

 

You can argue all day about their contractual clauses for using the GameWorks program. But at the end of the day:

  • A game developer needs to release a game yesterday
  • The developer's stakeholders want all these whizzbang features
  • NVIDIA can help them get all of these things in a shorter amount of time

These people are commercial software developers. This type of environment demands the best from you in a short amount of time. So anything that helps make the process go faster is better.

from what i remember amd can't implement it driver wide, nvidia doing the scheduling on the cpu gave them more freedom to change the game code,

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Mira Yurizaki said:

Design patterns or strategies can affect the overall performance of the application depending on the use case.

Yes, if you pick the wrong strategy/algorithm for the use case perf could be less and vice versa. And with or without design patterns (just form and convention) used for implementation of any strategy, will have that effect. Thats why its called a design pattern and not a algorithm or architecture...

 

7 minutes ago, Mira Yurizaki said:

No, because they're not touching my application by fixing something at the driver level.

I'll violate my DRY princiole here and repeat myself. Driver is not just api but also compiler (think shader code); as such, when bluntly replacing shaders with theirs, its basically replacing the games code.

 

11 minutes ago, Mira Yurizaki said:
  • game developer needs to release a game yesterday
  • The developer's stakeholders want all these whizzbang features
  • NVIDIA can help them get all of these things in a shorter amount of time

These people are commercial software developers. This type of environment demands the best from you in a short amount of time. So anything that helps make the process go faster is better.

So, it would be in our best interest then if nvidia bought all studios and their devs so they can work in same building and be more productive? I have a feeling that wouldnt work at that well for consumers.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bartholomew said:

I'll violate my DRY princiole here and repeat myself. Driver is not just api but also compiler (think shader code); as such, when bluntly replacing shaders with theirs, its basically replacing the games code.

Unless you know what driver fixes are actually doing, at best we're speculating what's going on here. For all I know, I'm writing code that 99% of the industry agrees should generate some outcome, but the driver isn't doing it.

 

1 minute ago, Bartholomew said:

So, it would be in our best interest then if nvidia bought all studios and their devs so they can work in same building and be more productive? I have a feeling that wouldnt work at that well for consumers.

If that's what you want to believe based on my words, sure.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Mira Yurizaki said:

You can argue all day about their contractual clauses for using the GameWorks program. But at the end of the day:

  • A game developer needs to release a game yesterday
  • The developer's stakeholders want all these whizzbang features
  • NVIDIA can help them get all of these things in a shorter amount of time

These people are commercial software developers. This type of environment demands the best from you in a short amount of time. So anything that helps make the process go faster is better.

Problem with the Gameworks stuff is that its often forced onto the developers by managment.

Developers don't want to use that shit but have to.


IIRC this interview is pretty damning for nVidia:
http://www.pcgameshardware.de/The-Witcher-3-Spiel-38488/Specials/The-Witcher-3-welche-Grafikkarte-welche-CPU-1107469/2/

 

At least the German Part was.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Stefan Payne said:

I thought that its pretty well known in developer circles that GPU (driver) Developers use shader replacement in their drivers for some AAA Games, to replace the shader and get a bit more performance out it.

Thats somethjng different than "replacing it with something comoletely different"; but thats a moot point if done by the driver since then its impossible to know anyway. Very bad practice if you ask me, it would be better to just "send it back" to game dev, explainjng the optimisation, and have it in a update of the game. That would support knowledge transfer instead if locking knowledge in.

 

48 minutes ago, Stefan Payne said:

hope that the rumors about it are true. The Performance is not as good as Radeon 7 but at least VEGA could be replaced, for a slightly lower price with a dramatically lower TBP...

I hope (based on nothung more than, well, hope):) for better than Radeon 7 perf >8mb mem, at current pricepoint of r7 (man is allowed to dream right)

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Mira Yurizaki said:

At the end of the day though, Vulkan is biased towards AMD from the get go by the very nature of it being based on Mantle.

The thing is that the Developers asked about the low level API for years. The official statement from DICE about Mantle is that they asked (or begged) nVidia to do that - wich they ignored.
And then they asked AMD; wich thought its a good idea and started the project, wich lead us to having DX12 and Vulkan.

 

 

 

18 minutes ago, Mira Yurizaki said:

This unlike OpenGL which was developed by SGI.

...in the late 80s/early 90s...

In a time when even Texturing wasn't a thing, let alone the "shader".

Every try to modernize it was blocked by various people in the Khronis Consortium.

 

So AMD giving them the documentation for Mantle might be something they were all mostly happy with (well, except for nVidia, obviously)...

18 minutes ago, Mira Yurizaki said:

NVIDIA and ATI both had to figure out how to implement it.

That wasn't that hard with the Hardware from the mid 90s that didn't even have shaders and other interesting stuff.


Fun fact:
We have DirectX because the Windows NT crew didn't want to give their Code to the Windows 9x Crew.

So Windows 9x Crew invented their own 3D API.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Calm down people, let the actual benchmark come out from actual reviewers for multiple game genre range, then we can celebrate, we've been pointlessly celebrating AMD pre-release hype for over a decade now with disappointing followups.

Details separate people.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Stefan Payne said:

If what you say is true, than answer this question:
 

WHY is it only hapening on Turing cards??

Why does it work on Maxwell, Pascal???
Why does it seem like there aren't many Problems with AMD either??

 

That doesn't add up.

 

And it feels like you are just defending nVidia at all cost, wich comes at the cost of the consumer, because you are defending them, they don't have an incentive to fix that issue.

 

Remember: It works on Pascal, doesn't on Turing.

 Not defending nvidia you are using shit example when the game has be riddled with problems consistently since its release

Ubisoft typical shit

I quick google tells you this if you omit rtx lol

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

GG LTT,  any chance to have a good discussion about AMD performing well and having a better future turns into a fanboy shitfest about how bad NVIDIA is. 

 

Just out of curiousity, a question; (as noticing this thread its "fanboys" and some other thread the other day "fangirls" was your valuable, informative, topic related, non-selfregulating as per the CoC in your signature ;) addition to the thread):

 

I am currently on nvidia gpu, but i like and have owned amd stuff also, but my current cpu is intel.

Im planning a ryzen 3xxx build, and hopefully navi build in the future.

Sometimes i reccomend intel, amd, nvidia... (whatever fits best).

 

Now my question: am i an "fantransgender"? ?

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, pas008 said:

 Not defending nvidia you are using shit example when the game has be riddled with problems consistently since its release

Ubisoft typical shit

I quick google tells you this if you omit rtx lol

You didn't answer any of my question.

 


Answer the following:

a) why hasn't that been fixed yet?

b) who should fix it?

c) why does it work with other Cards

d) why does only Turing seem to have the Problems?


That people have Problems with a Game is normal. Google Something and you find dozens of that Problem. 

There are other issues as well, for example NIER:Automata, wich also had Crash Issues with GCN Chips that were fixed sooner rather than later. 

 

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Stefan Payne said:

You didn't answer any of my question.

 


Answer the following:

a) why hasn't that been fixed yet?

b) who should fix it?

c) why does it work with other Cards

d) why does only Turing seem to have the Problems?


That people have Problems with a Game is normal. Google Something and you find dozens of that Problem. 

There are other issues as well, for example NIER:Automata, wich also had Crash Issues with GCN Chips that were fixed sooner rather than later. 

 

Lol what's your point

You act like its nvidias fault that ubisoft continues to release problematic games

Lol like patches to remove avx 

And many more

You know you have seen many of the topics here

Do i need to continue?

Ubisoft period

enough said or do i need to link everything for you with their releases and disaster of games from last few years?

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Bartholomew said:

-snip-

Yes I understand this but you've also missed the point, what's the percentage difference between the two GPUs in raw math capability? Something that vastly different, seriously it's 43% difference, is just never ever going to be performing near each other unless one of them has not been optimized for. It's just never going to happen without that situation.

 

And it's not because of Vulkan either, Turing has all the hardware improvements to be able to gain all the advantages that Vega can with such an API where Pascal does not.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Mira Yurizaki said:

Which makes me wonder where the adoption of Vulkan is and how easy it is to work with. Looking online about DX11 vs DX12 programming tells me unless you really know what you're doing or you really need the extra performance, start with DX11.

 

If Vulkan is like DX12, then to me it should follow that it should have a similar level of "difficulty."

I think most are in holding patterns waiting for game engine developers to get the majority of the work done so they can have a more DX11/DX12 like experience. Middleware is still king and low level APIs aren't going to change that. It's also vastly better that you can optimize rather than have to, you know what they say about reinventing the wheel.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bartholomew said:

Just out of curiousity, a question; (as noticing this thread its "fanboys" and some other thread the other day "fangirls" was your valuable, informative, topic related, non-selfregulating as per the CoC in your signature ;) addition to the thread):

 

I am currently on nvidia gpu, but i like and have owned amd stuff also, but my current cpu is intel.

Im planning a ryzen 3xxx build, and hopefully navi build in the future.

Sometimes i reccomend intel, amd, nvidia... (whatever fits best).

 

Now my question: am i an "fantransgender"? ?

Well, My valuable and informative engagements in such topics are usually misrepresented by the fantalk and devolve into accusations of defending a company simply because I want to be realistic and fair,   so sometimes I just sidestep it and point out the uselessness of such posts in hopes that the topic actually becomes good again.  Judging by the reactions and engagement with my posts I am not alone in my feelings on topic.

 

 

Do you see/feel a need to shit talk other brands when the topic can be adequately discussed without their mention?  That would make you a fan *insert gender here*. 

 

Reading through this thread there are some exceptional posts that highlight why AMD is not shit and why they are performing better, but at the same time point out some realities about the industry and the context behind it all.  They are great, they can be expanded on and we can all learn something and have hope for the future.  However, on the other hand you have certain people who can't let a thread like this go by without using it as an excuse to crap all over other brands.   People who cherry pick one or two articles that barley prove anything, then use them to loudly and proudly pontificate accusations of anti consumer practice.  Unfortunately it is the same people making thee same claims in every thread.

 

 

It's not rocket science, if the idea that AMD have better performing hardware under certain conditions upsets you  then you are a fanboy, if it upsets you that there is a reason why such performance is limited and and may be nothing to write home about then you are a fanboy, if you need to bring up the old "gameworks is shit" argument in a thread that has next to nothing to do with it then you are a fanboy. 

 

TL:DR

I like hearing about AMD's improvements, but it is very frustrating to have to wade through and avoid debates with the massive posts of useless anti NVIDIA shit in order to have a decent and honest discussion about AMD.

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Bartholomew said:

Now my question: am i an "fantransgender"? ?

Am I? :D

As I'm writing these lines from an Intel/nVidia (i7-3930K, GT710) System (2nd Hand) and have another Intel (LGA1366) Board in a Box and a ton of Intel CPUs in the cupboard (2 LGA2011 CPUs, 2 1366, a shit ton of Socket P and some Socket M ones)...

 

But yeah, I already have two Ryzen Systems running (2400G in my Office PC and Ryzen 1700x).

And the Ryzen 2400G replaced an Intel i3-4150, MSI H81I)...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, pas008 said:

Lol what's your point

That its a Driver Issue on nVidia's side that doesn't affect Pascal and earlier but only Turing.

Its not something that the French could fix from their side and has to be fixed in the Driver. Wich they tried and failed.

 

Quote

You act like its nvidias fault that ubisoft continues to release problematic games

No, you dismiss the Problems that only happen with one nVidia Generation and all other cards are not affected by that issue.


Now you get mad and call other people names...

 

Quote

Lol like patches to remove avx 

...so that it runs on old CPUs...

Remember all those Threads about "argh, Ubi Game XXX don't run with my Core 2 Duo E8400"...

 

Quote

You know you have seen many of the topics here

I already said that. If you are looking for some Problems, you find it.

But not with THAT regularity. THAT was my point.

 

Like "got VEGA, Game Y crashes after 5min Gameplay".

I had a similar Issue with NIER:Automata, wich is also known to cause Problems on Graphics cards that are not 1st Generation AMD GCN (wich is what I used to play it).

But those Problems are fixed, IIRC on AMD's side. The Game wasn't patched.

 

My point wasn't that it doesn't happen.

My point was that it get fixed more or less quickly on one side, while on the other side, we have people like you that claim its the Software Manufacturers fault and shield the GPU manufacturer from criticism...

 

Wich then prevents the Problem from beeing fixed!
THAT is the real issue!
 That the shit isn't really fixed.

 

So if you want to play Wildlands, you best remove the Turing card and put something else inside. That can't be the solution. Especially after 8 months!

And it was slightly fixed with a Driver Update, wich then allows the inventory to be opened but now crashes randomly.

 

Quote

Ubisoft period

You have any Proof of your claim or is it just you shielding nVidia from Criticism?
If it was on Ubi's side, it would have been fixed, PERIOD. 

 

Quote

enough said or do i need to link everything for you with their releases and disaster of games from last few years?

LOOK OVER THERE; A 3 HEADED MONKEY!!!11

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm.. I thought I already posted this. 

 

Reason why this game is so good on Radeon right now. 

 

- it is optimized for Radeon and have Shader Intrinsic, Rapid Packed Math and Async Compute support. 

- Nvidia haven't release a game ready driver for it. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×