Jump to content

GameWorks VR To Be Included In Unreal Engine 4, Helps Getting Up To 50% More Performance

Mr_Troll

This isn't an occasional jest, you do it nearly every time you get into a "discussion" about something like this. Don't kid yourself.

Don't care what you have. When you make the comments like you do, it takes away any credibility that you may have had.

holy cow, get off it. you act like he just insulted your dead grandmother or something. it's a damn joke get over it.

Link to comment
Share on other sites

Link to post
Share on other sites

So AMD is doing what, VR wise? Are they going to bind some of their libraries somewhere?

 

LiquidVR. They announced theirs way before Nvidia did I believe.

Link to comment
Share on other sites

Link to post
Share on other sites

holy cow, get off it. you act like he just insulted your dead grandmother or something. it's a damn joke get over it.

 

I'm tired of the fanboy brigade going around acting like they're all high and mighty because they hold what they think is the superior opinion.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD LiquidVR, SteamVR. There, that wasn't so hard.

 

 

NVidia didn't do anything to develop HMC. They even abandoned it for HBM, meaning a massive 0 graphics cards in the world uses and will in the near future use HMC. So HMC is pushing absolutely nothing when it comes to graphics.

By physics systems, do you mean the obsolete PhysX, that has failed completely in it's APEX form? Yeah that pushed nothing either.

 

NVidia innovated Gsync, which is very nice, but chose to make it an overpriced proprietary crap solution, instead of making it an industry standard. So that anti competitive vendor locked in crap is harming the industry more in the long run.

Not to mention PhysX is pointless with Havok being integrated into DX.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm tired of the fanboy brigade going around acting like they're all high and mighty because they hold what they think is the superior opinion.

how does throwing around a circlejerk joke about gameworks usually causing things to run like shit due to over tessellation (or the stupidity of devs in relation to GW) amount to fanboyism or acting high and mighty? even Jayztwocents, the biggest Nvidia fan will tell you GW makes things run like shit and he doesn't even have it turned on in reviews he does. again it's a joke, get over it.

Link to comment
Share on other sites

Link to post
Share on other sites

how does throwing around a circlejerk joke about gameworks usually causing things to run like shit due to over tessellation (or the stupidity of devs in relation to GW) amount to fanboyism or acting high and mighty? again it's a joke, get over it.

 

It's the combination of them having their opinions and then making the jokes that they do in spite of.

 

If I did the same shit in the opposite manner, they would hold it against me and they'd be right to. Because doing that stuff is dumb and it makes you look like an asshole.

 

That's why I'm not getting over it. Plus, no one invited you to the party. I know it's a public forum, but you don't have to ride their cocks and white knight for them, Mr. Mustang.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD LiquidVR, SteamVR. There, that wasn't so hard.

 

 

NVidia didn't do anything to develop HMC. They even abandoned it for HBM, meaning a massive 0 graphics cards in the world uses and will in the near future use HMC. So HMC is pushing absolutely nothing when it comes to graphics.

By physics systems, do you mean the obsolete PhysX, that has failed completely in it's APEX form? Yeah that pushed nothing either.

 

NVidia innovated Gsync, which is very nice, but chose to make it an overpriced proprietary crap solution, instead of making it an industry standard. So that anti competitive vendor locked in crap is harming the industry more in the long run.

Uh, Nvidia was one of the founding members of the HMC Consortium, and no, it didn't abandon it at all. It convinced Oracle to migrate to it for the Fujitsu systems and is pressuring IBM to do the same.

 

PhysX is the most successful gaming physics library on the planet, in more games than any other competitor.

 

It's not a crap solution. It's changeable. That's the beauty of using an FPGA. FreeSync 1.0 has already proven GSync the better solution. If FreeSync actually becomes competitive and eliminates its ghosting issues, Nvidia can make GSync better with firmware updates for the monitors themselves. Put that in your pipe and smoke it.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

"Up to 50%"

Yeah, anything from 0 to 50%, never always 50%, you could get 0%.

The stars died for you to be here today.

A locked bathroom in the right place can make all the difference in the world.

Link to comment
Share on other sites

Link to post
Share on other sites

Uh, Nvidia was one of the founding members of the HMC Consortium, and no, it didn't abandon it at all. It convinced Oracle to migrate to it for the Fujitsu systems and is pressuring IBM to do the same.

 

PhysX is the most successful gaming physics library on the planet, in more games than any other competitor.

 

It's not a crap solution. It's changeable. That's the beauty of using an FPGA. FreeSync 1.0 has already proven GSync the better solution. If FreeSync actually becomes competitive and eliminates its ghosting issues, Nvidia can make GSync better with firmware updates for the monitors themselves. Put that in your pipe and smoke it.

 

Yeah a consortium member that has chosen the competing technology or their graphics cards next year. Good job for a founding member. NVidia has not published any plans to implement HMC in any of their products. Not putting their money where their mouth is, is hardly impressive or put much confidence in HMC.

 

Basic PhysX running on CPU is no better than Havoc. The graphical effects in APEX are vendor proprietary, which is not a good thing for the market or the consumer. The results speas for itself: APEX is only implemented in NVidia sponsored games. Paying off devs to implement their closed vendor specific tech is hardly impressive. Internet Explorer is also the most installed browser in the world. It hardly speaks to quality or merits.

 

It's overpriced and vendor exclusive, and removes all competition on the monitor controller scene.

There is no such thing as Freesync 1.0. Freesync is a driver. Adaptive Syn exists in 1 standard. How to implement that standard is up to the monitor controller vendors, just like any graphics API or any other industry standard is. You get cheap under performing solutions and high end solutions, that are just as good as Gsync (for instance the Acer ultra wides).

Firmware updates are up to the vendors to implement if they want to. There are no technical imitations per se.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah a consortium member that has chosen the competing technology or their graphics cards next year. Good job for a founding member. NVidia has not published any plans to implement HMC in any of their products. Not putting their money where their mouth is, is hardly impressive or put much confidence in HMC.

 

Basic PhysX running on CPU is no better than Havoc. The graphical effects in APEX are vendor proprietary, which is not a good thing for the market or the consumer. The results speas for itself: APEX is only implemented in NVidia sponsored games. Paying off devs to implement their closed vendor specific tech is hardly impressive. Internet Explorer is also the most installed browser in the world. It hardly speaks to quality or merits.

 

It's overpriced and vendor exclusive, and removes all competition on the monitor controller scene.

There is no such thing as Freesync 1.0. Freesync is a driver. Adaptive Syn exists in 1 standard. How to implement that standard is up to the monitor controller vendors, just like any graphics API or any other industry standard is. You get cheap under performing solutions and high end solutions, that are just as good as Gsync (for instance the Acer ultra wides).

Firmware updates are up to the vendors to implement if they want to. There are no technical imitations per se.

Only because Intel ordered such a large volume the supply lines are choked for a while due to Knight's Landing.

 

Actually Volta is still believed to be an HMC architecture. Nvidia hasn't published anything to the contrary.

 

There's no proof Nvidia's paying anyone. 

 

Acer's ultrawides have ghosting issues out the yin-yang.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Not everything is about hardware, you can have the best there is but if your software is bad that's as good as you'll hardware will be able to perforn. 

That's taking the discussion out of context. In this context, we're talking about them locking both developers and consumers into their ecosystem in an insidious way that would make Microsoft or Intel look mild.

Link to comment
Share on other sites

Link to post
Share on other sites

Did you even read this, or are you just " its gameworks, so its shit!"? Because tesselation has nothing to do with this.

NO.

 

LOGIC IS NOT ALLOWED HERE, THIS IS A NO LOGIC ZONE.

Link to comment
Share on other sites

Link to post
Share on other sites

Only because Intel ordered such a large volume the supply lines are choked for a while due to Knight's Landing.

 

Actually Volta is still believed to be an HMC architecture. Nvidia hasn't published anything to the contrary.

 

There's no proof Nvidia's paying anyone. 

 

Acer's ultrawides have ghosting issues out the yin-yang.

 

Pretty weak counter argument Patrick.

 

So a founding consortium member are not allowed access to the supply? Can't be much of an influential member then.

 

Volta was indeed designed to use HMC, but development was much slower than HBM. What Volta will have is irrelevant, as the chip won't be out until 2018! Who knows what tech is best/most mature by then. Does not change the fact that NVidia opted out of HMC for the use of HBM for at least 2-3 more years.

 

No of course not. Devs just love to market a third party companies logos and slogans in their games and game intros. Maybe they just love the colours and shapes? Don't be naive, you should be too smart for that. Value/sponsorships can be delivered in other ways than monetary, like hardware, access to proprietary software, direct collaboration with driver programmers, etc. A shit load of Quattro cards is probably the industry standard in NVidia sponsored games.

 

Not really. The ghosting that is there is normal for the LG IPS panels. Actually the NVidia model produces overcharge and also implements more latency in the FPGA. The Freesync version does not have more gosting than normal LG IPS monitors, but either way the differences are miniscule in praxis.

 

Freesync model http://www.tftcentral.co.uk/reviews/content/acer_xr341ck.htm#response_times​

Gsync model http://www.tftcentral.co.uk/reviews/content/acer_predator_x34.htm#response_times

 

Also check out the input lag part. Oh and the colour reproduction of the Freesync version is much better too (0.4 to 1 deltaE on Freesync versus 0.6 to 2.7 on Gsync calibrated).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Pretty weak counter argument Patrick.

So a founding consortium member are not allowed access to the supply? Can't be much of an influential member then.

Volta was indeed designed to use HMC, but development was much slower than HBM. What Volta will have is irrelevant, as the chip won't be out until 2018! Who knows what tech is best/most mature by then. Does not change the fact that NVidia opted out of HMC for the use of HBM for at least 2-3 more years.

No of course not. Devs just love to market a third party companies logos and slogans in their games and game intros. Maybe they just love the colours and shapes? Don't be naive, you should be too smart for that. Value/sponsorships can be delivered in other ways than monetary, like hardware, access to proprietary software, direct collaboration with driver programmers, etc. A shit load of Quattro cards is probably the industry standard in NVidia sponsored games.

Not really. The ghosting that is there is normal for the LG IPS panels. Actually the NVidia model produces overcharge and also implements more latency in the FPGA. The Freesync version does not have more gosting than normal LG IPS monitors, but either way the differences are miniscule in praxis.

Freesync model http://www.tftcentral.co.uk/reviews/content/acer_xr341ck.htm#response_times​

Gsync model http://www.tftcentral.co.uk/reviews/content/acer_predator_x34.htm#response_times

Also check out the input lag part. Oh and the colour reproduction of the Freesync version is much better too (0.4 to 1 deltaE on Freesync versus 0.6 to 2.7 on Gsync calibrated).

Micron and Intel are their own fabs. Nvidia isn't. And Micron has been supplying Oracle with it for 3 years now. Development beat HBM to market. You have a weak ability to choose your best options for criticism.

Volta is Q1 2017. It's going into the DOE supercomputers based on Power 9 due out at the same time. Please do keep up.

If Nvidia wants advertisement credit for providing the bedrock foundation of a game, I say let them have their dues. If the games don't turn out well, it's egg on Nvidia's face.

Game Studios don't tend to use high-end hardware except for testing at the end of development, and that hardware tends to get flipped for cash pretty quickly (currently consulting at Epic Games to optimize Unreal's threading model).

That measurement is quite old and far out of date.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia may want to consider changing their branding for things like this.

 

Things like 'Gameworks' and 'the way it's meant to be played' have become bad words to many PC gamers due to negative experiences in the last few years.

 

The new video showing off gameworks in AC is getting a like:dislike ratio of 10:7.

link

Which is not what you expect to happen on a non-offensive video about some graphical effects.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia may want to consider changing their branding for things like this.

Things like 'Gameworks' and 'the way it's meant to be played' have become bad words to many PC gamers due to negative experiences in the last few years.

The new video showing off gameworks in AC is getting a like:dislike ratio of 10:7.

link

Which is not what you expect to happen on a non-offensive video about some graphical effects.

How much you want to bet it's AMD paying that fraudulent reviewer firm that was in the news recently?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

How much you want to bet it's AMD paying that fraudulent reviewer firm that was in the news recently?

What are you talking about? Which fraudulent reviewer?

 

What I was saying is maybe Nvidia shouldn't use the 'gameworks' brand name for this VR initiative.

Link to comment
Share on other sites

Link to post
Share on other sites

What are you talking about? Which fraudulent reviewer?

 

What I was saying is maybe Nvidia shouldn't use the 'gameworks' brand name for this VR initiative.

http://arstechnica.com/tech-policy/2015/10/amazon-sues-1114-reviewers-some-selling-reviews-for-5/

 

People selling their opinions for money.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

How much you want to bet it's AMD paying that fraudulent reviewer firm that was in the news recently?

The funniest thing is, I would call bullshit if not for the fact that many people do viewbotting on twitch, and other companies also do stuff along that line.

Link to comment
Share on other sites

Link to post
Share on other sites

The funniest thing is, I would call bullshit if not for the fact that many people do viewbotting on twitch, and other companies also do stuff along that line.

Funny thing is - AMD don't have the spare cash to bribe people :D

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Gameworks is just a library of software, and now it's going to include this VR performance enhancement. AMD is even working on their own to help out with VR performance.

 

 

 

Like Nvidia or not, they're pushing graphical advancements forward. AMD is as well, but not nearly as much.

....

3-4 year old AMD GPUs have enough horsepower to compete with and beat the latest and greatest nVidia architecture(not the 980 Ti)...and still..nVidia's pushing graphical advancements...HBM, XDMA, Liquid VR are just some words...

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

Funny thing is - AMD don't have the spare cash to bribe people :D

AMD does't have 5 bucks per review?  Damn, they're doing worse than I thought.

 

I don't think they're actually doing it, because I don't KNOW if they;re actually doing it.  I'm just saying it's always a possibility.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD does't have 5 bucks per review?  Damn, they're doing worse than I thought.

 

I don't think they're actually doing it, because I don't KNOW if they;re actually doing it.  I'm just saying it's always a possibility.

True - though I doubt they managed to pay 90% of reviewers since 90% of reviewers are consistent with each other

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

....

3-4 year old AMD GPUs have enough horsepower to compete with and beat the latest and greatest nVidia architecture...and still..nVidia's pushing graphical advancements...

 

Yep, they are. It should be clear as day, but apparently some people are seething with so much rage toward a company that they can't see that.

 

I've acknowledged that AMD has been on essentially the same architecture for a long time now and still keeping up with Nvidia. They're starting to hit a roadblock, though. That can be seen in the Fury cards.

Link to comment
Share on other sites

Link to post
Share on other sites

Yep, they are. It should be clear as day, but apparently some people are seething with so much rage toward a company that they can't see that.

 

I've acknowledged that AMD has been on essentially the same architecture for a long time now and still keeping up with Nvidia. They're starting to hit a roadblock, though. That can be seen in the Fury cards.

I agree with that, but that's what I call technical advancement, being able to provide and sustain performance for a very long period of time. That's why AMD cards are the GO TO in the SH market. I have always seen people recommending 5770s, 7870XTs, 7970s, 6850s for the correct price.

I haven't actually seen anybody recommending GTX 680s, 670s or the infamous god the 660Ti anymore.

MARS_PROJECT V2 --- RYZEN RIG

Spoiler

 CPU: R5 1600 @3.7GHz 1.27V | Cooler: Corsair H80i Stock Fans@900RPM | Motherboard: Gigabyte AB350 Gaming 3 | RAM: 8GB DDR4 2933MHz(Vengeance LPX) | GPU: MSI Radeon R9 380 Gaming 4G | Sound Card: Creative SB Z | HDD: 500GB WD Green + 1TB WD Blue | SSD: Samsung 860EVO 250GB  + AMD R3 120GB | PSU: Super Flower Leadex Gold 750W 80+Gold(fully modular) | Case: NZXT  H440 2015   | Display: Dell P2314H | Keyboard: Redragon Yama | Mouse: Logitech G Pro | Headphones: Sennheiser HD-569

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×