Jump to content

AMD's 3xx series tessellation "not" improved - performance boost comes from the drivers

zMeul

I will buy everyone here the first Intel desktop card if that happens. According to him, it should happen by 2020 or 2021.

 

lol, They won't exist.  The dGPU will be all but phased out.... at least low/mid range cards will be.  So be prepared to buy everyone TITAN'esque desktop cards =D.

Link to comment
Share on other sites

Link to post
Share on other sites

I will buy everyone here the first Intel desktop card if that happens. According to him, it should happen by 2020 or 2021.

Why would there be a desktop card if they are dropping SoC's?

System CPU : Ryzen 9 5950 doing whatever PBO lets it. Motherboard : Asus B550 Wifi II RAM 80GB 3600 CL 18 2x 32GB 2x 8GB GPUs Vega 56 & Tesla M40 Corsair 4000D Storage: many and varied small (512GB-1TB) SSD + 5TB WD Green PSU 1000W EVGA GOLD

 

You can trust me, I'm from the Internet.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why would there be a desktop card if they are dropping SoC's?

That's the point. Nvidia will have a monopoly on PC gaming if AMD dies. It's that simple.

Link to comment
Share on other sites

Link to post
Share on other sites

I will buy everyone here the first Intel desktop card if that happens.

except Intel discrete graphic cards is nothing new:

361_intel740_fastware_ag240g_rev.2.2_top

Link to comment
Share on other sites

Link to post
Share on other sites

That's the point. Nvidia will have a monopoly on PC gaming if AMD dies. It's that simple.

 

It wouldn't.  If ATi goes to Intel, they get access to *some protocol shit and further integrate there architecture into GPU production to match Nvidia.  Nvidia in turn gets access to AMDs *protocol shit and starts giving Intel a run for their money in the same market.

 

Whoever has the most money when it starts has the best chance of winning.... which is Intel.

Link to comment
Share on other sites

Link to post
Share on other sites

That's the point. Nvidia will have a monopoly on PC gaming if AMD dies. It's that simple.

It's not the point. Your assuming that in the future manufacturing processes wont make it more viable for High speed Graphics to reside on the CPU. 

System CPU : Ryzen 9 5950 doing whatever PBO lets it. Motherboard : Asus B550 Wifi II RAM 80GB 3600 CL 18 2x 32GB 2x 8GB GPUs Vega 56 & Tesla M40 Corsair 4000D Storage: many and varied small (512GB-1TB) SSD + 5TB WD Green PSU 1000W EVGA GOLD

 

You can trust me, I'm from the Internet.

 

Link to comment
Share on other sites

Link to post
Share on other sites

23" Viewsonic 1080p. You people are crazy or not looking if you can't see the differences.

well I am on a phone so... I was just being captain obvious.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

maybe that's the problem

 

look at the below. The bottom shot yields way higher framerates. The top one only serves to make the game run slower for everyone. Fortunately for AMD users we we have a way to force the bottom preset. If CD Projekt Red had done their jobs properly it would not be necessary and everybody would be enjoying hairworks with high framerates. If you lower it further to x8 I guess it's visible if you are really looking for it.

 

You cannot blame CDPR on this one. They have publicly stated they cannot optimize it for AMD, which means they don't have source code access. Considering they have done a lot with HairWorks, except put in a multiplier slider, must mean they are not able to. In that case, this is all on NVidia, wasting everyone's performance.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I'd really like to see a 390X 8GB vs Vapor 290X 8GB with same clocks and the same drivers to see what's what. That's the only thing that interests me.

Lol what do you expect to see? If they're using the same drivers then the performance will be the exact same (within the margin of error ofc).

MacBook Pro 15' 2018 (Pretty much the only system I use)

Link to comment
Share on other sites

Link to post
Share on other sites

Lol what do you expect to see? If they're using the same drivers then the performance will be the exact same (within the margin of error ofc).

Exactly that. If it's the same than we have proof on paper (screen) that it's the same, and these stock 290X and 290X 4GB vs 390X 8GB debates can end. And even if we know it's the same, I want to see results for myself.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Ty guys.  You guys (especially @patrickjp93) are giving me an conceptual understanding regarding modern micro-processing... even though I have to read it at 0.05 speed to try to understand most of it (failing partly of course).

 

;)

We all have to start from somewhere. No worries :)

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

What @patrickjp93  is giving you is far from an education. He is throwing all of his bias for certain companies into little balls of fact and rumor.

Accuse all you like. If I thought AMD could be competitive long-term, I wouldn't be saying this. Intel and Nvidia both need competition. AMD simply can't deliver. AMD's cadence is off-time and leaves it unable to capture the marketshare it needs to make money and keep Nvidia on its toes, and it's for far behind Intel and not theoretically able to surpass it without the hat trick of the decade in CPUs. Nvidia's gunning for AMD to die sooner rather than later, meanwhile Intel's holding off on features in desktop Skylake. If you can't read tea leaves that obvious, there's little helping you.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

except Intel discrete graphic cards is nothing new:

 

Except, that's an Imagination Technology chip verbatim manufactured by Intel.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

...

 

Okay, out of curiosity, is this the only name you use on forums?

Pretty much.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Stop talking? Lol.

 

All I said was a speculation on my part. I can't speculate about something AMD is doing? 

Ignore him. Tech hog is as diehard an AMD fanchild as anyone else on the web. Despite the fact he has no basis to discredit anything I've said, he will attack anything I say even slightly anti-AMD with an unyielding sense of self-importance.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That's the point. Nvidia will have a monopoly on PC gaming if AMD dies. It's that simple.

No it isn't. The FTC would act to ensure competition is upheld. And Intel has a vendetta against Nvidia. It will do everything it can to drive Nvidia into the floor if it gets the ATI IP.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It wouldn't.  If ATi goes to Intel, they get access to *some protocol shit and further integrate there architecture into GPU production to match Nvidia.  Nvidia in turn gets access to AMDs *protocol shit and starts giving Intel a run for their money in the same market.

 

Whoever has the most money when it starts has the best chance of winning.... which is Intel.

He's an Intel fanboy. He's always going to say things which point to Intel's success.

 

It's not the point. Your assuming that in the future manufacturing processes wont make it more viable for High speed Graphics to reside on the CPU. 

No, there will never be a point when contemporary integrated graphics will match contemporary discrete graphics unless advancement stops for dGPUs. Even is it does, an Intel monopoly would be even worse than an Nvidia one.

Link to comment
Share on other sites

Link to post
Share on other sites

You cannot blame CDPR on this one. They have publicly stated they cannot optimize it for AMD, which means they don't have source code access. Considering they have done a lot with HairWorks, except put in a multiplier slider, must mean they are not able to. In that case, this is all on NVidia, wasting everyone's performance.

They don't need source code access. Nvidia and AMD both optimize using just the binary. This goes to my bashing of game programmers. They're not remotely good. Near universally they're the bottom 50% of all computer programmers at any point in time. There are rare exceptions. John Carmack (washed up as he is) and our own LukaP are both quite capable at using modern design and techniques. The lead game developers are from the stone age when it comes to CPU support and use of parallelism at all levels of programming too. If CDPR can't figure out how to optimize the game, they shouldn't have built it.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Except, that's an Imagination Technology chip verbatim manufactured by Intel.

"if" and when they gets their hands on ATi, will be ATi tech  ^_^

Link to comment
Share on other sites

Link to post
Share on other sites

He's an Intel fanboy. He's always going to say things which point to Intel's success.

 

No, there will never be a point when contemporary integrated graphics will match contemporary discrete graphics unless advancement stops for dGPUs. Even is it does, an Intel monopoly would be even worse than an Nvidia one.

You're disagreeing with IBM, Intel, AMD, and ARM. Disagreeing with 1 of them is stupid. Disagreeing with all 4 at once is a level of stupid for which exists no word strong enough.

 

AMD and Intel both say they'll have SOCs which are 98% GPU by the end of the decade. Think about that for a second. Your CPU and dGPU dies will be about the same size, with a difference in SP count of about 4%. In other words, you are wrong, flat out and unequivocally. 

 

And no company has managed to stay competitive with Intel long-term. Not one. The king IBM lost. AMD had about a 3-year stint before falling off a cliff. If you're willing to be AMD and Nvidia can survive long-term against the premier tech company of the world, you're the kind of risk-taker that one should listen to. Pessimism is a good thing. It's a defensive instinct. I suggest you refamiliarize with it.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

They don't need source code access. Nvidia and AMD both optimize using just the binary. This goes to my bashing of game programmers. They're not remotely good. Near universally they're the bottom 50% of all computer programmers at any point in time. There are rare exceptions. John Carmack (washed up as he is) and our own LukaP are both quite capable at using modern design and techniques. The lead game developers are from the stone age when it comes to CPU support and use of parallelism at all levels of programming too. If CDPR can't figure out how to optimize the game, they shouldn't have built it.

 

I can't really comment on the competencies of game dev programmers, but there are many different programming areas in game development, so you have to be more specific. A 3D model programmer, probably doesn't need to be as competent as a game engine programmer. Either way, games are just entertainment, and are not as critical as, lets say medical hardware firmware, etc. So obviously the best programmers go to the best paid jobs, which are specialized.

 

As for the easy optmizable. This is what an ex Valve programmer has to say about it:

 

“[T]here are fundamental limits to how much perf you can squeeze out of the PC graphics stack when limited to only driver-level optimizations,” Geldreich told ExtremeTech. “The PC driver devs are stuck near the very end of the graphics pipeline, and by the time the GL or D3D call stream gets to them there’s not a whole lot they can safely, sanely, and sustainably do to manipulate the callstream for better perf. Comparatively, the gains you can get by optimizing at the top or middle of the graphics pipeline (vs. the very end, inside the driver) are much larger.”

http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy/3

 

Either way, that still doesn't change multiplier settings on HairWorks, for the developer. AMD did manage to dictate tessellation levels on GameWorks effects, which is more than NVidia users themselves can (which really sucks).

 

Also CDPR can easily optimize their game, and have so plenty of times. But optimizing a black boxed dll, is a different story. Even if they could, putting in the resources, just isn't worth it.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I can't really comment on the competencies of game dev programmers, but there are many different programming areas in game development, so you have to be more specific. A 3D model programmer, probably doesn't need to be as competent as a game engine programmer. Either way, games are just entertainment, and are not as critical as, lets say medical hardware firmware, etc. So obviously the best programmers go to the best paid jobs, which are specialized.

 

As for the easy optmizable. This is what an ex Valve programmer has to say about it:

 

http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy/3

 

Either way, that still doesn't change multiplier settings on HairWorks, for the developer. AMD did manage to dictate tessellation levels on GameWorks effects, which is more than NVidia users themselves can (which really sucks).

 

Also CDPR can easily optimize their game, and have so plenty of times. But optimizing a black boxed dll, is a different story. Even if they could, putting in the resources, just isn't worth it.

There's no such thing as a black-boxed DLL if you're resourceful. We have disassemblers, and decompilers are rapidly approaching the same advancement as compilers. I tore open LLVM a long time ago despite the virtual language not being open-source, and that's a much more intricate piece of software than a graphics pipeline.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

There's no such thing as a black-boxed DLL if you're resourceful. We have disassemblers, and decompilers are rapidly approaching the same advancement as compilers. I tore open LLVM a long time ago despite the virtual language not being open-source, and that's a much more intricate piece of software than a graphics pipeline.

 

You still have 7.000 lines shaders in single GameWorks effects. Going through all of that, and programming an option for tessellation multiplier, might just be way more work than is worth it. Remember that Witcher 3 was delayed half a year, and have had several patches since, so the man power has been stretched thin already (even to the point of pushing development on Cyberpunk 2077).

urther more, what is technically possible, and what is legally possible qua their license agreement with NVidia are easily two different things.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

You still have 7.000 lines shaders in single GameWorks effects. Going through all of that, and programming an option for tessellation multiplier, might just be way more work than is worth it. Remember that Witcher 3 was delayed half a year, and have had several patches since, so the man power has been stretched thin already (even to the point of pushing development on Cyberpunk 2077).

urther more, what is technically possible, and what is legally possible qua their license agreement with NVidia are easily two different things.

It wouldn't be in violation of any "no code sharing" agreement, especially since the original code will never be visible to AMD except through, wait for it... THE BINARY!

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No it isn't. The FTC would act to ensure competition is upheld. And Intel has a vendetta against Nvidia. It will do everything it can to drive Nvidia into the floor if it gets the ATI IP.

Nvidia will claim that Iris Pro as a competitor and then pay off the FTC.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×