Jump to content

AMD's 3xx series tessellation "not" improved - performance boost comes from the drivers

zMeul

It wouldn't be in violation of any "no code sharing" agreement, especially since the original code will never be visible to AMD except through, wait for it... THE BINARY!

 

Based on the interview Tom Petersen did, after Huddy's attack on GameWorks, we know that companies can buy access to prepackaged DLL's, part source code access, and full access. If you only buy the first part, I believe you are in violation, if you go into source code. But of course it's all speculation, when we can't read the license agreement/contract. What we can do, is see what all devs, using GameWorks are doing, and that does not show particular source code editing or options/settings changes in anything. Was it possible to change the tessellation multiplier in Far Cry 4? or any other HairWorks title? I believe not, and that in itself should say something,

 

Addendum: I believe it was in this podcast, they talked about their license changes, though I'm not 100% certain: https://www.youtube.com/watch?v=aG2kIUerD4c

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

You're disagreeing with IBM, Intel, AMD, and ARM. Disagreeing with 1 of them is stupid. Disagreeing with all 4 at once is a level of stupid for which exists no word strong enough.

 

AMD and Intel both say they'll have SOCs which are 98% GPU by the end of the decade. Think about that for a second. Your CPU and dGPU dies will be about the same size, with a difference in SP count of about 4%. In other words, you are wrong, flat out and unequivocally. 

 

And no company has managed to stay competitive with Intel long-term. Not one. The king IBM lost. AMD had about a 3-year stint before falling off a cliff. If you're willing to be AMD and Nvidia can survive long-term against the premier tech company of the world, you're the kind of risk-taker that one should listen to. Pessimism is a good thing. It's a defensive instinct. I suggest you refamiliarize with it.

So, you think that there will be 250-300W SoCs for consumers in the future? I highly doubt it. Maybe, but we'll see. It's much more likely that PC gaming will just die.

 

Either way, if no company really can compete with Intel, they need to be broken up after AMD's death. Also, LOL about me being an AMD fanboy. I've been called a fanboy for everything under the sun at some point in my life because I'm balanced. For now, I seem like an AMD fanboy to you for two reasons:

 

1.) You're an Intel fanboy

2.) I favor the underdog in most cases because I'm terrified of monopolies

 

If another company were to step to Intel and Nvidia or if Intel showed interest in competing with Nvidia instead of throwing a temper tantrum because they didn't read the fine print (not that Nvidia or AMD are free of guilt from temper tantrums), I'd be fine with AMD going (assuming there's actual competition with price). As it stands though, I can't see any good things coming of them dying, and the only "good thing" you can come up with is Intel having total control of the desktop space. But yeah, I'm the fanboy here.

 

How about if you explain your point in-depth to me in a PM, explaining exactly how the industry is better off with only Intel and we won't have to worry about paying $2000 for a 1080p gaming PC? I'm willing to listen.

Link to comment
Share on other sites

Link to post
Share on other sites

Ignore him. Tech hog is as diehard an AMD fanchild as anyone else on the web. Despite the fact he has no basis to discredit anything I've said, he will attack anything I say even slightly anti-AMD with an unyielding sense of self-importance.

 

the funny thing is that I find myself to be pretty in favor of AMD simply due to me liking to support the underdog (I will admit that nvidia and intel have better silicon. I just like AMD) and I blocked techhog because I felt like he was an nvidia shill :P

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, except that 390X is being compared to a bottom of the barrel 980 that isn't overclocked. Overclock both, and Hell throw a real 980 at it, and let's see what we get.

 

 

No it isn't. The FTC would act to ensure competition is upheld. And Intel has a vendetta against Nvidia. It will do everything it can to drive Nvidia into the floor if it gets the ATI IP.

 

fuck me you love to argue even when most of what you speak is crap. the ftc cant do shit if no one comes in for amd if they went under, the only thing they could do is stop nvidia from going in for them and even then id nvidia are the only one they wont. as for the bottom of the barrel 980, the most you get from a vendor card is 10% improvement...its still $100 cheaper....the 390x, is much better value than a 980 right now...deal with it

"if nothing is impossible, try slamming a revolving door....." - unknown

my new rig bob https://uk.pcpartpicker.com/b/sGRG3C#cx710255

Kumaresh - "Judging whether something is alive by it's capability to live is one of the most idiotic arguments I've ever seen." - jan 2017

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, from an AMD fanboy to to a NVidia shill in the same damn thread...NICE!  ;)

Which proves the point I made in my previous post. I'm very balanced in my opinions based on my interpretation of the facts, yet I'm also very adamant when it comes to my opinions and drive them hard. Thus, anyone with a bias who argues with me might assume that I'm a fanboy.

Link to comment
Share on other sites

Link to post
Share on other sites

where can I get this hacked driver for my 290x?

 

Lol today it's now cracked drivers because manufacturer are selling them ha ha ha

Link to comment
Share on other sites

Link to post
Share on other sites

Which proves the point I made in my previous post. I'm very balanced in my opinions based on my interpretation of the facts, yet I'm also very adamant when it comes to my opinions and drive them hard. Thus, anyone with a bias who argues with me might assume that I'm a fanboy.

 

 

ehh, the real reason I blocked you is because you're an ass about pushing your opinions, I don't care what you think it's just annoying how you preach.

Daily Driver:

Case: Red Prodigy CPU: i5 3570K @ 4.3 GHZ GPU: Powercolor PCS+ 290x @1100 mhz MOBO: Asus P8Z77-I CPU Cooler: NZXT x40 RAM: 8GB 2133mhz AMD Gamer series Storage: A 1TB WD Blue, a 500GB WD Blue, a Samsung 840 EVO 250GB

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia did this with Fermi and Fermi 2.0, the core clock and core count did not equate to the gains seen by the 580, but instead later on in around 290.x.x drivers and higher, the 480 was catching up.
 
 
Here is some proof...
 
480 flashed with 580 bios.
 
2Fz9otg.png
 
Test results at same exact clock speeds.
 
FGQ2GJs.png
hqV7iMF.png

AMD Phenom II B55 Quad / unlocked dual core 4.3ghz CB R15 = CB 422
XFX R9 390 8GB MY RIG: http://uk.pcpartpicker.com/p/MVwQsY
Fastest 7770 on LTT . 3rd Fastest Phenom II Quad on LTT

PCSX2 on AMD CPU? http://linustechtips.com/main/topic/412377-pcsx2-emulator-4096x2160-amd-phenom-ii/#entry5550588

Link to comment
Share on other sites

Link to post
Share on other sites

PCPer has come to a different conclusion: http://www.pcper.com/news/Graphics-Cards/AMD-Catalyst-155-1515-Performance-Check-Validating-AMD-R9-390-Testing

 

But, there does not appear to be any kind of smoking gun to point to that would indicate AMD was purposefully attempting to improve its stance through driver manipulation. And that's all I wanted to make sure of with this testing today and this story. I obviously didn't test every game that users are playing today, but all indications are that AMD is in the clear.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

No AMD drivers are not doing that.

 

Witcher 3 is a seperate discussion; AMD users were able to optimize hairworks performance by capping the tessellation level manually in catalyst control center.

 

This has nothing to do with what's going on in the 3xx series driver which is probably some optimization regarding how tessellation workloads are processed in general.

 

Witcher 3 is exactly the thing that spawned this discussion. Witcher 3 is performing much better on the 390X than the 290X. The question posed was if this is due to a better tessellation performance by the new 300 series or on the software side. It seems likely that AMD is tailoring their drivers to reduce the tessellation in much the same way as users were doing through CCC. You can't magically increase tessellation performance that much in software, all else being equal. 

 

Therefore, I said that the comparison would no longer be apples to apples if the 390X was only doing 1/4 of the tessellation of the 980. Or whatever it has been reduced to. 

 

Which is also why I said that CDPR and Nvidia should allow users to set their own desired tessellation. So that if you are willing to live with 1/4 or 1/2 the current tessellation of HW, you can enjoy some additional performance.

Turnip OC'd to 3Hz on air

Link to comment
Share on other sites

Link to post
Share on other sites

So if you use CCC to reduce the tessellation factor, then the 390x will outperform the 980 with hairworks?

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia will claim that Iris Pro as a competitor and then pay off the FTC.

It can't. Under DX 12 iGPUs and dGPUs work in tandem, and Intel has no dGPUs. Furthermore Intel can argue its offerings don't touch Nvidia's high end. Nvidia would have no way, and Intel can pay more by far.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

So, you think that there will be 250-300W SoCs for consumers in the future? I highly doubt it. Maybe, but we'll see. It's much more likely that PC gaming will just die.

 

Either way, if no company really can compete with Intel, they need to be broken up after AMD's death. Also, LOL about me being an AMD fanboy. I've been called a fanboy for everything under the sun at some point in my life because I'm balanced. For now, I seem like an AMD fanboy to you for two reasons:

 

1.) You're an Intel fanboy

2.) I favor the underdog in most cases because I'm terrified of monopolies

 

If another company were to step to Intel and Nvidia or if Intel showed interest in competing with Nvidia instead of throwing a temper tantrum because they didn't read the fine print (not that Nvidia or AMD are free of guilt from temper tantrums), I'd be fine with AMD going (assuming there's actual competition with price). As it stands though, I can't see any good things coming of them dying, and the only "good thing" you can come up with is Intel having total control of the desktop space. But yeah, I'm the fanboy here.

 

How about if you explain your point in-depth to me in a PM, explaining exactly how the industry is better off with only Intel and we won't have to worry about paying $2000 for a 1080p gaming PC? I'm willing to listen.

The Power 8s are 250W CPUs. I see no reason to think an SOC at that TDP is impossible.

 

1) I'm not a fanboy. If AMD can truly dominate shape up, kudos to them. All signs point to the conclusion they can't. 

2) Wanting to cheer for the underdog is fine, but not when it comes at the expense of needlessly bashing the current industry leaders.

 

Intel wouldn't be in total control. Nvidia would have CPUs and SOCs out the door within 2 years that would be competitive, but of course that may result in Intel opening the floodgates and trying to bury Nvidia under all of their built up features that haven't been released. As to whether or not that would succeed requires total speculation.

 

I will tomorrow. For now, I'm tired from a long day.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

fuck me you love to argue even when most of what you speak is crap. the ftc cant do shit if no one comes in for amd if they went under, the only thing they could do is stop nvidia from going in for them and even then id nvidia are the only one they wont. as for the bottom of the barrel 980, the most you get from a vendor card is 10% improvement...its still $100 cheaper....the 390x, is much better value than a 980 right now...deal with it

Nvidia is very much after x86_64 after the Denver tegra fallout. When Intel found out it was emulating x86 instructions the hammer came down on Nvidia pretty hard. Nvidia needs to diversify to survive long term, and for that it needs SOCs, and ubiquity would help.

 

Nothing I ever type or speak is utter crap. Intel would dive for the ATI IP straight up. IP is the only thing standing in its way to having competitive graphics.

 

Most 980s from other vendors get 30+% overclocks.

 

That remains to be proven.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

no suprise, we knew it was the same chip with a new name, more ram for cheaper only

Error: 451                             

I'm not copying helping, really :P

Link to comment
Share on other sites

Link to post
Share on other sites

pcper conducted a test between drivers, they didnt see improvement on 290X on different drivers but on later released games like GTA V, and im guessing Witcher 3 etc, so the 290X got perf iprovement because those games are still being patched and updated, latest driver ofc brings lastest update for their patch.

so basicaly 300 series are genuinly faster with tesselation than 200 series, there was some confusion and speculation, but this will clear up when ppl bench a dozen games with and without tesselation.

Link to comment
Share on other sites

Link to post
Share on other sites

So all the threads saying the 390X are actually improved.... just got served.   But guys... it's getting 5 fps faster than 290X... Oh the drivers were changed though. Nevermind.

 

But guys the Fury....   Right.... The only card worth getting.  Possibly.  We shall see.

Link to comment
Share on other sites

Link to post
Share on other sites

Is there are any info if it does improve r9 270/x performance?

Laptop: Acer V3-772G  CPU: i5 4200M GPU: GT 750M SSD: Crucial MX100 256GB
DesktopCPU: R7 1700x GPU: RTX 2080 SSDSamsung 860 Evo 1TB 

Link to comment
Share on other sites

Link to post
Share on other sites

Is there are any info if it does improve r9 270/x performance?

 

I don't think so.  The 270/280 are older than the 285/290... different GPUs.  But I could be wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm still trying to understand why are people pissed off over a card they won't buy anyway...

 

EDIT: and for the love of everything will you stop fighting in this forum over this, sheessh

Im surelly pissed even though i was never a potential customer for ati products (purely because i prefer premium products over reasonable ones) - bad line up after over 18 months of eaiting means no real competition for nvidia, aka no price cuts and no push for new releases and we are straigth on the downwards slope of currrent cpu market :(

Link to comment
Share on other sites

Link to post
Share on other sites

That is an unconfirmed rumor. Don't go spreading it as fact, and by the way, I believe this article and thread prove you're full of it. AMD took the same GPUs, tossed them on new boards with better BIOS and firmware/drivers, and the AIBs put on better coolers. Given this with the newest driver hacked onto a 290X and bringing the same performance, AMD did nothing in regards of a new process revision.

So you are claiming that the same 28nm process is being unchanged since 2013? There were no improvements?

I didn't expect this coming from you.

 

uuuuh...no...

they are both 28nm and pretty much identical cards

 

the only new cards are the fury series

 

I'm not saying they are not identical - what I'm saying is that the 28nm process improved.

 

Link to comment
Share on other sites

Link to post
Share on other sites

So you are claiming that the same 28nm process is being unchanged since 2013? There were no improvements?

I didn't expect this coming from you.

 

I'm not saying they are not - what I'm saying is that the 28nm process improved.

 

It's the same Hawaii die on the same TSMC 28nm process, perhaps improved yields means better voltages/overclocking but other than that no improvements past PCB revisions and software improvements.

 

I suspect AMD just don't want to put money into making many new cards on 28nm with better processes right around the corner (which usually means a lot of cost, lower yields (so more wasted/failed chips) and much bigger performance improvements) when most of their cards are decently performing at their current prices.

Link to comment
Share on other sites

Link to post
Share on other sites

people have already flashed their 290X into a 390X for shits and giggles

 

Hacked driver in question found in this post

 

http://forums.guru3d.com/showpost.php?p=5098084&postcount=26

 

R9 290x and r9 390x is not the same.

 

R9 390x has higher memory clock speed and tighter timings (1500 vs 1250).

 

R9 390x has higher voltage about +50mv

Yeah, we're all just a bunch of idiots experiencing nothing more than the placebo effect.
Link to comment
Share on other sites

Link to post
Share on other sites

so w8... u are saying that improved tesselation performance on a graphic card through a driver is  bad thing? should that be a good thing?

AMD Rig - (Upgraded): FX 8320 @ 4.8 Ghz, Corsair H100i GTX, ROG Crosshair V Formula, Ghz, 16 GB 1866 Mhz Ram, Msi R9 280x Gaming 3G @ 1150 Mhz, Samsung 850 Evo 250 GB, Win 10 Home

(My first Intel + Nvidia experience  - recently bought ) : MSI GT72S Dominator Pro G ( i7 6820HK, 16 GB RAM, 980M SLI, GSync, 1080p , 2x128 GB SSD + 1TB HDD... FeelsGoodMan

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×