Jump to content

WOW. AMD GPU's To Gain 400% Performance Increase With DirectX12

AMD will include fire extinguisher with their new 300 seires cards guys!

 

yay for awsome cooling.

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA FOR THE WIN!!!!!!!!!!!

My Cheap But Good Rig: I7-3770s, Intel Motherboard (actually made by intel), 16gb DDR3, Nvidia Gtx 1070, 250gb Samsung 850 EVO SSD, 750gb HDD, Evga 500 BR power supply

Link to comment
Share on other sites

Link to post
Share on other sites

Fanboy? That's fact. The GTX 980 is and always will be more powerful than the 290X, every benchmark has shown it.

The 980 also costs ~$250 more.

i5-4690k @ 4.2GHz | Asus Radeon R9 290 DirectCU II | Hyper 212 EVO | ASRock Z97 Extreme3 | Corsair Vengeance 8GB 1866MHz, Corsair XMS 4GB | 850 EVO 250GB, random 1TB drive | Corsair 200R | EVGA SuperNOVA 750W | Rosewill RK-9000BR | Logitech G700s | Logitech G930 | ViewSonic VG2427wm, Dell S2209W, Dell S2009W

Dell Inspiron 3147

Latitude E5420, Samsung 840 EVO 250gb, 12gb RAM, 1600x900 display

Pentium G3258 @ 3.2GHz | WD Red 2TB x3 in RAID-Z, Crucial MX100 128GB(cache drive) | Fractal Design Node 304 | Gigabyte GA-H97N-WIFI | Crucial Ballistix Sport 8GB | EVGA 500B

HTC One M8 64GB, Droid Razr M

Link to comment
Share on other sites

Link to post
Share on other sites

NVIDIA FOR THE WIN!!!!!!!!!!!

The fanboyism is real

i5-4690k @ 4.2GHz | Asus Radeon R9 290 DirectCU II | Hyper 212 EVO | ASRock Z97 Extreme3 | Corsair Vengeance 8GB 1866MHz, Corsair XMS 4GB | 850 EVO 250GB, random 1TB drive | Corsair 200R | EVGA SuperNOVA 750W | Rosewill RK-9000BR | Logitech G700s | Logitech G930 | ViewSonic VG2427wm, Dell S2209W, Dell S2009W

Dell Inspiron 3147

Latitude E5420, Samsung 840 EVO 250gb, 12gb RAM, 1600x900 display

Pentium G3258 @ 3.2GHz | WD Red 2TB x3 in RAID-Z, Crucial MX100 128GB(cache drive) | Fractal Design Node 304 | Gigabyte GA-H97N-WIFI | Crucial Ballistix Sport 8GB | EVGA 500B

HTC One M8 64GB, Droid Razr M

Link to comment
Share on other sites

Link to post
Share on other sites

DX12 hype ! If this will apply to games (double the fps) then I'll buy 144hz monitor even before I upgrade my GPU (want 1070 unless r9 390 is superior)

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

I bet is actually "Up to 400%" lol.

The stars died for you to be here today.

A locked bathroom in the right place can make all the difference in the world.

Link to comment
Share on other sites

Link to post
Share on other sites

Article: http://au.ibtimes.com/directx-12-improves-amd-gpu-performance-400-windows-10-1420167

 

___________________________________________________________________________________

 

(Pic courtesy of @catbutts).

AEVnxt4.png

Holy crap I don't trust that source one bit. I will wait for a tech website to post something.

Love cats and Linus. Check out linuscattips-fan-club. http://pcpartpicker.com/p/Z9QDVn and Asus ROG Swift. I love anime as well. Check out Heaven Society heaven-society. My own personal giveaway thread http://linustechtips.com/main/topic/387856-evga-geforce-gtx-970-giveaway-presented-by-grimneo/.

Link to comment
Share on other sites

Link to post
Share on other sites

Holy crap I don't trust that source one bit. I will wait for a tech website to post something.

the anandtech test is reliable. You just have to understand what it is. It's designed to measure one aspect of GPU driver performance- namely the CPU bottleneck when overloaded with draw calls. It doesn't mean that the 290x is going to become four times faster, or that the 980 is going to become 2.5 times faster, or that the 980 is going to beat the 290x by 50%. It's like a synthetic benchmark measuring one aspect of performance. The reason for some of the retarded arguing is that it seems much of this community cannot understand that.
Link to comment
Share on other sites

Link to post
Share on other sites

the anandtech test is reliable. You just have to understand what it is. It's designed to measure one aspect of GPU driver performance- namely the CPU bottleneck when overloaded with draw calls. It doesn't mean that the 290x is going to become four times faster, or that the 980 is going to become 2.5 times faster, or that the 980 is going to beat the 290x by 50%. It's like a synthetic benchmark measuring one aspect of performance. The reason for some the retarded arguing is I seems that much of this community cannot understand that.

I mean the website link http://au.ibtimes.co...dows-10-1420167. I will go to anandtech for their article. The link provided is full of ads and spam.

Love cats and Linus. Check out linuscattips-fan-club. http://pcpartpicker.com/p/Z9QDVn and Asus ROG Swift. I love anime as well. Check out Heaven Society heaven-society. My own personal giveaway thread http://linustechtips.com/main/topic/387856-evga-geforce-gtx-970-giveaway-presented-by-grimneo/.

Link to comment
Share on other sites

Link to post
Share on other sites

I mean the website link http://au.ibtimes.co...dows-10-1420167. I will go to anandtech for their article. The link provided is full of ads and spam.

oh ya the ibtimes article is clickbait. Will mislead less tech savvy people who will come back in one year and say that dx12 was overhyped. This will happen due to no fault of Microsoft or Nvidia or AMD
Link to comment
Share on other sites

Link to post
Share on other sites

oh ya the ibtimes article is clickbait. Will mislead less tech savvy people who will come back in one year and say that dx12 was overhyped. This will happen due to no fault of Microsoft or Nvidia or AMD

Here is the AnandTech for those that want to read it. http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm

Love cats and Linus. Check out linuscattips-fan-club. http://pcpartpicker.com/p/Z9QDVn and Asus ROG Swift. I love anime as well. Check out Heaven Society heaven-society. My own personal giveaway thread http://linustechtips.com/main/topic/387856-evga-geforce-gtx-970-giveaway-presented-by-grimneo/.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD will include fire extinguisher with their new 300 seires cards guys!

 

yay for awsome cooling.

if you buy any reference card anyway, you're crazy. AMD or Nvidia.

Current Build: (protege) Core i7-4790k @ 4.7GHz (1.264v) | Corsair H75 | Gigabyte GA-G1.SNIPER Z97 | MSI GeForce GTX 970 4(3.5)gb Twin Frozr (Soon to be 380x) | Corsair RM750 Laptop: AMD A6-4400m | Toshiba 500gb HD | Radeon HD 7520g

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

 

400% in ONE instance (Starswarm on DX11 cripples ALL GPU's) does not equal 400% across the board for us all.

Keep that in mind.

Not saying I don't welcome more performance, but it's a little misleading to think it's standard fare for all things.

I agree with you... one of the only people in this thread that is sensible.. to me? It seems more like a driver issue for D3D11 vs D3D12 for Star Swarm... 8fps? That's DEFINITELY not a proper representation of the AMD R9 290X... It is funny how people are running away with it though like QueenDementria lol... do you HONESTLY think that it means that the AMD R9 290X will see 400% framerate boost? Like somehow DirectX 12 will boost all games into the stratosphere? That a single AMD R9 290X will do 4K @ 60fps in all games? Step back and think about this situation a little more please. :P

 

I for one have high hopes for DirectX 12... that it will do what it promises... better hardware control for game developers and maybe even fix the frame buffer issues that plague Multi-GPU setups and allow VRAM doubling instead of each VRAM bank having to duplicate data? Only time will tell... Star Swarm sounds like it is doing something to what I'm hoping... but this benchmark is completely out of whack haha...

 

 

400% increase.....on an fx-4300 cpu....

But it's got quad core!!!! Maybe it's OC'ed to like.. 10GHZ!? We'll never know. ;)

 

Nobody really has taken advantage of Mantle. With time it should get better, just like PhysX. I remember the joke about PhysX was that "It looks great, for all two games that support it".

Mantle is/was a great idea... fix what DirectX & OpenGL hasn't... but DirectX 12 seems to be rectifying it... but I'll wait for a reputable benchmark... why is the GTX 750ti getting better framerate than the AMD R9 290X in D3D11??? That makes absolutely no sense. Something tells me their setup was bugged right out... that or AMD haven't optimized their drivers for D3D11 Star Swarm test...

 

Ever heard of 3DFX Glide by chance? It was a spin-off of OpenGL by 3DFX Labs. They focused on building the best graphics API and at the time? Yes, 3DFX Glide was the shit! It blew away OpenGL and Direct 3D... BUT... and just like you said QueenDemetria, support was rather lack-lustre... despite Unreal Engine supporting it? No other notable game engines supported it so it faded into obscurity and DirectX was gaining in popularity... OpenGL didn't know its head from its ass and was focusing more on Computer Aided Design and neglected the improvments of pixel shaders and shader models that were gaining in processing abilities and 3DFX's DirectX side of their drivers sucked terribly (Quake 3 Arena was unplayable in D3D and Counter-Strike had a two second lag in D3D as prime examples from my 3DFX Voodoo 4 with 32MB on PCI... buck yeah!!! lol)  that eventually 3DFX Labs went under and NVIDIA bought out their licenses..

 

Interesting fact about 3DFX? Whilst lots of their engineers went over to NVIDIA to develop NVIDIA SLI (Scalable Link Interface) and port it from 3DFX SLI (Serial Link Interface)? A bunch of engineers went over to ATI (before AMD bought them out) to assist them in making the HD x1000 series GPUs and CrossFire... that's why AMD CrossFire and NVIDIA SLI work so similar...

 

AMD haven't said much about the technical side of AMD Mantle? But I'd bet money that the 3DFX Engineers who made 3DFX Glide did the same thing of porting and improving OpenGL for creating AMD Mantle... but just like history? Now DirectX is catching up so AMD Mantle will probably fade away and DirectX 12 will be the mainstray which is fine...

 

but I do have some hopes for Khronos Group actually fixing OpenGL by doing what should have been done long ago, and splitting up OpenGL into OpenGL (Legacy) and glNext (core)... maybe then gaming on Mac and/or Linux won't be so damn painful... OpenGL is the worst thing ever for AMD CrossFire and NVIDIA SLI support... both don't work at all in Linux save for like.. ID Tech 4 Engine games... but now that GPUs are so strong? They don't need it... but playing Dying Light, War Thunder or Serious Sam 3 on Linux with Multi-GPU? All the other cards do is have fans that spin... no performance boosts... sigh...

 

It doesn't surprise me in the slightest that ID Software, who used to be pro-OpenGL have completely dropped OpenGL from ID Tech 5 Engine starting with the DirectX 11 only "The Evil Within" game... not a trace of OpenGL.

 

/rant

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

Holy crap I don't trust that source one bit. I will wait for a tech website to post something.

 

It's an insanely CPU bottlenecked test they did. DirectX 12 allows GPU to provide more performance without being held back by CPU as much. In DX11 AMD's video cards suffer more from this than Nvidia's, that's why the gains are so large... because AMD's cards performed worse to begin with in this test due to not performing as well when CPU is a limiting factor.

 

Also AnandTech is kind of a tech website...

Intel i5-4690K @ 3.8GHz || Gigabyte Z97X-SLI || 8GB G.Skill Ripjaws X 1600MHz || Asus GTX 760 2GB @ 1150 / 6400 || 128GB A-Data SX900 + 1TB Toshiba 7200RPM || Corsair RM650 || Fractal 3500W

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

She's got huge.....       tracks of land.

 

 

attachicon.gif Screen Shot 2015-02-12 at 7.05.33 PM.png

LMFAO! Ok you win this thread. :P

 

that's cute

It's not AMD's message to NVIDIA... that was Linus Torvalds showing his dislike for how hard it is to work with NVIDIA to incorporate NVIDIA Optimus in Linux... but as annoying as that is? I'm pretty sure Qualcomm and Creative Labs is worse... My Soundblaster Recon3D crackles and fucks up so bad in Linux... my Killernic 2100 doesn't support Linux at all, so I have to enable onboard LAN when I want to boot into Linux... at least NVIDIA supports their GPUs on Linux and have pretty darn good drivers for gaming just not for 4K gaming... yet....

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×