Jump to content

First DirectX 12 game benchmarked *Update 2 More benchmarks

Well first I really doubt Samsung would do this for NVIDIA only. It would be s shot in the head to start the production (with the investment it needs) to one client. Don't take me wrong - if your customer was Apple they would do it, but only NVIDIA? I just doubt it.

 

Second: we don't know the extent of AMD exclusivity deals with Hynix. I'm pretty sure alone won't be able to fullfill AMD/NVIDIA orders, specially when you have the possibility of several semi-custom designs to come out in 2016.

 

You have two GPU manufacturers with new designs, you have two Nintendo consoles to come out (even though Nintendo it's known to not use the top notch tech for their consoles, you can't dismiss what HBM brings to power consumption and form factor), you have AMD GPU IP licensing to Mediatek, you have Apple.

 

What I mean is, NVIDIA may be one of the customers for Samsung, but I'm pretty sure it's not even close to be the only one.

Not if the initial volume request is 140 million units, which is exactly what it is, because Nvidia is planning to 1-up all of AMD's HPC offerings and win in price, and the entire Pascal lineup is using HBM 2.

 

You would if you were an AMD investor and did your due diligence, like I did before I jumped off at $2. AMD is royally screwed.

 

And Nvidia is a big enough client to jumpstart the volume production to get pricing data aired; which, if it beats Hynix, brings the other customers rolling in. It's Samsung's smartest move to stick it to Hynix and Micron in a single move and help saturate its 14nm foundry which currently is taking a loss due to too low production volume.

 

HBM is too expensive for the consoles and you know it. Currently Hynix's price per stack is $30 for orders of 1 to 9 million, $25 if you order in packs of 10 million. If you intend to have HBM as the only system RAM, the cost of the SOC for 8GB is already between $200 and $240.

 

HBM is too expensive for cell phones too. There's no way Qualcomm and Mediatek are buying, unless Qualcomm has a new ARM SOC for HPC planned that they haven't announced yet.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

See when you mom calls you a twat, you might want to re evaluate how you behave and present yourself ;) ​

Eh, I give respect where it's earned and take no bullshit. I've gotten teachers fired and lawyers ruined because of that. As crass and off-putting as it is, it's not arrogance if you actually are as good as you think and sell yourself as, and IBM and their $260,000 starting salary and $50,000 signing bonus agree if I take a job with them at the end of this Fall semester.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Eh, I give respect where it's earned and take no bullshit. I've gotten teachers fired and lawyers ruined because of that. As crass and off-putting as it is, it's not arrogance if you actually are as good as you think and sell yourself as, and IBM and their $260,000 starting salary and $50,000 signing bonus agree if I take a job with them at the end of this Fall semester.

Are you going to take the job ?

For someone reason I got like the feeling you wouldn't for some reason.

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

Are you going to take the job ?

For someone reason I got like the feeling you wouldn't for some reason.

I'm debating between it and getting my master's and trying to get them to hold out for that additional year and a half. And I'm waiting to hear back from SAP, Oracle, and Nvidia before I say yes to anyone.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I'm debating between it and getting my master's and trying to get them to hold out for that additional year and a half. And I'm waiting to hear back from SAP, Oracle, and Nvidia before I say yes to anyone.

That is quite the decision to debate.  Though going with IBM and starting your career off would be rather interesting.  Money for one - which is good for investments, and your future, obviously, but also the capability to say years down the line when you could say to a possible future company to work for - "I have years of experience and work in IBM."

Link to comment
Share on other sites

Link to post
Share on other sites

That is quite the decision to debate.  Though going with IBM and starting your career off would be rather interesting.  Money for one - which is good for investments, and your future, obviously, but also the capability to say years down the line when you could say to a possible future company to work for - "I have years of experience and work in IBM."

Eh, IBM's goose is cooked in 10 years. Rumor has it Intel's moving to 4 threads per core on the Skylake E5 and E7 Xeons, meaning it's moving toward matching IBM in scale-up architectures to further the efforts to bury IBM and Nvidia in HPC pursuits. Once it reaches parity, its price competitiveness to Power 8 and Ultrasparc Fujitsu will leave mainframe and database builders considering leaving IBM. Once that happens all it has left is its CMOS production and foundry IP and graphene mass production IP. It would be great pay for 3 years, but I'd want to jump immediately after that. Big Blue's era is ending, and the only company delusional enough to not believe it is Nvidia.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Goodbye, Nvidia. 

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

Eh, IBM's goose is cooked in 10 years. Rumor has it Intel's moving to 4 threads per core on the Skylake E5 and E7 Xeons, meaning it's moving toward matching IBM in scale-up architectures to further the efforts to bury IBM and Nvidia in HPC pursuits. Once it reaches parity, its price competitiveness to Power 8 and Ultrasparc Fujitsu will leave mainframe and database builders considering leaving IBM. Once that happens all it has left is its CMOS production and foundry IP and graphene mass production IP. It would be great pay for 3 years, but I'd want to jump immediately after that. Big Blue's era is ending, and the only company delusional enough to not believe it is Nvidia.

Well, they could probably keep with mainstream quad-cores for Cannon-lake if they add quad-hyper-threading per core, huh.  Looking at the grand scheme.  If AMD kicked the bucket and Intel got the GPU-portion.. man, oh man.  That is quite the dominating future for them if it turns out to all be true together, huh.   

Link to comment
Share on other sites

Link to post
Share on other sites

Well, they could probably keep with mainstream quad-cores for Cannon-lake if they add quad-hyper-threading per core, huh.  Looking at the grand scheme.  If AMD kicked the bucket and Intel got the GPU-portion.. man, oh man.  That is quite the dominating future for them if it turns out to all be true together, huh.   

It's worse if IBM dies before AMD does, Intel pushes Nvidia out of HPC, and Intel buys out Nvidia since there'd be no markets left in which they compete. At that point grinding AMD into the floor would be inconsequential, because no one would want to pick up the smoldering pieces. At that point some holding company picks up x86_64, and Intel tells them to screw off and make a new 64-bit instruction set with more general purpose registers like ARM and adapts the industry to it with its army of programmers. At that point Intel has a monopoly in 2 different integrated circuit categories. That would be the end of competition everywhere but mobile. That is a nightmare. AMD dying ends up with Nvidia having x86_64 most likely, or another holding/licensing company who'll sell it to anyone willing to pay. And Intel would buy up ATI super fast. We'd have renewed competition on two fronts. That would be a much better outcome.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

It's worse if IBM dies before AMD does, Intel pushes Nvidia out of HPC, and Intel buys out Nvidia since there'd be no markets left in which they compete. At that point grinding AMD into the floor would be inconsequential, because no one would want to pick up the smoldering pieces. At that point some holding company picks up x86_64, and Intel tells them to screw off and make a new 64-bit instruction set with more general purpose registers like ARM and adapts the industry to it with its army of programmers. At that point Intel has a monopoly in 2 different integrated circuit categories. That would be the end of competition everywhere but mobile. That is a nightmare. AMD dying ends up with Nvidia having x86_64 most likely, or another holding/licensing company who'll sell it to anyone willing to pay. And Intel would buy up ATI super fast. We'd have renewed competition on two fronts. That would be a much better outcome.

I as well prefer that outcome, but still really hope AMD has its possible come-back.  Hopefully Zen is a miracle success to throw everyone through a loop and they can have some better performance and success with their Greenland graphics.  Though if not--hopefully Intel + ATI and nVidia + x86_64 is the outcome.

Link to comment
Share on other sites

Link to post
Share on other sites

I as well prefer that outcome, but still really hope AMD has its possible come-back.  Hopefully Zen is a miracle success to throw everyone through a loop and they can have some better performance and success with their Greenland graphics.  Though if not--hopefully Intel + ATI and nVidia + x86_64 is the outcome.

AMD is financially weak and structurally unsound, and Global Foundries is controlled by the same investor as AMD, a Saudi prince or another big shot from the middle east. He can only shuffle money between GloFo and AMD to keep affording node shrinks for so long if GloFo can't capture new business and AMD can't capture more OEMs and design wins. If 14nm HP flames out for GloFo and/or 14nm LPP isn't good enough to keep AMD's performance in close ties or beating Nvidia's or sufficiently beating them in power usage/heat to get the HPC architects in with it (not to mention the poor OpenCL scaling under Linpack and its successor benchmarks), all that R&D money will show up as an assailable debts in addition to what's due in 2019. If Zen is anything less than a smash hit, AMD is going to rot away, a result Intel won't like, but one that will benefit us all if IBM falters as I expect it to.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

So if Im right what we're seeing here is that AMDs hardware is actually on a very similar level to Nvidias hardware, its just DX11 optimisation that has put Nvidia ahead.

The situation is exaggerated in this benchmark because NVIDIA has tweaked their DX11 driver specifically for Ashes of the Singularity. Normally both AMD and Nvidia do game specific optimization for every AAA game. In this case AMD has not because they assume almost everybody will use DX12, or because it's not out yet. That's why you see a such a big gap between the two DX11 scores...

It's true that in a CPU bottlenecked scenario on DX11 Nvidia is more efficient. Just not to this extent...

Link to comment
Share on other sites

Link to post
Share on other sites

Things don't look so good on the Nvidia side of things 

 

 

This guy has no idea what he's looking at. You should really learn how to read the graphs before you post about them. 

 

*hint* Nvidia still has higher fps, amd drivers were just absolute crap before so you're seeing a huge increase, but still not beating nvidia. 

Link to comment
Share on other sites

Link to post
Share on other sites

This guy has no idea what he's looking at. You should really learn how to read the graphs before you post about them. 

 

*hint* Nvidia still has higher fps, amd drivers were just absolute crap before so you're seeing a huge increase, but still not beating nvidia. 

 

A 290x matching the 980 Ti is a pretty big win I would say...

 

heavy.001.png

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

A 290x matching the 980 Ti is a pretty big win I would say...

 

heavy.001.png

 

That's not what I'm talking about. It's great for AMD users, but the reason that it's great is because AMD's software was garbage. Also why are they using a 290x instead of a fury or 390x? 

Link to comment
Share on other sites

Link to post
Share on other sites

That's not what I'm talking about. It's great for AMD users, but the reason that it's great is because AMD's software was garbage. Also why are they using a 290x instead of a fury or 390x? 

 

And also another reason its great is because the 290x is half the price of a 980 Ti? 

 

Maybe because the 290x actually has alot of compute power? 

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

And also another reason its great is because the 290x is half the price of a 980 Ti? 

 

Maybe because the 290x actually has alot of compute power? 

And can be cheaply flashed to an R9 390X if you've got the 8GB version. (AMD got rekt big time when the first Hawaii 290X was flashed to a Grenada 390X).

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

And can be cheaply flashed to an R9 390X if you've got the 8GB version. (AMD got rekt big time when the first Hawaii 290X was flashed to a Grenada 390X).

 

Don't need to, all the performance boost is already in the latest drivers so all the cards are in on the party ;) 

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Don't need to, all the performance boost is already in the latest drivers so all the cards are in on the party ;)

$ $$$

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

And also another reason its great is because the 290x is half the price of a 980 Ti? 

 

Maybe because the 290x actually has alot of compute power? 

I never said it wasn't a good option, I just said it's the way it should have been. 

Link to comment
Share on other sites

Link to post
Share on other sites

I never said it wasn't a good option, I just said it's the way it should have been. 

Sometimes you just gotta wait for software to follow.

Should have simply wasn't possible.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

I never said it wasn't a good option, I just said it's the way it should have been. 

 

Hmm? What wasn't a good option? The 290x and the 980 Ti have the same amount of shading units...

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm? What wasn't a good option? The 290x and the 980 Ti have the same amount of shading units...

I don't think you're following along here. 

 

Sometimes you just gotta wait for software to follow.

Should have simply wasn't possible.

AMD software has been shit. They really need to fix that. If a what 3 year old gpu can compete with an almost top of the line card with simple driver fixes then AMD really deserves to go out of business. Someone like samsung or valve should buy them and make nvidia sweat. 

Link to comment
Share on other sites

Link to post
Share on other sites

AMD software has been shit. They really need to fix that. If a what 3 year old gpu can compete with an almost top of the line card with simple driver fixes then AMD really deserves to go out of business. Someone like samsung or valve should buy them and make nvidia sweat. 

Not really. Nvidia surely had some magic sauce in their DX11 driver, but nothing to make up this difference.

We are talking about 3 year old architectures, seen the first shred of daylight, as software are slowly been transformed to utilize more features in the architecture.

 

Nothing could have been done 3 years ago to get the same results. AMD have also been working on the low-level API, which hopefully had its influence in DX12 and other graphical APIs.

Dont think this was a simple "update" to their driver.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think you're following along here. 

 

AMD software has been shit. They really need to fix that. If a what 3 year old gpu can compete with an almost top of the line card with simple driver fixes then AMD really deserves to go out of business. Someone like samsung or valve should buy them and make nvidia sweat. 

 

Yea... I ain't getting you here. I don't remember who or what was saying anything about good or bad options...

Anyway it makes sense to compare the 290x and 980 Ti since they have similar specs on paper. 

 

Here's a nice pretty explanation http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/400#post_24321843

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×