Jump to content

AMD's Zen architecture will be late

zMeul

Again you deliver nothing constructive or useful to the debate. I don't have schadenfreude for NVidia users. I think it's a ripoff and feel sorry for them. Why defend NVidia in planned obscolescence? Then again most in here seems to support vendor lock in, and black boxed middleware.

 

I'm not defending anyone, lmfao.

 

Here again, you're being a hypocrite. Everything you speak against, you do it to others. You're so ignorant it's not even funny anymore. I used to get a laugh out of your posts because I legit thought you were trolling, but now I can see you're being completely serious.

 

And there it is again, that magical word "black boxed middleware".

 

Amazing. *slow clap*

 

Everything I argue with you about is about the arguments you make, how you make them and how you present yourself. It's not about your fookin stance. You can be on Nvidia's team and argue for them and I'd still argue with you over the way you present your points. You literally have nothing to bring to the table. Everything you say is pure speculation and nothing more than hearsay.

 

Can't wait to see how you, yet again, bring up some facts that somehow further your narrative.

Link to comment
Share on other sites

Link to post
Share on other sites

History has shown that new versions of DX aren't really supported out of the gates like that. I mean we still have DX9 support for god sakes. I just don't see it being relevant. Nvidia can Frankenstein async computer to work more than well enough through drivers on Maxwell cards so they should do fine for another year. As I said earlier, in general most people upgrade every generation, if they didn't then Nvidia would give bigger incentives to upgrade to newer cards. Let alone that DX12 isn't going to be all that popular because Windows 10 still has lackluster adoption it all adds up to:

 

It doesn't matter what Nvidia or AMD does right now.

 

You are right about the history of DX, but there are a few things that are different now:

  • Most hardware on the market and in peoples computers already supports DX12.
  • The OS that DX12 is exclusive to, is free for most, and we see the adoption rate is through the roof.
  • The current gen consoles are not only using DX12 tech (async shaders/compute), but also x86 based, which means that porting is a lot easier to do now. This also means that PC land can enjoy the performance that consoles always has (efficiency of the hardware has always sucked on PC).
  • Several AAA games are launching in the next 6 months, and EA is even discussing having it be mandatory in about 14 months. No API has ever done that to my knowledge.

So while I get your point, the fact is that the situation is very diffirent this time around. Windows 10 having over 16% adoption in steam users after 5 weeks is by no definition lackluster.

 

It matters to people who are getting these DX12 games from a month or two from now and onwards.

 

Can't wait to see how you, yet again, bring up some facts that somehow further your narrative.

 

I base it on direct statements made by a developer actually making DX12 gamesm as well as statements from both AMD and NVidia. Sure you can choose not to believe any of them. Or any of the other people who have contributed to the issues lately. It does not make my statements wrong, nor make me ignorant. All you need to do is prove me wrong with good sources/knowledge. That way I and others reading this thread learns something. But you never do. I wonder why. And you call me troll.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Well in leiu of this news I'm going to be buying a skylake xeon probably. It depends on how frugal I feel like being.

Spoiler

Cpu: Ryzen 9 3900X – Motherboard: Gigabyte X570 Aorus Pro Wifi  – RAM: 4 x 16 GB G. Skill Trident Z @ 3200mhz- GPU: ASUS  Strix Geforce GTX 1080ti– Case: Phankteks Enthoo Pro M – Storage: 500GB Samsung 960 Evo, 1TB Intel 800p, Samsung 850 Evo 500GB & WD Blue 1 TB PSU: EVGA 1000P2– Display(s): ASUS PB238Q, AOC 4k, Korean 1440p 144hz Monitor - Cooling: NH-U12S, 2 gentle typhoons and 3 noiseblocker eloops – Keyboard: Corsair K95 Platinum RGB Mouse: G502 Rgb & G Pro Wireless– Sound: Logitech z623 & AKG K240

Link to comment
Share on other sites

Link to post
Share on other sites

My Q6600 just died, it does not want to wait any longer, so start preparing your 2500k cause 1st gen is next then onto sandy D:

 

Replacing the CPU is easy, there's plenty of second hand Sandy's around. Its the motherboard that I'm worried about. Sandy/Ivy mobos are hard to find these days.

Link to comment
Share on other sites

Link to post
Share on other sites

My Q6600 just died, it does not want to wait any longer, so start preparing your 2500k cause 1st gen is next then onto sandy D:

My Dad's old beast still works great. It was a solid overclocker, and we still have the ASUS WS board with it. You want it? We might need you to pay shipping, but it comes with water blocks pre-installed for you too.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

  • Most hardware on the market and in peoples computers already supports DX12.
  • The OS that DX12 is exclusive to, is free for most, and we see the adoption rate is through the roof.
  • The current gen consoles are not only using DX12 tech (async shaders/compute), but also x86 based, which means that porting is a lot easier to do now. This also means that PC land can enjoy the performance that consoles always has (efficiency of the hardware has always sucked on PC).
  • Several AAA games are launching in the next 6 months, and EA is even discussing having it be mandatory in about 14 months. No API has ever done that to my knowledge.

 

On the question of porting:  Does the x86 instruction set on consoles also include x86_64?

Link to comment
Share on other sites

Link to post
Share on other sites

On the question of porting:  Does the x86 instruction set on consoles also include x86_64?

Yes. Does it include all the extra extensions that have come since then? Hell no.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

AMD's computer thing, forgot what it is called, has an i7 in it.  xD

Project Quantum*

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

My Dad's old beast still works great. It was a solid overclocker, and we still have the ASUS WS board with it. You want it? We might need you to pay shipping, but it comes with water blocks pre-installed for you too.

I might need to take you up on that offer, I just got my 780i board working again but that thing is needing retirement so I can at least say "yea it works" if you can PM me cause I am very interested :D

 

Replacing the CPU is easy, there's plenty of second hand Sandy's around. Its the motherboard that I'm worried about. Sandy/Ivy mobos are hard to find these days.

Yea 775 boards just now are starting to become a pain to find so I am in the process of stocking up you could say

A shadowy flight into the dangerous world of a man who does not exist.

 

Core 4 Quad Not Extreme, only available on LGA 557 at your local Circuit City

Link to comment
Share on other sites

Link to post
Share on other sites

figured this would happen. they are too broke to get something out quickly.

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

Also notice how anything good happens with AMD, it's praised on the highest pedestal.

 

Nvidia releases better performance-per-watt architecture while still beating Radeon GPU's? Fucking fanboy why would you ever buy that shit from Nvidia kekekeke XDDDXDDDD

 

Better performance per watt? While this is true, when looking at DX11 performance, it won't be true when looking at DX12 performance. The one handicap that nVIDIA need to rectify with, I'm hoping, Pascal is the inclusion of Hardware based scheduling. Do you remember Fermi? Do you care to guess what Pascal would need to look like in order to compete with Greenland? Think Fermi... because that's the last time nVIDIA had hardware side scheduling.

 

 

I think it's the idea that they're the underdogs and people are naturally going to root for the under-dog. I'm sure if the roles were switched, nVidia would inevitably be praised for their little achievements as well.

 

I wouldn't completely agree with that statement anymore though. AMD has been criticized a lot lately and praised little - mostly because they have made some dumb decisions the last 6 months.

 

Right, that's why nVIDIA sold 81% of the dGPU Graphic cards the last quarter. Because people "root" for the under-dog. Oh wait... /sarcasm.

 

 

People should think rationally no matter what.

 

People are thinking rationally. How are those Kepler cards doing? Do you remember the GTX 680? GTX 780? Care to guess how well they will perform under DX12 titles compared to a Radeon HD 7970 and R9 290x?

 

Now what do you mean by thinking rationally? Do you mean paying $400 for a GPU back in Q4 2013 which will last throughout the DX12 titles up until Greenland/Pascal and thus granting these folks a larger return on their investment? That's my definition of thinking rationally but I guess we've got varying opinions on that.

Or do you mean thinking rationally in terms of purchasing $650 GPUs ever 6 months to a year? You know "epeen"? Having the faster card available at the moment? Is bragging rational behavior? If so, please argue your case. I'm all ears.

_________________________________________________________________________________________________________________________________

Now aside from the folks making absurd statements with regards to GPUs, and getting back on the CPU topic, this news does not bode well for AMD. I think that the issues with GF need to be rectified pronto. AMD Zen cannot afford to be any later than it already is.

"Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth." - Arthur Conan Doyle (Sherlock Holmes)

Link to comment
Share on other sites

Link to post
Share on other sites

Better performance per watt? While this is true, when looking at DX11 performance, it won't be true when looking at DX12 performance. The one handicap that nVIDIA need to rectify with, I'm hoping, Pascal is the inclusion of Hardware based scheduling. Do you remember Fermi? Do you care to guess what Pascal would need to look like in order to compete with Greenland? Think Fermi... because that's the last time nVIDIA had hardware side scheduling.

 

 

 

Right, that's why nVIDIA sold 81% of the dGPU Graphic cards the last quarter. Because people "root" for the under-dog. Oh wait... /sarcasm.

 

 

 

People are thinking rationally. How are those Kepler cards doing? Do you remember the GTX 680? GTX 780? Care to guess how well they will perform under DX12 titles compared to a Radeon HD 7970 and R9 290x?

 

Now what do you mean by thinking rationally? Do you mean paying $400 for a GPU back in Q4 2013 which will last throughout the DX12 titles up until Greenland/Pascal and thus granting these folks a larger return on their investment? Or do you mean thinking rationally in terms of purchasing $650 GPUs ever 6 months to a year?

 

Or do you mean "epeen"? Having the faster card available at the moment? Is bragging rational behavior? If so, please argue your case. I'm all ears.

_________________________________________________________________________________________________________________________________

Now aside from the folks making absurd statements with regards to GPUs, and getting back on the CPU topic, this news does not bode well for AMD. I think that the issues with GF need to be rectified pronto. AMD Zen cannot afford to be any later than it already is.

Fermi was completely overbuilt for its time. And that was on a much larger, less power and heat-efficient node. It won't be an issue now. Pascal will just be Maxwell with the multiprecision SPs back in play, hardware scheduling, async compute, and the remainder of the DX 12 bells and whistles. Nvidia stripped a lot from Maxwell to make it all fit. Putting it all back in and working on the density of their libraries is childs play. Volta is going to be a wild card though.

 

Kepler is still the best GPGPU compute architecture in the world, at least according to actual enterprise experts.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

AMD's Project Quantum has an i7 in it.  xD

that is honestly

the funniest thing i've ever heard AMD do when I found it out

Its pathetic TBH

Link to comment
Share on other sites

Link to post
Share on other sites

No shit, like how long am I gonna be stuck with my FX 8320?

Until you upgrade to a proper PC gaming platform.
Link to comment
Share on other sites

Link to post
Share on other sites

Show me a quote, from an enterprise expert, who states that Kepler is the best GPGPU?

As for Pascal, if it includes all of the hardware scheduling then it will be competing on an orange to orange level with Greenland. Same manufacturing process, both utilizing hardware based scheduling. Point being... The TDP between the two will likely be similar. And that was my point.

Fermi is likely the best GPGPU architecture. Only because AMDs professional software suites and optimization guides are horrible.

CUDA is far better maintained than OpenCL... For now.

"Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth." - Arthur Conan Doyle (Sherlock Holmes)

Link to comment
Share on other sites

Link to post
Share on other sites

Forget about upgrading your internals for the next year people. Just ignore that for a year and you`ll be much happier.

 

This is the time when you should upgrade your monitor however.

Upgrade your crappy 1080p TN shit to a 3440x1440 IPS 100hz gsync monitor.

THAT will blow your mind right out of your skull and be a far more enjoyable gaming upgrade than ANY new internal components could be.

Link to comment
Share on other sites

Link to post
Share on other sites

Upgrade your crappy 1080p TN shit to a 3440x1440 IPS 100hz gsync adaptive sync monitor.

 

There are two adaptive sync technologies.

 

And one is clearly more bang for the buck than the other. And it's not G-sync.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

I have yet to find a reason to push my 3930k past stock speeds. nothing challenged it yet. Did not even have any troubles with the 8350.

 

Maybe you guys should think about upgrading from 1080p to 4k before you start yelling for a more powerful CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

 4k before you start yelling for a more powerful CPU.

you can play 4k on an i3, you just need beef gpu

 

if you wanted more framerate, thats another story.

Link to comment
Share on other sites

Link to post
Share on other sites

you can play 4k on an i3, you just need beef gpu

 

if you wanted more framerate, thats another story.

i pushed my cpu to 4.8 and got 1 fps boost which i'm pretty sure it was just an error margin. unless you are severely bottlenecked you don't need to upgrade the cpu.

 

idk what it is but going from 1080/1440 to 4k LOWERS cpu utilization and Increases gpu utilization. One would expect both to go up since the cpu would need to feed the gpu more but that is not the case. I really wish someone could explain why this happens.

Link to comment
Share on other sites

Link to post
Share on other sites

There are two adaptive sync technologies.

 

And one is clearly more bang for the buck than the other. And it's not G-sync.

 

I agree. And intel jumped on adaptive sync too, so its the future for sure.

 

Except NOW - the only "no compromise" monitor has gsync. The best 3440x1440 adaptive sync IPS monitor goes up to 75hz only, which isnt good enough to be considered a HIGH REFRESH monitor. 100hz is good enough tho.

 

Im sure in 5-7 years we shall see OLED`s with adaptive sync range of 0-200hz for a TRUE fps=hz display with a mathematically beautiful variable range. 

Also, OLED doesnt need overdrive on top of being able to go down to 0hz, so gsyncs voltage control and low hz doubling tricks will become irrelevant.

 

 

But for the next 5 years - if you want to be SET, you want that 3440x1440 IPS 100hz GSYNC monitor.

If you get the 75hz freesync version - you will always keep thinking "maybe one day i`ll get to upgrade to a high refresh version of this".

Link to comment
Share on other sites

Link to post
Share on other sites

I agree. And intel jumped on adaptive sync too, so its the future for sure.

 

Except NOW - the only "no compromise" monitor has gsync. The best 3440x1440 adaptive sync IPS monitor goes up to 75hz only, which isnt good enough to be considered a HIGH REFRESH monitor. 100hz is good enough tho.

 

Im sure in 5-7 years we shall see OLED`s with adaptive sync range of 0-200hz for a TRUE fps=hz display with a mathematically beautiful variable range. 

Also, OLED doesnt need overdrive on top of being able to go down to 0hz, so gsyncs voltage control and low hz doubling tricks will become irrelevant.

 

 

But for the next 5 years - if you want to be SET, you want that 3440x1440 IPS 100hz GSYNC monitor.

If you get the 75hz freesync version - you will always keep thinking "maybe one day i`ll get to upgrade to a high refresh version of this".

 

I generally agree but there are a few things to keep in mind:

  • The AS version can be OC'd to 85 hz (according to Linus). That leaves 15hz in difference.
  • If you get the Gsync version, you are stuck with NVidia for the next 4-5 years as you say.
  • In that time period, with Intel going AS, what do you think is most realist: NVidia going AS support or AMD and/or Intel going Gsync support?

The last point is important. Sure it's speculation, but I cannot imagine NVidia not supporting AS in 1-2 years from now. The only reason not to do so now, is that their displayport controller is obsolete, even in 980ti.

But yeah, the AS version only being 75hz is dumb. Those monitor controller vendors needs to get their shit together.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

i pushed my cpu to 4.8 and got 1 fps boost which i'm pretty sure it was just an error margin. unless you are severely bottlenecked you don't need to upgrade the cpu.

 

idk what it is but going from 1080/1440 to 4k LOWERS cpu utilization and Increases gpu utilization. One would expect both to go up since the cpu would need to feed the gpu more but that is not the case. I really wish someone could explain why this happens.

well, its kinda obvious cpu utilization will go down, if framerate goes down.

 

if you have 20 fps, cpu only needs to calculate stuff on screen 20 times per second

if you have 200fps it needs to do it 200 times, hence, you need better cpu

if you have 20 fps at 720p or 20 fps at 4k, cpu will do same work

 

 

https://youtu.be/Qimf3-UpHxQ?t=204

 

 

Consoles could do 1080p with 4 times as better gpu. But they still would not get 60 fps. Still 30 fps lock :D 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×