Jump to content

-ENDED- AMD Next Horizon Gaming E3 2019 Live Thread

BigDamn
Go to solution Solved by BigDamn,

16c.PNG.bda183573c588dbbe0245e9062f8bd03.PNG

 

$750

28 minutes ago, _Hustler_One_ said:

I guess because the midrange market is significantly profitable for them than competing the nvidia's flagship, knowing that midrange is the highest population. Even Nvidia know this segment so they rip off hard this segment earlier with their 2060 & 2070 before AMD join the segment.

So they, AMD, can recover budget for the next 7+ architecture after focusing to their Zen2..

The problem is that AMD aren't aiming for mid range, they already have products to fill that gap. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

The problem is that AMD aren't aiming for mid range, they already have products to fill that gap. 

Yes but they're few generations old away from midrange RTX

My system specs:

Spoiler

CPU: Intel Core i7-8700K, 5GHz Delidded LM || CPU Cooler: Noctua NH-C14S w/ NF-A15 & NF-A14 Chromax fans in push-pull cofiguration || Motherboard: MSI Z370i Gaming Pro Carbon AC || RAM: Corsair Vengeance LPX DDR4 2x8Gb 2666 || GPU: EVGA GTX 1060 6Gb FTW2+ DT || Storage: Samsung 860 Evo M.2 SATA SSD 250Gb, 2x 2.5" HDDs 1Tb & 500Gb || ODD: 9mm Slim DVD RW || PSU: Corsair SF600 80+ Platinum || Case: Cougar QBX + 1x Noctua NF-R8 front intake + 2x Noctua NF-F12 iPPC top exhaust + Cougar stock 92mm DC fan rear exhaust || Monitor: ASUS VG248QE || Keyboard: Ducky One 2 Mini Cherry MX Red || Mouse: Logitech G703 || Audio: Corsair HS70 Wireless || Other: XBox One S Controler

My build logs:

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, _Hustler_One_ said:

Yes but they're few generations old away from midrange RTX

Not just from RTX, it's literally the best they can do at the moment across the board. this is not AMD aiming for mediocre, they got that down pat. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, VegetableStu said:

so no one should innovate when it "hurts fps"?

 

consider it used to take days to minutes to render a single frame in a scene with raytracing.

Yeah, people shouldn't coerce reviewers into saying things like "just buy it, it works" and completely leave out the part where it hurts FPS. 

 

Innovation is fine. Misinformation and poor, rushed implementations for the sake of hurting your competition is not. 

Link to comment
Share on other sites

Link to post
Share on other sites

Navi is interesting.  Seeing what wattage the cards actually pull while gaming on reasonable cooling (they say this is optimal, I doubt it, but we'll see once reviews hit), could also help their case vs NV if they can do partial load more efficiently.  Seeing comments about not being able to power mod and the like from Gamers Nexus makes me think AMD is purposely limiting these cards, and this architecture, for launch…going for just about NV performance at just under NV pricing.  Next year, the consoles using AMD have announced real time ray tracing will be on them, so I would expect that in the next gen and where AMD goes for a knockout punch after using this round of cards to recover some R&D money (which, if they can sell several of these, should be significant with the smaller die size).

 

All in all, I'm happy to see AMD get a handle on GPU power consumption, even if it doesn't beat NV, and to see it hopefully put at least a slight bit of pressure on NV to lower prices, even if just by 50 bucks.  Also quite happy with what came out for Ryzen, but was expecting lower pricing to make the value better.  That being said, these are all announced MSRP prices, and actual prices usually are a fair bit under that shortly after launch for just about every product that isn't in way too short supply.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ThotChopsticks said:

Yeah, people shouldn't coerce reviewers into saying things like "just buy it, it works" and completely leave out the part where it hurts FPS. 

 

Innovation is fine. Misinformation and poor, rushed implementations for the sake of hurting your competition is not. 

You're assuming the motive is only to hurt the competition and at the expense of the consumer at that.  Companies might be ruthless when it comes to their priorities on money, but they aren't stupid.   Marketing is one thing, improving technology and developing new better ways to do things is another.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, mr moose said:

Not just from RTX, it's literally the best they can do at the moment across the board. this is not AMD aiming for mediocre, they got that down pat. 

At least they can give some better value in that segment for many people like me as an average budget consumer. The RX 5700 looks as a good GPU for me. Good value for its overall specs & performance.

 

This RX 5700 has better ratio power efficiency to performance than the previous AMD's midrange GPUs, I'm not concerning electricity bill but more to its heat output of the previous gen.

My system specs:

Spoiler

CPU: Intel Core i7-8700K, 5GHz Delidded LM || CPU Cooler: Noctua NH-C14S w/ NF-A15 & NF-A14 Chromax fans in push-pull cofiguration || Motherboard: MSI Z370i Gaming Pro Carbon AC || RAM: Corsair Vengeance LPX DDR4 2x8Gb 2666 || GPU: EVGA GTX 1060 6Gb FTW2+ DT || Storage: Samsung 860 Evo M.2 SATA SSD 250Gb, 2x 2.5" HDDs 1Tb & 500Gb || ODD: 9mm Slim DVD RW || PSU: Corsair SF600 80+ Platinum || Case: Cougar QBX + 1x Noctua NF-R8 front intake + 2x Noctua NF-F12 iPPC top exhaust + Cougar stock 92mm DC fan rear exhaust || Monitor: ASUS VG248QE || Keyboard: Ducky One 2 Mini Cherry MX Red || Mouse: Logitech G703 || Audio: Corsair HS70 Wireless || Other: XBox One S Controler

My build logs:

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, _Hustler_One_ said:

At least they can give some better value in that segment for many people like me as an average budget consumer. The RX 5700 looks as a good GPU for me. Good value for its overall specs & performance.

 

This RX 5700 has better ratio power efficiency to performance than the previous midrange GPUs, I'm not concerning electricity bill but more to its heat output of the previous gen.

 

AMD have been very good at offering reasonable prices,  I don;t care about power either, I used to play devils advocate when power consumption was used out of context to defend something, but the reality is these days you don't need to buy a new PSU just to upgrade the GPU. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Let's hope the nvidia discounts are true then. 

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, williamcll said:

Let's hope the nvidia discounts are true then. 

Cheapest 2070 at the time when this post was created is the Asus dual, at $450 from newegg, that's after the $50 instant savings and $30 mail in rebate. Original price is $530.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

You're assuming the motive is only to hurt the competition and at the expense of the consumer at that.  Companies might be ruthless when it comes to their priorities on money, but they aren't stupid.   Marketing is one thing, improving technology and developing new better ways to do things is another.

You're assuming that Nvidia maintains any real motive to "improve" GPU technology than to keep ahead of the competition. That's not where their business is at right now. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, ThotChopsticks said:

You're assuming that Nvidia maintains any real motive to "improve" GPU technology than to keep ahead of the competition. That's not where their business is at right now. 

They have enough motive from the enterprise sector, that's where the primary architecture development is focused on and gaming is just cut down from those. If you stop innovating and seeking extra performance then you will fall behind and it's not something you can just spin back up when you're caught off guard. You'd be in real trouble if you weren't developing new architectures because there would be at least a 2 year cycle to get anything new out while that competitor you discounted is out in front taking your business.

 

You also wouldn't sit on anything new for any length of time because the cost is just so high you need to get investment return.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, NumLock21 said:

Cheapest 2070 at the time when this post was created is the Asus dual, at $450 from newegg, that's after the $50 instant savings and $30 mail in rebate. Original price is $530.

I meant the 2070 discounts for 2070 Super, but that's a decent price too.

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, cj09beira said:

lol, we have known that navi is mid range for such a long time 

I'm just saying, they could do both.

 

4 hours ago, _Hustler_One_ said:

I guess because the midrange market is significantly profitable for them than competing the nvidia's flagship, knowing that midrange is the highest population. Even Nvidia know this segment so they rip off hard this area earlier by their 2060 (with only 6G vram) & 2070 (with awful value) before AMD join the segment with their own new generation.

So they, AMD, can recover budget for the next 7+ architecture after focusing to their Zen2..

They've been going with that mindset for a while now.

 

Doesn't seem to be working as they still aren't dominating. Although I'm sure people will just blame Nvidia for this. Not that Nvidia isn't without fault at all, but AMD's mindset seems to be the bigger problem to me.

 

They give Nvidia too much control with this mentality. All they have to do is drop prices a bit and AMD get's shut out.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, leadeater said:

They have enough motive from the enterprise sector, that's where the primary architecture development is focused on and gaming is just cut down from those. If you stop innovating and seeking extra performance then you will fall behind and it's not something you can just spin back up when you're caught off guard. You'd be in real trouble if you weren't developing new architectures because there would be at least a 2 year cycle to get anything new out while that competitor you discounted is out in front taking your business.

 

You also wouldn't sit on anything new for any length of time because the cost is just so high you need to get investment return.

I'd like for that to be the case, but those words of yours tell a different story from the moderate performance bumps and colossal price hikes we've been seeing. Enterprise and gaming have just as many differences as they have similarities, if not more. Working for enterprise does not always work for gaming.

 

Looking at purely economics and finance, it doesn't seem like gaming makes up an overly significant portion of profits for Nvidia. They hardly have any need to push themselves except for to keep the competition grounded.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ThotChopsticks said:

You're assuming that Nvidia maintains any real motive to "improve" GPU technology than to keep ahead of the competition. That's not where their business is at right now. 

 

Business 101, innovate or stagnate.

 

 

All you see is the tip of the iceberg, the minor domestic market and forum machinations. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, ThotChopsticks said:

I'd like for that to be the case, but those words of yours tell a different story from the moderate performance bumps and colossal price hikes we've been seeing.

The key words being 'cut down', alongside with price adjustments.

 

Just because they have innovated and developed a good architecture (which Turing is) does not mean they're going to give it to you for cheap if they don't have to while the competition is lagging - they can play with pricing at their leisure (see reduction in the MSRP for Turing cards), moving the price tiers up (release of RTX) and down (price reduction to compete with Navi) as needed.

 

Quote

Enterprise and gaming have just as many differences as they have similarities, if not more. Working for enterprise does not always work for gaming.

That goes just as much as for AMD - Radeon cards (such as their flagship Mi50/Mi60 compute cards) are great at compute/datacenter workloads. Not quite so great at gaming (in terms of cost).

 

46 minutes ago, ThotChopsticks said:

Looking at purely economics and finance, it doesn't seem like gaming makes up an overly significant portion of profits for Nvidia. They hardly have any need to push themselves except for to keep the competition grounded.

Gaming makes up the largest portion (although not the fastest growing) of Nvidia's revenue portfolio. They will milk it for all it's worth so long as they are able to.

Link to comment
Share on other sites

Link to post
Share on other sites

So, let's talk about actually important thing, PBO on Zen 2. 

I saw a slide whrere it shows that it can give up to 200 Mhz boost, my questions are:
1. How many cores?
2. Will it work on older motherboards?
3. Why is there such difference in gains from PBO between 3600 and 3600x? Could it mean that stock cooler isn't enough to keep higher clocked Zen 2 chips boosting past its regular boost clocks?
4. How would it affect overclocking headroom? I fully understand it is too early to say yet, but would it imply that you can overclock all cores above boost clocks?

Ex-EX build: Liquidfy C+... R.I.P.

Ex-build:

Meshify C – sold

Ryzen 5 1600x @4.0 GHz/1.4V – sold

Gigabyte X370 Aorus Gaming K7 – sold

Corsair Vengeance LPX 2x8 GB @3200 Mhz – sold

Alpenfoehn Brocken 3 Black Edition – it's somewhere

Sapphire Vega 56 Pulse – ded

Intel SSD 660p 1TB – sold

be Quiet! Straight Power 11 750w – sold

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Trik'Stari said:

I'm just saying, they could do both.

 

All the rumours, leaks, and god knows what else have indicated they can't. Big Navi just isn't ready. My bet is they decided they didn't want to release a card with a 350w+ TDP and are busy working on getting the power efficiency up some more via RDNA next gen and we'll see it and some lower end stuff launch as the 6xxx series.

 

The thing to remember is AMD's graphics division went through a real clusterfuck of a bad time. They put out polaris and then started working on Vega, when that came out a combination of factors meant it didn't really do what they wanted in the gaming space, (datacenter loves vega, but thats a different story), so they then apparently had a bit of internal scuffling where they started working on a next gen vega before they switched to what is now RDNA. They've fundamentally gone through two major architecture redesign schedules back to back. Thats basically tore the hell out of their timing with things and are now a full generation behind development wise.

 

29 minutes ago, ThotChopsticks said:

I'd like for that to be the case, but those words of yours tell a different story from the moderate performance bumps and colossal price hikes we've been seeing. Enterprise and gaming have just as many differences as they have similarities, if not more. Working for enterprise does not always work for gaming.

 

Looking at purely economics and finance, it doesn't seem like gaming makes up an overly significant portion of profits for Nvidia. They hardly have any need to push themselves except for to keep the competition grounded.

 

Currently the only hardware differences between an RTX Titan, (best "gaming" card) and a top of the line Quadro, (Their professional card lineup), is the Quadro dosen't have it's FP64 compute performance limited, the Quadro uses larger memory chips to pack a bigger VRAM buffer, and the memory is ECC type, (i believe). 

Link to comment
Share on other sites

Link to post
Share on other sites

I'll say this... I'm not extremely disappointed. At best I'd wish for better pricing.  But the 5700 and 5700XT compete with the GPUs they're trying to compete against.

The 5700XT is even $50 cheaper.


I've seen some people go (not necessarily on here) go, "It performs the same, just buy a 2070 for $50 more!"  But... why?  If they perform the same, why would't I save the $50.  That'd basically be almost the price of a new game.  I could even get more than one game assuming they're on sale.  Or, that's $50 I could push towards another part.

 

As for blower cards... I've never had an issue with blower cards.  There's still many cases that have shit airflow, that people buy for looks, even on the mid tier to higher end cases.  A blower cooled card would handle those cases better.  

 

CPUs, I'm not disappointed at all, maybe at the 3950X's pricing... but the rest of the Zen 2 stack is solid AF, and I'm excited.  Though I do wish they'd announce new APUs, and new Ryzen 3 CPUs on the Zen 2 architecture.  All Ryzen 3 processors currently are still first gen Zen.

Currently focusing on my video game collection.

It doesn't matter what you play games on, just play good games you enjoy.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I dislike the notion that the highest end CPU has to have the highest boost clock too, for me 3950 is a chip that just proves they can, 8/16 should probably do best in gaming atm and the workloads that justify going over 16 threads probably justify the price too. And you know, if the chips don't sell, the prices will drop, if the GPUs don't sell, the prices will drop too. GDDR6 and the small chip should mean nice margins.

 

If the CU can really go up to 64, we could see double performance in next gen IF Navi's really something between Polaris and RDNA(60% more CUs + some big improvements like Zen -> Zen 2 if they want to go for it, probably a 400W GPU though). Still, I'd like better choice under $200.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ThotChopsticks said:

I'd like for that to be the case, but those words of yours tell a different story from the moderate performance bumps and colossal price hikes we've been seeing. Enterprise and gaming have just as many differences as they have similarities, if not more. Working for enterprise does not always work for gaming.

 

Looking at purely economics and finance, it doesn't seem like gaming makes up an overly significant portion of profits for Nvidia. They hardly have any need to push themselves except for to keep the competition grounded.

Nvidia is only releasing what they need to on the game market, architecture development is happening and performance is increasing as much and as fast as it can. Every generation is not going to be large leaps over the last, that's unrealistic and hasn't actually been a thing.

 

But the insinuation I was replying to was that Nvidia does not have the motive to improve which is quite incorrect. Nobody ever replaces like for like on enterprise, the ever march for more, better, faster, efficient has and will always be a thing.

 

If you go way back to the very roots of dedicated GPUs for computers by Nvidia you can see that each generation isn't always a great leap from the last, there have been leaps and there will be leaps to come but expecting Geforce 900 to Geforce 10 series every time is extremely unrealistic.

 

As for gaming that is over 50% of Nvidia's revenue and market, all their other business sectors are much smaller. However the extreme demands for performance and efficiency from the enterprise sector are also mutually beneficial to the gaming sector, if you want to claim Nvidia is holding back then I would invite you to look at the best Tesla cards Nvidia has and show any disparity between the performance increases there that are not equally found in gaming. If they can't improve performance by 40% on Tesla there is no way it's happening on gaming cards.

 

Pricing sucks, has been for a long time. We can debate that all day but that issue ends when equivalent performance cards from AMD are not any cheaper, not by the amounts that people wish were a thing. GPU development costs are extremely high.

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, leadeater said:

Pricing sucks, has been for a long time. We can debate that all day but that issue ends when equivalent performance cards from AMD are not any cheaper, not by the amounts that people wish were a thing. GPU development costs are extremely high.

By the look of it, Nvidia dropped an extra $300M+ into Turing/volta.   I just can't see them doing that unnecessarily if they were in a position they could suit on their hands and still run out something better. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Trik'Stari said:

I'm just saying, they could do both.

 

They've been going with that mindset for a while now.

 

Doesn't seem to be working as they still aren't dominating. Although I'm sure people will just blame Nvidia for this. Not that Nvidia isn't without fault at all, but AMD's mindset seems to be the bigger problem to me.

 

They give Nvidia too much control with this mentality. All they have to do is drop prices a bit and AMD get's shut out.

amd probably has very good margins on these cards, as 40 cus is the low end die, the one they probably made for 250-300 range but it gained good amounts of ipc so they bumped it up, so lowering prices wont be a problem for them, and nvidia loses a lot more when lowering prices than amd does (due to volume)

4 hours ago, Quadriplegic said:

So, let's talk about actually important thing, PBO on Zen 2. 

I saw a slide whrere it shows that it can give up to 200 Mhz boost, my questions are:
1. How many cores?
2. Will it work on older motherboards?
3. Why is there such difference in gains from PBO between 3600 and 3600x? Could it mean that stock cooler isn't enough to keep higher clocked Zen 2 chips boosting past its regular boost clocks?
4. How would it affect overclocking headroom? I fully understand it is too early to say yet, but would it imply that you can overclock all cores above boost clocks?

binning and non X chips use to not have xfr so there might be some limiting happening ?!?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

By the look of it, Nvidia dropped an extra $300M+ into Turing/volta.   I just can't see them doing that unnecessarily if they were in a position they could suit on their hands and still run out something better. 

It was needed for the AI side of things, not the consumer side, especially since all else being equal AMD was just better at compute than them. But because they had a bunch of chips not suitable for the business side of things, they foisted it off as the latest and greatest gaming tech, even though in order to work properly and have everything actually raytraced in games it needs to be at least 10 times faster than it currently is. As is it makes little sense for developers to sink significant time and money into the systems, and even with the new consoles any tracing APIs will be optimized for the MS spec, not nVidia's RTX.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×