Jump to content

-ENDED- AMD Next Horizon Gaming E3 2019 Live Thread

BigDamn
Go to solution Solved by BigDamn,

16c.PNG.bda183573c588dbbe0245e9062f8bd03.PNG

 

$750

15 minutes ago, BigDamn said:

While I don't disagree, I'll say I'm disappointed that AMD isn't giving me a reason to switch back. I too prefer AMD cards, mostly due to software, and would love to sell my 1080 and grab one of these. Unfortunately there's nothing here to reel me back in. I remember switching from my FX 8320 to an i7 4790k because AMD had nothing that met my needs. After Ryzen released I sold my i7 for $300 and grabbed the 1700 for $300 and it DWARFED my i7, while using less power and producing far far less heat. Would love to see AMD do this in the graphics market.

Yupp.. Ryzen is great.. I started with a 2600X build in November and sold most of my parts and got the 9700K and 2080 FTW3 (tried used 1080 ti first).. I ended back at Ryzen because I was counting on Zen 2 to pull through for a better price.. Plus I wanted to water cool everything and the Ryzen was cheaper for me at the time.. Now I'm thinking the 3800X or 3900X when it's released..

 

I also wish they were better in the GPU market.. I think this new release will help push them in the right direction.. It takes ton of money to play with the big boys (Nvidia/Intel) and most of their budget, I'm sure, went towards Ryzen.. But I don't have much to complain about, honestly..  I over paid for the VII (talking about performance, not going rate) but I have one of the higher binned ones and it's on water.. So not really a bad deal when compared to my previous 2080.. I get great performance in the games I play, so I'm happy lol.

Ryzen 3800X + MEG ACE w/ Radeon VII + 3733 c14 Trident Z RGB in a Custom Loop powered by Seasonic Prime Ultra Titanium
PSU Tier List | Motherboard Tier List | My Build

Link to comment
Share on other sites

Link to post
Share on other sites

16C seems cool. Not something I would ever need for a gaming PC, but cool.

AMD Ryzen 7 3700X | Thermalright Le Grand Macho RT | ASUS ROG Strix X470-F | 16GB G.Skill Trident Z RGB @3400MHz | EVGA RTX 2080S XC Ultra | EVGA GQ 650 | HP EX920 1TB / Crucial MX500 500GB / Samsung Spinpoint 1TB | Cooler Master H500M

Link to comment
Share on other sites

Link to post
Share on other sites

So, they once again match performance, a year later, but lack RT.

Given they usually don't show real numbers, we can deduct about 10% and they are no longer matching performance either.

 

How could they screw up 7nm so badly?

I mean,... barely matching 12nm GPUs? Even using more power STILL? WTF!

Link to comment
Share on other sites

Link to post
Share on other sites

Wow that 5700/XT pricing is ballsy! Guess they really don't want to sell that many...

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Taf the Ghost said:

Unreal's limited AMD optimizations in the PC branch causes more benchmarking issues than anything else for AMD. Unity is a close second, so it's a really good idea to make Unity work really well on AMD. Money gets that done.

thing is i play 0 games on UE4, while i have 1k+ hours on rust 

21 minutes ago, VegetableStu said:

so no one should innovate when it "hurts fps"?

 

consider it used to take days to minutes to render a single frame in a scene with raytracing.

no but pushing a tech before its really and increasing price quite a bit because of it is not how you do it, they should have waited at least until 7nm to bring it to consumers that would have made much more sense, and the current cards will never be good at it so why pay for it, and its not like they needed time for devs to bring it, as raytracing is much easier to do than normal rasterization when it comes to what the devs need to do for it to work 

Link to comment
Share on other sites

Link to post
Share on other sites

Spoiler

Has AMD mentioned card TDP/Power drw

Given that these only have 40 CUs and RDNA supports 60CUs (64?)... I'm looking forward to seeing what a full-fat 300W monster can do.

Link to comment
Share on other sites

Link to post
Share on other sites

Great CPU lineup definitely upgrading my CPU next month, but the GPU's were extremely underwhelming, I was hoping for at least feature parity.
I wouldn't touch those GPU's with a foot pole without RT as we already know the new consoles will have it.
The GPU's are guaranteed to be behind the consoles by next year, which could have really bad implications for future performance.

The price is also not very competitive as $50 less at those price points is nothing, we have price differences just based on the cooler design of $50-120.
So an Asus RX5700 will likely cost the same as a Gigabyte or Zotac RTX2070.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Goddamnit AMD.

 

PUNCH UP. NOT DOWN.

 

I don't have a need for something that only beats a 2060 in price to performance. Goddamnit don't you bastards make me buy another fucking Nvidia card because you're too stupid to get out of the cheap seats and offer something actually cutting edge.

 

Give us a 5900 XT that dominates over the 2080, or piss the fuck off.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Trik'Stari said:

Goddamnit AMD.

 

PUNCH UP. NOT DOWN.

 

I don't have a need for something that only beats a 2060 in price to performance. Goddamnit don't you bastards make me buy another fucking Nvidia card because you're too stupid to get out of the cheap seats and offer something actually cutting edge.

 

Give us a 5900 XT that dominates over the 2080, or piss the fuck off.

lol, we have known that navi is mid range for such a long time 

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting bits. AMD will bring raytracing with the next iteration. Makes sense. It just doesn't make sense for gaming right now. People will of course scream and shout for it anyway. Not all that different to PCI-E 4.0 in that regard: it's just not useful for gaming presently. Just to throw some extra salt: by the time raytracing makes sense, a Turing card will be irrelevant so you're not 'future-proofing' by buying one.

 

No variable rate shading either for now but get this: primitive shaders are back with a vengeance and this time it actually works (note: apparently it also worked on Vega but it didn't provide any tangible performance benefits so it was disabled). Rapid packed math is also here.

Link to comment
Share on other sites

Link to post
Share on other sites

RIP Intel, lol

 

Can't wait to see the third-party benchmarks for the 3700X, that's the part I'm most interested in.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, System32.exe said:

RIP Intel, lol

 

Can't wait to see the third-party benchmarks for the 3700X, that's the part I'm most interested in.

Yes, I am very interested in how it overclocks relative to 3800X.

AMD Ryzen 7 3700X | Thermalright Le Grand Macho RT | ASUS ROG Strix X470-F | 16GB G.Skill Trident Z RGB @3400MHz | EVGA RTX 2080S XC Ultra | EVGA GQ 650 | HP EX920 1TB / Crucial MX500 500GB / Samsung Spinpoint 1TB | Cooler Master H500M

Link to comment
Share on other sites

Link to post
Share on other sites

$750 is too much for the 16 Core.
 

When I made this prediction almost 1 full year ago "economical" meant 1800x pricing or $500. After everyone overhyped the pricing leaks I thought AMD might take liberties and go to $600.

I am not paying $750 for 16 cores on AM4, I'll wait for Threadripper and take the higher pcie lanes and lower heat density of w/e 16 core Zen2 TR they release. Lisa you really let me down by trying to play intel's game so early. You could have at least waited until you had proven yourself.

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

the new branch prediction was supposed to come with zen 3, but the engineers worked hard and got it in 

they claim around 30% less misses 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TetraSky said:

Come on now, that's basically standard for references cards. It's only since this generation that nvidia decided to add a second fan.

 

Not to mention, they have their places in plenty of systems.

Not exactly a good reason to keep doing something that is known to not be that good and has a legitimate list of negative aspects. It doesn't matter that an open air cooler costs slightly more, the continued reputation damage those cooler causes (warranted or not) is a much higher cost.

 

Good AIB cards have never been hot or loud but that's the over arching reputation AMD cards have, because of reference blower cards. Won't be an issue if AIB open air coolers come out at the same time then reviews will use those.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

Not exactly a good reason to keep doing something that is known to not be that good and has a legitimate list of negative aspects. It doesn't matter that an open air cooler costs slightly more, the continued reputation damage those cooler causes (warranted or not) is a much higher cost.

 

Good AIB cards have never been hot or loud but that's the over arching reputation AMD card have, because of reference blower cards. Won't be an issue if AIB open air coolers come ou at the same time then reviews will use those.

You should know something is wrong with your product/PR when the internet is making  turbo engine jokes at your expense.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mr moose said:

You should know something is wrong with your product/PR when the internet is making  turbo engine jokes at your expense.

 

 

poor 290x, it deserved better (mining didn't help either)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ONOTech said:

Navi has a small die compared to previous generations (251 mm^2 right?). Will there be a "fatter" iteration?

rumors talked about a 52-56cu die, which is what i was expecting to go agaisnt the 2070, so i guess it must be coming later on 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, COTG said:

 

Screen Shot 2019-06-11 at 1.21.06 AM.png

Better performance and higher VRAM capacity than the 2060 at around same price. I would definitely get this thing over 2060 from my current 1060 6G. Hopefully my patience for this to finally come will be paid off well..

My system specs:

Spoiler

CPU: Intel Core i7-8700K, 5GHz Delidded LM || CPU Cooler: Noctua NH-C14S w/ NF-A15 & NF-A14 Chromax fans in push-pull cofiguration || Motherboard: MSI Z370i Gaming Pro Carbon AC || RAM: Corsair Vengeance LPX DDR4 2x8Gb 2666 || GPU: EVGA GTX 1060 6Gb FTW2+ DT || Storage: Samsung 860 Evo M.2 SATA SSD 250Gb, 2x 2.5" HDDs 1Tb & 500Gb || ODD: 9mm Slim DVD RW || PSU: Corsair SF600 80+ Platinum || Case: Cougar QBX + 1x Noctua NF-R8 front intake + 2x Noctua NF-F12 iPPC top exhaust + Cougar stock 92mm DC fan rear exhaust || Monitor: ASUS VG248QE || Keyboard: Ducky One 2 Mini Cherry MX Red || Mouse: Logitech G703 || Audio: Corsair HS70 Wireless || Other: XBox One S Controler

My build logs:

 

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, cj09beira said:

thing is i play 0 games on UE4, while i have 1k+ hours on rust 

no but pushing a tech before its really and increasing price quite a bit because of it is not how you do it, they should have waited at least until 7nm to bring it to consumers that would have made much more sense, and the current cards will never be good at it so why pay for it, and its not like they needed time for devs to bring it, as raytracing is much easier to do than normal rasterization when it comes to what the devs need to do for it to work 

 

You do realise if they'd waited for 7nm we'd still have all the issues we have now. ames get more intensive and push the hardware more with every generation, hence why there's better performing GPU's every year./ Real time RT was allways going to come in and really hammer the frame rates, both because it's a new effect, but also because until everyone actually puts it to real world use and figures out how to optimize it on a software level, (and also once the software side shakes out far enough the hardware people can start tweaking their hardware designs to better support exactly how it actually plays out), there's going to be some significant implementation level issues.

 

And none of thats going to get worked out and worked through until it's actually put out there and starts being used.

 

Also don't buy into NVIDIA's marketing BS. There's nothing about the actual hardware that made this the optimal time to do Ray Tracing. It's all about the convergence of the hybrid rendering approach coupled with the ever increasing processing cost of the existing methods we have for implementing lighting, shadows, and reflection effects. Where not at the point where the average of all of those is as resource intensive as hybrid ray tracing, but where rapidly approaching that point. If they kept trying to push up image quality via other methods they're very quickly going to hit the point where it actually runs slower than it would with, (properly optimized), hybrid ray tracing. In effect they've gone as far as they can with existing development paths without hurting performance even more than Hybrid Ray Tracing does.

 

16 minutes ago, S w a t s o n said:

$750 is too much for the 16 Core.
 

When I made this prediction almost 1 full year ago "economical" meant 1800x pricing or $500. After everyone overhyped the pricing leaks I thought AMD might take liberties and go to $600.

I am not paying $750 for 16 cores on AM4, I'll wait for Threadripper and take the higher pcie lanes and lower heat density of w/e 16 core Zen2 TR they release. Lisa you really let me down by trying to play intel's game so early. You could have at least waited until you had proven yourself.

 

I doubt there will be a 16 core TR3. They'd be competing against themselves so i don't see it happening. That said i don;t disagree the price isn't compelling, thats why they didn't announce and won't launch it alongside the 12 core. If this was the Halo product at launch it would neg bomb the whole lines rep because no matter how good it is in productivity, outside of that it's going to barely be any better than the 12 core, which is apparently just barely better than the 9900K which is a good $150+ cheaper. You'd be inviting a 1080Ti vs RTX 2080 situation, and everyone knows what a marketing disaster thats been for NVIDIA.

Link to comment
Share on other sites

Link to post
Share on other sites

Bitwit just did a live stream video for the E3 coverage.
 

image.png.f22c443fd4e352379b98598a5763413f.png

 

?‍♂️ He ended the stream without talking about the 3950x.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, CarlBar said:

I doubt there will be a 16 core TR3. They'd be competing against themselves so i don't see it happening. That said i don;t disagree the price isn't compelling, thats why they didn't announce and won't launch it alongside the 12 core. If this was the Halo product at launch it would neg bomb the whole lines rep because no matter how good it is in productivity, outside of that it's going to barely be any better than the 12 core, which is apparently just barely better than the 9900K which is a good $150+ cheaper. You'd be inviting a 1080Ti vs RTX 2080 situation, and everyone knows what a marketing disaster thats been for NVIDIA.

8 core TR existed and competed with 1800x, I expect the same this time around. Even if it's $799 instead of $750, I'll pony up the extra $50 for the pcie lanes and other shit, and buy the more (maybe not considering X570 pricing) expensive mobo. If it's 899 even I would still consider it over the 16 core ryzen 3000. Also the 16 core is coming at launch I'm pretty sure? September I guess?

Edit: I dont want to encourage them actually, I wont be spending more than like MSRP 799 on new TR so lets see what happens

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

If the 40CU 5700XT is 10% on average faster than the 2070 (grain of salt)... I wonder what AMD could of done had they offered a 60-64 CU variant. Was hoping AMD would take a shot with high end Navi but looks like Radeon VII is still the king for the time being.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Trik'Stari said:

Goddamnit AMD.

 

PUNCH UP. NOT DOWN.

 

I don't have a need for something that only beats a 2060 in price to performance. Goddamnit don't you bastards make me buy another fucking Nvidia card because you're too stupid to get out of the cheap seats and offer something actually cutting edge.

 

Give us a 5900 XT that dominates over the 2080, or piss the fuck off.

I guess because the midrange market is significantly profitable for them than competing the nvidia's flagship, knowing that midrange is the highest population. Even Nvidia know this segment so they rip off hard this area earlier by their 2060 (with only 6G vram) & 2070 (with awful value) before AMD join the segment with their own new generation.

So they, AMD, can recover budget for the next 7+ architecture after focusing to their Zen2..

My system specs:

Spoiler

CPU: Intel Core i7-8700K, 5GHz Delidded LM || CPU Cooler: Noctua NH-C14S w/ NF-A15 & NF-A14 Chromax fans in push-pull cofiguration || Motherboard: MSI Z370i Gaming Pro Carbon AC || RAM: Corsair Vengeance LPX DDR4 2x8Gb 2666 || GPU: EVGA GTX 1060 6Gb FTW2+ DT || Storage: Samsung 860 Evo M.2 SATA SSD 250Gb, 2x 2.5" HDDs 1Tb & 500Gb || ODD: 9mm Slim DVD RW || PSU: Corsair SF600 80+ Platinum || Case: Cougar QBX + 1x Noctua NF-R8 front intake + 2x Noctua NF-F12 iPPC top exhaust + Cougar stock 92mm DC fan rear exhaust || Monitor: ASUS VG248QE || Keyboard: Ducky One 2 Mini Cherry MX Red || Mouse: Logitech G703 || Audio: Corsair HS70 Wireless || Other: XBox One S Controler

My build logs:

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×