Jump to content

Navi/Ryzen 3000 launch Megathread

LukeSavenije
14 minutes ago, D13H4RD said:

To be honest, even this showing is quite strong for me.

 

Did AMD kick the pants off Intel? Well, not entirely, but it made further gains at its strongest attribute and also nibbled at the difference between it and the equivalent Intel competition.

 

All-in-all, I think a price-drop from Intel might be on the cards, given that the only pedestals they can really stand on right now are clockspeeds (plus overclocking) and raw gaming performance (even that is becoming less significant).

And maybe a few very specific scenarios like some adobe workloads.  

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Intel have the brand and ability to charge more than AMD, in a similar way that nvidia does. They can't rest on that forever but I don't see them needing to do a short term adjustment as a direct response. That of course doesn't exclude they may choose to adjust pricing at any time, which is business as normal.

 

What AMD has to do is keep the hype up and get into the minds of ever more people. Not the enthusiast space we inhabit, but the masses are what they have to reach and there is still more work to be done there.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, porina said:

Intel have the brand and ability to charge more than AMD, in a similar way that nvidia does. They can't rest on that forever but I don't see them needing to do a short term adjustment as a direct response. That of course doesn't exclude they may choose to adjust pricing at any time, which is business as normal.

 

What AMD has to do is keep the hype up and get into the minds of ever more people. Not the enthusiast space we inhabit, but the masses are what they have to reach and there is still more work to be done there.

I can see Intel doing it simply to keep stock moving.  A minor adjustment to maintain everything else. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, mr moose said:

I can see Intel doing it simply to keep stock moving.  A minor adjustment to maintain everything else. 

It will be interesting to see what does happen. Does the 10nm production of mobile CPUs take some pressure off their fabs for 14nm parts? I don't know, are they still in constraint? Pricing certainly has relaxed somewhat. On a parallel note, I wonder what AMD's production capability is of their CPUs. It is still early days but I've yet to see stock of anything other than the 3600.

 

As a more wild thought, could we even get to a similar situation with DRAM/flash? The over-under supply cycles causing pricing to go up and down. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, dgsddfgdfhgs said:

i was really hoping they will have a 6c12t APU with gtx1060 perf lol

That would be really sweet as a jumping off point before you can upgrade gpu.

Desktop: Slick

CPU: Ryzen 5 2600    Motherboard: Asus Prime X470-Pro    RAM: 16GB Corsair Vengeance LPX 3200mhz CL16   GPU: GTX 1080 8GB 

Storage: Samsung PM981 256GB (970 EVO)    PSU: Corsair TX650M   Case: Lian-Li PCO11 Dynamic - White  Fans: Deepcool RF120mm RGB

Peripherals: HP Omen X 35 Ultrawide (3440x1440, 100hz, G-Sync), Logitech g903, mdr-1000x, umc22 + AKG D5, Drevo Blademaster 87K RGB (Gateron Browns)

Laptop: Surface Laptop 2 Platinum - i5 8250u, 8gb ram, 256gb nvme, 13 inch pixelsense touch display

Phone: Huawei Mate 20 pro 128gb Black + iphone 6s 32gb gold

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, CarlBar said:

And the bit you quoted from 12:59 is relevant to the reddit article how? The reddit article is explicitly about how it boosts to 4.6Ghz in workloads where it can do that, i.e lightly threaded. So how and why it boosts in a heavy all core load isn't especially relevant information to that discussion.

Because we don't actually know how lite it needs to be, the frequency over time graphs on GN show CPU cores hitting 4.6GHz then shortly after that dropping down. So yes in games the cores can hit 4.6GHz right now, briefly. Is that how it's supposed to be? Dunno.

 

What this information is saying is that 4.6GHz is not and has never been an all core turbo figure.

image.png.6b6b934334200248547ca45ad11da085.png

 

There is the 4.6GHz boost working, maybe or maybe not how it is supposed to but it's not not achieving 4.6GHz in games.

 

3 hours ago, CarlBar said:

What your saying about binning contradicts literally everything i've seen everyone else ever say on the subject...

No it doesn't, everyone correctly says what it is right until talking about server CPUs then proceed to assume server CPUs need the highest binned dies without actually thinking if this is the case and necessary. There are specific high clock variants of Xeons for example which cost more than SKUs with significantly more cores, why is that? Because those SKUs do require higher binned dies because they are significantly higher clocked and there are less high binned dies without memory controller faults or other issues specific to the requirements of a Xeon i.e. ECC. No matter how good or bad a die is if it fails ECC validation but passes everything else then it's fine for any non ECC SKU, if it's an appropriate die of course because HEDT doesn't use the HCC Xeon dies.

 

Zen2 doesn't have to worry about memory controller or other areas for the core complex die.

3 hours ago, CarlBar said:

This isn't just about how they OC but the stock power draw. if you go back to the chart from GN 3600 video where seeing stock draws of:

 

3600: 79.2

3700: 87

3900:147.6

 

From the 3600 to the 3700 the jump is waaaay lower than it should be, (same base clock, 33% more cores/threads and 4.7% boost clock jump but only 9.8% power usage jump).

 

Whilst the jump from the 3700 to 3900 displays weird behaviour too, (69% power spike for 50% more cores threads, a 5.5% higher base clock and a 4.5% higher boost clock).

This is correct and matches the TDP difference between those products. 3600 and 3700X are both 65W TDP and the 3900X is 95W. Two products with the same TDP and PPT limit will have the same power draw if both products are capable of hitting their maximum defined limits, the 3900X limits are set higher than both of these so will naturally be significantly higher because it's allowed to be.

 

I think what you want to see is a 3800X review, that's the 8 core 95W TDP part and may be more capable of achieving 4.6GHz in games due to more power headroom for each core compared to the 3900X and 3950X when that comes out.

Link to comment
Share on other sites

Link to post
Share on other sites

The 65 watt 3700X made great SFF builds possible: it looks like it's sufficiently powerful, should perform well without overclocking, and can probably be cooled by a good 47mm tall air cooler. Looking forward to getting that thing, along with the Dan A4 case.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, RejZoR said:

but how is my rather ancient 5820K at 4.5GHz on all cores stacking up against 3900X in games mostly?

No idea about games, but in synthetic benchmarks the Has-E is getting slaughtered. 

 

Kyle ran Cinebench R20 on his 3900X and got a score of 7168

I then ran R20 on my 5930K (@4.2GHz) and my score was 2750.  Granted, I can crank mine up to 4.6GHz if needed, but that's only going to give me 100 to 150 points extra. 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Captain Chaos said:

No idea about games, but in synthetic benchmarks the Has-E is getting slaughtered. 

 

Kyle ran Cinebench R20 on his 3900X and got a score of 7168

I then ran R20 on my 5930K (@4.2GHz) and my score was 2750.  Granted, I can crank mine up to 4.6GHz if needed, but that's only going to give me 100 to 150 points extra. 

And I'm still on Ivy-E lol. 3800X I really want to see some reviews of.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Captain Chaos said:

No idea about games, but in synthetic benchmarks the Has-E is getting slaughtered. 

 

Kyle ran Cinebench R20 on his 3900X and got a score of 7168

I then ran R20 on my 5930K (@4.2GHz) and my score was 2750.  Granted, I can crank mine up to 4.6GHz if needed, but that's only going to give me 100 to 150 points extra. 

Then there is me on a 5920K that won't go higher than 3.7 on all core without blue screening.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, porina said:

Intel have the brand and ability to charge more than AMD, in a similar way that nvidia does. They can't rest on that forever but I don't see them needing to do a short term adjustment as a direct response. That of course doesn't exclude they may choose to adjust pricing at any time, which is business as normal.

 

What AMD has to do is keep the hype up and get into the minds of ever more people. Not the enthusiast space we inhabit, but the masses are what they have to reach and there is still more work to be done there.

Wouldn't be surprised if the rumored pricing adjustment for the desktop parts end up being the case

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, ThePD said:

Then there is me on a 5920K that won't go higher than 3.7 on all core without blue screening

darn.  Oh well, at least you don't have to go through the same mess I'm dealing with right now. 

 

I did my run yesterday with Firefox, Skype etc open.  So when @leadeater quoted me half an hour ago I decided to kill all unnecessary tasks and gave it another run, resulting in 2842. 

I figured I'd see if I could hit 3K, so I rebooted into my BIOS and changed voltage and multiplier to what I know works just fine at 4.5GHz.  Unfortunately Win7 still had a pending update and that fails to install.  I ran a very thorough Disk Cleanup yesterday, perhaps that has something to do with it. 

Can't even boot into safe mode as that also hangs.  So much for this install.  Currently in Mint (yay for dualboot!) backing up my game files, not going to download 100+GB again. 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, D13H4RD said:

Wouldn't be surprised if the rumored pricing adjustment for the desktop parts end up being the case

I wasn't aware there was any rumour going around there.

 

Look at my sig. That's a lotta Intel CPUs I have, and not many AMD ones. Zen 2 I think is going to be the switching point for me, subject to testing for my specific use case once I get my hands on one... so I haven't been looking at Intel much recently. Since we're not expecting anything new on desktop any time soon from team blue, the only way they could keep my interest in the short term is if they can drop Cascade Lake into HEDT at lower than historic HEDT pricing, without crippling ram channels.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Well time to upgrade from my x99

Silverstone FT-05: 8 Broadwell Xeon (6900k soon), Asus X99 A, Asus GTX 1070, 1tb Samsung 850 pro, NH-D15

 

Resist!

Link to comment
Share on other sites

Link to post
Share on other sites

This could be a major spoiler when combined with the half bandwidth writes. The testing I need to do just got a lot more complicated.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, porina said:

This could be a major spoiler when combined with the half bandwidth writes. The testing I need to do just got a lot more complicated.

The memory write bandwidth is going to be an interesting thing to explore. It was probably done for heat/power reasons (to keep the interconnect power down), but whether it was done at the design level or it's a binning choice (and thus Epyc will have symmetric bandwidth) remains to be seen.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Captain Chaos said:

No idea about games, but in synthetic benchmarks the Has-E is getting slaughtered. 

 

Kyle ran Cinebench R20 on his 3900X and got a score of 7168

I then ran R20 on my 5930K (@4.2GHz) and my score was 2750.  Granted, I can crank mine up to 4.6GHz if needed, but that's only going to give me 100 to 150 points extra. 

Yeah, but that's just compute because of the multithreading. We all know games don't scale that way.

Link to comment
Share on other sites

Link to post
Share on other sites

Does 3800x and below only have one chiplet or do they use 2 with not al cores enabled?

GAMING PC CPU: AMD 3800X Motherboard: Asus STRIX X570-E GPU: GIGABYTE RTX 3080 GAMING OC RAM: 16GB G.Skill 3600MHz/CL14  PSU: Corsair RM850x Case: NZXT MESHIFY 2 XL DARK TG Cooling: EK Velocity + D5 pump + 360mm rad + 280mm rad Monitor: AOC 27" QHD 144Hz Keyboard: Corsair K70 Mouse: Razer DeathAdder Elite Audio: Bose QC35 II
WHAT MY GF INHERITED CPU: Intel i7-6700K (4.7GHz @ 1.39v) Motherboard: Asus Z170 Pro GPU: Asus GTX 1070 8GB RAM: 32GB Kingston HyperX Fury Hard Drive: WD Black NVMe SSD 512GB Power Supply: XFX PRO 550W  Cooling: Corsair H115i Case: NZXT H700 White
Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, CiBi said:

Does 3800x and below only have one chiplet or do they use 2 with not al cores enabled?

They use 2 chiplets

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Taf the Ghost said:

The memory write bandwidth is going to be an interesting thing to explore. It was probably done for heat/power reasons (to keep the interconnect power down), but whether it was done at the design level or it's a binning choice (and thus Epyc will have symmetric bandwidth) remains to be seen.

Statement from AMD rep, taking from overclockers.com review:

Quote

“This is an expected result. Client workloads do very little pure writing, so the CCD/IOD link is 32B/cycle while reading and 16B/cycle for writing. This allowed us to save power and area inside the package to spend on other, more beneficial areas for tangible performance benefits.”

 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Neat. I've ye to see everything though. But so far what I've seen it's looking amazing. New build planned soon. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

@LukeSavenije

 

Some more videos

Spoiler


Pauls Hardware:

 

GN separately Reviewed the 3900x (and will also do so for the 3700x, I'll let you decide if you want to include that or not)


 

After watching many of these videos, I've come to this conclusion:

 

Intel? Still best for gaming. But (now more than ever) if you do anything else on top of gaming; if you use photoshop or premire (surprisingly, premire works good on AMD now), if you stream, if you do CAD work (hi), if you work in a data center, if you like having money either to spend on other things or just save up for things like retirement, student loans, bills or whatever else life has to make you give them money, if you sneeze in between gaming sessions, then my recommendation is Ryzen 3000 (Especially that 3600. I agree with GN, best value in a CPU by far)

"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

 RGB Build Post 2019 --- Rainbow 🦆 2020 --- Velka 5 V2.0 Build 2021

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- Project ITNOS

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage Samsung EVO 250GB, Samsung EVO 1TB, WD Black 3TB, WD Black 5TB    PSU Corsair CX750M    Cooling Cryorig H7 with NF-A12x25

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

PLEASE QUOTE ME IF YOU ARE REPLYING TO ME

Desktop Build: Ryzen 7 2700X @ 4.0GHz, AsRock Fatal1ty X370 Professional Gaming, 48GB Corsair DDR4 @ 3000MHz, RX5700 XT 8GB Sapphire Nitro+, Benq XL2730 1440p 144Hz FS

Retro Build: Intel Pentium III @ 500 MHz, Dell Optiplex G1 Full AT Tower, 768MB SDRAM @ 133MHz, Integrated Graphics, Generic 1024x768 60Hz Monitor


 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×