Jump to content

AMD ramping up Ryzen flagship clockspeeds as we move closer to launch

Humbug
2 minutes ago, lots of unexplainable lag said:

Remember that I didn't just say ever. You're not getting it on air and water. On LN2, entirely possible.

But of course, I consider LN2 results basically invalid due to practicality. E-peen only!

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Humbug said:

I think at this point people are just happy to know that AMD is going to have a 4Ghz CPU with 40% better IPC than excavator.

Because that means in a real world gaming load you will no longer be able to tell the difference between Intel and AMD...

Sure You can do benchmarks and find small differences and split hairs but nothing detectable to the naked eye.

ok you i like!, that is really well said, like i know the 6700k is a brilliant gaming cpu, but from what i have seen, in builds that bottleneck minimal they use 2011-3 chips dont they, like 6950s, or something like that, so if zen is running up there. that means it has a chance to have a lower chance of bottle neck as well correct ?
 

but at the end of the day, if it gives good clean performance (even if its 5% lower then a 6900K on a benchmark) at what was speculated to be $500 for SR7 its better performance per $ than a 6900K correct. so thats what im keen to see is price per $$$, an thats what AMD is about, if it give a good experience, smooth gameplay, workload application performance and good TDP, why would anyone want to split ends. also competition on the market is in dire need. and its variety, who doesn't love variety

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Ashaira said:

that makes no sense. if single core is worse how could an 8 core beat a 10 core in multi threading.

unless amd did some magic for schedueling and such. but that has to be some big stuff if it could compensate for almost 2 entire cores in a multi thread scenarios.

newer tech in the CPU, cant remember what its called but SMT and thier new AI thingy, kinda like predictive text for programs and tasks i guess, just more accurate and isn't going to say your want your duck suck to your gf when typing on fb like auto correct would do... i digress, higher clocks and core isn't always what makes a cpu better, i mean fx9590 has 8 cores and 5Ghz, still out classed buy alot of intels cpus, its down to the extra tech in the CPU these days, not just core and clock

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, ApolloFury said:

Even Dragon Age Inquisition, a 2014 game, stresses my i7 6700K a lot, 40-70% CPU usage (On all threads). Games on Frostbite engine are well optimised.

EEERMEEEGEEEERD MASS EFFECT ANDROMEDA RUN AT MEEEEEEEEEE! Zen and ME:A will turn me into a vorcha.
Prothean NO LIKE YOU!

ME3_Javik_Character_Shot.png

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, ArcThanatos said:

newer tech in the CPU, cant remember what its called but SMT and thier new AI thingy, kinda like predictive text for programs and tasks i guess, just more accurate and isn't going to say your want your duck suck to your gf when typing on fb like auto correct would do... i digress, higher clocks and core isn't always what makes a cpu better, i mean fx9590 has 8 cores and 5Ghz, still out classed buy alot of intels cpus, its down to the extra tech in the CPU these days, not just core and clock

i dont think thats what he meant, for ryzen to be as good as a 6900k for example it has to be as good in single thread scenarios, and given that its the same on blender as a 6900k at 3.4 then at 3.6-3.9 it will enter 6950x territory

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Ultrametric said:

What why is it more efficient????

 

Intel measures it differently!

With AVX2 instructions the i7-7700k takes >120 watts without a singe Hz of overclock (source: http://www.golem.de/news/intel-core-i7-7700k-im-test-kaby-lake-skylake-hevc-overclocking-1701-125322-4.html).

 

So the TDP can be off by a big margin for both manufacturers....

Mineral oil and 40 kg aluminium heat sinks are a perfect combination: 73 cores and a Titan X, Twenty Thousand Leagues Under the Oil

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Valentyn said:

I wouldn't care one bit. My 5820K is rated at 140W.

I hope AMD manage to compete well on performance, and price it even a tad below Intel. 

Would love and AMD system again. My last one was a Phenom II 940 BE.

Agreed.  My Thuban is rated at 125W stock, and so for in my overclocking misadventures sucks down 135-140ish during cinebench runs at 3.9GHz.  I'm easily keeping it very cool using a pretty ghetto custom water loop.

 

I'm all for that low-load efficiency, though.

 

Edit:  And the potential for an APU with high gaming performance gives me all sorts of ideas for SFF builds.

SFF-ish:  Ryzen 5 1600X, Asrock AB350M Pro4, 16GB Corsair LPX 3200, Sapphire R9 Fury Nitro -75mV, 512gb Plextor Nvme m.2, 512gb Sandisk SATA m.2, Cryorig H7, stuffed into an Inwin 301 with rgb front panel mod.  LG27UD58.

 

Aging Workhorse:  Phenom II X6 1090T Black (4GHz #Yolo), 16GB Corsair XMS 1333, RX 470 Red Devil 4gb (Sold for $330 to Cryptominers), HD6850 1gb, Hilariously overkill Asus Crosshair V, 240gb Sandisk SSD Plus, 4TB's worth of mechanical drives, and a bunch of water/glycol.  Coming soon:  Bykski CPU block, whatever cheap Polaris 10 GPU I can get once miners start unloading them.

 

MintyFreshMedia:  Thinkserver TS130 with i3-3220, 4gb ecc ram, 120GB Toshiba/OCZ SSD booting Linux Mint XFCE, 2TB Hitachi Ultrastar.  In Progress:  3D printed drive mounts, 4 2TB ultrastars in RAID 5.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, DildorTheDecent said:

Hoping the motherboards are good though. Most of the AMD offerings look like trash. 

 

Return of the Crosshair would be good. ASUS pls. Need some build quality for once. 

they look like trash because most are old designs. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, 3DOSH said:

Yep, all we need is bit of competition so prices go down and 6+ cores  become the mainstream. We have been stuck with 4 cores for 10 years for fuck sake. 

let's hope cpu become the next tv, wait 10 years THEN BOOOOOM, Oled, HDR 10, DV HDR, QD, Qled and 4k in just a few years gap.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, lots of unexplainable lag said:

Because they don't. Remember, Broadwell IPC, not Skylake IPC. Then again, a 4c/8t Ryzen CPU won't cost nearly as much as a 7700K. 

Then again,  skylake is like less than 5% faster than broadwell... 

Though 5ghz might be pushing it

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

It's similar to what I have heard from a friend(he had access to certain ryzen  chips) 

This is what he said

The 8c/16T  boosts uppity 4.1 Ghz  with h100i(manual Oc  to 4.4)

And 6c/12T  boosts upto 4.2  under h100i 

(Ps  he didn't mention the mobo)

So with a better cooling solution (X62 or a custom loop)  I guess it won't be impossible to feet 5Ghz 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Y'all are fighting over single core performance, and I'm just sitting here wondering when the hell AMD got multithread computing?

i7 4930k \ Asus P9X79 LE \ Corsair H100i \ 16 GB DDR3 G.SKILL Ripjaw \ Asus Strix R9 380x 4GB \ Crucial 500 GB Sata III SSD \ Thermaltake TR2 RX 850W \ Corsair Crystal 460 Black \ Razer Naga Molten edition \ Razer Black Widow Ultimate \ Klipsch Promedia 2.1 speakers \ Hyper X Cloud Alpha \ 

 

i5 6600k\ Asus Z170-A \ Corsair H100i v2 \ 16 GB DDR4 G.SKILL Ripjaw \ Asus GTX 1060 6GB 4GB \ SanDisk 480 GB Sata III SSD \ Seasonic G Series550W \ DIYPC Skyline 06 black/green \ Razer Naga Epic \ Razer Black Widow Chroma \ Logitech 2.1 Speakers \ Logitech G430 \ 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Stefan1024 said:

With AVX2 instructions the i7-7700k takes >120 watts without a singe Hz of overclock (source: http://www.golem.de/news/intel-core-i7-7700k-im-test-kaby-lake-skylake-hevc-overclocking-1701-125322-4.html).

 

So the TDP can be off by a big margin for both manufacturers....

That's why such should not be brought up for comparison, it's meaningless.

Link to comment
Share on other sites

Link to post
Share on other sites

Let's all take this with a big grain of salt. I don't know how AMDs boost tech works but if it's anything like Nvidia GPU boost maybe it can go higher than spec.

Link to comment
Share on other sites

Link to post
Share on other sites

On 07/01/2017 at 7:23 AM, Kilobytez95 said:

-snip-

It's takes Nvidia's boost to a whole new level, each individual portion of the chip can be dynamically overclocked by even just 25MHz; the chip will try and automatically squeeze as much performance out of the chip as possible depending on cooling provided, so I'd imagine the boost figures are the average boost attained by a number of chips with the stock cooler, which in this case it actually rather good.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, ArcThanatos said:

 - snip -

Ah yes, the Prothean master race...

 

With AMD, giving stars for GPU names, I'd say it's time for the reapers! Warn them about the reapers!

You can bark like a dog, but that won't make you a dog.

You can act like someone you're not, but that won't change who you are.

 

Finished Crysis without a discrete GPU,15 FPS average, and a lot of heart

 

How I plan my builds -

Spoiler

For me I start with the "There's no way I'm not gonna spend $1,000 on a system."

Followed by the "Wow I need to buy the OS for a $100!?"

Then "Let's start with the 'best budget GPU' and 'best budget CPU' that actually fits what I think is my budget."

Realizing my budget is a lot less, I work my way to "I think these new games will run on a cheap ass CPU."

Then end with "The new parts launching next year is probably gonna be better and faster for the same price so I'll just buy next year."

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, YoloSwag said:

Ah yes, the Prothean master race...

 

With AMD, giving stars for GPU names, I'd say it's time for the reapers! Warn them about the reapers!

if only people would believe it, but they wont until its too late...  (i think Nvidia is the reaper in this case...)

506.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, ArcThanatos said:

  - snip -

 

I believe it's gonna be that hybrid ending.

 

You know, the one where GPUs would work with CPUs in something like a hybrid PC or something in an advanced API so it wouldn't matter what you put in it as long as it's a good mix.

You can bark like a dog, but that won't make you a dog.

You can act like someone you're not, but that won't change who you are.

 

Finished Crysis without a discrete GPU,15 FPS average, and a lot of heart

 

How I plan my builds -

Spoiler

For me I start with the "There's no way I'm not gonna spend $1,000 on a system."

Followed by the "Wow I need to buy the OS for a $100!?"

Then "Let's start with the 'best budget GPU' and 'best budget CPU' that actually fits what I think is my budget."

Realizing my budget is a lot less, I work my way to "I think these new games will run on a cheap ass CPU."

Then end with "The new parts launching next year is probably gonna be better and faster for the same price so I'll just buy next year."

 

Link to comment
Share on other sites

Link to post
Share on other sites

....

3 minutes ago, YoloSwag said:

I believe it's gonna be that hybrid ending.

 

You know, the one where GPUs would work with CPUs in something like a hybrid PC or something in an advanced API so it wouldn't matter what you put in it as long as it's a good mix.

u mean like a APU... ...

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ArcThanatos said:

....

u mean like a APU... ...

No, like GPUs and CPUs would eventually split task loads and although the CPU would still be the "brain", it can delegate some work load to the GPUs.

Something like those server cards but for mainstream and in a different way. GPUs would then be like "add-on" powerhouses that could do CPU work.

You can bark like a dog, but that won't make you a dog.

You can act like someone you're not, but that won't change who you are.

 

Finished Crysis without a discrete GPU,15 FPS average, and a lot of heart

 

How I plan my builds -

Spoiler

For me I start with the "There's no way I'm not gonna spend $1,000 on a system."

Followed by the "Wow I need to buy the OS for a $100!?"

Then "Let's start with the 'best budget GPU' and 'best budget CPU' that actually fits what I think is my budget."

Realizing my budget is a lot less, I work my way to "I think these new games will run on a cheap ass CPU."

Then end with "The new parts launching next year is probably gonna be better and faster for the same price so I'll just buy next year."

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, YoloSwag said:

No, like GPUs and CPUs would eventually split task loads and although the CPU would still be the "brain", it can delegate some work load to the GPUs.

Something like those server cards but for mainstream and in a different way. GPUs would then be like "add-on" powerhouses that could do CPU work.

so a geth ... reapers would be more then that, whast uu described is a geth... maybe thats what AMD is trying to make... OH does that mean i could have my very own legion and tali?

mass_effect_animated_gif___holy_crap__it__s_tali__by_kikaimegami-d58advp.gif

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ArcThanatos said:

 - snip -

TBH, I don't think Nvidia would be the reaper since reapers in the ME universe are extremely OP (absorbing tech and stuff). Also, we should stop this for now as it's getting out of topic lol.

You can bark like a dog, but that won't make you a dog.

You can act like someone you're not, but that won't change who you are.

 

Finished Crysis without a discrete GPU,15 FPS average, and a lot of heart

 

How I plan my builds -

Spoiler

For me I start with the "There's no way I'm not gonna spend $1,000 on a system."

Followed by the "Wow I need to buy the OS for a $100!?"

Then "Let's start with the 'best budget GPU' and 'best budget CPU' that actually fits what I think is my budget."

Realizing my budget is a lot less, I work my way to "I think these new games will run on a cheap ass CPU."

Then end with "The new parts launching next year is probably gonna be better and faster for the same price so I'll just buy next year."

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, YoloSwag said:

TBH, I don't think Nvidia would be the reaper since reapers in the ME universe are extremely OP (absorbing tech and stuff). Also, we should stop this for now as it's getting out of topic lol.

true... i mean some of the fanboys are no more than mindless husks, but yeah off topic... where is the ME:A feed. lol
this yr is starting pretty juicey thou, Ryzen ME:A Vega. thats like all i need from this yr, once those 3 are out ill never leave my room. ill dying playing ME:A with a smile on my face and a stench that will give you special eyes ... xD i had to last shot

raw.gif

Link to comment
Share on other sites

Link to post
Share on other sites

On 6. 1. 2017 at 11:54 AM, DELTAprime said:

This still doesn't give me confidence that they will have a SKU that can outperform a 7700k @5ghz for gaming.

I am sure it wont be as good as 7700k for games but if the price for 6 core Zen will be similar to a 7700k and only about 5%-10% slower single core performance then I wouldn't even consider the 7700k if I was shopping for CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, WereCat said:

I am sure it wont be as good as 7700k for games but if the price for 6 core Zen will be similar to a 7700k and only about 5%-10% slower single core performance then I wouldn't even consider the 7700k if I was shopping for CPU.

*8 core. The "leaked" pricing so far puts one of the two 8-core SKUs at $350, the same as a 7700K. 6-core placed at the price of a 7600K, and a true quad-core with SMT for like $160-$180. That gets you a measly i3 on the Intel side.

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×