Jump to content

Intel claims Core i9-13900K will be 11% faster on average than AMD Ryzen 9 7950X in gaming

Summary

Slides shared by “RUBY’s|RYZEN|RAYDEON RAGE 3D” show the Intel Core i9-13900K beating the Ryzen 9 7950X in gaming. Another slide shows productivity benchmark from PugetBench, Procyon or AutoCAD/AutoDesk. It looks like the slides were shown during a recent Raptor Lake pre-launch events, which are now taking place around the world for the tech press and analysts.

 

INTEL-13900K-vs-AMD-7950X-1.thumb.jpg.0f68c452b443000af837eac3b80da41b.jpg

 

INTEL-13900K-vs-AMD-7950X-2.thumb.jpg.5b7a3517759b71fa8490b56e0b504bcc.jpg

 

 

Quotes

Quote

According to the Gaming Performance slide, the Intel Core i9-13900K offers -1% worse to 22% better performance than AMD Ryzen 9 7950X in gaming. On average, the Intel CPU will be 11% faster than the AMD Zen4 CPU.

 

Intel claims it will reach parity in half of the benchmarks, with 3% loss in Photoshop test and 4% to 16% gain with AutoDesk/CAD according to the Content Creation slide.

 

It is not clear if these slides in particular are under embargo, but it is clear that they have not been shared before. 

 

My thoughts

These results from Intel seem to be relatively reliable as it seems they are not totally outpacing AMD in everything. It is surprising that Intel would even show where their part is losing or equal to the competitors. Because of that I would say that these results could be pretty close to accurate because they don't appear overly cherry picked; where Intel is totally dominating in everything. These results aren't too surprising, as HotHardware points out, as the 7950X already struggles to consistently beat the 12900k in gaming workloads. Intel is launching Raptor Lake tomorrow, October 20th, so you don't have to wait too long before there's third party benchmarks comparing these two CPUs.  

 

Sources

https://www.techpowerup.com/300029/intel-claims-11-percent-gaming-performance-advantage-of-the-core-i9-13900k-over-amds-ryzen-9-7950x

https://videocardz.com/newz/intel-claims-core-i9-13900k-will-be-11-faster-on-average-than-amd-ryzen-9-7950x-in-gaming

https://hothardware.com/news/13900k-clobbers-7950x-alleged-benchmark-slides

Link to comment
Share on other sites

Link to post
Share on other sites

I'd say that Intel definitely gave AMD worse RAM and/or lower power limit. They always do this.

 

So if the margin will actually be closer to 5%, I would be disappointed.

 

And so will be Intel once Zen4 3d launches.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ouroesa said:

Don't make jokes like this when I'm having coffee. Nearly covered my desk init from the outburst of laughter.

 

Taken at face value, I don't really see the problem with them. They don't glorify the performance of the 13900k, and seem somewhat fair. I agree that you should be skeptical of a company's own data, but here they don't seem overly suspicious. 

Link to comment
Share on other sites

Link to post
Share on other sites

I cant recall a single time where there weren't glaring issues with their (this includes AMD and Nvidia) graphs. Sometimes they gimp the the competitor wioth drivers/power limits/memory etc and tend to cherry pick results with just a couple of losses in there to seem genuine. Always remember that their marketing budget is an unfathomable amount and these people are generally in charge of what you and I see - if you can trust a marketing bloke, you can trust a lawyer or second hand car salesman. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ouroesa said:

I cant recall a single time where there weren't glaring issues with their (this includes AMD and Nvidia) graphs. Sometimes they gimp the the competitor wioth drivers/power limits/memory etc and tend to cherry pick results with just a couple of losses in there to seem genuine. Always remember that their marketing budget is an unfathomable amount and these people are generally in charge of what you and I see - if you can trust a marketing bloke, you can trust a lawyer or second hand car salesman. 

 

Lately AMD/NVIDIA have been pretty on point with their own graphs being close to results from third party reviewers. I'm not saying that they aren't going to try to paint their product in the best light, as after all this is marketing, but they also know how the internet reacts these days. Therefore they have to be careful of what they publish. Nonetheless, 11% better on average isn't anything too crazy. Then with the productivity charts they are showing basically parity between the 13900k and 7950X. That doesn't seem too disingenuous. You're right we don't know the settings they are using here, nor the test bench, so that makes it much harder to determine the accuracy. But I wouldn't be surprised if tomorrow when the reviews for Raptor Lake come out that these numbers aren't too far off. 

Link to comment
Share on other sites

Link to post
Share on other sites

Frame Chasers got his hands on a 13900K early and benchmarked it on his YouTube channel. Max overclock vs. max overclock with with a 4090, the 13900K is ~2% faster in gaming than the 12900K. Tested games where high FPS actually matters, like Warzone and Rust iirc. 13900K is Basically a 12900KSS with eight more dumpster cores bolted on to stop AMD from winning blender and cinebench.

Zen4 and Raptor Lake are complete duds for gaming, I feel a little ripped off because I waited a couple months for this instead of just buying a 12700K like I was originally planning to lol

Link to comment
Share on other sites

Link to post
Share on other sites

honestly, look to be about where I personally expect them to be. 
Other then the fact no one buys a 7950x for gaming, but it makes sense to compare flag ship to flagship. 

I notice a distinct lack of 5800x3d on those charts.

Link to comment
Share on other sites

Link to post
Share on other sites

I always take intel slides with a huge saltbrick since the time they used Ryzen at stock speed with neutered Ram and intel on hardcore cooling and monster Ram.

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, SeriousDad69 said:

Frame Chasers got his hands on a 13900K early and benchmarked it on his YouTube channel. Max overclock vs. max overclock with with a 4090, the 13900K is ~2% faster in gaming than the 12900K. Tested games where high FPS actually matters, like Warzone and Rust iirc. 13900K is Basically a 12900KSS with eight more dumpster cores bolted on to stop AMD from winning blender and cinebench.

Zen4 and Raptor Lake are complete duds for gaming, I feel a little ripped off because I waited a couple months for this instead of just buying a 12700K like I was originally planning to lol

Framechaser is a hack.  In his own video at 10:30 or around you can clearly see game settings for 13900k are much higher then the 12900k.  Also do you really think 13900k is only 1% faster in fire strike with 8 more cores and high clocks? 
 

 

A lot of his videos have been debunked. 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Shzzit said:

Framechaser is a hack.  In his own video at 10:30 or around you can clearly see game settings for 13900k are much higher then the 12900k.  Also do you really think 13900k is only 1% faster in fire strike with 8 more cores and high clocks? 
 

 

A lot of his videos have been debunked. 

He disables ecores because they're slower for actual gaming. His content is targeted at esports pros and semi-pros that need the highest FPS with the best 1% and .1% lows. Personally, I trust him a lot more than someone like Steve at Hardware Unboxed that'll do tests with DDR4 in Gear 2 so he can say budget DDR5 is faster lol

Link to comment
Share on other sites

Link to post
Share on other sites

With no actual numbers and no information on overall system specs these are as meaningless as ever. Plus it's increasingly irrelevant what cpu you have when it comes to videogames... to get these differences you probably need to run at 720p to completely uncap the gpu and force high cpu usage to reach framerates in the multiple hundreds which isn't a relevant use case.

5 hours ago, Shzzit said:

Also do you really think 13900k is only 1% faster in fire strike with 8 more cores and high clocks? 

I don't know how reliable the testing was there but in theory it's perfectly possible, games have gotten better at using more cores but they still don't benefit that much from them beyond a certain point. Even higher clocks only matter to a point if you're GPU bound (as you generally are).

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, SeriousDad69 said:

He disables ecores because they're slower for actual gaming. His content is targeted at esports pros and semi-pros that need the highest FPS with the best 1% and .1% lows. Personally, I trust him a lot more than someone like Steve at Hardware Unboxed that'll do tests with DDR4 in Gear 2 so he can say budget DDR5 is faster lol

Assuming this is all true and that this, to me, completely unknown Youtuber accurately did measure everything and the criticism of him that he used different settings for the 12th gen and 13th gen stuff is incorrect, it is still possible that he is measuring different things from what Intel are measuring.

 

If he does things like disable E-cores and primarily focus on getting very high FPS in some lighter e-sports titles, then it is entirely possible that the results from Intel are true and valid, and the results from Frame Chaser are also true and valid. It depends on what you measure.

 

 

I don't agree that the 13900K is just a 12900KSS with more E-cores though.

Not only is that downplaying the importance of E-cores (which matters a lot for multithreaded workloads), the 13900K also has significantly more cache, which as we saw from for example the 5800X3D can make a big difference for gaming.

I would be very surprised if the general performance uplift in games will only be 1-2% when comparing 13th gen vs 12th gen.

The uplifts Intel are showing in their benchmarks, which I recommend gets taken with a grain of salt, seems far more plausible to me.

 

 

Also, I looked around for threads about Frame Chasers and the general consensus seems to be that he doesn't know what he is doing and tries to make controversial videos to get a lot of clicks. 

Link to comment
Share on other sites

Link to post
Share on other sites

I'll await more detailed reviews but I'll address the ram question. Generally speaking official testing will be performed with standards based ram. From memory, Intel will have a small advantage here as 13th gen officially supports slightly higher speed ram than Zen 4, which is higher again than 12th gen. The timings will generally also be standard. If you run anything XMP/EXPO that's overclocking, and if you overclock one part, it's a slippery slope where anything goes.

 

AMD have traditionally enforced a power limit at stock, although from what we saw of the early Zen 4 reviews thermal limit seems to be the primary limit now anyway. Intel doesn't enforce a power limit on desktop CPUs.

 

Basically expect the testing to be "as manufacturer intended" and not "what a forum visiting enthusiast does". Wait for clickbait youtubers for those.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LAwLz said:

Assuming this is all true and that this, to me, completely unknown Youtuber accurately did measure everything and the criticism of him that he used different settings for the 12th gen and 13th gen stuff is incorrect, it is still possible that he is measuring different things from what Intel are measuring.

 

If he does things like disable E-cores and primarily focus on getting very high FPS in some lighter e-sports titles, then it is entirely possible that the results from Intel are true and valid, and the results from Frame Chaser are also true and valid. It depends on what you measure.

 

 

I don't agree that the 13900K is just a 12900KSS with more E-cores though.

Not only is that downplaying the importance of E-cores (which matters a lot for multithreaded workloads), the 13900K also has significantly more cache, which as we saw from for example the 5800X3D can make a big difference for gaming.

I would be very surprised if the general performance uplift in games will only be 1-2% when comparing 13th gen vs 12th gen.

The uplifts Intel are showing in their benchmarks, which I recommend gets taken with a grain of salt, seems far more plausible to me.

 

 

Also, I looked around for threads about Frame Chasers and the general consensus seems to be that he doesn't know what he is doing and tries to make controversial videos to get a lot of clicks. 

One thing that seems to be getting missed from my original reply is this is with max overclocks. Dude takes everything he can to the maximum he can without degrading the CPU or sacrificing stability. eSport pros literally pay him $500 to overclock and tune their systems. 13900K is faster stock vs. stock.

Link to comment
Share on other sites

Link to post
Share on other sites

So in the cherry picked benchmarks from Intel, they are just about beating AMD? think 13th gen might be a bit of a stinker then because I have a hard time believing Intel marketing got more honest

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, SeriousDad69 said:

Frame Chasers got his hands on a 13900K early and benchmarked it on his YouTube channel. Max overclock vs. max overclock with with a 4090, the 13900K is ~2% faster in gaming than the 12900K. Tested games where high FPS actually matters, like Warzone and Rust iirc. 13900K is Basically a 12900KSS with eight more dumpster cores bolted on to stop AMD from winning blender and cinebench.

Zen4 and Raptor Lake are complete duds for gaming, I feel a little ripped off because I waited a couple months for this instead of just buying a 12700K like I was originally planning to lol

Wow, who would have guessed that a guy who "claimed" to get a 13900k early made fake benchmarks, very weird. It is almost as if it being to hit 6ghz now and having 55% more cache makes a difference, who would have guessed a clickbait youtuber made a fake video. If fixed at the same frequency and E-cores disabled, the 13900k is faster by 4-5% purely from the IPC improvements.

 

I only see your reply if you @ me.

This reply/comment was generated by AI.

Link to comment
Share on other sites

Link to post
Share on other sites

Reviews are out.

 

Here are the results from TechPowerUp's review (12 games):

 

1080p: i9-13900K is 8.3% faster than 7950X.

1440p: i9-13900K is 10% faster than the 7950X.

4K: i9-13900K is 1% faster than the 7950X.

 

Other than the 4K results which were heavily GPU limited (even the i3-12100 is only 0.3% behind the 7950X), the i9-13900K seems to be about 10% faster for gaming than the 7950X.

Link to comment
Share on other sites

Link to post
Share on other sites

look like graph might be close from reviewers looks like intel might be a bettter bang for the buck but the heat and watt unlocked sigh.

btw how come no ltt video on this release yet? weird haha

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, BiG StroOnZ said:

11% better on average isn't anything too crazy. Then with the productivity charts they are showing basically parity between the 13900k and 7950X.

pretty spot on based on GN's review

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Is it possible to sue for false advertising with the 125W TDP for the K parts since the turbo to 300W and hold that IS stock standard performance and not just a blip?
Now that is a hot chip. Just a CPU using more power then the entirety of  GPUs which include ram and vrm and other control logic on the boards. 

Honestly 13600k looks the easiest chip to recommend to 80% of new builds for non enthusiasts, who cares about it being a dead platform, they are not swapping CPUs anyways.
The 13900k costs more to manufacture then then 7950x. that its msrp is less then the 7950x is a choice if I have ever seen one. When Pat is trying to get margins back. 

Edited by starsmine
Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, starsmine said:

Is it possible to sue for false advertising with the 125W TDP for the K parts since the turbo to 300W and hold that IS stock standard performance and not just a blip?
Now that is a hot chip. Just a CPU using more power then the entirety of  GPUs which include ram and vrm and other control logic on the boards. 

I'm still skimming the reviews now. I don't know how many times it needs to be said, TDP is not supposed to indicate power consumption! Also Intel allows system builders to set whatever power limit they like, unlike AMDs enforced limits.

 

Is there a review that covers power consumption at different workloads? So far I've only seen Anandtech stick Prime95 on to hit >300W. Does it get anywhere near that power with softer loads like Cinebench?

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, porina said:

I'm still skimming the reviews now. I don't know how many times it needs to be said, TDP is not supposed to indicate power consumption! Also Intel allows system builders to set whatever power limit they like, unlike AMDs enforced limits.

 

Is there a review that covers power consumption at different workloads? So far I've only seen Anandtech stick Prime95 on to hit >300W. Does it get anywhere near that power with softer loads like Cinebench?

no, its supposed to give a ball park THERMAL DISPLACEMENT.
I know it doesn't indicate power consumption. I know intel and amd calculate it differently. and I know 125W is them talking about BASE speeds. but running a K at base speeds is not the intended use of K.

Im only half kidding about sueing. its just I cant recall the disparity between specified TDP and real world thermal displacement being so wide.
image.thumb.png.21712741d527107d299cbc1e0ba84913.png

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, LAwLz said:

Reviews are out.

 

Here are the results from TechPowerUp's review (12 games):

 

1080p: i9-13900K is 8.3% faster than 7950X.

1440p: i9-13900K is 10% faster than the 7950X.

4K: i9-13900K is 1% faster than the 7950X.

 

Other than the 4K results which were heavily GPU limited (even the i3-12100 is only 0.3% behind the 7950X), the i9-13900K seems to be about 10% faster for gaming than the 7950X.

TPU used a vanilla 3080; many games are even limited at 1080p with that. Wish more used 4090 to test.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, starsmine said:

no, its supposed to give a ball park THERMAL DISPLACEMENT.

It isn't that either. It is a relationship between minimum cooler capacity to hit base clocks. 

 

Thanks for the link to the power chart. As suspected most other workloads don't get anywhere near Prime95. It's too easy for haters to latch onto one worst case load and take that as typical across the range. Don't know if anyone does it but a CPU centric perf/W measure across different workloads between red and blue would be interesting. AMD enforcing a limit means they will never have high peaks at stock, but it can impair performance.

 

In a quick search myself, tweaktown claims 13900k hits 285W with Cinebench R23 MT, with the 7950X hitting 260W. I'm not familiar with their testing process but that 7950X value is above PPT so I'm not sure it is "stock".

https://www.tweaktown.com/reviews/10222/intel-core-i9-13900k-raptor-lake-cpu/index.html#Gaming-and-Power-Consumption

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×