Jump to content

7900X reviewed!

PCGuy_5960
Just now, PCGuy_5960 said:

At the same frequency, yes, it is 15-20% better, but don't forget that it can clock much higher than Ryzen..... :)

I was going to make a Perf/Watt point, then I had sudden flashbacks to the recent Mining Craze discussions.

 

Probably the more fun monkey wrench to throw out: the 7700k is still the better buy. xD

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, The Benjamins said:

what is funny x99 had a 40 PCIe at $500 which helped in 2-4 way SLI/Crossfire, but now it is at the $1000 price point. sure 28 is better then 16 but it is nice to see that Threadripper will have 64 on all Skus

 

I build some pretty competitive rigs for my own personal use and for friends.  I've yet to use the 40 PCIe lanes on my 5960x and I'm kinda nutty about high end shit.  I think most will find that 16, 28 or even 44 CPU PCIe lanes is plenty when you have a chipset that supports many more lane even if it's routed through a DMI. 

 

You are right though, it's always nice to have more of something.  This argument applies to more then just CPU PCIe lanes.  

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, done12many2 said:

Well, thanks for showing us which is faster, but from a different angle.  That always helps.

Here's from another angle.

Z7KyLnafEy4xEfApq5WGAP-650-80.png.a3f01d41eb8b48a80ca7fb0011212d95.png

Am I helping?

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, done12many2 said:

 

I'm still looking at a 2D image that's out of scaling.  xD

Put some perspective on it.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, tom_w141 said:

I know you said or :) I just can't fathom why a 10 core should even weigh in here. 8 vs 8 I get it. 10 vs 8 I don't.

The same reason you people constantly factor in Ryzen's 8c against Intel's 4c 7700k. We compare all hardware, regardless of their intended purpose, so that we are informed enough to pick the best tool for the job. If someone feels the more expensive Intel CPU is the end that justifies their means, then who are you to cast judgement on their decision?

 

It's time for this fanboy justification to stop on both sides. You can argue until you are blue in the face, but at the end of the day, neither side should tell others what they can or cannot buy, nor can they argue their subjective opinions as if they are fact. I dislike the 7740X. I dislike the 7640X. Neither should exist in my mind. Am I going to judge those that purchase them? Maybe secretly behind their back, but I am certainly not going to tell them they made a mistake, if the performance is exactly what they expected and they were fine paying what they did for it. The same will be said of the higher end Skylake-X CPU's. Even if they cannot justify having that many threads at that high of a clock speed, if they are satisfied, it's good enough for me.

 

 

Also, to the rest of you that keep bringing AVX up, let me enlighten you on a very simple fact. Yes, AVX-512 in and of itself may not be widely adopted, but by simply having registers that large, it doubles the previous amount of bit ops/clock you can do on AVX1 and 2. It can do literally 2 AVX2 operations per clock, simply because the pipeline is 512-bits wide. EVEX allows this natively, so they don't even need to go back to re-code applications to do this. This means that anyone that uses any current AVX application, can see up to double the performance by using an AVX3 capable CPU alone. AVX512 also allows for 8x 52-bit FMA operations, per core, per clock. I wager that the 7980XE is capable of over 1 TFLOP of raw lifting power. This is not something we simply ignore, because "The average user doesn't need AVX". If the average user understood the amount of free performance going to waste by not using it, they'd probably push for AVX's adoption. Then again, the "average consumer" is far too afraid of heat, and their "pseudostable" overclocks will no longer hold. 

 

I try not to rant like this, but this thread is something else.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, done12many2 said:

 

I build some pretty competitive rigs for my own personal use and for friends.  I've yet to use the 40 PCIe lanes on my 5960x and I'm kinda nutty about high end shit.  I think most will find that 16, 28 or even 44 CPU PCIe lanes is plenty when you have a chipset that supports many more lane even if it's routed through a DMI. 

 

You are right though, it's always nice to have more of something.  This argument applies to more then just CPU PCIe lanes.  

it is more that 28 may cut it close in some scenarios

2 8x GPU, 2 NVME, 10Gbpe would be 28 (assuming all is running off the CPU)

a 3 8x GPU set up would eat 24 just for that leaving only 4x for everything else and a extra 4x on the chipset (bandwidth wise), so in this case adding NVME drives and/or 10Gbpe could run you out of I/O lanes.

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, done12many2 said:

JayzTwoCents really liked his R7 content creation setup.  Out of curiosity, he compared video editing with a 5960x (2+ year old Intel 8 core) and a Ryzen 1800X both clocked at 4 GHz.  Content creation is one of Ryzen's strong points and for the money, it did pretty well.

 

End result, at the same clock speed, the 5960x was roughly 9% faster.  When he pushed the Intel chip to 4.5 GHz, that rose to roughly 15% faster in the video workload he was using.  Obviously, he would have pushed the Ryzen 1800X to 4.5 GHz as well, but he didn't have that option.

 

Oh I need to mention that Jay used two Maxwell Titans in the Ryzen build, and GTX 1080's in the 5960X.
Puget Systems found the GTX 1080 in Premiere Pro, and Adobe suit sped up workloads by roughly 10% faster at 4K than the Maxwell Titans.

 

Quote

4K Single GPU
The GTX 1080, on the other hand, was about 2% faster than the GTX 980Ti and about 10.5% faster than the Titan X.

 

4K Dual GPUs
 For whatever reason, the GTX 1080 was actually a bit slower ( than 2x1070) clocking in at about 2.5% faster than a GTX 980Ti and about 12% faster than a GTX Titan X.

 

Interestingly the 980Ti does better in Adobe Premiere Pro than the Titan X Maxwell did.

 

https://www.pugetsystems.com/labs/articles/GTX-1070-and-GTX-1080-Premiere-Pro-Performance-810/

Quote

One last thing we want to point out is that, while this article is primarily about looking at the performance of the GTX 1070 and 1080, one thing we did find was that dual GPU configurations can often work really well for Premier Pro. We didn't see much of a gain when exporting to 4K (only 2.5 ( 2.5% being 1070)-10% better performance), but exporting to 1080p and rendering previews was anywhere from 20% to 50% faster with two video cards versus just one.

 


So one could argue if they both used 1080's, they'd perform the same, but as you mentioned the Overclock on the 5960X is that steals the show in the end.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Valentyn said:

Puget Systems found the GTX 1080 in Premiere Pro, and Adobe suit sped up workloads by roughly 5-10% faster at 4K. 

The Titan Xs were heavily overclocked and watercooled tho... I think they were at 1400-1500MHz and the 1080s were at stock and air cooled, so the Titan Xs couldn't have caused this performance difference ;)

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, PCGuy_5960 said:

The Titan Xs were heavily overclocked and watercooled tho... I think they were at 1400-1500MHz and the 1080s were at stock and air cooled, so the Titan Xs couldn't have caused this performance difference ;)

He never mentioned the Titans are overclocked in his build log of the system.
 

And yes, considering Puget state the big difference for rendering encoding, it can make a difference of 12% difference in performance in Premiere Pro from Dual Titan X vs 1080. That's a performance difference.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

I do hope Puget Systems redoes that testing after the 12c X299 and Threadripper drop.  Adobe actually pushed a fairly significant updater after Ryzen launch that supposedly helped Multi-core across the board, but no one else really was testing that.

 

http://www.tweaktown.com/reviews/8225/intel-core-i9-7900x-series-skylake-cpu-review/index7.html

 

Someone dropped some 720p Gaming benchmarks! Haha.  Though the graph does show the importance of the Single-Core turbo being as high as possible for DX11 gaming.  And Rise of the Tomb Raider still hates every other CPU that isn't a 7700k, haha.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Valentyn said:

He never mentioned the Titans are overclocked in his build log of the system.

I am pretty sure that they were.... He's Jayztwocents after all xD

1 minute ago, Taf the Ghost said:

http://www.tweaktown.com/reviews/8225/intel-core-i9-7900x-series-skylake-cpu-review/index7.html

 

Someone dropped some 720p Gaming benchmarks! Haha.  Though the graph does show the importance of the Single-Core turbo being as high as possible for DX11 gaming.  And Rise of the Tomb Raider still hates every other CPU that isn't a 7700k, haha.

Link to the benchmark charts? I refuse to disable Adblock xD

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PCGuy_5960 said:

I am pretty sure that they were.... He's Jayztwocents after all xD

Link to the benchmark charts? I refuse to disable Adblock xD

8225_39_intel-core-i9-7900x-series-skyla

8225_40_intel-core-i9-7900x-series-skyla8225_41_intel-core-i9-7900x-series-skyla

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PCGuy_5960 said:

I am pretty sure that they were.... He's Jayztwocents after all xD

Link to the benchmark charts? I refuse to disable Adblock xD

Doesn't look like it, just scrubbed through all this videos on it.

Also I noticed his Ryzen system only had 16GB of system RAM, vs 32GB on the 5960X, that could also affect performance for 4K video.

I know if I work on a small 4K project that Premiere Pro will use a minimum of 20GB.

Although Puget are a little overboard recommending at least 64GB for 4K workloads. :P

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

The "Intel Optimized" bit is due to a late BIOS update that enabled Turbo Boost 3.0 and automatically put the RAM to 2666.  Intel doesn't consider that to be Overclocking but "Optimized".  Make of the marketing speak what you will.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Valentyn said:

Oh I need to mention that Jay used two Maxwell Titans in the Ryzen build, and GTX 1080's in the 5960X.
Puget Systems found the GTX 1080 in Premiere Pro, and Adobe suit sped up workloads by roughly 10% faster at 4K than the Maxwell Titans.

 

 

Interestingly the 980Ti does better in Adobe Premiere Pro than the Titan X Maxwell did.

 

https://www.pugetsystems.com/labs/articles/GTX-1070-and-GTX-1080-Premiere-Pro-Performance-810/


So one could argue if they both used 1080's, they'd perform the same, but as you mentioned the Overclock on the 5960X is that steals the show in the end.

 

That's a nice observation, but I think most can see that these were not heavily GPU taxing workloads.  But your point stands.

 

He also pointed out that the 5960x rig was an old Windows install filled with stuff and that the Ryzen was a fresh install.

 

The CPU overclock further demonstrated that the workload was CPU intensive as doing so on the 5960x resulted in an additional increase in performance.  

 

One thing I do know with confidence is that I'm willing to demonstrate the difference in performance with anyone who has a R7 and is willing to do so.

 

Lastly, I think anyone who watches the video will understand that Jaye's intent was to showcase the CPUs and he felt the setups were comparable.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, PCGuy_5960 said:

I am pretty sure that they were.... He's Jayztwocents after all xD

Link to the benchmark charts? I refuse to disable Adblock xD

Here are all the benchmarks from it.

8225_25_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_26_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_27_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_28_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_29_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_30_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_31_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_32_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_33_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_34_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_35_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_36_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_37_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_38_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_39_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_40_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_41_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_42_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_43_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_44_intel-core-i9-7900x-series-skylake-cpu-review.png

8225_45_intel-core-i9-7900x-series-skylake-cpu-review.png

Link to comment
Share on other sites

Link to post
Share on other sites

For as pretty as Rise of the Tomb Raider is, on PC that game is an atrocity for benchmarking. It's unstable, run to run, on the same hardware.  The 6950X's minimum actually improved by 32% going from 1080p to 1440p.  It does things like that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, OriAr said:

8225_45_intel-core-i9-7900x-series-skylake-cpu-review.png

looks like the tables have turned. i remember people always pointing out how little juice intel cpus used. :P

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, done12many2 said:

 

That's a nice observation, but I think most can see that these were not heavily GPU taxing workloads.  But your point stands.

 

He also pointed out that the 5960x rig was an old Windows install filled with stuff and that the Ryzen was a fresh install.

 

The overclock further demonstrated that the workload was CPU intensive as doing so on the 5960x resulted in an additional increase in performance.  

 

One thing I do know with confidence is that I'm willing to demonstrate the difference in performance with anyone who has a R7 and is willing to do so.

 

 

Yeah, I noticed Jay also poinst out his 1800X system only had 16GB of RAM vs 32GB on the 5960X. I know when I do small 4K projects it uses at least 20GB on my system.
Although since we can't download and test his workflow for benchmarking it's not possible to know just how much it used.

 

Wish Jay had a review site, it's nice when some companies and people in the video industry actually upload their "tests" so a person can download and compare it with your system, and their results.

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, TOMPPIX said:

looks like the tables have turned. i remember people always pointing out how little juice intel cpus used. :P

The i9-7980XE jokes involving Nuclear Reactors are going to be a regular thing.

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, PCGuy_5960 said:

But, the spreadsheet doesn't show overclocked performance, a 4.7GHz 7820X would be at least 30% better than a 4GHz Ryzen :D

You can't really blame me for the contents of the my spreadsheet when it is created using data you provided. :P 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Valentyn said:

Yeah, I noticed Jay also poinst out his 1800X system only had 16GB of RAM vs 32GB on the 5960X. I know when I do small 4K projects it uses at least 20GB on my system.
Although since we can't download and test his workflow for benchmarking it's not possible to know just how much it used.

 

Wish Jay had a review site, it's nice when some companies and people in the video work actually upload their "tests" so a person can download and compare it with your system, and their results.

 

Agreed, it would be nice to see the "what does this mean to me?" for yourself.  

 

Anyways, it's pretty cool that a 2.5 year old CPU more than keeps up with latest gen chips in serious workloads.  

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, done12many2 said:

Now you and I both know that @tom_w141 prefers shaping the outcome to make a point. :)

Ironic when I didn't provide the dataset used for that spreadsheet? I literally just fed in the data from @PCGuy_5960's graphs so you can't accuse me of shaping anything :P 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, tom_w141 said:

You can't really blame me for the contents of the my spreadsheet when it is created using data you provided. :P 

Yeah, I am simply saying that if you overclock both, the 7820X will destroy the 1800X :D

(And this is why I have decided to get a 7820X or a 7800X :P)

CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro X Superlight (main), Logitech G Pro Wireless, Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×