Jump to content

Intel claims Core i9-13900K will be 11% faster on average than AMD Ryzen 9 7950X in gaming

Thats Bait Tom Hardy GIF - Thats Bait Tom Hardy Max Rockatansky - Discover  & Share GIFs

 

The gaming power scaling looks good but my god what is this
 

Power_Scaling_AMD.png

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, KaitouX said:

ComputerBase has a decent amount of tests on the 13900K and 13700K at lower power limits, derBauer has a video on it where he shows CB, 3DMark CPU test and games at lower power limits and some with undervolting on the 13900K. HUB did a Cinebench power curve comparison between the 7950X and 13900K in their review.

Ok, as much as I can't watch HUB at least they put the data on a web page. The below data is what I was after, presented pretty badly. Still, it is showing quite a big efficiency advantage for AMD. Would be interesting to see how this goes in other workloads but I wouldn't expect anything that different.

 

Intel 7 never seemed as good as TSMC N7, and even with the improvements they claim for Raptor Lake, it is no match for AMD's optimised N5 at TSMC. 

 

image.png.f3369284a164810566c4c14561a556e3.png

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

Did they? If you got good evidence please share it. I would point out again that "stock" testing will be performed with officially supported hardware, which may be seen as lacking compared to enthusiast choices which go beyond standards.

 

I would give more caution to that until we have a better picture. The questionable narrative some sites give is to suggest it is a >300W CPU generally, but I'd like more direct comparisons across wider workloads. The ideal comparison would be someone taking 7950X vs 13900k and do a perf/W curve for both on varied workloads. That would give a true picture of their competitiveness. Again, power limit on Intel is a system builder's choice. Let's see what it does at lower power limits also. Like you don't have to drive your car on the red line all the time.

If you look at the launch benchmarks They used DDR5 for their CPUS and they used slower cheaper ram for the AMD. Same with the motherboards. Linus and other youtubers I think have talked about this issue. Another thing is they're comparing their CPUs against a CPU that wasn't even designed to compete with the 13th gen.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Noble3212 said:

If you look at the launch benchmarks They used DDR5 for their CPUS and they used slower cheaper ram for the AMD. Same with the motherboards. Linus and other youtubers I think have talked about this issue. Another thing is they're comparing their CPUs against a CPU that wasn't even designed to compete with the 13th gen.

Raptor lake officially supports 5600, where Zen 4 officially supports 5200. You can't test with faster ram and call it stock. By all means test at faster speeds for additional information, but keep in mind it is non-standard. We had the same problem around Zen 3 if memory serves me correctly. AMD only officially supports 3200, but at the same time they suggest 3600 is a sweet spot to aim for. So AMD fanboys got upset some sites used 3200 for test and saw it as intentionally crippling its performance. No, they're testing it exactly as AMD intended. 3600 operation is not guaranteed at all.

 

I don't know about the mobo situation, but mobo performance reviews have died off since there is practically zero difference between boards unless you have a really bad one.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Life in Zen 3 is good.

Case: Lian Li O11-dynamic mini | CPU: AMD Ryzen 5 5600X | GPU: AMD Radeon RX6800 | Motherboard: ASUS ROG Strix B550 E-Gaming | Memory: 32Gb 3600Mhz G. SKILL Trident Z | PSU: Corsair SF750 Platinum | Cooling: Lian Li Galahad AIO 240 | Case fans: Lian Li Unifans

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shzzit said:

What exactly did I make up here ?  
 

And lol at ur passive aggressive crying.

Here?

2 hours ago, Shzzit said:

That’s exactly what they did, pure click bait bs lol.   

You're misrepresenting GN pretty bad, even suggesting they're basically just a click-bait channel.

 

If you have a gripe, great, explain and give examples to support it, but don't act like you're being impartial or objective in anyway.

 

P.S. your remarks are more argumentative and insulting than any of those responding to you, so quit projecting.

Parasoshill

adjective

  • A person whose parasocial relationship with a social media influencer or content creator has driven them to promote or blindly defend them, acting as a shill for their benefit.
Link to comment
Share on other sites

Link to post
Share on other sites

Are there any z790 motherboards that can have x16 4.0 for gpu with x4 4.0 for two m.2 drives?

 

Reading these m.2 dive options on motherboard specs is a nightmare. 
 

Most I see will turn gpu to x8.

 

4 hours ago, WildDagwood said:

Here?

You're misrepresenting GN pretty bad, even suggesting they're basically just a click-bait channel.

 

If you have a gripe, great, explain and give examples to support it, but don't act like you're being impartial or objective in anyway.

 

P.S. your remarks are more argumentative and insulting than any of those responding to you, so quit projecting.

Aww I get it, only group think in here ? No one can have an opinion that they don’t like gamers nexus?  Did I shade ur favorite YouTuber?  
 

Grow up lol.  
 

For one if the cpu was at intel stock settings then it wouldent go over 252 watts.  If it’s hitting 300 plus watts then it’s motherboard has changed it’s settings to u limited power.


 

And a thumbnail with a dumb face and giant words that make you think it uses massive power to do anything.

 

Then you watch them video and it’s the opposite.

 

Yoou don’t like my opinion that’s fine, telling me I can’t have one and should stop talking is not ok. 

My grip was clear, I think gamers nexus issues clickbait titles and thumb nails.

So again not sure what the hell ur crying about.

 

 

 

Edited by SansVarnic
Merged - Please learn to use the Multi-quote function.

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Shzzit said:

For one if the cpu was at intel stock settings then it wouldent go over 252 watts.

Not exactly true. Default PL2 calculation is PL1 x 1.25 = PL2 and this was the standard on previous generations. This means that if PL1 = 125W, PL2 would be 156W. Intel themselves do not follow this formula as it is merely a recommendation and can vary based on thermal solutions used. Stock settings since Alder Lake have been PL1 = PL2 with stipulations that board vendors are now allowed to decide PL2 power limitations as a default. Default Tau duration is 10ms which allows the CPU to boost beyond the PL2 limits assuming thermal and power overhead exist. This is all defined in the whitesheets under section 4.2. If you look at section 4.2.2, you can even see where Intel is ignoring the formula defined in section 4.2 by running PL2 at 241W instead of 156W. Attached the whitesheet for context.

 

23 minutes ago, Shzzit said:

If it’s hitting 300 plus watts then it’s motherboard has changed it’s settings to u limited power.

True, however this is now allowed by Intel per their whitesheet and is considered an intended design now. Personally I am against it, but they claim their boost techniques were designed to take advantage of the overhead so as long as it doesn't outright kill processors... Since Intel is defining what is acceptable with their processors, we can't fault the board partners for operating within that defined spec, even if it exceeds Intel's advertised paper specs. If they didn't want to allow it, they'd define such in the whitesheets and impose microcode limitations like they did back in the day when we were overclocking locked Skylake chips.

 

Also, step back and chill for a moment. The person you were talking to wasn't out to insult or offend you, they merely took issue with the way you went about your critique. There is a difference between being objective and being emotional with ones opinion. 

220311547_Intel12thGenWhitesheet.pdf

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Shzzit said:

Aww I get it, only group think in here ? No one can have an opinion that they don’t like gamers nexus?  Did I shade ur favorite YouTuber?  
 

Grow up lol.  
 

For one if the cpu was at intel stock settings then it wouldent go over 252 watts.  If it’s hitting 300 plus watts then it’s motherboard has changed it’s settings to u limited power.


 

And a thumbnail with a dumb face and giant words that make you think it uses massive power to do anything.

 

Tjen you watch them video and it’s the opposite.

 

Yoou don’t like my opinion that’s fine, telling me I can’t have one and should stop talking is not ok. 

My grip was grip was clear, I think gamers nexus issues clickbait titles and thumb nails.

So again not sure what the hell ur crying about.

 

 

 

Do you always get this defensive over constructive criticism, viewing it as some kind of attack? Or are you just incapable of having a conversation without dishing out insults to someone who doesn't necessarily agree with you?

 

Regardless of the thumbnail, anyone going into the video thinking 300W is the normal power draw is a little naive, in my opinion. The first sentence in the video even clarifies that it's under an all-core workload. So even if you want to say the title is a bit click-baity (which I can agree with), it's nothing to how you're presenting it.

 

Don't think anyone ever said you can't have an opinion. How you go about giving it is definitely deserving of criticism though, so I'm "not sure what the hell ur crying about".

 

Think you need to take a peak in the mirror and look at how you approach conversations before you tell anyone to grow up.

 

Probably tap out here though cause you honestly seem a little toxic. Not really intended as an insult but take it for what it is, an opinion based on how you present yourself.

Parasoshill

adjective

  • A person whose parasocial relationship with a social media influencer or content creator has driven them to promote or blindly defend them, acting as a shill for their benefit.
Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, GoodBytes said:

It all comes down to your actual needs.

For gaming unless you're running a 4090 the top end framerate difference is marginal at best, and AMD comes out on top more often than not regarding 0.01% and 0.1% lows. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ravenshrike said:

For gaming unless you're running a 4090 the top end framerate difference is marginal at best, and AMD comes out on top more often than not regarding 0.01% and 0.1% lows. 

I think we reached a point in general where the CPU isn't going to matter unless you are trying to push 1080p at 500Hz on that new ASUS panel and don't want to be bottlenecked by the CPU feeding that frame data, lol.

 

For most people, the compromise is going to be a nice resolution and a nice refresh rate. 1440p 165 or 4k 120 is going to be fine even on mid-range processors. I'd even go as far to say that most previous generation processors from the past 2-3 years will be fine at those resolutions. Even now, we are reaching a point where display technology is starting to bottleneck the power offered by GPU's, and these halo tier cards are going to be pointless unless new technologies (akin to raytracing) come out and magically start to tax our hardware again.

 

Odd to see people splitting hairs over which CPU is faster when both are beyond what 99% of people are going to need.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, MageTank said:

I think we reached a point in general where the CPU isn't going to matter unless you are trying to push 1080p at 500Hz on that new ASUS panel and don't want to be bottlenecked by the CPU feeding that frame data, lol.

 

For most people, the compromise is going to be a nice resolution and a nice refresh rate. 1440p 165 or 4k 120 is going to be fine even on mid-range processors. I'd even go as far to say that most previous generation processors from the past 2-3 years will be fine at those resolutions. Even now, we are reaching a point where display technology is starting to bottleneck the power offered by GPU's, and these halo tier cards are going to be pointless unless new technologies (akin to raytracing) come out and magically start to tax our hardware again.

 

Odd to see people splitting hairs over which CPU is faster when both are beyond what 99% of people are going to need.

I think we just suddenly jumped into a big diminishing returns. 

 

You need RTX 4090 at 1080p/1440p to expose CPU overhead this marginal... at the same time there are very few people with 4090 that will play at 1080p and at 4k the CPU basically doesn't matter (newish one). 

 

So there's just a kind of a hole at 1440p where all of this can make sense depending on what you play. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, WereCat said:

I think we just suddenly jumped into a big diminishing returns. 

 

You need RTX 4090 at 1080p/1440p to expose CPU overhead this marginal... at the same time there are very few people with 4090 that will play at 1080p and at 4k the CPU basically doesn't matter (newish one). 

 

So there's just a kind of a hole at 1440p where all of this can make sense depending on what you play. 

This is flawed logic though. The Ryzen 3600 won't bottleneck a 4090 at 4k.... BUT your 1% lows are going to be trash still....

 

Let alone just the general computing performance. My Ryzen 7700x just loads INSTANTLY. Even compared to my M1 Mac, my previous 12600k system etc. Idk what they did or if it's just the clock speed, boost behavior etc. But it makes a huge difference. It's just WICKED fast. 

 

The 13900k looks amazing too. The 1% lows aren't much worse than the 5800x3d. There is so much more to the equation that if it'll just bottleneck a gpu at x resolution 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, WereCat said:

I think we just suddenly jumped into a big diminishing returns. 

 

You need RTX 4090 at 1080p/1440p to expose CPU overhead this marginal... at the same time there are very few people with 4090 that will play at 1080p and at 4k the CPU basically doesn't matter (newish one). 

 

So there's just a kind of a hole at 1440p where all of this can make sense depending on what you play. 

Then you have people like me. Buys a 12GB 3080, 4k 120hz OLED TV, 5950X and pushes an all-core OC of 4.7Ghz only to play 15 year old MMO's and 20 year old shooters, lol. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

Ok, as much as I can't watch HUB at least they put the data on a web page. The below data is what I was after, presented pretty badly. Still, it is showing quite a big efficiency advantage for AMD. Would be interesting to see how this goes in other workloads but I wouldn't expect anything that different.

 

Intel 7 never seemed as good as TSMC N7, and even with the improvements they claim for Raptor Lake, it is no match for AMD's optimised N5 at TSMC. 

 

image.png.f3369284a164810566c4c14561a556e3.png

Important to note that this Techspot/HUB chart goes against to what Intel themselves claimed and that ComputerBase confirmed, that 13900K@65W is almost on par with 12900K@Stock on average, Intel used SPEC, ComputerBase used their own test suite. DerBauer CB R20 results are close to ComputerBase ones too.

 

Spoiler

CB R20

image.png.fc8f4ea2f7d0b927c3a05605c586003c.png

image.thumb.png.30c081bad98966a3725b358d7e691deb.png

 

CB R23

image.png.fa00cdd5893965626b52c7542e3c8667.png

You can find the complete ComputerBase charts here: https://www.computerbase.de/2022-10/intel-core-i9-13900k-i7-13700-i5-13600k-test/2/#abschnitt_leistung_in_apps_bei_reduzierter_tdp

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, CHICKSLAYA said:

This is flawed logic though. The Ryzen 3600 won't bottleneck a 4090 at 4k.... BUT your 1% lows are going to be trash still....

I am going to need you to elaborate on this. I've tested many processors in my time (R5 3600 included), I have never noticed terrible 1% lows in both our synthetic or gaming suites when operating at 4k. What exactly constitutes the 1% lows being "trash"? 1% lows at 1080p and 4k should be nearly identical on the processor, you aren't magically handling more draw calls just because you increased the resolution.

 

5 minutes ago, CHICKSLAYA said:

Let alone just the general computing performance. My Ryzen 7700x just loads INSTANTLY. Even compared to my M1 Mac, my previous 12600k system etc. Idk what they did or if it's just the clock speed, boost behavior etc. But it makes a huge difference. It's just WICKED fast. 

It doesn't feel any different to me, and I've tested every AM5 processor currently available. Just feels like every other CPU of the past few years. Is there a specific task you are performing where a change in the responsiveness was noticed? I'd like to test that and get a better understanding of why that is. I'd have to assume whatever it is is heavily cache-based as memory wise, I am still considering AM5 a step backwards from AM4 when it comes to overall memory latency performance, mostly due to this new unstrapped FCLK/UCLK/MEMCLK configuration and the minimum timing limitations imposed by DDR5 in general. Admittedly I still don't have all of DDR5 figured out for AMD but am not yet impressed. The larger cache on AM5 is certainly a welcomed change and I'd like to see this trend continue in future designs.

 

8 minutes ago, CHICKSLAYA said:

There is so much more to the equation that if it'll just bottleneck a gpu at x resolution 

I still beg to differ, mostly because I am not seeing what you are seeing when it comes to poor 1% low performance at higher resolutions. Understand, if you are buying a high-end GPU and monitor to pair with it, you'll have VRR technologies to help mask the smaller deviations in framerate. Granted, if the dip is jarring enough to break the VRR window, you'll certainly notice it, but if you are dipping from a 115 frame cap on a 120hz display down to 80-90, you won't really notice that, at least most people wouldn't.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, MageTank said:

Then you have people like me. Buys a 12GB 3080, 4k 120hz OLED TV, 5950X and pushes an all-core OC of 4.7Ghz only to play 15 year old MMO's and 20 year old shooters, lol. 

We're probably in the same boat then. I upgraded from 1080ti to 3060ti to play Planescape Torment and FFXIV. 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, MageTank said:

Odd to see people splitting hairs over which CPU is faster when both are beyond what 99% of people are going to need.

Thank you!

 

The only thing I am seeing here is one CPU for which compatible parts are too expensive to buy, and one CPU that fries itself up being so desperate to stay ahead.

Case: Lian Li O11-dynamic mini | CPU: AMD Ryzen 5 5600X | GPU: AMD Radeon RX6800 | Motherboard: ASUS ROG Strix B550 E-Gaming | Memory: 32Gb 3600Mhz G. SKILL Trident Z | PSU: Corsair SF750 Platinum | Cooling: Lian Li Galahad AIO 240 | Case fans: Lian Li Unifans

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ravenshrike said:

For gaming unless you're running a 4090 the top end framerate difference is marginal at best, and AMD comes out on top more often than not regarding 0.01% and 0.1% lows. 

Oh nice ,which benchmarks did you see.  All the ones Iv seen show 13900k lows in the lead.

 

Not sure if I want a 7700 or an i7 , maybe wait for 3d. 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Shzzit said:

Oh nice ,which benchmarks did you see.  All the ones Iv seen show 13900k lows in the lead.

 

Not sure if I want a 7700 or an i7 , maybe wait for 3d. 

I am essentially renting my 7700x for $100 until the 7800x3d comes out and I sell it for $300

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, CHICKSLAYA said:

I am essentially renting my 7700x for $100 until the 7800x3d comes out and I sell it for $300

Hahah nice mate, you think they will do 3d on a 16 core?  I heard they can only do it on single chiplet 8core.  Not sure if true.  Or maybe it cost to much? 

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Shzzit said:

Hahah nice mate, you think they will do 3d on a 16 core?  I heard they can only do it on single chiplet 8core.  Not sure if true.  Or maybe it cost to much? 

There's gonna be a 7950x3d and 7800x3d for sure. Also rumored to be a perhaps a 7900x3d. All 3 are gonna be juicy. Apparently they fixed the voltage issues with second gen 3d cache so you will be able to run PBO etc unlike the 5800x3d. If they can keep the clock speed the same it'll be absolutely naughty

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Shimmy Gummi said:

The problem I'm facing is that while this is all good on paper, I've yet to find a compelling reason to upgrade even though all the fibers of my being scream at me to spend money.

To be fair the only reason i'm on my 10900k from my 8700k is cause i got the whole PC cheap off my cousin...and he let me pay for it monthly and no..i don't feel the need to upgrade from the 10900k either....no matter how hard GN's Steve twists the knife in on CPU reviews XD

 With all the Trolls, Try Hards, Noobs and Weirdos around here you'd think i'd find SOMEWHERE to fit in!

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, SimplyChunk said:

To be fair the only reason i'm on my 10900k from my 8700k is cause i got the whole PC cheap off my cousin...and he let me pay for it monthly and no..i don't feel the need to upgrade from the 10900k either....no matter how hard GN's Steve twists the knife in on CPU reviews XD

We say this now

 

but wait until there's a sale or other promo

 

it will be just like the 10900k all over again LOL

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Shimmy Gummi said:

We say this now

 

but wait until there's a sale or other promo

 

it will be just like the 10900k all over again LOL

the way i see it is if my CPU is still in GN's benchmarks as a comparison then it'll be alright to skip the chip that's being reviewed 

 With all the Trolls, Try Hards, Noobs and Weirdos around here you'd think i'd find SOMEWHERE to fit in!

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×