Jump to content

Thinking of switching to 7800X3D for gaming form 13900K to reduce heat in case and power and get better gaming performance

I am thinking of getting a Ryzen 7800X3D as I mostly only game and want better power consumption and less heat put into case as 13900K generates so much power and heat even gaming and e-cores off with only 8 P cores at 5.6GHz.

Is the AMD frame Dip thing real or just FUD??

And is the frame times as good on 7800X3D as 13900K tuned DDR4 and e-cores off.

I have read things about 7800X3D having better highs, but worse frame times and not as smooth. Is that true or no?

 

https://www.youtube.com/watch?v=B31PwSpClk8&t=729s

And if the AMD Drip thing is true, can it be mitigated by a manual static overclock using an external clock generator motherboard as maybe the dynamic behavior of AMD CPUs cause it?

And if I do manual overclock, could I realistically hit 5GHz mostly stable all core on a 7800X3D on an NH-D15S.

Is the extra cache that good and do almost all games benefit greatly from it.

I imagine even a manual overclocked 7800X3D to 5GHz or close would still produce far less power gaming than a 5.6GHz to 5.7GHz 5GHz ring 13900K even e-cores off?

I just wonder if there would be any frame time inconsistency or performance regression in any gaming scenarios except of course CS Go or super heavy threaded Battlefield games. Of course I know productivity will be far less, but that is far secondary to getting lower temps and as good or better gaming performance at 4K with RTX 4090.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Wolverine2349 said:

I am thinking of getting a Ryzen 7800X3D as I mostly only game and want better power consumption and less heat put into case as 13900K generates so much power and heat even gaming and e-cores off with only 8 P cores at 5.6GHz.

Is the AMD frame Dip thing real or just FUD??

And is the frame times as good on 7800X3D as 13900K tuned DDR4 and e-cores off.

I have read things about 7800X3D having better highs, but worse frame times and not as smooth. Is that true or no?

 

https://www.youtube.com/watch?v=B31PwSpClk8&t=729s

And if the AMD Drip thing is true, can it be mitigated by a manual static overclock using an external clock generator motherboard as maybe the dynamic behavior of AMD CPUs cause it?

And if I do manual overclock, could I realistically hit 5GHz mostly stable all core on a 7800X3D on an NH-D15S.

Is the extra cache that good and do almost all games benefit greatly from it.

I imagine even a manual overclocked 7800X3D to 5GHz or close would still produce far less power gaming than a 5.6GHz to 5.7GHz 5GHz ring 13900K even e-cores off?

I just wonder if there would be any frame time inconsistency or performance regression in any gaming scenarios except of course CS Go or super heavy threaded Battlefield games. Of course I know productivity will be far less, but that is far secondary to getting lower temps and as good or better gaming performance at 4K with RTX 4090.

The 7800x3d is a dream to use. Its not a huge upgrade vs a 13900k though so it might be more trouble than it's worth

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Deadpool2onBlu-Ray said:

The 7800x3d is a dream to use. Its not a huge upgrade vs a 13900k though so it might be more trouble than it's worth

 

 

You mean not a huge gaming upgrade? Is that true even using Samsung BDie DDR4 with 13900K as I do not use DDR5 on 13th gen due to nothing but troubles. 

 

And you mean just in performance. Would it be a big upgrade in terms of better thermals and less power/heat generated inside the case. 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Wolverine2349 said:

I am thinking of getting a Ryzen 7800X3D as I mostly only game and want better power consumption and less heat put into case as 13900K generates so much power and heat even gaming and e-cores off with only 8 P cores at 5.6GHz.

Is the AMD frame Dip thing real or just FUD??

And is the frame times as good on 7800X3D as 13900K tuned DDR4 and e-cores off.

I have read things about 7800X3D having better highs, but worse frame times and not as smooth. Is that true or no?

 

https://www.youtube.com/watch?v=B31PwSpClk8&t=729s

And if the AMD Drip thing is true, can it be mitigated by a manual static overclock using an external clock generator motherboard as maybe the dynamic behavior of AMD CPUs cause it?

And if I do manual overclock, could I realistically hit 5GHz mostly stable all core on a 7800X3D on an NH-D15S.

Is the extra cache that good and do almost all games benefit greatly from it.

I imagine even a manual overclocked 7800X3D to 5GHz or close would still produce far less power gaming than a 5.6GHz to 5.7GHz 5GHz ring 13900K even e-cores off?

I just wonder if there would be any frame time inconsistency or performance regression in any gaming scenarios except of course CS Go or super heavy threaded Battlefield games. Of course I know productivity will be far less, but that is far secondary to getting lower temps and as good or better gaming performance at 4K with RTX 4090.

Why don't you simply drop a 13600k instead? 

System : AMD R9 5900X / Gigabyte X570 AORUS PRO/ 2x16GB Corsair Vengeance 3600CL18 ASUS TUF Gaming AMD Radeon RX 7900 XTX OC Edition GPU/ Phanteks P600S case /  Eisbaer 280mm AIO (with 2xArctic P14 fans) / 2TB Crucial T500  NVme + 2TB WD SN850 NVme + 4TB Toshiba X300 HDD drives/ Corsair RM850x PSU/  Alienware AW3420DW 34" 120Hz 3440x1440p monitor / Logitech G915TKL keyboard (wireless) / Logitech G PRO X Superlight mouse / Audeze Maxwell headphones

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Wolverine2349 said:

Would it be a big upgrade in terms of better thermals and less power/heat generated inside the case. 

In gaming scenarios, you'll lower your power use by around 70W. If you consider that your 4090 can output 450, I don't think it matters that much. Depending how the airflow in your PC is, they are somewhat separate anyway. If you want your PC to use less power (i.e. output less heat), you can just power limit the components. Just tell your GPU to only use 350W and your CPU to stick to 100 or 80. And if you are happy with the performance stick with it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, PDifolco said:

Why don't you simply drop a 13600k instead? 

 

 

Do not want only 6 P cores and I hate e-cores and the hybird arch crap. 

 

And 7800X3D should murder that.

 

I have 13900K with e-cores disabled. If Intel had an 8 P core only chip on current architecture of Raptor Cove, I would have gone with that. But all their chips have e-cores. And I want lower power plus I feel kind of a waste to disable parts of CPU especially when a lower powe/heat generating 8 core great choice exists that provides as good or better gaming perf.

 

Just need the right mobo that has no coil whine as one AMD Asus Z670E-E Strix had it at idle where as no Intel boards did. Never tried another AMD AM5 board though. 

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Wolverine2349 said:

I am thinking of getting a Ryzen 7800X3D as I mostly only game and want better power consumption and less heat put into case as 13900K generates so much power and heat even gaming and e-cores off with only 8 P cores at 5.6GHz.

Is the AMD frame Dip thing real or just FUD??

And is the frame times as good on 7800X3D as 13900K tuned DDR4 and e-cores off.

I have read things about 7800X3D having better highs, but worse frame times and not as smooth. Is that true or no?

 

https://www.youtube.com/watch?v=B31PwSpClk8&t=729s

And if the AMD Drip thing is true, can it be mitigated by a manual static overclock using an external clock generator motherboard as maybe the dynamic behavior of AMD CPUs cause it?

And if I do manual overclock, could I realistically hit 5GHz mostly stable all core on a 7800X3D on an NH-D15S.

Is the extra cache that good and do almost all games benefit greatly from it.

I imagine even a manual overclocked 7800X3D to 5GHz or close would still produce far less power gaming than a 5.6GHz to 5.7GHz 5GHz ring 13900K even e-cores off?

I just wonder if there would be any frame time inconsistency or performance regression in any gaming scenarios except of course CS Go or super heavy threaded Battlefield games. Of course I know productivity will be far less, but that is far secondary to getting lower temps and as good or better gaming performance at 4K with RTX 4090.

13900k actually has an impressive amount of L2+L3 cache but I can understand the desire for thermodynamics. 

 

Comparably, I've done testing with running 8c/16t on my 7950x3D in both 3D and non 3D scenarios while also having prior experience running a 5800x3D with 3D v-cache that replaced a 3950x.

 

3D v-cache's biggest benefit in my opinion is the higher minimum framerates. I imagine some scenarios are DRAM latency limited where having the extra 64MB of L3 prevents the CPU from having to access system RAM to execute. I'd sometimes get higher peak framerates on the pseudo 7700x setup, but it would be noticeably choppy compared to the pseudo 7800x3D setup.

 

Really a secondary benefit is the drastically lower wattage, since it'll draw at most 150W (7950x3D in 8+0 configuration), 7800x3D might be TDP limited to around 120W comparably. I almost bought a 13900k instead but didn't simply for the reason that its boring. I otherwise have a perfect 1:1 capable CPU in testing 3D v-cache's efficacy. 

 

If you can break even by swapping, then sure. Otherwise I'd consider just waiting for Ryzen 8000 series instead since the 13900k is on par with the 7800x3D.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Agall said:

13900k actually has an impressive amount of L2+L3 cache but I can understand the desire for thermodynamics. 

 

Comparably, I've done testing with running 8c/16t on my 7950x3D in both 3D and non 3D scenarios while also having prior experience running a 5800x3D with 3D v-cache that replaced a 3950x.

 

3D v-cache's biggest benefit in my opinion is the higher minimum framerates. I imagine some scenarios are DRAM latency limited where having the extra 64MB of L3 prevents the CPU from having to access system RAM to execute. I'd sometimes get higher peak framerates on the pseudo 7700x setup, but it would be noticeably choppy compared to the pseudo 7800x3D setup.

 

Really a secondary benefit is the drastically lower wattage, since it'll draw at most 150W (7950x3D in 8+0 configuration), 7800x3D might be TDP limited to around 120W comparably. I almost bought a 13900k instead but didn't simply for the reason that its boring. I otherwise have a perfect 1:1 capable CPU in testing 3D v-cache's efficacy. 

 

If you can break even by swapping, then sure. Otherwise I'd consider just waiting for Ryzen 8000 series instead since the 13900k is on par with the 7800x3D.

 

 

Are the higher minimum framerates the same as 1% and 0.1% lows and frame times.

 

Like how would a 7800X3D do in that compared to a 13900K 5.7GHz all core e-cores disabled 4200MHz CL16 DDR4 Gear 1 RAM in frame times and 1% and 0.1% lows of most modern games?

 

I can probably break even by swapping as I have a KS model with P core SP 124 and e-core SP 97 and they are not available anywhere so probably worth a lot of money.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Wolverine2349 said:

 

 

Are the higher minimum framerates the same as 1% and 0.1% lows and frame times.

 

Like how would a 7800X3D do in that compared to a 13900K 5.7GHz all core e-cores disabled 4200MHz CL16 DDR4 Gear 1 RAM in frame times and 1% and 0.1% lows of most modern games?

 

I can probably break even by swapping as I have a KS model with P core SP 124 and e-core SP 97 and they are not available anywhere so probably worth a lot of money.

Most likely yes, my experience being my evidence rather than emperical metrics like 0.1% and 1% lows. Drastic shifts in latency even down to 1ms are things humans are capable of noticing, something I've had experience in for a long time going all the way back to 2014 when I had SLI GTX 780ti's and a DIY G-sync modded 1080p 144Hz display. Gave me all the evidence I needed to never do SLI again since the slight difference in input latency between cards made it nearly unplayable.

 

I've run at least 144Hz and variable refresh rate since 2014, 4K 240Hz for the last year, and its impressed me how much smoother of an experience it is. Most of that testing in Warframe, since its a very well optimized game with a variety of testing environments while also having extremely fast/intense gameplay. However I have to run the game with my constant overclock and an uncapped framerate, otherwise I'll experience those same slight hitches.

 

I'm really spoiled at this point and I run a fat overclock on my RTX 4090 with an unlocked framerate for the same reasons. Unfortunately, if the game isn't literal buttery smooth, its a subpar experience to me because of this. Its possible you wouldn't even notice otherwise and its the epidemy of a 1st world problem, but its a problem. Something I noticed as well testing the R5 7600 and 6750XT rig I built for my dad that I tested by just swapping towers with my main peripheral setup.

 

If your experience is already smooth by your expectations, then I wouldn't risk Murphy's Law and swap your whole platform for some lower wattage draw and potentially imperceivably better performance.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

This "upgrade" makes zero sense, not financially or otherwise, and *especially* not for maybe like 5c lower temps.

 

These threads always come off as brag threads, like yeah we 

get it "4k" "unlocked framerates".... yay... personally there's nothing i hate more because it is *never* smooth,  but then again I also don't chase numbers...

 

 

Imo go ahead change the whole platform for *potentially* very little gains, money seems to be of no concern. ¯\_(ツ)_/¯ 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Mark Kaine said:

This "upgrade" makes zero sense, not financially or otherwise, and *especially* not for maybe like 5c lower temps.

 

These threads always come off as brag threads, like yeah we 

get it "4k" "unlocked framerates".... yay... personally there's nothing i hate more because it is *never* smooth,  but then again I also don't chase numbers...

 

 

Imo go ahead change the whole platform for *potentially* very little gains, money seems to be of no concern. ¯\_(ツ)_/¯ 

Temperature is relative and, in my opinion, not relevant to this discussion. Half of the power draw from ~200W to 100W is however, especially in some regions where electricity is getting very expensive.

 

I'd believe you on the *never* smooth part if I hadn't experience genuinely smooth performance. It is possible but it costs nearly $5k total, including peripherals.

 

I'm not sure if telling everyone you're a computer hardware whale willing to spend absurd money on slight gains is 'bragging', its more indicative of financial irresponsibility in most scenarios.

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Agall said:

Temperature is relative and, in my opinion, not relevant to this discussion. Half of the power draw from ~200W to 100W is however, especially in some regions where electricity is getting very expensive.

 

I'd believe you on the *never* smooth part if I hadn't experience genuinely smooth performance. It is possible but it costs nearly $5k total, including peripherals.

 

 

Its also excess heat dumped into the case. Intel seems to be a lot worse in that regard and any workload takes CPU temps 80s and so some to the 90s.

 

Its not electricity costs. In a hot summer sitting by PC trying to keep room cooler even in AC house. If the performance is truly better or certainly no downgrade in gaming (yes that includes frame times and smoothness) may be worth the swap if I can break even in which the 13900KS P core SP 124 and E core SP 97 which is a top tier bin I could sell for a lot.

 

Have been trying to lower temps and its so hard especially on air, but I want to stay on air and keep 8 P cores at 5.6 to 5.7GHz.

Link to comment
Share on other sites

Link to post
Share on other sites

I only quickly skimmed the thread but I didn't see anyone suggesting just power limit the Intel chip. See what the gaming performance impact is.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Agall said:

I'd believe you on the *never* smooth part if I hadn't experience genuinely smooth performance. It is possible but it costs nearly $5k total, i

No, because unlocked  -- in 99% of all cases *is* horrible stuttery, and gsync don't help (it makes it worse imo)

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Mark Kaine said:

No, because unlocked  -- in 99% of all cases *is* horrible stuttery, and gsync don't help (it makes it worse imo)

G-sync is there for when it dips below maximum, but if you're shoving framerate down its throat, it'll always have an available frame. I personally prefer to overdrive 1.5-2x simply for this, but in games with extreme framerate variability, having it unlocked is a sure way of having the most available frame for the display when it refreshes, whether that's at maximum or variable. 

 

The issue you describe with G-sync I've never seen in the 5 different G-sync/freesync monitors I've had since 2014, first one of which was a pre-production DIY kit from Nvidia for an Asus VG248QE. One thing I learned back then and still do today is globally disable V-sync through the 3D settings in Nvidia Control Panel, something I recommend to anyone. Not sure if its still applicable in 2023, but at some point shortly into a new install/system, I find myself disabling it for various reasons.

 

Since its come out, G-sync has worked as advertised for me, only having some major issues in those pre-production days that they quickly ironed out before its proper release.

 

Ideally we don't derail the discussion about the two CPUs discussing VRR, that can be done in another post if any.

 

1 hour ago, porina said:

I only quickly skimmed the thread but I didn't see anyone suggesting just power limit the Intel chip. See what the gaming performance impact is.

I generally view the power consumption vs performance chart to be exponential (since performance is usually proportional to current and therefore current squared when discussing power) so there's a point where you're at an optimal power draw based on STP operating conditions for silicon. 

 

As I expected, its not a good idea, since there's otherwise a reason why the 13900k wants to draw 250W and would likely draw more if it thermally could. Same goes for the RTX 4090, I don't have to run it at 133% TDP but I've learned that it does result in overall smoother gameplay, but could get 'close enough' performance at even 80% TDP. The answer to your question being 105W.

image.png.e70e1b4bc3205938c0b5dbf1e9da7110.pngCore i9-13900K & Ryzen 9 7950X Scaling Performance: CPU Short Form - A Lighter Touch: Exploring CPU Power Scaling On Core i9-13900K and Ryzen 9 7950X (anandtech.com)

Ryzen 7950x3D Direct Die NH-D15

RTX 4090 @133%/+230/+500

Builder/Enthusiast/Overclocker since 2012  //  Professional since 2017

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Agall said:

I generally view the power consumption vs performance chart to be exponential (since performance is usually proportional to current and therefore current squared when discussing power) so there's a point where you're at an optimal power draw based on STP operating conditions for silicon. 

For sure it isn't linear. Voltage is not constant either. I don't know if anyone has tried fitting a mathematical curve to it. Still, the CPUs can push out great performance if given great power, but it doesn't scale the same on all workloads.

 

Cinebench is unlikely to be representative of most games. Given OP's use case of gaming, that is what we should be looking at.

 

power-per-game.thumb.png.44ff48ad692ee7c5db5e8c1104c7f2f1.png

https://www.techpowerup.com/review/intel-core-i9-13900k/22.html

 

Of this selection of games, it doesn't go anywhere near 250W. For most of these games, setting a 120W limit wouldn't change performance.

 

Turning it around a bit, even with 7800X3D's typical lower power consumption in gaming, how much could be saved from making this switch? Might not be so much.

 

Reading OP again, beyond turning off the E-cores, it looks like there was a manual clock adjustment applied. Is it running higher or lower power than stock? I don't know. I'd suggest OP look at the actual CPU only power draw when gaming before looking at the 7800X3D.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Are temps an issue or do you just want lower temps for the sake of having lower temps? Short answer, no it isn't worth it. 

Case: Phanteks Enthoo Evolv X Motherboard: ASUS TUF B450-PLUS-GAMING CPU: AMD Ryzen 7 5800x3d GPU: ASUS GeForce RTX 3080 ROG Strix EVA Edition 12GB OC Cooling: 3x Noctua NF-A14, 1x Corsair ML140 RGB Elite, 360mm EKWB EK-AIO RGB all-in-one cooler Monitors: ASUS TUF Gaming VG27AQ (portrait 2560x1440) & ASUS ROG Strix XG349C (ultrawide 3440x1440) Peripherals: Logitech G915, G502, G733 OS: Windows 11 64-bit

Link to comment
Share on other sites

Link to post
Share on other sites

It's most definitely not worth it for ~+5% averages and the same 1% lows, just throw a -100mv offset on your 13900k and it'll bring down power consumption and temps considerably, you likely only need = > 1.2v for non-avx operations and could always throw on an offset of 1 or 2 for those if it becomes unstable.

8086k Winner BABY!!

 

Main rig

CPU: R7 5800x3d (-25 all core CO 102 bclk)

Board: Gigabyte B550 AD UC

Cooler: Corsair H150i AIO

Ram: 32gb HP V10 RGB 3200 C14 (3733 C14) tuned subs

GPU: EVGA XC3 RTX 3080 (+120 core +950 mem 90% PL)

Case: Thermaltake H570 TG Snow Edition

PSU: Fractal ION Plus 760w Platinum  

SSD: 1tb Teamgroup MP34  2tb Mushkin Pilot-E

Monitors: 32" Samsung Odyssey G7 (1440p 240hz), Some FHD Acer 24" VA

 

GFs System

CPU: E5 1660v3 (4.3ghz 1.2v)

Mobo: Gigabyte x99 UD3P

Cooler: Corsair H100i AIO

Ram: 32gb Crucial Ballistix 3600 C16 (3000 C14)

GPU: EVGA RTX 2060 Super 

Case: Phanteks P400A Mesh

PSU: Seasonic Focus Plus Gold 650w

SSD: Kingston NV1 2tb

Monitors: 27" Viotek GFT27DB (1440p 144hz), Some 24" BENQ 1080p IPS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

personally, i dont see going from a 9 series with 24 cores, and 32 threads to a 7 series with 8 cores and 16 threads as an upgrade. yes, i know they are Intel and AMD, the only thing i see you gaining is less heat and power draw.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Agall said:

G-sync is there for

its fine if you dont feel the lag, i do. that's why i only play with a locked framerate / fast sync, and no, you most certainly don't need a "$5000 system" for that ¯\_(ツ)_/¯ 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, tdkid said:

personally, i dont see going from a 9 series with 24 cores, and 32 threads to a 7 series with 8 cores and 16 threads as an upgrade. yes, i know they are Intel and AMD, the only thing i see you gaining is less heat and power draw.

 

It would not be an upgrade and in fact a downgrade in many respects. Though I hate e-cores and yes I have them disabled, but if/when Windows 11 is required for newer games I am screwed and gimped perf with e-cores off on WIN11 due to thread director behavior which you cannot shut off. No issue on WIN10 as it does not use thread director. But WIN10 only has 2 years and 4.75 months before the plug is pulled.

 

https://fox-laptop.com/pc-components/cpu/core-i7-12700k-review-revealing-that-e-cores-hurt-p-cores-but-without-them-everything-is-only-worse/

 

That says there that is why having e-cores off hurts P core performance in WIN11 but not in WIN10.

 

Since 7800X3D and 7700X do not have e-cores nor any parts disabled, WIN11 will not have such issues with them.

 

That is another factor as well.

 

Plus I think I could get a lot of money for my 13900KS which is a top tier bin even for a KS having SP 115 PCore 124 and e-core 97.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Are you the guy that brought the 13900k and disabled all the  ecores ? 
because you wanted better binned cores than a 13700k p cores ? 

-13600kf 

- 4000 32gb ram 

-4070ti super duper 

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/7/2023 at 6:37 PM, Wolverine2349 said:

 

 

Are the higher minimum framerates the same as 1% and 0.1% lows and frame times.

 

Like how would a 7800X3D do in that compared to a 13900K 5.7GHz all core e-cores disabled 4200MHz CL16 DDR4 Gear 1 RAM in frame times and 1% and 0.1% lows of most modern games?

 

I can probably break even by swapping as I have a KS model with P core SP 124 and e-core SP 97 and they are not available anywhere so probably worth a lot of money.

Raptor lake with tunned ddr4 b die is faster and better 1 percent lows than a 7800x3d with ddr5 6000 

 

go watch framechaser 7800x3d review on YouTube where he max overclocks both platforms 

 

the am dip is real 

 

https://youtu.be/ytpVOFvbv8w

-13600kf 

- 4000 32gb ram 

-4070ti super duper 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Wolverine2349 said:

 

It would not be an upgrade and in fact a downgrade in many respects. Though I hate e-cores and yes I have them disabled, but if/when Windows 11 is required for newer games I am screwed and gimped perf with e-cores off on WIN11 due to thread director behavior which you cannot shut off. No issue on WIN10 as it does not use thread director. But WIN10 only has 2 years and 4.75 months before the plug is pulled.

 

https://fox-laptop.com/pc-components/cpu/core-i7-12700k-review-revealing-that-e-cores-hurt-p-cores-but-without-them-everything-is-only-worse/

 

That says there that is why having e-cores off hurts P core performance in WIN11 but not in WIN10.

 

Since 7800X3D and 7700X do not have e-cores nor any parts disabled, WIN11 will not have such issues with them.

 

That is another factor as well.

 

Plus I think I could get a lot of money for my 13900KS which is a top tier bin even for a KS having SP 115 PCore 124 and e-core 97.

 

 

sorry. i meant that as kind of in general but also towards Deadpool2onBlu-Ray who said "The 7800x3d is a dream to use. Its not a huge upgrade vs a 13900k though so it might be more trouble than it's worth". and i dont think going from an intel 9 series with 24 cores and 32 threads to an AMD with 8 cores and 16 threads is an upgrade. that would be a massive downgraded based on going to AMD you would lose 16 cores and 16 threads alone. as for windows 10, you might want to look into that a little more. microsoft released windows 7 to the public in Oct. 2009 and final release in Feb. 2011, they stopped supporting it in Jan. 2020. windows 8 was released in Oct. 2012 with the last release in Dec. of 2016 but they released 8.1 in Oct 2013 and they just stopped supporting it back in Jan this year. so you have what i would say at least until 2026 before they actually stop supporting 10.

 

i will have to look into this myself as i have been thinking of upgrading from my z370 MB and 8700k CPU but not sure if i wanted to stick with intel or switch to AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, Wolverine2349 said:

 

It would not be an upgrade and in fact a downgrade in many respects. Though I hate e-cores and yes I have them disabled, but if/when Windows 11 is required for newer games I am screwed and gimped perf with e-cores off on WIN11 due to thread director behavior which you cannot shut off. No issue on WIN10 as it does not use thread director. But WIN10 only has 2 years and 4.75 months before the plug is pulled.

 

https://fox-laptop.com/pc-components/cpu/core-i7-12700k-review-revealing-that-e-cores-hurt-p-cores-but-without-them-everything-is-only-worse/

 

That says there that is why having e-cores off hurts P core performance in WIN11 but not in WIN10.

 

Since 7800X3D and 7700X do not have e-cores nor any parts disabled, WIN11 will not have such issues with them.

 

That is another factor as well.

 

Plus I think I could get a lot of money for my 13900KS which is a top tier bin even for a KS having SP 115 PCore 124 and e-core 97.

 

 

As already said, then do it, there's really not much to discuss?  most people wouldn't want to do this all for very little *possible* gains, and in the end it'll probably cost you money too regardless....

 

 

you know, you said temps/power.... but i guarantee you everyone is just wondering one thing: WHY?? that's the only thing worth discussing here since the differences are likely very small, but in the end its your decision if you want to do this, way too much hassle for me, it just makes no sense simply (because unless proven otherwise any improvements are just imaginary at this point lol)

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×