Jump to content

Is 5900x enough ?If not is 5950x enough?

i want to game and record not stream on one single pc using x264 medium .Pls i dont want to use nvenc or amd equivalent.4k x264 medium .I would be playing all latest big games.So consider decent cpu utilization..Pls answer if u really know stuff .Do not answer due to hype of these cpus.My gpu would be either 3080 or 6800xt.I do not want to waste money but also want to fulfill the purpose so wanna spend if only required.

Thanks 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, GodDoesGood said:

i want to game and record not stream on one single pc using x264 medium .Pls i dont want to use nvenc or amd equivalent.4k x264 medium .I would be playing all latest big games.So consider decent cpu utilization..Pls answer if u really know stuff .Do not answer due to hype of these cpus.My gpu would be either 3080 or 6800xt.I do not want to waste money but also want to fulfill the purpose so wanna spend if only required.

Thanks 

 

Firstly, there is no reason NOT to use NVENC if you are getting a 3080.  Its been comparable to Medium x264 in quality since the 2000 series.

Even so, a 5900x is probably more than enough, overkill even for any current game.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

Use NVENC, trust me, you're welcome. (Seriously, 4k requires a pretty decent bit rate and doing that on the CPU is insane).

 

But if you're absolutely set on using your CPU for encoding, get as many cores as you can, it should scale fairly well.

.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Alex Atkin UK said:

Firstly, there is no reason NOT to use NVENC if you are getting a 3080.  Its been comparable to Medium x264 in quality since the 2000 series.

Even so, a 5900x is probably more than enough, overkill even for any current game.

nvenc is comparable to x264 fast not medium i compared and tested.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GodDoesGood said:

nvenc is comparable to x264 fast not medium i compared and tested.

I think you might need to run some of those tests again because it absolutely is as good as 264 medium.

.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, AlwaysFSX said:

Use NVENC, trust me, you're welcome. (Seriously, 4k requires a pretty decent bit rate and doing that on the CPU is insane).

 

But if you're absolutely set on using your CPU for encoding, get as many cores as you can, it should scale fairly well.

x264 medium is better than nvenc  .i want to get highest quality possible for recordings.i also read x264 medium requires less bitrate than nvenc due to compression so it would help. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GodDoesGood said:

x264 medium is better than nvenc  .i want to get highest quality possible for recordings.i also read x264 medium requires less bitrate than nvenc due to compression so it would help. 

My eyes aren't bad at all and I can't see a discernible difference between the two. Going past that, any recording you do is going to be compressed in to oblivion uploaded on the internet (if your goal is anything along the lines of YouTube), trying to eek out minor differences in recording quality will be lost regardless. NVENC doesn't need a higher bitrate compared to using x264 anymore either. Maybe on older 700 series cards? But now, you don't need to run something absurdly high. The encoding hardware is that good.

.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Alex Atkin UK said:

Firstly, there is no reason NOT to use NVENC if you are getting a 3080.

 

16 minutes ago, AlwaysFSX said:

Use NVENC, trust me, you're welcome.

if OP doesn't wanna use the superior encoder, sure, let him be

 

15 minutes ago, GodDoesGood said:

nvenc is comparable to x264 fast not medium i compared and tested.

which nvenc? how did u test it?

NVENC Turing (20 series and up, excluding 1650 which is NVENC volta) is better than x264 medium

while older NVENC is about x264 medium(?)

 

what's the intended resolution and framerate? bitrate? games genre?

 

edit: i own a 3900x and 2070S

while playing MHW, i tried to encode my stream by x264 medium and immediately started dropping frames, so i have to set to fast preset

1440p60, btw

nvenc handles it just fine

 

so u may need to consider 5950x, if you wanna do all on one machine with x264, maybe even threadripper

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, AlwaysFSX said:

My eyes aren't bad at all and I can't see a discernible difference between the two. Going past that, any recording you do is going to be compressed in to oblivion uploaded on the internet (if your goal is anything along the lines of YouTube), trying to eek out minor differences in recording quality will be lost regardless. NVENC doesn't need a higher bitrate compared to using x264 anymore either. Maybe on older 700 series cards? But now, you don't need to run something absurdly high. The encoding hardware is that good.

 

watch 12000kbs settings on from 3.21 .I want to upload as high quality as RGR29 using one pc.If i bought 6800xt due to vram than i would be limited to vce .amd have lower limit on bit rate.so i would be trapped.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, GodDoesGood said:

can u tell me what u mean by mhw.

monster hunter world, a AAA game from... 2018?

 

5 minutes ago, GodDoesGood said:

Will 5950x be enough

depends on what you're doing, i suppose

never fiddled around with x264 much due to the dropping frames

 

7 minutes ago, GodDoesGood said:

watch from 3.21 difference from even compressed youtube video.i want to play the big aaa games like watch dogs legion assassin creed valhalla etc

yea i can see the difference (only if i pause, tbh)

the video i linked uses the same RAW footage to compare, which is a more fair comparison

 

and 12Mbit is kinda a weird bitrate to encode at since twitch max is 6000, or 8000kbps, maybe youtube?

for local recording, just crank it to 50mbit and the difference is really minimal

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Moonzy said:

monster hunter world, a AAA game from... 2018?

 

depends on what you're doing, i suppose

never fiddled around with x264 much due to the dropping frames

 

yea i can see the difference (only if i pause, tbh)

the video i linked uses the same RAW footage to compare, which is a more fair comparison

 

and 12Mbit is kinda a weird bitrate to encode at since twitch max is 6000, or 8000kbps, maybe youtube?

for local recording, just crank it to 50mbit and the difference is really minimal

can u tell me if there are two cases like x264 medium bitrate 20000 and x264 medium bitrate 30000 would there be a difference in cpu utilization.i dont know the mechanics well?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GodDoesGood said:

can u tell me if there are two cases like x264 medium bitrate 20000 and x264 medium bitrate 30000 would there be a difference in cpu utilization.i dont know the mechanics well?

im not that familiar with encoders, maybe wait for other's insights

 

i think higher bitrate = lower cpu usage, but i could be wrong

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moonzy said:

im not that familiar with encoders, maybe wait for other's insights

 

i think higher bitrate = lower cpu usage, but i could be wrong

see the video u send me at 16.00 quality difference between x264 slow and nvenc .now since slow is not that better than medium u can compare how on same rate there are differences .that guys use of netflix tester is wrong and misguiding .i know he is pretty famous but see for your self in so many cases i showed u there were decent differences.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Aereldor said:

Yes

^ That's basically the short version.

 

 

Afaik a 5800X or maybe even a 5600X would probably be equally as capable in these tasks. I don't see 16 cores helping you in any of this. Got some money that needs to be burned? As you're playing in 4K the CPU plays only a very small role anyway.

 

You're basically asking if the best desktop CPU available is enough...

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Aereldor said:

Yes

I see sometimes cpu single core usage spike to 95 percent for a second in watch dogs legion in online benchmarks of 5800x .so 5950x being same zen 3 chip is there something i need to worry about?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ServantOfGod said:

I see sometimes cpu single core usage spike to 95 percent for a second in watch dogs legion in online benchmarks of 5800x .so 5950x being same zen 3 chip is there something i need to worry about?

No.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Noah0302 said:

Is *Insert best possible desktop part here* enough for *insert usecase here*?

no there exist threadripper if u dont know

Link to comment
Share on other sites

Link to post
Share on other sites

I mean, you can try getting a better one but I doubt you'll find any. 

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Stahlmann said:

Afaik a 5800X or maybe even a 5600X would probably be equally as capable in these tasks. I don't see 16 cores helping you in any of this. Got some money that needs to be burned? As you're playing in 4K the CPU plays only a very small role anyway.

5900x is probably just enough for 1440p let alone 4k.i know higher res and cpu utilization thing but cpu has to process more while recording

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ServantOfGod said:

no there exist threadripper if u dont know

threadripper is for insane high end desktops, for mainstream, the 5950x is the top

5 minutes ago, ServantOfGod said:

I see sometimes cpu single core usage spike to 95 percent for a second in watch dogs legion in online benchmarks of 5800x .so 5950x being same zen 3 chip is there something i need to worry about?

that is because of 8 cores, not because of zen 3, more cores more power, the 5900x is fine

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, ServantOfGod said:

no there exist threadripper if u dont know

That would be an HEDT part, not a desktop one

My Gaming PC:
Inno3D iChill Black - RTX 4080 - +500 Memory, undervolted Core, 2xCorsair QX120 (push) + 2xInno3D 120mm (pull)
AMD Ryzen 7 7800X3D - NZXT x72
G.SKILL Trident Z @6000MHz CL30 - 2x16GB
Asus Strix X670E-E Gaming

1x500GB Samsung 960 Pro (Windows 11)

1x2TB Kingston KC3000 (Games)

1x1TB WD Blue SN550 (Programs)

1x1TB Samsung 870 EVO (Programs)
Corsair RM-850X

Lian Li O11 Vision
ASUS ROG Swift OLED PG27AQDM (240hz OLED), MSI Optix MAG274QRFDE-QD, BenQ ZOWIE XL2720

Logitech G Pro Wireless Superlight
Wooting 60HE

Audeze LCD2-C + FiiO K3

Klipsch RP600-M + Klipsch R-120 SW

 

My Notebook:

MacBook Pro 16 M1 - 16GB

 

Proxmox-Cluster:

  • Ryzen 9 3950X, Asus Strix X570E F-Gaming, 2x32GB3200MHz ECC, 2x 512GB NVMe ZFS-Mirror (Boot + Testing-VMs), 2x14TB ZFS-Mirror + 1x3TB (TrueNAS-VM), 1x 1TB Samsung 980 Pro NVMe (Ceph-OSD), 10G NIC
  • i7 8700k delidded undervolted, Gigabyte Z390 UD, 4x16GB 3200MHz, 1x 512GB SSD (Boot), 1x 1TB Samsung 980 Pro NVMe (Ceph-OSD), 2,5G NIC
  • i5 4670, 3x4GB + 1x8GB 1600MHz, 1x 512GB SSD (Boot), 1x 1TB Samsung 980 Pro NVMe (Ceph-OSD), 2,5G NIC

Proxmox-Backup-Server:

  • i5 4670, 4x4GB 1600MHz, 2x2TB ZFS-Mirror, 2,5G NIC
Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Noah0302 said:

That would be an HEDT part, not a desktop one

it is still a cpu thats all thats matter u can use it for pc if u want .unnecessary argument

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GodDoesGood said:

 

watch 12000kbs settings on from 3.21 .I want to upload as high quality as RGR29 using one pc.If i bought 6800xt due to vram than i would be limited to vce .amd have lower limit on bit rate.so i would be trapped.

Enabling Psycho Visual Tuning will do that and fuzz out some details, I leave it off for a reason, my records turn out better. Three other things about that video:

-The rest of the examples seem to show NVENC performing the same as x264

 

-Why are they using such a low bitrate for 1440p, in the case of 3k and 6k. You wouldn't do that in the first place. You'd either downscale to raise the bitrate. Low bitrate + high resolution =/= good quality.

 

-At 4:10 you can notice hitching in the x264 encoder whereas you don't have that with NVENC, and this is purely a CPU load, there was no game running in the background because they were recording a raw video. Add in a game load and your performance will go down.

 

--Again, if you're ADAMANT about using x264, get whatever fits your budget with the most cores.--

 

But realistically you'd be using NVENC because the quality difference in recordings is negligible and I say this after doing many days of testing bit rates, encoders, and rescaling filters.

1 hour ago, Moonzy said:

i think higher bitrate = lower cpu usage, but i could be wrong

Not the way it works. Bitrate is directly linked to the quality of video, so the encoder is spending as much time as it's allowed trying to generate the least noisy image. More bitrate means it's going to try to reach that target. If you have a lower bitrate, once it hits the size for the image, it stops. (Rough explanation)

.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, AlwaysFSX said:

Enabling Psycho Visual Tuning will do that and fuzz out some details, I leave it off for a reason, my records turn out better. Three other things about that video:

-The rest of the examples seem to show NVENC performing the same as x264

 

-Why are they using such a low bitrate for 1440p, in the case of 3k and 6k. You wouldn't do that in the first place. You'd either downscale to raise the bitrate. Low bitrate + high resolution =/= good quality.

 

-At 4:10 you can notice hitching in the x264 encoder whereas you don't have that with NVENC, and this is purely a CPU load, there was no game running in the background because they were recording a raw video. Add in a game load and your performance will go down.

 

--Again, if you're ADAMANT about using x264, get whatever fits your budget with the most cores.--

 

But realistically you'd be using NVENC because the quality difference in recordings is negligible and I say this after doing many days of testing bit rates, encoders, and rescaling filters.

Not the way it works. Bitrate is directly linked to the quality of video, so the encoder is spending as much time as it's allowed trying to generate the least noisy image. More bitrate means it's going to try to reach that target. If you have a lower bitrate, once it hits the size for the image, it stops. (Rough explanation)

if i choose rx 6800xt then what 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×