Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Is 5900x enough ?If not is 5950x enough?

i want to game and record not stream on one single pc using x264 medium .Pls i dont want to use nvenc or amd equivalent.4k x264 medium .I would be playing all latest big games.So consider decent cpu utilization..Pls answer if u really know stuff .Do not answer due to hype of these cpus.My gpu would be either 3080 or 6800xt.I do not want to waste money but also want to fulfill the purpose so wanna spend if only required.

Thanks 

 

Link to post
Share on other sites
13 minutes ago, GodDoesGood said:

i want to game and record not stream on one single pc using x264 medium .Pls i dont want to use nvenc or amd equivalent.4k x264 medium .I would be playing all latest big games.So consider decent cpu utilization..Pls answer if u really know stuff .Do not answer due to hype of these cpus.My gpu would be either 3080 or 6800xt.I do not want to waste money but also want to fulfill the purpose so wanna spend if only required.

Thanks 

 

Firstly, there is no reason NOT to use NVENC if you are getting a 3080.  Its been comparable to Medium x264 in quality since the 2000 series.

Even so, a 5900x is probably more than enough, overkill even for any current game.

Router: i5-7200U appliance running pfSense WiFi: Ubiquiti nanoHD (~700Mbit peak throughput)
ISP: Zen Unlimited Fibre 2 (66Mbit) + Plusnet Unlimited Fibre Extra. (56Mbit)

Link to post
Share on other sites

Use NVENC, trust me, you're welcome. (Seriously, 4k requires a pretty decent bit rate and doing that on the CPU is insane).

 

But if you're absolutely set on using your CPU for encoding, get as many cores as you can, it should scale fairly well.

.

Link to post
Share on other sites
1 minute ago, Alex Atkin UK said:

Firstly, there is no reason NOT to use NVENC if you are getting a 3080.  Its been comparable to Medium x264 in quality since the 2000 series.

Even so, a 5900x is probably more than enough, overkill even for any current game.

nvenc is comparable to x264 fast not medium i compared and tested.

Link to post
Share on other sites
1 minute ago, GodDoesGood said:

nvenc is comparable to x264 fast not medium i compared and tested.

I think you might need to run some of those tests again because it absolutely is as good as 264 medium.

.

Link to post
Share on other sites
2 minutes ago, AlwaysFSX said:

Use NVENC, trust me, you're welcome. (Seriously, 4k requires a pretty decent bit rate and doing that on the CPU is insane).

 

But if you're absolutely set on using your CPU for encoding, get as many cores as you can, it should scale fairly well.

x264 medium is better than nvenc  .i want to get highest quality possible for recordings.i also read x264 medium requires less bitrate than nvenc due to compression so it would help. 

Link to post
Share on other sites
Just now, GodDoesGood said:

x264 medium is better than nvenc  .i want to get highest quality possible for recordings.i also read x264 medium requires less bitrate than nvenc due to compression so it would help. 

My eyes aren't bad at all and I can't see a discernible difference between the two. Going past that, any recording you do is going to be compressed in to oblivion uploaded on the internet (if your goal is anything along the lines of YouTube), trying to eek out minor differences in recording quality will be lost regardless. NVENC doesn't need a higher bitrate compared to using x264 anymore either. Maybe on older 700 series cards? But now, you don't need to run something absurdly high. The encoding hardware is that good.

.

Link to post
Share on other sites
18 minutes ago, Alex Atkin UK said:

Firstly, there is no reason NOT to use NVENC if you are getting a 3080.

 

16 minutes ago, AlwaysFSX said:

Use NVENC, trust me, you're welcome.

if OP doesn't wanna use the superior encoder, sure, let him be

 

15 minutes ago, GodDoesGood said:

nvenc is comparable to x264 fast not medium i compared and tested.

which nvenc? how did u test it?

NVENC Turing (20 series and up, excluding 1650 which is NVENC volta) is better than x264 medium

while older NVENC is about x264 medium(?)

 

what's the intended resolution and framerate? bitrate? games genre?

 

edit: i own a 3900x and 2070S

while playing MHW, i tried to encode my stream by x264 medium and immediately started dropping frames, so i have to set to fast preset

1440p60, btw

nvenc handles it just fine

 

so u may need to consider 5950x, if you wanna do all on one machine with x264, maybe even threadripper

Things I need help with:

Spoiler

none atm

 

I hate Intel's pricing, Ryzen's weird quirks, Nvidia's pricing, and Radeon GPUs in general

 

Spoiler

 

Products I like:

Spoiler

Sony Xperia Z1 / Z2 / 10 ii, Asus Strix 970 / 1070, Samsung SSD, WD HDD, Corsair PSUs (AX, RM, CX(grey)), GeForce GPU, NZXT N450/S340, be quiet! Coolers, G.Skill Trident RAM, Logitech M525, Logitech G440, Razer Deathadder Elite

 

Products I hate:

Spoiler

Xperia Z3, XiaoMi 5c, Radeon GPUs, Razer Audio Products, any bloatwares

 

Companies I absolutely adore: (and hope it stays that way)

Spoiler

be quiet! - sent me AM4 mounting for my DRP3 even though it's way past the timeframe stated, no questions asked

Corsair - very good RMA experience, absolutely recommend

 

Companies I hate:

Spoiler

Nvidia, Intel, Apple, TMT (Thundermatch, a retailer)

 

Personal Blacklisted Companies:

Spoiler

Acer: shit tier quality products, shit tier customer service thus far, they "tried" to solve my issue but they arent really doing anything but delaying and delaying. (on-going case since July)

Gigabyte: horrible customer service (gigabyte had literally 0 customer service, asked me to go to retailer with NO WAY to email them about a question) but at least they fixed my shit in ONE MONTH (would probably take me 1 hour to fix if they let me email them)

XiaoMi Phones: built like a tank but the software is buggy as all hell

Seagate HDD: had too many dead seagate drives

Kingston SSD: 300V controller swap thingy

Razer (except their mouse)

 

Remember, just because I had good/bad experiences with these companies/product, doesn't mean you will have similar experiences too. I would still recommend these products if they made sense for your needs, but I'll add a disclaimer of my experience if it's relevant. Feel free to DM me asking why they are where they are.

 

 

Link to post
Share on other sites
4 minutes ago, AlwaysFSX said:

My eyes aren't bad at all and I can't see a discernible difference between the two. Going past that, any recording you do is going to be compressed in to oblivion uploaded on the internet (if your goal is anything along the lines of YouTube), trying to eek out minor differences in recording quality will be lost regardless. NVENC doesn't need a higher bitrate compared to using x264 anymore either. Maybe on older 700 series cards? But now, you don't need to run something absurdly high. The encoding hardware is that good.

 

watch 12000kbs settings on from 3.21 .I want to upload as high quality as RGR29 using one pc.If i bought 6800xt due to vram than i would be limited to vce .amd have lower limit on bit rate.so i would be trapped.

Link to post
Share on other sites
5 minutes ago, GodDoesGood said:

can u tell me what u mean by mhw.

monster hunter world, a AAA game from... 2018?

 

5 minutes ago, GodDoesGood said:

Will 5950x be enough

depends on what you're doing, i suppose

never fiddled around with x264 much due to the dropping frames

 

7 minutes ago, GodDoesGood said:

watch from 3.21 difference from even compressed youtube video.i want to play the big aaa games like watch dogs legion assassin creed valhalla etc

yea i can see the difference (only if i pause, tbh)

the video i linked uses the same RAW footage to compare, which is a more fair comparison

 

and 12Mbit is kinda a weird bitrate to encode at since twitch max is 6000, or 8000kbps, maybe youtube?

for local recording, just crank it to 50mbit and the difference is really minimal

Things I need help with:

Spoiler

none atm

 

I hate Intel's pricing, Ryzen's weird quirks, Nvidia's pricing, and Radeon GPUs in general

 

Spoiler

 

Products I like:

Spoiler

Sony Xperia Z1 / Z2 / 10 ii, Asus Strix 970 / 1070, Samsung SSD, WD HDD, Corsair PSUs (AX, RM, CX(grey)), GeForce GPU, NZXT N450/S340, be quiet! Coolers, G.Skill Trident RAM, Logitech M525, Logitech G440, Razer Deathadder Elite

 

Products I hate:

Spoiler

Xperia Z3, XiaoMi 5c, Radeon GPUs, Razer Audio Products, any bloatwares

 

Companies I absolutely adore: (and hope it stays that way)

Spoiler

be quiet! - sent me AM4 mounting for my DRP3 even though it's way past the timeframe stated, no questions asked

Corsair - very good RMA experience, absolutely recommend

 

Companies I hate:

Spoiler

Nvidia, Intel, Apple, TMT (Thundermatch, a retailer)

 

Personal Blacklisted Companies:

Spoiler

Acer: shit tier quality products, shit tier customer service thus far, they "tried" to solve my issue but they arent really doing anything but delaying and delaying. (on-going case since July)

Gigabyte: horrible customer service (gigabyte had literally 0 customer service, asked me to go to retailer with NO WAY to email them about a question) but at least they fixed my shit in ONE MONTH (would probably take me 1 hour to fix if they let me email them)

XiaoMi Phones: built like a tank but the software is buggy as all hell

Seagate HDD: had too many dead seagate drives

Kingston SSD: 300V controller swap thingy

Razer (except their mouse)

 

Remember, just because I had good/bad experiences with these companies/product, doesn't mean you will have similar experiences too. I would still recommend these products if they made sense for your needs, but I'll add a disclaimer of my experience if it's relevant. Feel free to DM me asking why they are where they are.

 

 

Link to post
Share on other sites
3 minutes ago, Moonzy said:

monster hunter world, a AAA game from... 2018?

 

depends on what you're doing, i suppose

never fiddled around with x264 much due to the dropping frames

 

yea i can see the difference (only if i pause, tbh)

the video i linked uses the same RAW footage to compare, which is a more fair comparison

 

and 12Mbit is kinda a weird bitrate to encode at since twitch max is 6000, or 8000kbps, maybe youtube?

for local recording, just crank it to 50mbit and the difference is really minimal

can u tell me if there are two cases like x264 medium bitrate 20000 and x264 medium bitrate 30000 would there be a difference in cpu utilization.i dont know the mechanics well?

Link to post
Share on other sites
2 minutes ago, GodDoesGood said:

can u tell me if there are two cases like x264 medium bitrate 20000 and x264 medium bitrate 30000 would there be a difference in cpu utilization.i dont know the mechanics well?

im not that familiar with encoders, maybe wait for other's insights

 

i think higher bitrate = lower cpu usage, but i could be wrong

Things I need help with:

Spoiler

none atm

 

I hate Intel's pricing, Ryzen's weird quirks, Nvidia's pricing, and Radeon GPUs in general

 

Spoiler

 

Products I like:

Spoiler

Sony Xperia Z1 / Z2 / 10 ii, Asus Strix 970 / 1070, Samsung SSD, WD HDD, Corsair PSUs (AX, RM, CX(grey)), GeForce GPU, NZXT N450/S340, be quiet! Coolers, G.Skill Trident RAM, Logitech M525, Logitech G440, Razer Deathadder Elite

 

Products I hate:

Spoiler

Xperia Z3, XiaoMi 5c, Radeon GPUs, Razer Audio Products, any bloatwares

 

Companies I absolutely adore: (and hope it stays that way)

Spoiler

be quiet! - sent me AM4 mounting for my DRP3 even though it's way past the timeframe stated, no questions asked

Corsair - very good RMA experience, absolutely recommend

 

Companies I hate:

Spoiler

Nvidia, Intel, Apple, TMT (Thundermatch, a retailer)

 

Personal Blacklisted Companies:

Spoiler

Acer: shit tier quality products, shit tier customer service thus far, they "tried" to solve my issue but they arent really doing anything but delaying and delaying. (on-going case since July)

Gigabyte: horrible customer service (gigabyte had literally 0 customer service, asked me to go to retailer with NO WAY to email them about a question) but at least they fixed my shit in ONE MONTH (would probably take me 1 hour to fix if they let me email them)

XiaoMi Phones: built like a tank but the software is buggy as all hell

Seagate HDD: had too many dead seagate drives

Kingston SSD: 300V controller swap thingy

Razer (except their mouse)

 

Remember, just because I had good/bad experiences with these companies/product, doesn't mean you will have similar experiences too. I would still recommend these products if they made sense for your needs, but I'll add a disclaimer of my experience if it's relevant. Feel free to DM me asking why they are where they are.

 

 

Link to post
Share on other sites
2 minutes ago, Moonzy said:

im not that familiar with encoders, maybe wait for other's insights

 

i think higher bitrate = lower cpu usage, but i could be wrong

see the video u send me at 16.00 quality difference between x264 slow and nvenc .now since slow is not that better than medium u can compare how on same rate there are differences .that guys use of netflix tester is wrong and misguiding .i know he is pretty famous but see for your self in so many cases i showed u there were decent differences.

Link to post
Share on other sites

Yes

ALIENWARE 13R3

i7-7700HQ | GTX 1060 | 16GB DDR4 | 512GB NVMe SSD | 1440p OLED display | Logitech MX Master KZ ZSA

 

 

Link to post
Share on other sites
5 minutes ago, Aereldor said:

Yes

^ That's basically the short version.

 

 

Afaik a 5800X or maybe even a 5600X would probably be equally as capable in these tasks. I don't see 16 cores helping you in any of this. Got some money that needs to be burned? As you're playing in 4K the CPU plays only a very small role anyway.

 

You're basically asking if the best desktop CPU available is enough...

PSU Tier List          AMD Motherboard Tier List          SSD Tier List          How to choose a Monitor          My Samsung Odyssey G7 Review

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 with Phanteks Halos Lux RGB Frames - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: LG 27GN850-B - (For HDR Games: LG C9 OLED 55") - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915, Corsair K63 Wireless + Lapboard - Headphones: Beyerdynamic DT 990 Edition 600Ohm - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark CPU/GPU burner running for 1 hour. Fans @1000RPM. Pump @1600RPM.

Water: 39°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites
1 minute ago, Aereldor said:

Yes

I see sometimes cpu single core usage spike to 95 percent for a second in watch dogs legion in online benchmarks of 5800x .so 5950x being same zen 3 chip is there something i need to worry about?

Link to post
Share on other sites
1 minute ago, ServantOfGod said:

I see sometimes cpu single core usage spike to 95 percent for a second in watch dogs legion in online benchmarks of 5800x .so 5950x being same zen 3 chip is there something i need to worry about?

No.

PSU Tier List          AMD Motherboard Tier List          SSD Tier List          How to choose a Monitor          My Samsung Odyssey G7 Review

 

Current Specs:

CPU: AMD Ryzen 5 5600X - Motherboard: ASUS ROG Strix B550-E - GPU: PNY RTX 3080 XLR8 Epic-X - RAM: 4x8GB (32GB) G.Skill TridentZ RGB 3600MHz CL16 - PSU: Corsair RMx (2018) 850W - Storage: 500 GB Corsair MP600 (Boot) + 2 TB Sabrent Rocket Q (Storage) - Cooling: EK & Alphacool custom loop - Case: Lian-Li PC O11 Dynamic - Fans: 6x Noctua NF-A12x25 with Phanteks Halos Lux RGB Frames - AMP/DAC: FiiO K5 Pro - OS: Windows 10 Pro - Monitor: LG 27GN850-B - (For HDR Games: LG C9 OLED 55") - Mouse: Logitech G Pro + Powerplay - Keyboard: Logitech G915, Corsair K63 Wireless + Lapboard - Headphones: Beyerdynamic DT 990 Edition 600Ohm - Microphone: Antlion ModMic

 

Temperatures @steady state: Furmark CPU/GPU burner running for 1 hour. Fans @1000RPM. Pump @1600RPM.

Water: 39°C

CPU: 73°C

GPU: 54°C

Link to post
Share on other sites

I mean, you can try getting a better one but I doubt you'll find any. 

Gaming HTPC:

R7 1700X - Scythe Big Shuriken 3 - Asus ROG B350-i - Asus GTX 1080 Strix - 16gb G.Skill Ripjaws V 3333mhz - Silverstone SX500-LG - 500gb 960 EVO - Fractal Design Node 202 - Samsung 60KS7090 - Logitech G502 - Thrustmaster T500RS - Noblechairs Icon


Desktop PC:
R9 3900X - H100i GTX - Asus Prime X570 Pro - EVGA RTX2060KO - 32gb LPX 3200mhz - EVGA 750G2 - 250gb 970 Evo - 6TB WD My Book Duo (Reds) - Inwin 103 White - Dell U3415W - Qpad MK-85 Brown - Logitech MX518 Legendary - Blue Yeti Platinum - Noblechairs Icon 


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to post
Share on other sites
5 minutes ago, Stahlmann said:

Afaik a 5800X or maybe even a 5600X would probably be equally as capable in these tasks. I don't see 16 cores helping you in any of this. Got some money that needs to be burned? As you're playing in 4K the CPU plays only a very small role anyway.

5900x is probably just enough for 1440p let alone 4k.i know higher res and cpu utilization thing but cpu has to process more while recording

Link to post
Share on other sites
2 minutes ago, ServantOfGod said:

no there exist threadripper if u dont know

threadripper is for insane high end desktops, for mainstream, the 5950x is the top

5 minutes ago, ServantOfGod said:

I see sometimes cpu single core usage spike to 95 percent for a second in watch dogs legion in online benchmarks of 5800x .so 5950x being same zen 3 chip is there something i need to worry about?

that is because of 8 cores, not because of zen 3, more cores more power, the 5900x is fine

"I know a lot about a little, but a little about a lot"

Note that I am a student so what I say is based on what I read, and may not be the case for everyone.

I am not the best technician, so don't make fun

Link to post
Share on other sites
6 minutes ago, ServantOfGod said:

no there exist threadripper if u dont know

That would be an HEDT part, not a desktop one

My Gaming PC:
Zotac RTX 3090 AMP (+750 mem +200 core. Not the best, but was MSRP. Plan on watercooling later)
i7 8700k @4.95GHz delid + a fuckton of voltage (1.375V LLC3, yeah its a below average OCer) - NZXT x72
G.Skill Trident Z @3450MHz 32GB CL15 (3466 and 3200 mixed CL16) OC
Asus Maximus X Hero
Samsung 840 PRO 128GB (Windows)

Crucial P1 M.2 NVME 1TB (Games)
Samsung 860 EVO 1TB (Games)
Samsung 860 EVO 500GB (Stuff)
Seagate Backup Plus 8TB (Shadowplay and Backups)
Corsair RM-850X

Lian Li O11Dynamic
Dell S2721DGF 165hz 1440p / ASUS ROG SWIFT PG258Q  240hz, BenQ ZOWIE XL2720 144hz, Acer S242HLDBID 60hz

Logitech G Pro Wireless (Corepad Skatez) / Razer Viper Ultimate / Endgame Gear XM1 / Cooler Master MM711
Ducky One 2 Mecha Mini (MX Black) / One 2 TKL (Silver/Blue)

 

My Silent Multimedia PC:

Ryzen 2700 @4GHz 1.337V - Noctua NH-D15

Asus Strix RTX 2080 OC (Undervolt)

G.Skill Trident Z 3200MHz

MSI B450 Pro Carbon AC

Kingston A400 240GB (Windows) + 480GB (Games)

Corsair RM-650

Be Quiet Silent Base 601 with 1x 140mm Noctua exhaust from my D15

Samsung Q60R (4k 60hz for Windows, 1440p 120hz Freesync for gaming)

Logitech G Pro Wireless

Ducky One 2 TKL (Silver/Blue)

 

My Notebook:

Asus Zephyrus G14 Top Spec 120hz

Link to post
Share on other sites
Just now, Noah0302 said:

That would be an HEDT part, not a desktop one

it is still a cpu thats all thats matter u can use it for pc if u want .unnecessary argument

Link to post
Share on other sites
1 hour ago, GodDoesGood said:

 

watch 12000kbs settings on from 3.21 .I want to upload as high quality as RGR29 using one pc.If i bought 6800xt due to vram than i would be limited to vce .amd have lower limit on bit rate.so i would be trapped.

Enabling Psycho Visual Tuning will do that and fuzz out some details, I leave it off for a reason, my records turn out better. Three other things about that video:

-The rest of the examples seem to show NVENC performing the same as x264

 

-Why are they using such a low bitrate for 1440p, in the case of 3k and 6k. You wouldn't do that in the first place. You'd either downscale to raise the bitrate. Low bitrate + high resolution =/= good quality.

 

-At 4:10 you can notice hitching in the x264 encoder whereas you don't have that with NVENC, and this is purely a CPU load, there was no game running in the background because they were recording a raw video. Add in a game load and your performance will go down.

 

--Again, if you're ADAMANT about using x264, get whatever fits your budget with the most cores.--

 

But realistically you'd be using NVENC because the quality difference in recordings is negligible and I say this after doing many days of testing bit rates, encoders, and rescaling filters.

1 hour ago, Moonzy said:

i think higher bitrate = lower cpu usage, but i could be wrong

Not the way it works. Bitrate is directly linked to the quality of video, so the encoder is spending as much time as it's allowed trying to generate the least noisy image. More bitrate means it's going to try to reach that target. If you have a lower bitrate, once it hits the size for the image, it stops. (Rough explanation)

.

Link to post
Share on other sites
5 minutes ago, AlwaysFSX said:

Enabling Psycho Visual Tuning will do that and fuzz out some details, I leave it off for a reason, my records turn out better. Three other things about that video:

-The rest of the examples seem to show NVENC performing the same as x264

 

-Why are they using such a low bitrate for 1440p, in the case of 3k and 6k. You wouldn't do that in the first place. You'd either downscale to raise the bitrate. Low bitrate + high resolution =/= good quality.

 

-At 4:10 you can notice hitching in the x264 encoder whereas you don't have that with NVENC, and this is purely a CPU load, there was no game running in the background because they were recording a raw video. Add in a game load and your performance will go down.

 

--Again, if you're ADAMANT about using x264, get whatever fits your budget with the most cores.--

 

But realistically you'd be using NVENC because the quality difference in recordings is negligible and I say this after doing many days of testing bit rates, encoders, and rescaling filters.

Not the way it works. Bitrate is directly linked to the quality of video, so the encoder is spending as much time as it's allowed trying to generate the least noisy image. More bitrate means it's going to try to reach that target. If you have a lower bitrate, once it hits the size for the image, it stops. (Rough explanation)

if i choose rx 6800xt then what 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×