Jump to content

Is 5900x enough ?If not is 5950x enough?

5 minutes ago, ServantOfGod said:

u still dont get my question is  5950x enough or i need threadripper

 

10 minutes ago, AlwaysFSX said:

-At 4:10 you can notice hitching in the x264 encoder whereas you don't have that with NVENC, and this is purely a CPU load, there was no game running in the background because they were recording a raw video. Add in a game load and your performance will go down.

It depends on what bitrate you're recording at. Low bitrate: Probably okay more times than not, but you're going to drop / have slow frames regardless. High bitrate: glhf.

 

Edit:

4 minutes ago, ServantOfGod said:

if i choose rx 6800xt then what 

I guess at this point I should add:

If anyone is recording things for YouTube then you may as well make a high quality 1080p recording and export it in 1440p or 4k before publishing. The compression on YT is not worth recording in 4k to begin with unless it's something that'll artifact a lot. Especially if using x264, do this for performance reasons.

.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, ServantOfGod said:

thanks is a crime to commit these days.but even if it is i am not trying to harm anyone just want to get an answer

 

ok you want a short answer, yes.

 

Detailed answer:

It is more than enough, the 5900x is also much, the 5800x can struggle a bit, so get a 5900x and save money

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Ankh Tech said:

It is more than enough, the 5900x is also much, the 5800x can struggle a bit, so get a 5900x and save money

im not op but are you sure it can run AAA games and also encode at 4k?

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Moonzy said:

im not op but are you sure it can run AAA games and also encode at 4k?

yeah it can, 6 cores for gaming, and the rest for encoding

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moonzy said:

im not op but are you sure it can run AAA games and also encode at 4k?

clearly he dont know 5900x would get beaten up pretty bad

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ServantOfGod said:

clearly he dont know 5900x would get beaten up pretty bad

it will not, as I said, 6 cores max, maybe 8, for gaming and the rest encoding, otherwise, get a secondary system to encode while you game

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Ankh Tech said:

it will not, as I said, 6 cores max, maybe 8, for gaming and the rest encoding, otherwise, get a secondary system to encode while you game

the hit is bigger than u think

Link to comment
Share on other sites

Link to post
Share on other sites

then get a 5950x, problem solved, btw it is so obvious you made two accounts, but that is not our issue

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Ankh Tech said:

then get a 5950x, problem solved, btw it is so obvious you made two accounts, but that is not our issue

i asked is 5950x enough ?pls if u dont know admit .things are not so simple.

Link to comment
Share on other sites

Link to post
Share on other sites

*** Threads merged and cleaned ***

 

Moderation of members is private. Please report any issues and we handle it further. Any speculation is only causing disruption and doesn't lead anywhere.

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AlwaysFSX said:

 

It depends on what bitrate you're recording at. Low bitrate: Probably okay more times than not, but you're going to drop / have slow frames regardless. High bitrate: glhf.

 

Edit:

I guess at this point I should add:

If anyone is recording things for YouTube then you may as well make a high quality 1080p recording and export it in 1440p or 4k before publishing. The compression on YT is not worth recording in 4k to begin with unless it's something that'll artifact a lot. Especially if using x264, do this for performance reasons.

is amd equivalent of nvenc vce on 6800xt good as nvenc

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ServantOfGod said:

if i choose rx 6800xt then what 

then you're stuck with the inferior encoder. 

So choose *wisely*  :)

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

For what OP wants to do, it's still probably best to just get a 5800X and pair it with a RTX GPU and use NVENC. At 4K the Nvidia offerings are better performing either way. Idk why you'd lock yourself into a ridiculously expensive CPU just to use H264. Is there any real reason why you want a 6800XT?

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Stahlmann said:

For what OP wants to do, it's still probably best to just get a 5800X and pair it with a RTX GPU and use NVENC. At 4K the Nvidia offerings are better performing either way. Idk why you'd lock yourself into a ridiculously expensive CPU just to use H264. Is there any real reason why you want a 6800XT?

vram

Link to comment
Share on other sites

Link to post
Share on other sites

title says it all 

i play aaa games .i dont wanna argue about nvenc or thing.if u really know stuff then only post.gpu 6800xt.cpu advice required

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, handymanshandle said:

Except the title doesn't say it all? What specific games are you playing? Are you dead set on the 6800XT? Seems like you'd benefit from a 12+ core Ryzen CPU if you need any CPU suggestions.

yes dead set on gpu.cpu advice required that game and record at 4k x264 medium.aaa games like watch dogs assassin creed hitman rdr2

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Noah0302 said:

That would be an HEDT part, not a desktop one

Hmmm, i wonder what the DT stands for in Hedt 🤔😉

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/2/2020 at 2:03 PM, GodDoesGood said:

vram

What gives you the impression you need so much VRAM?  As many tech YouTubers have said and has happened time and time again in the past, by the time you need that VRAM the rest of the GPU will be too weak to use it.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Alex Atkin UK said:

What gives you the impression you need so much VRAM?  As many tech YouTubers have said and has happened time and time again in the past, by the time you need that VRAM the rest of the GPU will be too weak to use it.

i see some games already using too close 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, GodDoesGood said:

i see some games already using too close 

That doesn't mean VRAM usage will suddenly double.  Most games reserve more VRAM than they actually use and having more direct access to VRAM (so things can be moved in and out quicker) potentially means LESS VRAM is required for the same task in the future.

Also, the 3080 has dramatically more memory bandwidth, which could work out more useful in the long run.

At the end of the day, building to be future proof is folly, we do not know how game engines are going to scale going forward.  However if Xbox/Playstation is anything to go by, games will be focusing on more efficiently using RAM, moving things in and out as required from a fast SSD rather than just ramming everything into RAM just-in-case you need it, as games do now. 

Remember, both consoles are sharing 16GB between the CPU and GPU.  This suggests to me that on PC, 10GB VRAM is unlikely to be an issue, you'll be needing a faster GPU long before then.

Think about what hardware you need to do what you want NOW, not in 3-4 years time.  If using NVENC means you can get away with a cheaper CPU and more reliable recording performance, surely it makes sense to buy based on that?

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Alex Atkin UK said:

That doesn't mean VRAM usage will suddenly double.  Most games reserve more VRAM than they actually use and having more direct access to VRAM (so things can be moved in and out quicker) potentially means LESS VRAM is required for the same task in the future.

Also, the 3080 has dramatically more memory bandwidth, which could work out more useful in the long run.

At the end of the day, building to be future proof is folly, we do not know how game engines are going to scale going forward.  However if Xbox/Playstation is anything to go by, games will be focusing on more efficiently using RAM, moving things in and out as required from a fast SSD rather than just ramming everything into RAM just-in-case you need it, as games do now. 

Remember, both consoles are sharing 16GB between the CPU and GPU.  This suggests to me that on PC, 10GB VRAM is unlikely to be an issue, you'll be needing a faster GPU long before then.

Think about what hardware you need to do what you want NOW, not in 3-4 years time.  If using NVENC means you can get away with a cheaper CPU and more reliable recording performance, surely it makes sense to buy based on that?

GODFALL USE 13GB that is just one example

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, GodDoesGood said:

GODFALL USE 13GB that is just one example

Sounds rather suspect to me, like AMD deliberately made it use more VRAM to make their card more appealing. https://www.reddit.com/r/nvidia/comments/jspy61/godfall_requires_only_6gb8gb_vram_at_4k_maxed_out/

 

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

99 points·23 days ago
 

This guy turned on Amd FidelityFX LPM with his 3090 and it was allocating 11-12gbs of Vram https://youtu.be/lY9OSdebKcc

1 hour ago, Alex Atkin UK said:

Sounds rather suspect to me, like AMD deliberately made it use more VRAM to make their card more appealing. https://www.reddit.com/r/nvidia/comments/jspy61/godfall_requires_only_6gb8gb_vram_at_4k_maxed_out/

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×