Jump to content

Intel vs amd

I disagree. Switching to a Celeron or Pentium chip is a downgrade from even an AMD A4 let alone the FX series of chips. Otherwise you're pretty much spot on; Intel generally has better performance in the same class as AMD for everyday computing, where as AMD generally has better gaming graphics with their APU chips if you can't afford a dedicated graphics card.

Intel might have superior chips in each class but it can lag behind in different price brackets;Especially when the cost of motherboards is factored in.

System CPU : Ryzen 9 5950 doing whatever PBO lets it. Motherboard : Asus B550 Wifi II RAM 80GB 3600 CL 18 2x 32GB 2x 8GB GPUs Vega 56 & Tesla M40 Corsair 4000D Storage: many and varied small (512GB-1TB) SSD + 5TB WD Green PSU 1000W EVGA GOLD

 

You can trust me, I'm from the Internet.

 

Link to comment
Share on other sites

Link to post
Share on other sites

My settings at 0:41

Without having SLI enabled and same cards I'm already outperforming your system with twice as much fps. What would happen with 2 cards enabled? Your system won't even get any advantage of it.

i know what you did, you edited the config file man, just looking at both videos side by side will confirm that my card is running a much more detailed scenery...you don't have any occlusion going on in your video

neither that you have any anti-aliasing and the terrain details, map details and shadow details are ALL SET TO LOW in your video, you edited the config file manualy, that's what you did.. BUSTED !

 

HERE:

qjkbBW7.jpg

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

i know what you did, you edited the config file man, just looking at both videos side by side will confirm that my card is running a much more detailed scenery...you don't have any occlusion going on in your video

neither that you have any anti-aliasing and the terrain details, map details and shadow details are ALL SET TO LOW in your video, you edited the config file manualy, that's what you did.. BUSTED !

 

HERE:

 

Any explanation why the memory usage is exactly the same? The map graphically changes when you destroy the dam, on yours it wasnt but on mine it was. Nothing was set too low, everything was at ultra as being proved, even if I changed it would have showed the settings being set too low which wasnt the case.

Now that you got evidence in your face slapped, you're trying now to claim it's sabotated which is silly. Your cpu is a bottleneck for a 780, as the video's pointed out and the reviews. Good try again and denial/wrong as usual

Link to comment
Share on other sites

Link to post
Share on other sites

Any explanation why the memory usage is exactly the same? The map graphically changes when you destroy the dam, on yours it wasnt but on mine it was. Nothing was set too low, everything was at ultra as being proved, even if I changed it would have showed the settings being set too low which wasnt the case.

Now that you got evidence in your face slapped, you're trying now to claim it's sabotated which is silly. Your cpu is a bottleneck for a 780, as the video's pointed out and the reviews. Good try again and denial/wrong as usual

Its not a CPU bottleneck,

IMO i have more fps with a weaker CPU/GPU combo...

I think its something software wise thats causing his lower fps.

Link to comment
Share on other sites

Link to post
Share on other sites

I disagree. Switching to a Celeron or Pentium chip is a downgrade from even an AMD A4 let alone the FX series of chips. Otherwise you're pretty much spot on; Intel generally has better performance in the same class as AMD for everyday computing, where as AMD generally has better gaming graphics with their APU chips if you can't afford a dedicated graphics card.

That can be debatable, from a APU perspective (running on the iGPU) indeed AMD APU's are much better at gaming. Tho once you add in a discrete card the G3240 more than likely will out perform even the A4-6320. AMD's core performance is that bad to where even the extra 700 MHz on each core isn't going to make up for it. Also to mention that the G3240 is also a true dual core unlike the A4 which uses AMD's module architecture. The G3240 can execute two threads synchronously over the A4-6320 that can only execute one thread at a time.

Link to comment
Share on other sites

Link to post
Share on other sites

Its not a CPU bottleneck,

IMO i have more fps with a weaker CPU/GPU combo...

I think its something software wise thats causing his lower fps.

 

i have a lot of texture filtering and post image enhancement processing settings set in my BF4 profile in NVIDIA control panel..i'm a competitive BF4 player and i want all the best settings, i also play at 120% scalling (supersampling) but not on this video i set it back to 100% to shot this but the bf4 profile was active in nvidia control panel. i know what i'm doing, my GPU usage is always 97-99%

i have no bottleneck from this CPU what so ever, how could it be...it's a multi-threaded game and i run an 8 core AMD cpu at 4.6ghz...LOL

 

Even a 3.5ghz FX-4300 quad core would do the same and i tested it already disabling 4 cores, turbo boost off and stock clock of 3.5ghz and i was getting the same result: 97-99% GPU usage.

The FX 8 core is really OVERKILL for multi-threaded games like battlefield4, it will run BF5 at highest possible settings when it will come out, no doubt about that!

 

This video of mine was shot a little while ago to show some noob (techYESshitty) that my BF4 was not getting drops anywhere near 39FPS like he claimed in is excel graphs that he was showing in one of is most famous BS video..that was the only purpose of this video, not to show high average framerates or anything, otherwise i would have done like Faa and edit the config.ini file to force

low settings and display ULTRA preset in-game... i made an online video with 64player conquest full server and lots of action and stuff and i was getting above 70fps at all time and he argued that it was no good and he made a video to show me how to ''benchmark'' the thing on an empty server (LOL right there that's dumb) so i made this video that Faa showed to the world thinking i was CPU bottlenecked because he saw a % (wich is the FAN SPEED % BTW, to go along with the core temps of the GPU if you check carefully you will see that those 2 move along quite nicely following the fan profile that i've set in afterburner) so yeah Faa saw that % and jumped on it and said LOOK you only get 70% or something gpu load, but it is only the fan speed, i don't show the GPU load in

games cause at 99% all the time i find this irrelevant for me.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Any explanation why the memory usage is exactly the same? The map graphically changes when you destroy the dam, on yours it wasnt but on mine it was. Nothing was set too low, everything was at ultra as being proved, even if I changed it would have showed the settings being set too low which wasnt the case.

Now that you got evidence in your face slapped, you're trying now to claim it's sabotated which is silly. Your cpu is a bottleneck for a 780, as the video's pointed out and the reviews. Good try again and denial/wrong as usual

yes, the memory usage is the same because the texture quality setting and texture filtering setting (anisotropic filtering) you left on ULTRA and those are the only 2 settings that determine the memory usage amount and they cost no FPS (they won't require additional processing power) and you know it just too well...now, i'm no dumb, i played Battlfield since BF1942 and i play competitive since BF2 and i've won the TWL league tournament twice with the =KOH= killer of hell clan in 2006 and 2007 so i can recognise low settings from ultra settings in battlefield rather easily, trust me...you are missing trees, plants...just look at your image, it's so dull...the ambient occlusion alone...look at that, it's easy to tell i'm running HDAO occlusion, and you...occlusion OFF...

 

Now, that shows just how bad of an intel fanboy you are and how dark you are as an individual, you should be very ashame for doing that kind of stuff...i know you'll never admit it because that would discreditate you completely on this forum and render your opinions useless...but you know what, it's already done...you can move on now.

 

Now, to conclude this sh*t, here's my Battlefield 4 account, make sure you come play me someday, i will anihilate you so bad with my ''CPU bottlenecked GTX 780'' you will not even believe it

i'll make you quit the server within 5 minutes...noob!

http://battlelog.battlefield.com/bf4/soldier/Yansag1982/stats/362847852/pc/

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

FX processors use an old architecture that was originally designed for servers, and then turned into desktop chips, this was back in 2009. The AM3/AM3+ is a dying breed.

no just the am3, AM3+ cpus are continuing
Link to comment
Share on other sites

Link to post
Share on other sites

lots of bullshit

You're still missing the point; you didn't include gpu loads to prove your cpu isnt the bottleneck. There's no way a gtx 780 hits 99% and 80° with 100% TDP at the settings you claim I used. When I have my gpu around 40-50% load the temps dont even fluctuate a degree which was the case for you another proof that your cpu is a bottleneck.

 

 

Now, to conclude this sh*t, here's my Battlefield 4 account, make sure you come play me someday, i will anihilate you so bad with my ''CPU bottlenecked GTX 780'' you will not even believe it

i'll make you quit the server within 5 minutes...noob!

http://battlelog.battlefield.com/bf4/soldier/Yansag1982/stats/362847852/pc/

So cute :P

Link to comment
Share on other sites

Link to post
Share on other sites

 

You're still missing the point; you didn't include gpu loads to prove your cpu isnt the bottleneck. There's no way a gtx 780 hits 99% and 80° with 100% TDP at the settings you claim I used. When I have my gpu around 40-50% load the temps dont even fluctuate a degree which was the case for you another proof that your cpu is a bottleneck.

 

 

i will make a video with GPU load and FPS ONLY tonigh and i will even include PerfOverlay.DrawGraph 1 to shut you once and for all okay goofy ;)

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

I was really bored today...so i figured out that i could test the most un-optimized scenario to test if a fx-6300 could bottleneck a r9 280x

 

Settings:

-1680x1050(120% supersampling)

-Ultra Preset

-DirectX(no Mantle)

-full 64 player server

 

FPS:

-Min: 38

-Max: 75

-Avg: 62

Graphs:

Capture.jpg

 

Conclusion:

-Even with this intense setup the processor managed to hold the GPU at 99% while averaging a ~70% usage, with barely even passing the 85% mark. This, in theory, suggests that the gpu is the culprit for the lower fps in-game, and the CPU manages to "keep the GPU busy" all the time. In fact, the GPU could in some situations hold back the potential of this medium level CPU.

Fell free to comment if you think I'm wrong...

Link to comment
Share on other sites

Link to post
Share on other sites

I was really bored today...so i figured out that i could test the most un-optimized scenario to test if a fx-6300 could bottleneck a r9 280x

 

Settings:

-1680x1050(120% supersampling)

-Ultra Preset

-DirectX(no Mantle)

-full 64 player server

 

FPS:

-Min: 38

-Max: 75

-Avg: 62

Graphs:

 

 

Conclusion:

-Even with this intense setup the processor managed to hold the GPU at 99% while averaging a ~70% usage, with barely even passing the 85% mark. This, in theory, suggests that the gpu is the culprit for the lower fps in-game, and the CPU manages to "keep the GPU busy" all the time. In fact, the GPU could in some situations hold back the potential of this medium level CPU.

Fell free to comment if you think I'm wrong...

That cpu spike could be from launching Taskmanager?

o1Vlc1i.png

If the cpu bottlenecked you wouldn't see 99% on your gpu. If you arent happy with your frames and you know your gpu sits at 99% you should upgrade your GPU, if youre happy then dont bother.

Link to comment
Share on other sites

Link to post
Share on other sites

wow not only is this an neverending thread based on its Subject. It also has the added salt n peppa of the AMD vs Nvidia battle LMFAO Today, we read the infinite battles of the tomorows that yet be.....Winter's Coming.... lol i had too

Link to comment
Share on other sites

Link to post
Share on other sites

wow not only is this an neverending thread based on its Subject. It also has the added salt n peppa of the AMD vs Nvidia battle LMFAO Today, we read the infinite battles of the tomorows that yet be.....Winter's Coming.... lol i had too

Winter better be coming it's the only way I can keep these 290's cool !

 

:P I had too as-well. 

System CPU : Ryzen 9 5950 doing whatever PBO lets it. Motherboard : Asus B550 Wifi II RAM 80GB 3600 CL 18 2x 32GB 2x 8GB GPUs Vega 56 & Tesla M40 Corsair 4000D Storage: many and varied small (512GB-1TB) SSD + 5TB WD Green PSU 1000W EVGA GOLD

 

You can trust me, I'm from the Internet.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×