Jump to content

How is the FX 8350 for games?

Guest
Go to solution Solved by Dabombinable,

The FX is a 32nm beast of a chip that can pull just shy of 280W from the wall if you overclock it properly...and you need a good motherboard for overclocking and you NEED to do overclocking to get acceptable framerates in modern games..

Look at this...the cooler doesn't seem too hot but look at this poor motherboard the vrm's and the cpu socket are piping hot:

 

http://www.tomshardware.com/reviews/amd-fx-8370e-cpu,3929-4.html

Looking at the chart, I'd have to agree that the sweet spot is about 3.8GHz, though even compared to my core 2 its inefficient, it hits 3.8GHz (from 3.16GHz) with only a small increase in power consumption due to it operating at a higher voltage than is needed.

 

You did. 

Camera wasn't even zoomed out fully, not many people were there and it's sitting at 18 fps.

Sits at twice the frames.

 

Nope, linking some CA prices who usually exceed the MSRP isn't a valid point at all. In the USA it was 200$, so was the 8350. Saying it again; don't hide the 8350's terrible price/performance. Thank you.

 

 

Exactly what I've been saying; a 8350 is a 4300 in terms of gaming performance with 4 additional cores that might be useful once a year orso or to be a wannabe 5960x owner. Why pay twice as much when a 4300 does the same thing? Why pay 5 times more for a 5960x when a 4670K does better?

Noticing how crappy that AMD dual core performs against a i3? Same story in any 4-threaded game between the 4670K/8350.

 

 

I could spray personal experience on the net as well, saying that my 8350 at 5GHz wasn't a playable experience in 500 games. Personal experience just means nothing, I still see people lying about things like "I have 70 fps with my 7870 in BF4/Crysis 3 at max settings" or you making up that WoW plays without issues. When am I getting some valid arguments that justifies the 8350's cost over a 4300?

 

 

 

People have been playing WoW for 10 years on far weaker GPU and CPU configs.  My experience has been pleasant, but I'm also new to the game so YMMV in high level raids.  In the questing and dungeons that I've done, I rarely drop below 60.  (Also you're comparing a fucking 270 to an 290x.  GPU still matters.)

 

I live in Canada, so I linked the price applicable to me.  I paid $231.67 for my FX 8350 and some thermal paste when I bought it over a year ago.  That doesn't include that I would have had to buy a new motherboard, when I did the math it was around $400-500 total for me at the time.  I don't give a fuck what it costs in the U.S because I don't buy my components in the U.S.  

 

 8 cores offer potential future scaling when games start actually using more than four cores, and buying a four core processor would be short sighted.  Just look at the difference in new games like DA:I http://www.techspot.com/articles-info/921/bench/CPU_2.png.  I currently run the game anywhere from 60-90. (I haven't benched it yet.)  Losing 20% FPS would be more of a deal breaker than gaining 20% FPS.  

 

 

You must not raid in WoW or FF.

Some of your statements are just rediculous, too many to count. You need to measure your GPU load in games you play, you're in for a surprise.

The i5-4690k can be had for as little as $180, but more commonly for $200.

Why would you want "good enough for some games" when you can have "excellent for all games" for the same price! A locked i5 is the same price as an FX8 and it performs excellent in all games, gives you an upgrade path unlike AMD, and won't bottleneck high end GPUs.

Look at those Sandy Bridge and Clarkdale i3s (which are not good) outperforming the FX in both those games! That should tell you all you need to know about the FX.

Bottom line, the FX should not be recommended at all for gaming outside of obscure places in the world where Intel is prohibitively more expensive than AMD.

The quality of the experience doesn't really change.  Atleast for me.  Look at my benchmarks on the other page.  Damn near all of the games I play run at 1080p60.  There's a handful that don't, but upgrading won't really resolve that right now. (Maybe in another year or two it might be worthwhile.)  When brand new games like Sniper Elite III or Alien Isolation run at 120FPS perfectly, why do I need more?  I get the full benefit of my GPU solution in damn near every game I play. (My benchmarks tend to match any benchmarks I find for dual 270x's.)  I cap my frames at 60 in most games, because why waste power utilizing extra GPU?  I also don't need vsync because it's a 144hz monitor (I only play games at unlocked framerate if they run past 120 FPS)  I'm at the perfect point where I can maintain a perfect 60 without any drops into the 40-50 range that I so despise in every AAA title without really going crazy reducing settings. 2 GB Vram and games that don't support crossfire hold me back more than my CPU.  

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

(Also you're comparing a fucking 270 to an 290x.  GPU still matters.)

Both were bottlenecked so GPU performance doesn't matter at all.

 

 

I live in Canada, so I linked the price applicable to me.  I paid $231.67 for my FX 8350 and some thermal paste when I bought it over a year ago.  That doesn't include that I would have had to buy a new motherboard, when I did the math it was around $400-500 total for me at the time.  I don't give a fuck what it costs in the U.S because I don't buy my components in the U.S.  

I'm asking why did you pay 130$ more for 4 extra cores with zero fps gain?

 

 

 8 cores offer potential future scaling when games start actually using more than four cores, and buying a four core processor would be short sighted.  Just look at the difference in new games like DA:I http://www.techspot.com/articles-info/921/bench/CPU_2.png.  I currently run the game anywhere from 60-90. (I haven't benched it yet.)  Losing 20% FPS would be more of a deal breaker than gaining 20% FPS.  

Do you call 50% between the 8350 & 4670K future scaling? Why didn't you buy those 16 core AMD opterons? Maybe it will be useful in the future.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Both were bottlenecked so GPU performance doesn't matter at all.

 

 

I'm asking why did you pay 130$ more for 4 extra cores with zero fps gain?

 

 

Do you call 50% between the 8350 & 4670K future scaling? Why didn't you buy those 16 core AMD opterons? Maybe it will be useful in the future.

 

Double the framerate and... double the GPU.... hmmmm

 

Bottleneck doesn't mean that completely different video cards will get the same performance.  

 

I paid what I felt would be acceptable for ~2-3 years.  Games utilizing more cores is an obvious trend, especially when the PS4 and Xbone were both announced as using 8 core APU's.  Admittedly when I bought it I had no intension of getting a 120 FPS monitor, but one day I wanted to try 3D... (only to find out that AMD's 3D support is terrible, but oh well it's nice for counterstrike.)

 

Are you one of those people who buys a G3258?  It's short sighted to not buy hardware without considering how performance might change over time.  FX 8 core had nowhere to go but up.

 

I get better performance than all the GPU's in this video for a system a mere fraction of the costs.  Not surprising that two cards beats the highest end single cards of course, just making the point that my system is more than capable of matching very powerful systems in the latest games.  

https://www.youtube.com/watch?v=lsrBhnBOlQo

 

Oh, and I also don't have a bottleneck.

http://i.imgur.com/Uq3hBXJ.jpg

 

 

Running on Ultra w/4x MSAA ( I usually turn the MSAA down so I don't drop into the 40-50s at all, but I wanted to match the settings in the video.)

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Bottleneck doesn't mean that completely different video cards will get the same performance.  

If your CPU bottlenecks a 270x, you won't gain a thing from a better GPU unless you get a better CPU. Don't need to prove this, just use your brain a bit

 

 

Are you one of those people who buys a G3258?  It's short sighted to not buy hardware without considering how performance might change over time.  FX 8 core had nowhere to go but up.

8150 was released in 2011 and it's considered as a shit CPU while in mean time the 8350 only has 5% more IPC. Those cpu's were dead before they even hitted the market, you're not going to keep up without improving your singlethreaded performance and dropping more cores isn't doing anything. 

 

 

I get better performance than all the GPU's in this video for a system a mere fraction of the costs.

GPU usage?

Link to comment
Share on other sites

Link to post
Share on other sites

 

If your CPU bottlenecks a 270x, you won't gain a thing from a better GPU unless you get a better CPU. Don't need to prove this, just use your brain a bit

 

 

 

8150 was released in 2011 and it's considered as a shit CPU while in mean time the 8350 only has 5% more IPC. Those cpu's were dead before they even hitted the market, you're not going to keep up without improving your singlethreaded performance and dropping more cores isn't doing anything. 

 

 

GPU usage?

 

GPU still matters even when bottlenecked.  You'll still see linearity between GPU performance even with weaker CPU's limiting performance.  The only time you'll see identical performance is if they are both equally bottlenecked, which could be the case for WoW but often isn't the case for bottlenecks since usually it's extremely rare for higher end cards to becapped at the same spot as lower end ones. (But it's possible with WoW for sure given the age of the game and how well the GPU's will run it when not bottlenecked.)

 

http://us.hardware.info/reviews/5298/2/thief-tested-with-amd-mantle-patch-significant-performance-boost-for-entry-level-cpus-results-with-r9-290x-r9-280x-r9-270x-and-r7-260x

 

As you can see, even the bottlenecked 290X still crushes the 270X.  (Which isn't bottlenecked at all)  

 

More usage of other cores will raise performance.  Single thread will become less important.  DA:I is a brand new example.  It's only going to get more prevalent as more games spread out the CPU load more and individual core speed matters less.

 

GPU Usage is in the screenshot.  99%/98% for each card respectively.  

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

@Mush Brain

Why would you compare CPU's by using different GPU's?  Especially when you're dealing with quad crossfire vs SLI in a game that may very well have negative scaling past two gpu?  Or just different levels of drive optimization in general.

 

I tried to run my own benchmark for it, but RE5 doesn't want to launch because of GFWL in its death throes.

 

 

I think there should only be 4 categorizations for gaming performance.

 

-Below 30 FPS

-30 FPS stable

-60 FPS stable

-120 FPS stable.

 

That's all any gamer should care about.  For people who want to play at 30 or 60 FPS, an FX will not hold them back.  If you're buying dual high end GPu's you should be playing at a resolution that actually requires them and in those cases an FX will not hold you back. If you want to play games at 120FPS, an FX could potentially hold you back. (But the type of games that you want to run at 120 FPS i.e Dota 2 and Counterstrike will easily do so with an FX.) Arma 3 is the exception (because it heavily relies on hyperthreading, which doesn't work on AMD and has nothing to do with the performance of the chip) not the norm, and most games are highly playable.  Right now 4th gen intel is price competitive with AMD and outperforms it in CPU restricted games.  3rd gen intel, which was out when Vishera came out, was not so competitive.  The most competitive processor was the 3570K and the 3770K, both of which were significantly more expensive.  Also, plenty of AM2 boards were updated through BIOS to support AM3+ processors, so for those us with Phenom II's it was a no brainer to go with FX.  Especially when performance in games was near identical http://www.overclock.net/t/1333027/amd-fx-8350-vs-i5-3570k-delidded-single-gpu-and-crossfire-gpu and still is in most titles.

 

 

 

Who cares if Intel nets you 300 FPS and AMD nets you 200?  How does that matter?  If your cards are matched to your resolution, you won't even hit those kinds of framerates.  It's just a pointless pissing contest.  A bottleneck is when the GPU isn't utilized when it needs to be, not when you're already well past even the highest refresh rate monitors and are trying to measure your penis size.

 

 

OP already decided.  No point in continuing to have the "AMD vs Intel" debate.  

125 FPS on a 7990 with significant drops as shown in RE5's graph, vs apair of GTX 480's getting 200+..................

 

That's not a small difference, it is diabolically massive.

 

1 7970 GPU is almost 2x the performance of 1 stock GTX 480 in most games, so a 7990 should be over 2.5x faster.

 

And to top it off, my 2500k was stock clocked.

i7 5930k . 16GB Corsair Vengeance LPX 2666 DDR4 . Gigabyte GA-X99-Gaming G1-WIFI . Zotac GeForce GTX 980 AMP! 4GB SLi . Crucial M550 1TB SSD . LG BD . Fractal Design Define R2 Black Pearl . SuperFlower Leadex Gold 750w . BenQ GW2765HT 2560x1440 . CM Storm QF TK MX Blue . SteelSeries Rival 
i5 2500k/ EVGA Z68SLi/ FX 8320/ Phenom II B55 x4/ MSI 790FX-GD70/ G.skill Ripjaws X 1600 8GB kit/ Geil Black Dragon 1600 4GB kit/ Sapphire Ref R9 290/ XFX DD GHOST 7770 
Link to comment
Share on other sites

Link to post
Share on other sites

The titanic battle rages on :o

Link to comment
Share on other sites

Link to post
Share on other sites

 

How's the max multithreaded performance any relevant for Arma 3 that's only taking advantage of 2-3 cores? Get your logic right seriously, just get a freaking idea how threads & processes work. 

 

 

8350 at 5.4GHz bottlenecking 7970 average fps around 25 fps

4670K at 4GHz with FPS capped to 60 fps averaging at 50 fps;

There you go. If graphics don't affect performance in Arma 3 then it's the CPU that's affecting performance which you're agreeing now the 4670K will do better. Stop being a fanboy for a second, could you? You're not helping anyone out with your misinformation.

 

No, you dont seem to understand. Im talking about the reliability of that techyescity guy, since he was showing benchmarks of programms using all cores and the i5 beating it althought there are several cpu benchmarks showing the oposite. 

Also you dont seem to understand: 

ARMA 3 Multiplayer FPS are heavily dependent on the servers cpu. My average fps with my xeon on arma 3 are not higher than 30, can however go up to 60 on a good server where not a lot of stuff is happening. If that video is singleplayer, he is getting around the same FPS as me.

In the mission the i5 played in i often didnt even reach 25fps, so that scene may be gpu dependent. If you dont believe me i can make a video about it.

My Rig: AMD Ryzen 5800x3D | Scythe Fuma 2 | RX6600XT Red Devil | B550M Steel Legend | Fury Renegade 32GB 3600MTs | 980 Pro Gen4 - RAID0 - Kingston A400 480GB x2 RAID1 - Seagate Barracuda 1TB x2 | Fractal Design Integra M 650W | InWin 103 | Mic. - SM57 | Headphones - Sony MDR-1A | Keyboard - Roccat Vulcan 100 AIMO | Mouse - Steelseries Rival 310 | Monitor - Dell S3422DWG

Link to comment
Share on other sites

Link to post
Share on other sites

For gaming the 8350 is the way to go unless you can get your hands on an i7.  The only i5 that beats the 8350 is the 4690k and thats about it really.

Link to comment
Share on other sites

Link to post
Share on other sites

No, for those games better single core performance is vital. In tf2 my fps went up by 100 when going from the 8350 @ 5.1ghz to the 5820K at stock (3.3ghz).

 

 

:rolleyes:

Link to comment
Share on other sites

Link to post
Share on other sites

TL:DR - See underlined.

 

 

My friends 280x @1000Mhz Coreclock which is 70-80% equivalent to my 290 @ stock gets 1/3 of my performance in the SAME game we play.

You can say World of Tanks is limited in it's threaded nature, but THAT applies to ME too.

 

My 4690 VS HIS 8350 (Both using 2133Mhz Memory both have 8GB total, both have the game on SSD's)

90-100fps VS 31-40fps averages

 

This is one example I noticed when going to his and I saw his average was 31-35fps (at the time on this map) with VERY VERY frequent rises and dips between 30-50fps (He should be having the Framerates his card should provide...)

Later on I installed WOT, cos he wants me to play it, and I was genuinely surprised when I am at 90-100fps pretty consistently all the time. InGameDetails are Maxed for both systems both on 1080p

 

While his is still playable, that's so underwhelming!

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Going to go white girl:

I can't even.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

With an Intel CPU. You won't go wrong with it. It'll perform well in both multicore friendly and non-multicore friendly games.

 

With an AMD CPU. You take the risk of going wrong with it in games that are non multicore friendly games.

 

I'd pay the premium to not have to deal with taking the risk.

Link to comment
Share on other sites

Link to post
Share on other sites

Go with the fx 8350. It is a solid performer and granted its single core performance is lower than an intel CPU, but it will still perform well and WILL NOT bottleneck a graphics card as people claim. True bottlenecks come from older motherboard/CPU configurations.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

GPU still matters even when bottlenecked.  You'll still see linearity between GPU performance even with weaker CPU's limiting performance.  

Completely wrong. 

Bottlenecking single GPU; http://i.imgur.com/viwcf5j.png

Enabling SLI; http://i.imgur.com/tce7sMV.png

No difference.

Since we were comparing the performance between Intel & AMD in WoW, well this is the closest benchmark;

Test-Haswell-4770K-4670K-4570-Crysis-3-v

Double the gain.

 

 

No, you dont seem to understand. Im talking about the reliability of that techyescity guy, since he was showing benchmarks of programms using all cores and the i5 beating it althought there are several cpu benchmarks showing the oposite. 

Also you dont seem to understand: 

ARMA 3 Multiplayer FPS are heavily dependent on the servers cpu. My average fps with my xeon on arma 3 are not higher than 30, can however go up to 60 on a good server where not a lot of stuff is happening. If that video is singleplayer, he is getting around the same FPS as me.

In the mission the i5 played in i often didnt even reach 25fps, so that scene may be gpu dependent. If you dont believe me i can make a video about it.

What a load of bullshit really. The servers CPU has completely nothing to do with rendering frames on our side. In fact different server has nothing to do with your performance, some servers can be higher populated but that doesn't matter at all. Point is the more people you have around you, the more intensive it gets on the CPU just like MMO's. Who's not on your screen won't be rendered simple as. Your point is basically coming down to something like the GPS server slows your car down hahaha.

I've shown you a gameplay comparison of two different CPU's showing a difference of 100% in the same map, you can't deny it unless you're an AMD fanboy which you are acting like now. Tech Yes City showed the same difference, Hardwarepal showed the same difference.

Don't start about Tech Yes City's realibility of benchmarking when you were sucking Teksyndicate's d*ck who are the only "source" that ever said AMD > Intel. Do you really expect us to take your word when you have jack evidence to back your nonsense up? First you did benchmarks, not seeing any, then you have a xeon & i5, what's next really?

"That scene may be GPU dependent" seriously just stop. You clearly have no ideas when youre CPU or GPU limited. Stop posting here, you're getting proven wrong everytime.

Link to comment
Share on other sites

Link to post
Share on other sites

Go with the fx 8350. It is a solid performer and granted its single core performance is lower than an intel CPU, but it will still perform well and WILL NOT bottleneck a graphics card as people claim. True bottlenecks come from older motherboard/CPU configurations.

OMFG You actually posted that....

Did you even read any of the 1000 threads on this issue.....that prove you wrong...

 

Just these last 6 pages will prove you wrong if you read them... start at page one... work your way forward..

Tell me then there is no bottleneck from the 8350 in ANY game... "You said "WILL NOT" without specifics.. I'm guessing you mean all games?

 

If your gunna make a post like that, cite a source, or Factual evidence <-- Which I need not do, as there are MANY MANY MANY facts on this forum/thread alone you can easily click, read & search for within less than 10s.

 

HOLY CRAP!

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

snip!

 

snap!

 

snip!

 

snap!

 

snap!

 

snip!

Guys, all this arguying is useless this has been prooved countless time the AMD piledriver architecture is old and inneficient and those CPU's just plain sucks at gaming for the most part and YES THEY DO VASTLY LIMIT THE PERFORMANCE OF HIGHER-END GPU'S in JUST ABOUT EVERY GAMES OUT THERE...i can't wait for the day that AMD will FINALY admit defeat and RETIRE them from the market so that such threads FINALY stop rising again and again it's just plain stupid...now stop it!!

MOD PLEASE LOCK THREAD THERE'S NO MORE KNOWLEDGE TO BE GAINED HERE FOR ANYBODY.

i'm pissed, i've spent almost 300$ in the begining of year 2014 on freaking AMD FX-8320 and stuff i learned it the hard way and so long as uninformed people will still claim that it's a viable option for gaming some miss-informed people (which i was i must admit i was the worst of all i had no experience with how greater intel has grown over the years and i assumed AMD was still competitive...after all how could an 8 core 125W TDP freaking 4GHZ beast can be slow as shit at playing games and have a godawful single-threaded performance huh?!...well gess what it is god awful and there is a better option from intel at every pricepoint except IMHO for ULTRA low budget where even an i3 is out of question AMD has the 860K which is good for the super low price...other than that AMD should call it a day and leave the shelf they do more harm than good.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

In B4 Locked & another thread with the same topic...

:)

But your right... back and forth, and I'm totally guilty, but fact's must be stated. (even in an endless debate)

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Go with the fx 8350. It is a solid performer and granted its single core performance is lower than an intel CPU, but it will still perform well and WILL NOT bottleneck a graphics card as people claim. True bottlenecks come from older motherboard/CPU configurations.

O really.

TDLx2vT.png

F0tebUQ.png

Link to comment
Share on other sites

Link to post
Share on other sites

O really.

TDLx2vT.png

there you go, to all of you freaking nublets in denial there you have it...haswell nearly DOUBLE the AMD FX in single-threaded performance...that's how good intel went at mastering this sh!t...TWICE THE PERFORMANCE, litteraly...now how could you even argue over this? the FX should be a 65$ CPU that comes with a free motherboard and a couple games to be a viable option for god sakes!!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Not everyone uses single threaded applications though

Remember a wise man once said, "You'll most likely hear/see more bad reviews from products than good, because if they get a good product, they won't bother to write a review, and if they got a bad product, they'll complain about the product" ~ SoftenButterCream

Link to comment
Share on other sites

Link to post
Share on other sites

Not everyone uses single threaded applications though

Everyone does actually...

All apps use multiple single threads to do their bidding.

Some more multiple than others. - http://en.wikipedia.org/wiki/Thread_%28computing%29

 

 

 

That chart backs up my thoughts on purchasing my CPU :)

Which I read reviews on and saw that in Gaming it's TOP TIER for a Quadcore without HT.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Completely wrong. 

Bottlenecking single GPU; http://i.imgur.com/viwcf5j.png

Enabling SLI; http://i.imgur.com/tce7sMV.png

No difference.

Since we were comparing the performance between Intel & AMD in WoW, well this is the closest benchmark;

Test-Haswell-4770K-4670K-4570-Crysis-3-v

Double the gain.

 

 

What a load of bullshit really. The servers CPU has completely nothing to do with rendering frames on our side. In fact different server has nothing to do with your performance, some servers can be higher populated but that doesn't matter at all. Point is the more people you have around you, the more intensive it gets on the CPU just like MMO's. Who's not on your screen won't be rendered simple as. Your point is basically coming down to something like the GPS server slows your car down hahaha.

I've shown you a gameplay comparison of two different CPU's showing a difference of 100% in the same map, you can't deny it unless you're an AMD fanboy which you are acting like now. Tech Yes City showed the same difference, Hardwarepal showed the same difference.

Don't start about Tech Yes City's realibility of benchmarking when you were sucking Teksyndicate's d*ck who are the only "source" that ever said AMD > Intel. Do you really expect us to take your word when you have jack evidence to back your nonsense up? First you did benchmarks, not seeing any, then you have a xeon & i5, what's next really?

"That scene may be GPU dependent" seriously just stop. You clearly have no ideas when youre CPU or GPU limited. Stop posting here, you're getting proven wrong everytime.

 

You have never played arma or dayz or anything even closely related to arma? you can have a 5000€ pc and if you play on a bad server that renders peoples positions or the map or vehicles bad, you're going to have bad fps. Its a long known fact that the arma games are pretty much the only game where fps is server side aswell. 

The i5 was singleplayer, i played that mission myself, the other one was not the same mission. Im not an amd fanboy, im not an intel fanboy and i dont hate anyone of them however saying bullshit is where it ends for me. 

I have a xeon as you can see in my sig and i never said i have an i5. I said that the scene which you showed with the i5 benchmark is a mission i have played myself and is a scene where i with an intel processor do not get more than 25fps. Im not getting proven, if you call yourself proving my arguments wrong when you clearly have not played arma single or multiplayer in your lifetime. Im not talking about teksyndicate other than my first post, techyescitys reliability is still questionable, as you even mentioned yourself and even the most intel biased website, cpuboss.com, says that the 8350 has a better alround performance when it comes down to benchmarks executed on the cpu, like cinebench or passmark, and still your "reliable source techyescity" says that the 8350 performs worse in rendering which is heavily dependent on the cpus alround performance as tested in programms like cinebench.

 

Ill give you a benchmark of the arma 3 singleplayer mission tomorrow. 

 

Now i want you to buy arma 3, test out different servers and look at your fps. Theres a reason why server have HighFPS in their name since arma II and dayz.

 

Also you need to fix your benchmark, it kinda doesnt show up.

My Rig: AMD Ryzen 5800x3D | Scythe Fuma 2 | RX6600XT Red Devil | B550M Steel Legend | Fury Renegade 32GB 3600MTs | 980 Pro Gen4 - RAID0 - Kingston A400 480GB x2 RAID1 - Seagate Barracuda 1TB x2 | Fractal Design Integra M 650W | InWin 103 | Mic. - SM57 | Headphones - Sony MDR-1A | Keyboard - Roccat Vulcan 100 AIMO | Mouse - Steelseries Rival 310 | Monitor - Dell S3422DWG

Link to comment
Share on other sites

Link to post
Share on other sites

ehhhh, @LinusTech

 

What do you say?

Dont come along with him closing the thread. Debates are debates, as long as its not getting out of hand thats the way humanity works and one of the only ways to solve problems. The other one would be killing the other guy :-(

My Rig: AMD Ryzen 5800x3D | Scythe Fuma 2 | RX6600XT Red Devil | B550M Steel Legend | Fury Renegade 32GB 3600MTs | 980 Pro Gen4 - RAID0 - Kingston A400 480GB x2 RAID1 - Seagate Barracuda 1TB x2 | Fractal Design Integra M 650W | InWin 103 | Mic. - SM57 | Headphones - Sony MDR-1A | Keyboard - Roccat Vulcan 100 AIMO | Mouse - Steelseries Rival 310 | Monitor - Dell S3422DWG

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×