Jump to content

Prove AMD's Superiority To Me

Suika

I played Guild Wars 2 with my FX, at times where 200 people were on a world boss, and i did fine. I am calling shenanigans on that skyrim graph, as mine is heavily modded with 4k textures and i still average over 90fps with a 2gb GTX 770, with my min FPS never dropping under 60. I played the ArcheAge beta, and i recall both Intel and AMD CPU's complaining about the frame rates, and as soon as Nvidia released graphics drivers, i myself had no problem. Granted, we never had 100+ people on the screen at any given time, but it was a beta, and not many people were doing the same things at the same time.

 

I can say out of the games that are in my steam library, performance difference between the 8320 and the i5 3570k is seldom noticeable. Only time i've seen miracle frame rates from the intel, would be in certain MMO scenario's. In WoW, the 3570k managed to gain 30+ more FPS than the 8320. To be expected from most MMO's. 

 

I have no idea what went wrong in that BF4 graph, but it can't be accurate either. In 64 man servers, my stock 8320 still averaged 40-50fps with a GTX 770. Any source to these graphs? Do they include which drivers were used, what patch update each game was on? You pluralized "games" meaning multiple games were unplayable on the FX when compared to intel, but i dont see many games in this graph that are "unplayable" except maybe flight simulator. I don't own many of these games, but the ones that i do own differ greatly in the performance listed by these graphs, and my PC is below their specs.

 

-MageTank

Look through all of these sources... the i3 is handing it to the FX8s and FX9s in so many games!

Benchmarks:

http://www.hardcorew...-4340-review/2/

http://www.hardwarep...8-games-tested/

http://www.tomshardw...cpu,3929-7.html

http://www.anandtech...w-vishera-95w/3

http://techreport.co...sor-reviewed/14

https://translate.go...v-test-gpu.html

http://pclab.pl/art57842.html

 

 

"To put it nicely, the FX-8370E is a true middle-of-the-road CPU. Using it only makes sense as long as the graphics card you choose comes from a similar performance segment.

Depending on the game in question, AMD’s new processor has the potential to keep you happy around the AMD Radeon R9 270X/285 or Nvidia GeForce GTX 760 or 660 Ti level.

A higher- or even high-end graphics card doesn’t make sense, as pairing it with AMD's FX-8370E simply limits the card's potential."

 

"This is a huge result – it wasn’t until we used a Haswell core CPU that the R9 280X  was able to deliver consistent frame times and a 60 FPS frame rate in Assassin’s Creed IV. All three AMD CPUs we used – even the FX 8350 – and the Ivy Bridge Core i3 would deliver a sub 60 FPS frame rate, with frame spikes throughout the benchmark run.

In this case, the Core i3 4340 allows the R9 280X GPU to run at maximum potential, just like the Core i5 (and Core i7 would)."

 

"Pop over to the gaming scatter, though, and the picture changes dramatically. There, the FX-8350 is the highest-performance AMD desktop processor to date for gaming, finally toppling the venerable Phenom II X4 980. Yet the FX-8350's gaming performance almost exactly matches that of the Core i3-3225, a $134 Ivy Bridge-based processor. Meanwhile, the Core i5-3470 delivers markedly superior gaming performance for less money than the FX-8350. The FX-8350 isn't exactly bad for video games—its performance was generally acceptable in our tests. But it is relatively weak compared to the competition.

This strange divergence between the two performance pictures isn't just confined to gaming, of course. The FX-8350 is also relatively pokey in image processing applications, in SunSpider, and in the less widely multithreaded portions of our video encoding tests. Many of these scenarios rely on one or several threads, and the FX-8350 suffers compared to recent Intel chips in such cases. Still, the contrast between the FX-8350 and the Sandy/Ivy Bridge chips isn't nearly as acute as it was with the older FX processors. Piledriver's IPC gains and that 4GHz base clock have taken the edge off of our objections.

The other major consideration here is power consumption, and really, the FX-8350 isn't even the same class of product as the Ivy Bridge Core i5 processors on this front. There's a 48W gap between the TDP ratings of the Core i5 parts and the FX-8350, but in our tests, the actual difference at the wall socket between two similarly configured systems under load was over 100W. That gap is large enough to force the potential buyer to think deeply about the class of power supply, case, and CPU cooler he needs for his build. One could definitely get away with less expensive components for a Core i5 system."

 

"The FX-8370E stretches its legs a little in terms of minimum frame rates, particularly in SLI, however it is handily beaten by the i3-4330."

 

"Average frametimes did not do AMD’s processors any justice either. As we already said the game was fluid with i7 and i5’s, and somewhat playable with the i3 processor line. When we switched to FX CPUs not only did we have worse framerate but the gameplay was simply put, laggy."

 

I don't believe your Guild Wars 2 FPS, nor your Skyrim FPS.  40-50fps with stock FX8 in BF4 64p is not good. 

 

This is the problem with your AMD guys, its always "this was my result, with no evidence"  There is tons of evidence out there to show these games are unplayable, my friend with the FX8 can't play ArcheAge/ARMA3/DayZ, games that I love playing and he can't.

 

It all comes down to, why buy a processor that only allows you to play 4 out of 5 games, and still bottleneck in 3 out of 5, when you can get a processor for the same price that plays 5 out of 5 games, with no bottlenecks.  Just look at those graphs, from multiple sources.  The i3s are giving better framerates on SLI 980s than FX8350s!  Multiple sources.  In the majority of games(50% or greater is what majority means) the FX will be ok, but why have ok, when you can have excellent for the same price, that comes with an upgrade path, no bottlenecks, less electricity cost. 

 

This forum is littered with previous and current FX users who have had awful results, throttling, or bottlenecks with their FX processors only to make the switch to Intel and be like "oh my god, I was so wrong.  I'm sorry to anyone I recommended the FX to, I had no idea my minimum framerates would improve so much and gameplay is so much better."

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

I've never faced a problem with MMO's even on my A10-6800k. And trust me I have played essentially every MMO worth playing (as I develop for the private server community). Easily more than 70% of MMO's were written during the Athlon 64 and Pentium 4 era. IPC improvements have increased largely on both sides since then.

Link to comment
Share on other sites

Link to post
Share on other sites

Look through all of these sources... the i3 is handing it to the FX8s and FX9s in so many games!

Benchmarks:

http://www.hardcorew...-4340-review/2/

http://www.hardwarep...8-games-tested/

http://www.tomshardw...cpu,3929-7.html

http://www.anandtech...w-vishera-95w/3

http://techreport.co...sor-reviewed/14

https://translate.go...v-test-gpu.html

http://pclab.pl/art57842.html

 

 

"To put it nicely, the FX-8370E is a true middle-of-the-road CPU. Using it only makes sense as long as the graphics card you choose comes from a similar performance segment.

Depending on the game in question, AMD’s new processor has the potential to keep you happy around the AMD Radeon R9 270X/285 or Nvidia GeForce GTX 760 or 660 Ti level.

A higher- or even high-end graphics card doesn’t make sense, as pairing it with AMD's FX-8370E simply limits the card's potential."

 

"This is a huge result – it wasn’t until we used a Haswell core CPU that the R9 280X  was able to deliver consistent frame times and a 60 FPS frame rate in Assassin’s Creed IV. All three AMD CPUs we used – even the FX 8350 – and the Ivy Bridge Core i3 would deliver a sub 60 FPS frame rate, with frame spikes throughout the benchmark run.

In this case, the Core i3 4340 allows the R9 280X GPU to run at maximum potential, just like the Core i5 (and Core i7 would)."

 

"Pop over to the gaming scatter, though, and the picture changes dramatically. There, the FX-8350 is the highest-performance AMD desktop processor to date for gaming, finally toppling the venerable Phenom II X4 980. Yet the FX-8350's gaming performance almost exactly matches that of the Core i3-3225, a $134 Ivy Bridge-based processor. Meanwhile, the Core i5-3470 delivers markedly superior gaming performance for less money than the FX-8350. The FX-8350 isn't exactly bad for video games—its performance was generally acceptable in our tests. But it is relatively weak compared to the competition.

This strange divergence between the two performance pictures isn't just confined to gaming, of course. The FX-8350 is also relatively pokey in image processing applications, in SunSpider, and in the less widely multithreaded portions of our video encoding tests. Many of these scenarios rely on one or several threads, and the FX-8350 suffers compared to recent Intel chips in such cases. Still, the contrast between the FX-8350 and the Sandy/Ivy Bridge chips isn't nearly as acute as it was with the older FX processors. Piledriver's IPC gains and that 4GHz base clock have taken the edge off of our objections.

The other major consideration here is power consumption, and really, the FX-8350 isn't even the same class of product as the Ivy Bridge Core i5 processors on this front. There's a 48W gap between the TDP ratings of the Core i5 parts and the FX-8350, but in our tests, the actual difference at the wall socket between two similarly configured systems under load was over 100W. That gap is large enough to force the potential buyer to think deeply about the class of power supply, case, and CPU cooler he needs for his build. One could definitely get away with less expensive components for a Core i5 system."

 

"The FX-8370E stretches its legs a little in terms of minimum frame rates, particularly in SLI, however it is handily beaten by the i3-4330."

 

"Average frametimes did not do AMD’s processors any justice either. As we already said the game was fluid with i7 and i5’s, and somewhat playable with the i3 processor line. When we switched to FX CPUs not only did we have worse framerate but the gameplay was simply put, laggy."

 

I don't believe your Guild Wars 2 FPS, nor your Skyrim FPS.  40-50fps with stock FX8 in BF4 64p is not good. 

 

This is the problem with your AMD guys, its always "this was my result, with no evidence"  There is tons of evidence out there to show these games are unplayable, my friend with the FX8 can't play ArcheAge/ARMA3/DayZ, games that I love playing and he can't.

 

It all comes down to, why buy a processor that only allows you to play 4 out of 5 games, and still bottleneck in 3 out of 5, when you can get a processor for the same price that plays 5 out of 5 games, with no bottlenecks.  Just look at those graphs, from multiple sources.  The i3s are giving better framerates on SLI 980s than FX8350s!  Multiple sources.  In the majority of games(50% or greater is what majority means) the FX will be ok, but why have ok, when you can have excellent for the same price, that comes with an upgrade path, no bottlenecks, less electricity cost. 

 

This forum is littered with previous and current FX users who have had awful results, throttling, or bottlenecks with their FX processors only to make the switch to Intel and be like "oh my god, I was so wrong.  I'm sorry to anyone I recommended the FX to, I had no idea my minimum framerates would improve so much and gameplay is so much better."

 

Ill redownload both skyrim and GW2 just to prove this point. I too know the difference of performance between the intels and FX CPU's, there is a reason i use an intel CPU as my daily rig. However, you people claiming the FX cannot play something at playable frame rates is absurd. Ill even download archeage again, since that is one of the games you are still claiming is unplayable on the FX. Not only will it play these games with playable frame rates, ill do it at the stock 3.5ghz clock of the 8320.

 

-MageTank

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ill redownload both skyrim and GW2 just to prove this point. I too know the difference of performance between the intels and FX CPU's, there is a reason i use an intel CPU as my daily rig. However, you people claiming the FX cannot play something at playable frame rates is absurd. Ill even download archeage again, since that is one of the games you are still claiming is unplayable on the FX. Not only will it play these games with playable frame rates, ill do it at the stock 3.5ghz clock of the 8320.

 

-MageTank

Skyrim is locked at 60FPS, and my Core 2 Duo @ 3.12GHz stock (slower than an FX 8350 @ 4GHz) sustains that just fine with the real vision mods (and the ones recommended with it) at 1080p, an FX cpu wont have any problems with games such as Skyrim. (Core 2 Duo 2GHz is the minimum recommended on the case).

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Ill redownload both skyrim and GW2 just to prove this point. I too know the difference of performance between the intels and FX CPU's, there is a reason i use an intel CPU as my daily rig. However, you people claiming the FX cannot play something at playable frame rates is absurd. Ill even download archeage again, since that is one of the games you are still claiming is unplayable on the FX. Not only will it play these games with playable frame rates, ill do it at the stock 3.5ghz clock of the 8320.

 

-MageTank

8350 at 4.4GHz;

G3258;

https://www.youtube.com/watch?v=Kf3vtZrZGPA

Link to comment
Share on other sites

Link to post
Share on other sites

Skyrim is locked at 60FPS, and my Core 2 Duo @ 3.12GHz stock (slower than an FX 8350 @ 4GHz) sustains that just fine with the real vision mods (and the ones recommended with it) at 1080p, an FX cpu wont have any problems with games such as Skyrim. (Core 2 Duo 2GHz is the minimum recommended on the case).

 

I had a Core2Quad laptop at 2.5ghz that could handle skyrim just fine on medium with an ATI Radeon HD 4650. My PC runs the RealVision ENB and those 4k texture mods (my favorite way to play skyrim visually) with the mods to remove ambient lightning, makes the caves super scary. I do still notice the min frame rates, but every single person that owns an FX is aware of this flaw, and has learned to cope with it by now, or has moved on to intel. My point is, 90% of the time, its extremely playable. This is why i do not trust benches from random people, its almost as if they do not play the game, or they test them at a time when driver optimizations and patches are outdated.

 

@Faa Scroll up, to where i specifically used WoW as an example in which Intel has far more frames than AMD. You are arguing with yourself at this point.

 

 

 

I can say out of the games that are in my steam library, performance difference between the 8320 and the i5 3570k is seldom noticeable. Only time i've seen miracle frame rates from the intel, would be in certain MMO scenario's. In WoW, the 3570k managed to gain 30+ more FPS than the 8320. To be expected from most MMO's.

 

 

-MageTank

 

 

 

-MageTank

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8350 at 4.4GHz;

-snip-

G3258;

-snip-

 Doesn't show in the videos what the system config was. I do not believe them.

 

Edit: Overheating system components would explain the FPS if that is truly an FX series CPU (I have my share of issues with my Northbridge barely coping with the 970s bandwidth eg. it overheats when the 970 is at full load)

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Ill redownload both skyrim and GW2 just to prove this point. I too know the difference of performance between the intels and FX CPU's, there is a reason i use an intel CPU as my daily rig. However, you people claiming the FX cannot play something at playable frame rates is absurd. Ill even download archeage again, since that is one of the games you are still claiming is unplayable on the FX. Not only will it play these games with playable frame rates, ill do it at the stock 3.5ghz clock of the 8320.

 

-MageTank

When you play Skyrim, make sure you have lots of mods.  I never said Skyrim was unplayable by the way, just that Intel does it better.

When you play ArcheAge, you will have to go into one of the world events that happen multiple times a day, such as Halcyona, Grimghast, Crimson Rift, Hasla, etc..  Those are the places where 100+ people per faction show up.  This is a massive part of the mid to end game, not easily something you can test just by reinstalling it real fast.

Go play ARMA3, or DayZ.

 

Or, you can check around forums, youtube videos, everywhere to see evidence that these specific games are not going to play at acceptable levels.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

 Doesn't show in the videos what the system config was. I do not believe them.

What are you talking about?  The system config is in both of them.

 

Click the links to the actual youtube pages and look at the descriptions.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

I had a Core2Quad laptop at 2.5ghz that could handle skyrim just fine on medium with an ATI Radeon HD 4650. My PC runs the RealVision ENB and those 4k texture mods (my favorite way to play skyrim visually) with the mods to remove ambient lightning, makes the caves super scary. I do still notice the min frame rates, but every single person that owns an FX is aware of this flaw, and has learned to cope with it by now, or has moved on to intel. My point is, 90% of the time, its extremely playable. This is why i do not trust benches from random people, its almost as if they do not play the game, or they test them at a time when driver optimizations and patches are outdated.

 

@Faa Scroll up, to where i specifically used WoW as an example in which Intel has far more frames than AMD. You are arguing with yourself at this point.

 

 
 

 

 

-MageTank

Why should you have to deal with bad minimum framerates when the equally priced Intel equivalent doesn't have this problem?

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

What are you talking about?  The system config is in both of them.

 

Click the links to the actual youtube pages and look at the descriptions.

Actually showing them before or after the game was started is what I'm talking about.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Actually showing them before or after the game was started is what I'm talking about.

Its right there in the description....

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

When you play Skyrim, make sure you have lots of mods.  I never said Skyrim was unplayable by the way, just that Intel does it better.

When you play ArcheAge, you will have to go into one of the world events that happen multiple times a day, such as Halcyona, Grimghast, Crimson Rift, Hasla, etc..  Those are the places where 100+ people per faction show up.  This is a massive part of the mid to end game, not easily something you can test just by reinstalling it real fast.

Go play ARMA3, or DayZ.

 

Or, you can check around forums, youtube videos, everywhere to see evidence that these specific games are not going to play at acceptable levels.

 

Then we agree that Skyrim is playable on both CPU platforms, yay! I agree, Intel does it better, i have said that in almost every post i have made thus far in this thread. My entire point is that AMD CPU's can play the same games, at playable frame rates, but intel is a more enjoyable experience. I never made it high enough in ArcheAge, so it will take me some time to actually get anywhere to be able to test it properly, but i can do GW2 and skyrim today. I have never played Arma3, or DayZ (i saw how glitchy that game was, and decided to avoid it entirely).

 

Either way, ill look into it since ill be swapping my parts back to the AMD rig for this test anyways.

 

-MageTank

 

EDIT: @Faceman

 

The CPU is 2 years old, and i got it for $110 brand new, along with a 990FXA mobo for another $110. I could not beat that price to performance at the time, which is the only way i even shop for PC parts. I ignore brands for the most part, and go off reviews at the times of purchase. The 8320 looked good, and when i got it, it performed the tasks i required from it. Nowadays, the Intel chips do it better, and i would not suggest anyone choose an FX CPU over the current Intel CPU's, for the specific claims you and i have both made. However, anyone that got their FX cheap enough, and are currently enjoying them in what they do, do not need to upgrade.

 

Thats the point i have been trying to make, which so far, has been a futile attempt.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Its right there in the description....

Is it the video itself (aka bringing up cpu id or speccy) so it can be confirmed whether or not they are the actual specifications? No.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Is it the video itself (aka bringing up cpu id or speccy) so it can be confirmed whether or not they are the actual specifications? No.

So I guess we shouldn't take Linus' or any other reviewers word for it when they show us their test system without a live feed of their system with CPU-Z up.  I understand that you want concrete evidence to that effect, but its just not the way people are doing it the vast majority of the time.  Even in those really detailed tests on Andandtech, they list you the components used, and maybe a screenshot, but no live shot of anything. 

 

All reviewers say the same thing though, games perform better on Intel, so I'm not sure why you would question it.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

I played Guild Wars 2 with my FX, at times where 200 people were on a world boss, and i did fine. I am calling shenanigans on that skyrim graph, as mine is heavily modded with 4k textures and i still average over 90fps with a 2gb GTX 770, with my min FPS never dropping under 60. I played the ArcheAge beta, and i recall both Intel and AMD CPU's complaining about the frame rates, and as soon as Nvidia released graphics drivers, i myself had no problem. Granted, we never had 100+ people on the screen at any given time, but it was a beta, and not many people were doing the same things at the same time.

 

I can say out of the games that are in my steam library, performance difference between the 8320 and the i5 3570k is seldom noticeable. Only time i've seen miracle frame rates from the intel, would be in certain MMO scenario's. In WoW, the 3570k managed to gain 30+ more FPS than the 8320. To be expected from most MMO's. 

 

I have no idea what went wrong in that BF4 graph, but it can't be accurate either. In 64 man servers, my stock 8320 still averaged 40-50fps with a GTX 770. Any source to these graphs? Do they include which drivers were used, what patch update each game was on? You pluralized "games" meaning multiple games were unplayable on the FX when compared to intel, but i dont see many games in this graph that are "unplayable" except maybe flight simulator. I don't own many of these games, but the ones that i do own differ greatly in the performance listed by these graphs, and my PC is below their specs.

 

-MageTank

 

 

I never ever had a game that was not playable on my FX.

People who claim that certain games are unplayable on a FX, are just dumb.

Link to comment
Share on other sites

Link to post
Share on other sites

I never ever had a game that was not playable on my FX.

People who claim that certain games are unplayable on a FX, are just dumb.

if there are games that would be considered ''unplayable'' (AKA as below 24FPS) they are of an epic gendra called MMO's where massive online raid are taking place i did seen some videos where an FX-8350 was struggling to get 12 frames per second (guild wars 2)...but i have to agree with you 99% of the games out there will run between 30 and 60 FPS for the most part...the well optimised games will even see above 80FPS...but the problem is that current high-end cards can output way more than 80FPS...

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

if there are games that would be considered ''unplayable'' (AKA as below 24FPS) they are of an epic gendra called MMO's where massive online raid are taking place i did seen some videos where an FX-8350 was struggling to get 12 frames per second (guild wars 2)...but i have to agree with you 99% of the games out there will run between 30 and 60 FPS for the most part...the well optimised games will even see above 80FPS...but the problem is that current high-end cards can output way more than 80FPS...

 

Sorry, but i still have my FX 8320 at stock 3.5ghz, and i handle GW2 just fine, even in extremely large WvW situations. The only MMO's my FX gets bad at, is some of the older MMO's like Perfect World. The angelica engine was awful anyways, so it is probably not the best example. The fact that i can retain 30+ fps in WvW on ultra settings when 3 servers fighting for control over a single location, should speak for a lot. Again, the time that tests are conducted also mean a lot. Games, and drivers alike, get optimizations over time.

 

-MageTank

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry, but i still have my FX 8320 at stock 3.5ghz, and i handle GW2 just fine, even in extremely large WvW situations. The only MMO's my FX gets bad at, is some of the older MMO's like Perfect World. The angelica engine was awful anyways, so it is probably not the best example. The fact that i can retain 30+ fps in WvW on ultra settings when 3 servers fighting for control over a single location, should speak for a lot. Again, the time that tests are conducted also mean a lot. Games, and drivers alike, get optimizations over time.

 

-MageTank

i'm looking for the video right now...give me a sec.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

i'm looking for the video right now...give me a sec.

 

 

This is a GTX 760 with an FX 8320 clocked at 3.7ghz. Like i have been trying to say, GW2 has received some heavy optimizations, making it well beyond playable on the FX. The people before me saying they did not believe my frame rates, simply have not gamed on their FX's, or do not own one. It is easy to dismiss things without actually testing them yourself and experiencing them. 

 

Hopefully people will stop saying the FX is unplayable in certain games, because i have not found one yet where it was "unplayable".

 

-MageTank

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

This is a GTX 760 with an FX 8320 clocked at 3.7ghz. Like i have been trying to say, GW2 has received some heavy optimizations, making it well beyond playable on the FX. The people before me saying they did not believe my frame rates, simply have not gamed on their FX's, or do not own one. It is easy to dismiss things without actually testing them yourself and experiencing them. 

 

Hopefully people will stop saying the FX is unplayable in certain games, because i have not found one yet where it was "unplayable".

 

-MageTank

fair enough the game might indeed have seen improvements i can't find the video anymore it was about 6 months ago...

AND btw i'm not bashing the AMD FX CPU's i've used one i know what they are capable of and i think when paired with a GPU from a similar performance segment they are doing pretty good.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry, but i still have my FX 8320 at stock 3.5ghz, and i handle GW2 just fine, even in extremely large WvW situations. The only MMO's my FX gets bad at, is some of the older MMO's like Perfect World. The angelica engine was awful anyways, so it is probably not the best example. The fact that i can retain 30+ fps in WvW on ultra settings when 3 servers fighting for control over a single location, should speak for a lot. Again, the time that tests are conducted also mean a lot. Games, and drivers alike, get optimizations over time.

 

-MageTank

What graphics settings do you run on and what is your culling range? Throwing everything on max and turning off culling my I7 4960X is reduced to 19fps in the worst case scenarios, and that's paired up with a pair of GTX Titan Blacks. Mind you, part of the problem is always the internet itself, but still.

 

Edit: I did find the later post with the video of the FX 8320, and I admittedly haven't hopped on since my Fall term started, but that's good! Arenanet finally getting some young blood in there to optimize the engine is great.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Here in the UK the only issue with Intel CPUs is the price.  If you don't need maximum frame rates/efficiency then AMD are more cost effective.  On cheaper gaming builds I'd rather spend the difference on the GPU.

 

Here are some recent prices from 'scan' to illustrate the difference

FX 6300   £79

i3 4160     £87

FX 8320   £111

FX 8350   £127

i5 4460s   £141

i5 4690k   £184

 

If you need the performance you have to pay for it.  Remember though if you are going for high specification, the extra on the CPU is only a small percentage of the overall build cost

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry, but i still have my FX 8320 at stock 3.5ghz, and i handle GW2 just fine, even in extremely large WvW situations. The only MMO's my FX gets bad at, is some of the older MMO's like Perfect World. The angelica engine was awful anyways, so it is probably not the best example. The fact that i can retain 30+ fps in WvW on ultra settings when 3 servers fighting for control over a single location, should speak for a lot. Again, the time that tests are conducted also mean a lot. Games, and drivers alike, get optimizations over time.

 

-MageTank

 

Does not change the fact that a similarly priced i5 system will probably net more. Which is what the whole discussion is about, it's what it's always been about.

Not how bad the FX is, but how meager the performance is in comparison.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×