Jump to content

FX8350 vs i5 3570K (GTX 670) --> Teksyndicate

To clarify, the 3570K is about $10-15 more expensive in Australia than the FX-8350.

Firstly, I'm not an AMD fan or an Intel fan, I'm a fan of whichever chip is better at the price-point that it has been given. However, there were a few things I found to be a little concerning in this video.

The numbers seem much lower than they should be -- for example with Crysis Warhead, my 3570K with my 7970 gets almost 60 fps at stock speeds. Already the "41fps" they state at stock seems a little fishy. With Arma 2, somehow the 3570K got 25fps at stock? Benchmarks elsewhere show the 3570k getting over 100fps at Ultra@1080p, but with no AA and no AF. Is it possible the 8350 might do better with AA/AF than the 3570k?

"Far Cry 3 is another game that seems to favor the AMD"

This is not what we should be looking at for benchmarks. When a 3960X gets only 36 frames per second in Far Cry 3, I'm afraid I find it hard to believe that the 8350 is getting 57. In these instances, I think the game itself seems to be specifically optimized for the AMD CPU.

I think the result difference you're likely to see in games, generally, is similar to that of the Metro 2033 results. That is to say, the 3570k will win in every category for most games, but not by much (a couple of frames).

Overall verdict: In the instances when the 8350 actually won, it won by a LOT. To me, I see these chips as more or less equal -- the 8350 isn't a true eight-core; more like a slightly-better-than-with-hyperthreading quad core (this is why the 3570k still wins in some rendering/editing scenarios). They are both excellent chips, but it's generally not disputed that a 3570k beats out the 8350 in most scenarios.

Like Toby says, I want them to be correct, but something about the way Logan factually states everything makes me less likely to believe him. I'm opting for the idea that -- more or less -- the AMD processor is optimized better in the benchmarks that it won, because of how much it won by. Double the frames per second? Please; the two processors are not that different. If the 8350 is getting double the fps in a game compared to a 3570K, either it's slightly (at least) more optimized for that CPU or you have done something wrong. Real world difference between the two CPUs is win-some-lose-some, but not by a lot.

Let's look at Arma 2, overclock / stock clock:

AMD: 51.92 / 37.8

Intel: 25.56 / 13.12

I refuse to believe that the above benchmark is an indication of CPU performance. It isn't. It's an indication of how the game is optimized. The processors are more-or-less comparable, a benchmark like that seems illegitimate.

I could be wrong, but that just seems wrong. As an advocate for neither AMD nor Intel, I can happily say that the 8350 wins in some games. I just can't believe it isn't due to CPU optimization in Arma 2 specifically.

Exactly. Neither chip is THAT much better than the other that it would result in double the frame rate.
Link to comment
Share on other sites

Link to post
Share on other sites

I wonder how long this whole "Liers!" thing will go on. I guess we should just run our own rigs and compare with others...Someone will make the thread and we'll be required to take screen shots in certain spots with Fraps open to even enter the results basket. Lets see what we really need.

Link to comment
Share on other sites

Link to post
Share on other sites

@toby logan? smug? lol you must be new here.......

As for all the Butt hurt, This was a comparison of how the CPU's would react in certain games, so CPU optimization doesn't matter. If you play far cry and want highest possible fps and stability, you go with the results period.

Logan And Wendell stated multiple times that they could care less what the results are. They stated quite clearly that they all run Intel. So if anything underhanded was afoot, wouldn't they try to fib the results? We need to all relax, The Tek guys are here to help the community, shitting on them just makes you look like a smug fanboy. Oh and btw 1440p isnt some exotic resolution, I use two with my 3930k and sli 660ti's

Well shit, the forum's less than a month old. We're all pretty new here. I think I saw one other video by him months and months ago and had much the same impression...What's your implication? :P

Edit: Also, not directed at you, but the constant proclamations of "we don't care, we love both our children equally!" actually makes things less convincing. Bullshit they don't have a preference. Everyone has a preference. I have a preference...I have a firm preference of AMD, but I'll be owning an intel this time 'round. That doesn't mean I like Intel as much as AMD. Not to say that I think they're fiddling the results, it's just one of those things. :)

Link to comment
Share on other sites

Link to post
Share on other sites

I believe their benchmarks are legit, I honestly do. They're just trolling the trolls with the smugness. They have a pretty enjoyable tek podcast imo.

Link to comment
Share on other sites

Link to post
Share on other sites

@toby logan? smug? lol you must be new here.......

As for all the Butt hurt, This was a comparison of how the CPU's would react in certain games, so CPU optimization doesn't matter. If you play far cry and want highest possible fps and stability, you go with the results period.

Logan And Wendell stated multiple times that they could care less what the results are. They stated quite clearly that they all run Intel. So if anything underhanded was afoot, wouldn't they try to fib the results? We need to all relax, The Tek guys are here to help the community, shitting on them just makes you look like a smug fanboy. Oh and btw 1440p isnt some exotic resolution, I use two with my 3930k and sli 660ti's

While 1440 isn't exotic by any means, most users can't or won't afford a decent monitor that is 1440p They can reasonably be as expensive as an extreme edition i7.

The problem is I could may mi i7-960 with a SLI GTX 550ti look better at 1080p then a 7870 GPU and the 8350 CPU, by going out of my way to deal with heavier PhysX based games, and max out my 2nd card to handle a insane amount of PhysX that most AMD cards wouldn't handle well

The problem is that it's CHERRY PICKING, and it's NOT an even a fair 'average' of games. released recently. It's well known some games are more favoring AMD or Intel, AMD or NVidia GPU's. I don't think of him as smug, so much as a git.

Yes, It's good to know which games the AMD preforms better on. Though honestly 1440p isn't mainstream gaming, hell it's not even mainstream HD cable or satellite yet. The only users who are would have it on an average are ones into high end editing, where they need the best monitors with the best color reproduction.

A real world gaming performance test would be low to moderate AA @ 720 & 1080p because those monitors are the best bank for the buck for resolution, and quite alot of people don't find AA worth cranking up, even if they GPU and CPU wouldn't notice a thing they are so powerful.

The Other problem is when moronic followers, ignore the massive amounts of flaws, overlook the cherry picking, and just accept AMD as the better gaming CPU, when all in all, the i5 to AMD for gaming *which a compatent GPU, and settings that aren't trying to molested the life from it every second* works equally as well, with MINOR differences. All he did was exagerate the differences, which no normal user would do that.

To be honest I couldn't tell much of a difference in AA on 1080p, between 16*** & 2AA. The games that had bad textures were just highlighted EVEN more, and the games that had great textures, kinda had them ruined by being to crisp at times ruining the effect.

The thing that gets people 'butt hurt' is that, people actually take him serious. it took people yelling at him for his benchmarks to include the same fucking SSD for all the platforms. Then he refused to test the 2011 CPU, which tells us that it must have beat the AMD in all respects *as it should have, being a higher tier CPU*

The test results are VERY questionable as well. Those CPU's aren't so different that one would have a 100% better score in one game, then loose to it the same CPU in another game. that's newer.

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder how long this whole "Liers!" thing will go on. I guess we should just run our own rigs and compare with others...Someone will make the thread and we'll be required to take screen shots in certain spots with Fraps open to even enter the results basket. Lets see what we really need.
I don't think many people are calling them liars, just that they fucked up somewhere. A little saying I picked up from another area of interest I have is "extraordinary claims require extraordinary evidence". They've made a claim that qualifies as relatively extraordinary, so of course people will be suspicious until other sources corroborate it.
Link to comment
Share on other sites

Link to post
Share on other sites

Interesting article when it comes to crossfire

http://www.tomshardware.com/reviews/fx-8350-core-i7-3770k-gaming-bottleneck,3407.html

If i am going to trust anyone with benchmarks its going to be TTL not Teksyndicate, I watch them more for interesting news and a few laughs more than anything.

No your right, im actually quite surpised that a $330 intel cpu beat a $200 amd cpu. Wow the teksyndicate was wrong the whole time hes probbaly being paid off by AMD.
Link to comment
Share on other sites

Link to post
Share on other sites

This video was hilarious. The tests were legit and they even said they were surprised at the results. It's not like they're endorsed by AMD nor are they AMD fanboys.

Most of the passive insulting was directed at some redneck that posted an Intel fanboy video that did some testing of their own games at 640x480 and found that Intel wins at some, and AMD wins at some, and then they continue to talk about how Intel is better because it costs more and AMD is a stupid choice because Intel costs more and is therefore better.

Link to comment
Share on other sites

Link to post
Share on other sites

This video was hilarious. The tests were legit and they even said they were surprised at the results. It's not like they're endorsed by AMD nor are they AMD fanboys.

Most of the passive insulting was directed at some redneck that posted an Intel fanboy video that did some testing of their own games at 640x480 and found that Intel wins at some, and AMD wins at some, and then they continue to talk about how Intel is better because it costs more and AMD is a stupid choice because Intel costs more and is therefore better.

Grow up. no one plays at 640x480, unless they have to or the game supports nothing higher. the whole tests were just outright flawed.

Anyone who's got money to blow on a 1440 monitor isn't going to be using those CPU's unless they only want to game and then their monitor is a waste *as it'd been cheaper to get 2-3 1080p monitors and do multi-monitor setups*

I can't name one person who legitimatly players with max AA, and Max every option on.

Real world examples, those two CPU's are neck in neck, because people who play games to play a game and not to benchmark or show off their machine, don't crank all the settings to insane levels just because it's possible. They turn them up to look good, then stop when it gets to about 40-60 FPS. While the real FPS hounds will use even less settings so they can get ultra high FPS for 120+ Hz monitors.

Truth, both are good choices for 'real world gaming' and a 7870 or a GTX 670 is a dam solid choice for gaming as well, and over all give the same performance once you start taking into account more then just 10-15 PC video games. *while lets not forget alot of PC games are Console ports, so there isn't much performance to be gain from a good number of them, because they were designed for weaker hardware in mind.

The pitiful part is that AMD is losing because they can't compete on the high end market, while the budget & mid-range CPU's have very similar preformance, unless your doing something rather heavy liked editing. The other sad part is that AMD is losing because it's focusing on Gaming GPU's mostly, and the high grade GPU's that are poor for gaming [Different brands under Nvidia] *but meant for other tasks* have little to no compatition from AMD.

I actually like AMD and Intel, but people like that, do alot of damage to AMD because they are setting people up for an unrealistic situation where it's only ahead in XYZ, and none of the other key features to Intel CPU's were compared or considered.

Link to comment
Share on other sites

Link to post
Share on other sites

This video was hilarious. The tests were legit and they even said they were surprised at the results. It's not like they're endorsed by AMD nor are they AMD fanboys.

Most of the passive insulting was directed at some redneck that posted an Intel fanboy video that did some testing of their own games at 640x480 and found that Intel wins at some, and AMD wins at some, and then they continue to talk about how Intel is better because it costs more and AMD is a stupid choice because Intel costs more and is therefore better.

Regarding 1440p monitors, a lot of people jumped on those korean branded 1440p monitors that were cheap-ish (cheaper than most "high end" 1080p monitors). YOU might not have bought them but it doesn't mean it wasn't a hot item at one point or another. Linus even talked about it on one of his live streams when a twitter person asked about them.

You can't name one person who plays with all settings maxed? I do, on single player games and I leave it on max if my system can handle it, otherwise I tone it down from there. For competitive play, everything is turned down to the lowest settings, obviously. Doesn't change the fact that I play some games at max, when I can. Granted I don't have a 1080p monitor, I only run at 1680x1050. Realize that most "gamers" just casually play games and don't give a crap about getting better FPS over pretty graphics.

I also don't understand what you're trying to prove in your post. I own AMD and Intel in the past and present and I recommend Intel or AMD to people depending on their budget and needs. So if you're trying to tell me that they both have their strengths, I'm sure I already know.

You're probably not familiar with the video that Logan was poking fun at. I'm not going to link it because that just generates more traffic and thus more revenue for the idiots that made the video but read my second paragraph and that sums up the video. No, I don't think that testing at 640x480 is a good testing methodology, not did I imply it was. If you really want the link, PM me and I'll give you it but be warned, it's a lot of bullshit and your head might hurt after watching for a few minutes. I actually enjoyed Logan's video, it was freaking hilarious and informative but more hilarious because of the jokes that they made towards the neckbeards. Whether or not _ALL_ of his results were sound, all that anyone can take away from this video is that AMD cpus are not complete junk like everyone has been saying from the day 1 tests.

Link to comment
Share on other sites

Link to post
Share on other sites

Just had a run around skyrim (whiterun, same track that linus does) on a 3570K, GTX670, with a fully modded out skyrim. I get an average of around 100fps at 1080p 25 frames more than logan, no overclocked cpu. My gpu is the gigabyte OC'ed version. Allowing for the OC gpu maybe 5 to 10 frames more still something is wrong with his figures.

Link to comment
Share on other sites

Link to post
Share on other sites

Just watched it. I have a 3570k@4.7, it is amazing. Though I do have an am3+ board with a 955@3.9 on it too, which is also amazing, It does everything I ask. I think I'll go support amd and upgrade my amd rig, I don't want to see them go underwater much more than they are. 5.2Ghz here I come :D

lol watch Jack Hammer come out soon... Or Rocket Ship. Tug Boat? I'm loving those code names from amd..

So im not the only one, they're code names blow away intels like Bulldozer, trinity. Damn.
Link to comment
Share on other sites

Link to post
Share on other sites

Just want to add this to the thread, here is a post from tomshardware about the review: "Of course, if AMD had excitedly recognized good progress and tried to charge the same $245 it thought FX-8150 was worth a year ago, I’d be setting FX-8350 aside as quickly as I did with last year’s model. Instead, the company is asking for less than $200. That puts the FX-8350 on par with Intel’s Core i5-3470—a multiplier-locked part that it outperforms in a great many demanding desktop apps. In those same applications, the FX is usually able to beat the $230 Core i5-3570K, too. It’s only when you look back at the single-threaded stuff that AMD continues to get creamed."

Link to comment
Share on other sites

Link to post
Share on other sites

To clarify, the 3570K is about $10-15 more expensive in Australia than the FX-8350.

Firstly, I'm not an AMD fan or an Intel fan, I'm a fan of whichever chip is better at the price-point that it has been given. However, there were a few things I found to be a little concerning in this video.

The numbers seem much lower than they should be -- for example with Crysis Warhead, my 3570K with my 7970 gets almost 60 fps at stock speeds. Already the "41fps" they state at stock seems a little fishy. With Arma 2, somehow the 3570K got 25fps at stock? Benchmarks elsewhere show the 3570k getting over 100fps at Ultra@1080p, but with no AA and no AF. Is it possible the 8350 might do better with AA/AF than the 3570k?

"Far Cry 3 is another game that seems to favor the AMD"

This is not what we should be looking at for benchmarks. When a 3960X gets only 36 frames per second in Far Cry 3, I'm afraid I find it hard to believe that the 8350 is getting 57. In these instances, I think the game itself seems to be specifically optimized for the AMD CPU.

I think the result difference you're likely to see in games, generally, is similar to that of the Metro 2033 results. That is to say, the 3570k will win in every category for most games, but not by much (a couple of frames).

Overall verdict: In the instances when the 8350 actually won, it won by a LOT. To me, I see these chips as more or less equal -- the 8350 isn't a true eight-core; more like a slightly-better-than-with-hyperthreading quad core (this is why the 3570k still wins in some rendering/editing scenarios). They are both excellent chips, but it's generally not disputed that a 3570k beats out the 8350 in most scenarios.

Like Toby says, I want them to be correct, but something about the way Logan factually states everything makes me less likely to believe him. I'm opting for the idea that -- more or less -- the AMD processor is optimized better in the benchmarks that it won, because of how much it won by. Double the frames per second? Please; the two processors are not that different. If the 8350 is getting double the fps in a game compared to a 3570K, either it's slightly (at least) more optimized for that CPU or you have done something wrong. Real world difference between the two CPUs is win-some-lose-some, but not by a lot.

Let's look at Arma 2, overclock / stock clock:

AMD: 51.92 / 37.8

Intel: 25.56 / 13.12

I refuse to believe that the above benchmark is an indication of CPU performance. It isn't. It's an indication of how the game is optimized. The processors are more-or-less comparable, a benchmark like that seems illegitimate.

I could be wrong, but that just seems wrong. As an advocate for neither AMD nor Intel, I can happily say that the 8350 wins in some games. I just can't believe it isn't due to CPU optimization in Arma 2 specifically.

just want to confirm everyone is on the same page. He has all of his setting cracked up to MAX. (including AA)
Link to comment
Share on other sites

Link to post
Share on other sites

Just had a run around skyrim (whiterun' date=' same track that linus does) on a 3570K, GTX670, with a fully modded out skyrim. I get an average of around 100fps at 1080p 25 frames more than logan, no overclocked cpu. My gpu is the gigabyte OC'ed version. Allowing for the OC gpu maybe 5 to 10 frames more still something is wrong with his figures.[/quote']

Logan posted his graphical mods here: http://teksyndicate.com/forum/general-discussion/more-shots-fired-tek-syndicate-crew/130978?page=1

I would be interested to see your fraps log with those graphical mods specifically. I know the Serious HD 2048px mod makes my 6950 wheeze at 1080p.

Link to comment
Share on other sites

Link to post
Share on other sites

The 8350 is a cheaper option overall and does really well in performance. I am building a frugal rig with my friend and this is what we chose.
^ nah.. multi GPU setup won't get bottleneck by FX-8350.

Sauce: http://www.overclock.net/t/1318995/official-fx-8320-fx-8350-vishera-owners-club/6650#post_18997642

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

To clarify, the 3570K is about $10-15 more expensive in Australia than the FX-8350.

Firstly, I'm not an AMD fan or an Intel fan, I'm a fan of whichever chip is better at the price-point that it has been given. However, there were a few things I found to be a little concerning in this video.

The numbers seem much lower than they should be -- for example with Crysis Warhead, my 3570K with my 7970 gets almost 60 fps at stock speeds. Already the "41fps" they state at stock seems a little fishy. With Arma 2, somehow the 3570K got 25fps at stock? Benchmarks elsewhere show the 3570k getting over 100fps at Ultra@1080p, but with no AA and no AF. Is it possible the 8350 might do better with AA/AF than the 3570k?

"Far Cry 3 is another game that seems to favor the AMD"

This is not what we should be looking at for benchmarks. When a 3960X gets only 36 frames per second in Far Cry 3, I'm afraid I find it hard to believe that the 8350 is getting 57. In these instances, I think the game itself seems to be specifically optimized for the AMD CPU.

I think the result difference you're likely to see in games, generally, is similar to that of the Metro 2033 results. That is to say, the 3570k will win in every category for most games, but not by much (a couple of frames).

Overall verdict: In the instances when the 8350 actually won, it won by a LOT. To me, I see these chips as more or less equal -- the 8350 isn't a true eight-core; more like a slightly-better-than-with-hyperthreading quad core (this is why the 3570k still wins in some rendering/editing scenarios). They are both excellent chips, but it's generally not disputed that a 3570k beats out the 8350 in most scenarios.

Like Toby says, I want them to be correct, but something about the way Logan factually states everything makes me less likely to believe him. I'm opting for the idea that -- more or less -- the AMD processor is optimized better in the benchmarks that it won, because of how much it won by. Double the frames per second? Please; the two processors are not that different. If the 8350 is getting double the fps in a game compared to a 3570K, either it's slightly (at least) more optimized for that CPU or you have done something wrong. Real world difference between the two CPUs is win-some-lose-some, but not by a lot.

Let's look at Arma 2, overclock / stock clock:

AMD: 51.92 / 37.8

Intel: 25.56 / 13.12

I refuse to believe that the above benchmark is an indication of CPU performance. It isn't. It's an indication of how the game is optimized. The processors are more-or-less comparable, a benchmark like that seems illegitimate.

I could be wrong, but that just seems wrong. As an advocate for neither AMD nor Intel, I can happily say that the 8350 wins in some games. I just can't believe it isn't due to CPU optimization in Arma 2 specifically.

A simple launch perimeter "-exThreads" and "-cpuCount" on arma 2 allows the game to take advantage of more cores (2 cores being stock built in perimeter). 8 cores vs 4 cores could be the reason for such a massive boost, especially with the Ãœber intensive soldier AI

export PS1='\[\033[1;30m\]┌╼ \[\033[1;32m\]\u@\h\[\033[1;30m\] ╾╼ \[\033[0;34m\]\w\[\033[0;36m\]\n\[\033[1;30m\]└╼ \[\033[1;37m\]'


"All your threads are belong to /dev/null"


| 80's Terminal Keyboard Conversion | $5 Graphics Card Silence Mod Tutorial | 485KH/s R9 270X | The Smallest Ethernet Cable | Ass Pennies | My Screenfetch |

Link to comment
Share on other sites

Link to post
Share on other sites

I would like to see Linus benchmarking the FX 8350 again... The results from Logan's video were really surprising.

Link to comment
Share on other sites

Link to post
Share on other sites

My god those guys are dislikeable. I want them to be right and their smugness still repulses me.

My first question would be why AMD's been silent this whole time. If their CPU is more capable than Intel's staple gaming CPU that's recommended in 90% of builds, why have they not made some noise? If they've been unjustifiably losing money hand over fist to the 3570k, why didn't they say anything?

You do realize that when AMD totally wiped the floor with intel's pentium architecture during the early 2000s , 80% of the desktop market share was still held by intel .

intel's strongest point is their marketing .

Link to comment
Share on other sites

Link to post
Share on other sites

I would like to see Linus benchmarking the FX 8350 again... The results from Logan's video were really surprising.

I doubt he'd do that, he seems like he has already made up his mind about it.

Even on the live-stream he dismissed it way too quickly .

Link to comment
Share on other sites

Link to post
Share on other sites

My god those guys are dislikeable. I want them to be right and their smugness still repulses me.

My first question would be why AMD's been silent this whole time. If their CPU is more capable than Intel's staple gaming CPU that's recommended in 90% of builds, why have they not made some noise? If they've been unjustifiably losing money hand over fist to the 3570k, why didn't they say anything?

Before my time I'm afraid. I'm not really sure what that has to do with my post though, unless you're saying Intel actively censors AMD.
Link to comment
Share on other sites

Link to post
Share on other sites

I would like to see Linus benchmarking the FX 8350 again... The results from Logan's video were really surprising.

I doubt he'd do that, he seems like he has already made up his mind about it.

Even on the live-stream he dismissed it way too quickly .

He was being a bit close minded :/

export PS1='\[\033[1;30m\]┌╼ \[\033[1;32m\]\u@\h\[\033[1;30m\] ╾╼ \[\033[0;34m\]\w\[\033[0;36m\]\n\[\033[1;30m\]└╼ \[\033[1;37m\]'


"All your threads are belong to /dev/null"


| 80's Terminal Keyboard Conversion | $5 Graphics Card Silence Mod Tutorial | 485KH/s R9 270X | The Smallest Ethernet Cable | Ass Pennies | My Screenfetch |

Link to comment
Share on other sites

Link to post
Share on other sites

This video was hilarious. The tests were legit and they even said they were surprised at the results. It's not like they're endorsed by AMD nor are they AMD fanboys.

Most of the passive insulting was directed at some redneck that posted an Intel fanboy video that did some testing of their own games at 640x480 and found that Intel wins at some, and AMD wins at some, and then they continue to talk about how Intel is better because it costs more and AMD is a stupid choice because Intel costs more and is therefore better.

I I've seen both his videos on this, and the selection of games seriously suspect.

His first video he completely fucked up the benches, introducing more variables then is reasonable for such a comparison tests.

His second Video, he dropped the 2011, & changed up his games.

That is no to mention that some of the games were running with mods *while alot of gamers do that, it's still not something that should be considered because the mod can add yet another variable to the game that may run better on one platform or another.*

Also love how he skips out on the OS that is used, because clearly, Every OS plays the same game equally, and there has NEVER been difference. *coughs hard* While we assume it was Windows 7 that was used because of what he said about the patches for the AMD CPU, but was never clear if said hotfixes were removed or not there when the Intel CPU did it's test results. Introducing some core level changes to how the OS handles things, and wasn't clear on if it was installed with the Intel CPU there or not.

Logan's 'Testing' is flawed from the ground up in that respect. Leaving out critical data, and introducing more viarables for the tests. while not considering the OS variables. This is why synthetics benches are more reliable *when they aren't pulling weird shit*, and then using a few real world examples to show the comparison. That or using games with Built in benchmark modes.

Link to comment
Share on other sites

Link to post
Share on other sites

I agree, the 8350 is a great CPU. It can out perform the 3570 obviously not with everything but it can. The clock on the 8350 was higher, but clock for clock the 3570 is faster. It'll be interesting to see steamroller. Hopefully 28nm process and each core will have its own decoder. So it should give it a significant jump in performance and hopefully no excess power consumption. We could end up with something that not only competes with Ivy but perhaps haswell. I'm not going to get my hopes up though. Lets just hope AMD come through and create some competition :)
amd needs to tighten the memory latency as well to beat intel...

The Internet is invented by cats. Why? Why else would it have so much cat videos?

Link to comment
Share on other sites

Link to post
Share on other sites

i use a 8350 and love it but i will admit that the i5 is a beastly option. having said that there are a few good things about the 8350 and on is that even when playing bf3 using 6 cores u will still have 2 cores to cover stuf running in the backround and it overclocks like a beast. but the power drawl is emense. if u can aford an i5 get it but if u are alitle budget restrained i would highly recomend the 8350

I love gaming,pc building,talking tech.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×