Jump to content

What do AMD CPUs have over Intel?

No you posted delusions whoever thinks an 8350 is more future proof than an I7 especially haswell I7's & ivy bridge E is delusional period 

 

Please read the post above you and post i linked to. I was talking about the i5 processor that was quoted. Also, please read the whole thread before you start flaming ppl.

Case: Phanteks Enthoo Pro | PSU: Enermax Revolution87+ 850W | Motherboard: MSI Z97 MPOWER MAX AC | GPU 1: MSI R9 290X Lightning | CPU: Intel Core i7 4790k | SSD: Samsung SM951 128GB M.2 | HDDs: 2x 3TB WD Black (RAID1) | CPU Cooler: Silverstone Heligon HE01 | RAM: 4 x 4GB Team Group 1600Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah we should ask techfan@ic for an article why 8350's are 4 times faster.

 

He claimed that the 8350 walks over the 4770k.

amusing ... 

Link to comment
Share on other sites

Link to post
Share on other sites

Please read the post above you and post i linked to. I was talking about the i5 processor that was quoted.

Stop lying.

CperkVH.png

Told you; you wouldn't get any further than twisting your bs

 

Link to comment
Share on other sites

Link to post
Share on other sites

There is a reason for the huge price difference between AMD 8350 and Intel 4770K. How can you even compare these two?!

 

Spoiler

CPU:Intel Xeon X5660 @ 4.2 GHz RAM:6x2 GB 1600MHz DDR3 MB:Asus P6T Deluxe GPU:Asus GTX 660 TI OC Cooler:Akasa Nero 3


SSD:OCZ Vertex 3 120 GB HDD:2x640 GB WD Black Fans:2xCorsair AF 120 PSU:Seasonic 450 W 80+ Case:Thermaltake Xaser VI MX OS:Windows 10
Speakers:Altec Lansing MX5021 Keyboard:Razer Blackwidow 2013 Mouse:Logitech MX Master Monitor:Dell U2412M Headphones: Logitech G430

Big thanks to Damikiller37 for making me an awesome Intel 4004 out of trixels!

Link to comment
Share on other sites

Link to post
Share on other sites

Stop lying.

CperkVH.png

Told you; you wouldn't get any further than twisting your bs

 

 

You should read some of the BS that you have been posting before judging someones opinion as "BS", you're clearly out to just flame out the other arguments. Even though i gave the reasoning for my opinion. :D

Case: Phanteks Enthoo Pro | PSU: Enermax Revolution87+ 850W | Motherboard: MSI Z97 MPOWER MAX AC | GPU 1: MSI R9 290X Lightning | CPU: Intel Core i7 4790k | SSD: Samsung SM951 128GB M.2 | HDDs: 2x 3TB WD Black (RAID1) | CPU Cooler: Silverstone Heligon HE01 | RAM: 4 x 4GB Team Group 1600Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think its bs I just think hes ignorant and doesn't know any better 

Link to comment
Share on other sites

Link to post
Share on other sites

Even though i gave the reasoning for my opinion. :D

I thought you proved it instead of reasoning it? Another 180° turn, back to the same BS.

 

 

You should read some of the BS that you have been posting before judging someones opinion as "BS", you're clearly out to just flame out the other arguments. Even though i gave the reasoning for my opinion.  :D

You can't be subjective about which CPU performs better. Use some logic, come back over 10 years when you have a full beard and amuse us about how many gpu's you run in SLI or crossfire and how much fps you get with 3x 25K triple monitors with that futureproof yolo 8320. I bet you would deny that there's no higher number than 9.

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, what? an 8350 does the job in 65% of the 4770k time, pls give me hat source, because i have never ever seen that one, like ever. i ma literally searching right now, and the only thing i can find is that an 8350 is faster than a 3570k when rendering

 

 

and a but ton of reviews show that it can beat the i5 but not the i7, i have saw a few months ago the 3d cad design and 3d modeling, now that is where AMD really did well, almost as good as a 4770k, but a lot cheaper, not to mention considering current prices...

Oh yeah, Photoshop heavy editing works great as well

 

http://www.guru3d.com/articles-pages/core-i7-4770k-review,15.html this one is also interesting

 

Now i am not bashing the 8350 at this but at the 65% of the time you better have a good backup, because i can't really find it 

In a following post we did speculate codec differences, and upon chatting with the youtuber, we found out he was using an antiquated codec on Windows Movie Maker circa 2009. Upon updating the two chips were on pretty much equal footing.

 

The video was subsequently removed. It also wasn't well tagged in the first place. I stumbled upon it when I was looking at the 4790k speculation videos in late May. That's the other reason you likely didn't find it. 

 

Sidenote: Youtubers who don't tag videos properly suck.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

some people could argue that they have more cores for lower prices because they compare it to an i5 cpu which has no hyperthreading so really, to be honest, those cpus are good for.... i don't know a stove, maybe?

Link to comment
Share on other sites

Link to post
Share on other sites

 

I thought you proved it instead of reasoning it? Another 180° turn, back to the same BS.

 

 

You can't be subjective about which CPU performs better. Use some logic, come back over 10 years when you have a full beard and amuse us about how many gpu's you run in SLI or crossfire and how much fps you get with 3x 25K triple monitors with that futureproof yolo 8320. I bet you would deny that there's no higher number than 9.

 

25K? Good lord you people really need to read the research literature. 4K displays are about the limit of how much more your eye can pick up. What is really interesting about 4K and 8K displays is the aspect ratios supported, including 16:10. Some orientations are more pleasing to the eyes, but the overall detail you can pack in is reaching its limits for the standard sitting distance. 

 

4K doesn't even make sense for movie theaters. I can't fathom us going beyond 4K monitors except as a gimmick.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Hi everyone, I'm new here. Checking this post I just see a lot of fan boys. This post should be a good discution and it turns into fighting, it souns like fanboys of Apple and samsung. In my noob experience of pc AMD is cheaper and Intel no, AMD is good for some things than Intel no and viceversa. Stop fighting dudes :).

 

I don't think this forum overall so far is a good place for intelligent discussion.

 

In this forum, intelligent discussion = ad hominem attack combined with hurt feelings

"It seems we living the American dream, but the people highest up got the lowest self esteem. The prettiest people do the ugliest things, for the road to riches and diamond rings."- Kanye West, "All Falls Down"

 

Link to comment
Share on other sites

Link to post
Share on other sites

You would be lucky finding a forum where people are discussion intelligently. Especially on hardware.

 

+1

"It seems we living the American dream, but the people highest up got the lowest self esteem. The prettiest people do the ugliest things, for the road to riches and diamond rings."- Kanye West, "All Falls Down"

 

Link to comment
Share on other sites

Link to post
Share on other sites

25K? Good lord you people really need to read the research literature. 4K displays are about the limit of how much more your eye can pick up. What is really interesting about 4K and 8K displays is the aspect ratios supported, including 16:10. Some orientations are more pleasing to the eyes, but the overall detail you can pack in is reaching its limits for the standard sitting distance. 

 

4K doesn't even make sense for movie theaters. I can't fathom us going beyond 4K monitors except as a gimmick.

It was a joke. Don't know why youre wanting to argue about this. And 4K or even a 4K monitor never been the limit how much our eyes could pick up, it's far from a resolution alone - it's in a ratio along with the size of the screen. There's a certain PPI wall that are our eyes won't notice a difference depends complety how good your eyes are. 4K would look worse on a 250" TV than a 10" screen. http://lawlzawu1a.blogspot.be/2014/04/what-ppi-does-human-eye-have.html

Link to comment
Share on other sites

Link to post
Share on other sites

You should read some of the BS that you have been posting before judging someones opinion as "BS", you're clearly out to just flame out the other arguments. Even though i gave the reasoning for my opinion. :D

 

FX8320 Vs. i5 3570k Vs. i5-4670k Aggregate Comparison

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

I'd say that AMD has the upper hand when it comes to budget components (ignoring the higher power consumption) and in Germany they are much cheaper compared to intel and nvidia... to the point where if budget plays a role, you really don't have any alternative to AMD. We Germans generally get screwed when it comes to computer-stuff.

Also AMD-products seem to have hidden strengths in weird areas.

In America I'd guess that AMD doesn't really win anywhere when Gaming is the only thing that matters.

Link to comment
Share on other sites

Link to post
Share on other sites

 

In America I'd guess that AMD doesn't really win anywhere when Gaming is the only thing that matters.

That's just not true. We have servers and super computers as well as office PCs, and in all 3 areas Intel just has the better performance except in a few niche cases.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That's just not true. We have servers and super computers as well as office PCs, and in all 3 areas Intel just has the better performance except in a few niche cases.

Yes, Intel's lower power consumption plays a huge role in this. I wasn't even considering servers and the like with that statement though.

 

Link to comment
Share on other sites

Link to post
Share on other sites

According to Jayz it would take about 18 years to get the money you spend on the more expensive intel cpus back from the extra money from a AMD pc from your bills. So practically they are cheaper and have similar perfomance

My PC:

Spoiler

MOBO: MSI B450 Tomahawk Max, CPU: AMD Ryzen 5 3600, Cooler: BeQuiet! Dark Rock 3, GPU: Gigabyte GTX 1050ti D5 4G, Ram: 16GB (2x8) HyperX Fury DDR4, Case: NZXT S340, Psu: Be Quiet! Pure Power 11 600W , HDD's: WD 1TB Caviar Blue, WD 256GB Scorpio Blue, WD 2TB Caviar Blue  SSD: Sandisk SSD PLUS 240Gb

Link to comment
Share on other sites

Link to post
Share on other sites

According to Jayz it would take about 18 years to get the money you spend on the more expensive intel cpus back from the extra money from a AMD pc from your bills. So practically they are cheaper and have similar perfomance

Jayz math is flawed and he constantly forgets Intel has vastly superior low energy states when idling.

 

Put the 4770k vs the 8350 for instance  http://cpuboss.com/cpus/Intel-Core-i7-4770K-vs-AMD-FX-8350

 

By my count that's only 5 years to completely recoup losses, and even though we're moving to DDR4 and PCIe 4 which somewhat negate the argument on a macroarchitecture perspective, CPU improvements these days are small and getting smaller. It's less and less worth it to upgrade every 3 years except for the aforementioned reasons which are events every 6-10 years.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Im curious as to how a discussion about CPUs turned into one about Jews, but too lazy to look.

Dude i read it. It was comical.

Spoiler

Cpu: Ryzen 9 3900X – Motherboard: Gigabyte X570 Aorus Pro Wifi  – RAM: 4 x 16 GB G. Skill Trident Z @ 3200mhz- GPU: ASUS  Strix Geforce GTX 1080ti– Case: Phankteks Enthoo Pro M – Storage: 500GB Samsung 960 Evo, 1TB Intel 800p, Samsung 850 Evo 500GB & WD Blue 1 TB PSU: EVGA 1000P2– Display(s): ASUS PB238Q, AOC 4k, Korean 1440p 144hz Monitor - Cooling: NH-U12S, 2 gentle typhoons and 3 noiseblocker eloops – Keyboard: Corsair K95 Platinum RGB Mouse: G502 Rgb & G Pro Wireless– Sound: Logitech z623 & AKG K240

Link to comment
Share on other sites

Link to post
Share on other sites

Nazis were mentioned, Godwin's law mate

I was at work on one of my colleagues mentionsed Godwin's Law. I burst out in laughter.

Spoiler

Cpu: Ryzen 9 3900X – Motherboard: Gigabyte X570 Aorus Pro Wifi  – RAM: 4 x 16 GB G. Skill Trident Z @ 3200mhz- GPU: ASUS  Strix Geforce GTX 1080ti– Case: Phankteks Enthoo Pro M – Storage: 500GB Samsung 960 Evo, 1TB Intel 800p, Samsung 850 Evo 500GB & WD Blue 1 TB PSU: EVGA 1000P2– Display(s): ASUS PB238Q, AOC 4k, Korean 1440p 144hz Monitor - Cooling: NH-U12S, 2 gentle typhoons and 3 noiseblocker eloops – Keyboard: Corsair K95 Platinum RGB Mouse: G502 Rgb & G Pro Wireless– Sound: Logitech z623 & AKG K240

Link to comment
Share on other sites

Link to post
Share on other sites

Intel's technology allowed them to have the same amount of procesing power on less cores than AMD processor but some software takes advantage of more cores.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×