Jump to content

Core i3 7350K Overclockable Dual Core Review

30 minutes ago, MageTank said:

I've been trying to figure out what possible excuse they could have for wanting to show 4k results on a CPU review, but I honestly cannot think of any. The sad part is, people are going to buy this CPU, thinking it will be no different than an i7 in gaming because they decided to show 4k results (and mostly average framerates too). When the people buy this product, and use it on their budget 1080p setups, they are going to wonder why their i3 is stuttering hard in titles like BF4 and GTA 5. 

They actually tested the i5 7400, but your the point still stands, with a GPU bottleneck and no minimums in charts this really tells us next to nothing.

 

Looking at hardware unboxed tests the i3 7350K at 4.8GHz was getting almost consistently beaten by a......stock i5 4670K, even when they didnt even take proper frame times analysis into account as well. Can't wait for digital foundry to get their hands on it.

If you want to reply back to me or someone else USE THE QUOTE BUTTON!                                                      
Pascal laptops guide

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, MageTank said:

I've been trying to figure out what possible excuse they could have for wanting to show 4k results on a CPU review, but I honestly cannot think of any. The sad part is, people are going to buy this CPU, thinking it will be no different than an i7 in gaming because they decided to show 4k results (and mostly average framerates too). When the people buy this product, and use it on their budget 1080p setups, they are going to wonder why their i3 is stuttering hard in titles like BF4 and GTA 5. 

 

This has been my issue with LTT for a while now. They are quick to jump to these conclusions without giving their methodology a second thought, then proclaim it as a fact to their viewerbase. When you point it out to them, nothing changes. No re-do's on the videos, and if you are extremely lucky, you get a pseudo-statement on the subject that is basically them not admitting fault, and saying the end result is still the same (without ever really showing that evidence either). 

 

I only hope people are smart enough to check multiple review sources before going out to make a purchase decision. Luckily, some of the youtube comments caught on to this as well, and are being highly thumbed up for others to see. 

They did the same shit on their Skylake review. By doing a gpu benchmark at 4k they claimed Skylake wasn't really any better than Sandy Bridge. Same shit on their RAM speed doesn't matter video, running essentially a gpu benchmark on a 660 Ti by using tons of MSAA. LTT benchmark videos are pretty useless, Digital Foundry and Gamers Nexus are far superior now. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ryan_Vickers said:

Usually I don't really have anything to complain about but here's the quote of my comment from YouTube...

 

Nah you'd want to run the game at ultra settings since draw distance, shadows, number of objects, etc affect the cpu. Dropping to 720p ultra would be best with a single 1080.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, SteveGrabowski0 said:

Nah you'd want to run the game at ultra settings since draw distance, shadows, number of objects, etc affect the cpu. Dropping to 720p ultra would be best with a single 1080.

Perhaps.  My point is they took the step the pick a high powered GPU in order to avoid the bottleneck, but then loaded it down with 4K ultra... makes no sense.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

@LinusTech Can't even remove the formatting for dark theme users on their OWN forum....that's just the icing on the cake that is their lack of proper testing methodology.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, AxelRantila said:

What is up with your thumbnails as of late. This wasn't what I subscribed to in the first place.

If you are going to say "It's because of YouTube's new algorithm", then sorry, but that tells me absolutely nothing if you aren't going to expand on it in more details

 

They use this shitty thumbnail to generate more comments and "discussion" on their video.

Link to comment
Share on other sites

Link to post
Share on other sites

I think also with this comparison you have to take into account the cost of the cooling hardware to allow the OC on the 7350k. The 7400 will achieve those results using the stock intel cooler, which in my view has been plenty adequate for default clock speeds for several generations of intel processors now. The i3 is hooked up to an AIO cooler, adding substantial price.

Link to comment
Share on other sites

Link to post
Share on other sites

weird. I watched Jayz video earlier today and it showed MASSIVE differences between the i3, i5 and i7 in gaming. You guys shouldn't have tested on 4k since that would make a GPU bottleneck before a CPU bottleneck, I think. 

EDIT: just rewatched Jayz video a bit. He tested on 1080p and got indeed big differences, but since those were for framerates above 120fps, which makes it a little less relevant. People with 120Hz or more monitors will probably not buy an i3.

An i5-7500 should still be a better buy for gaming, though. 4 full cores.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, shadowbyte said:

jesus fucking christ what is that unholy abomination of a thumbnail

I am wondering if he is sucking in CPUs or if he is spiting CPUs out

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sure they have already heard of the complains on the thumbnails but instead they're keeping it to generate more controversy and discussions. Or to "troll" knowing that we hate it. Either way I've been a long LTT fan even before he started his own company and I have to say the videos are just getting worse and worse. We're getting misinformed included with the shitty thumbnails now.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stefken89 said:

EDIT: just rewatched Jayz video a bit. He tested on 1080p and got indeed big differences, but since those were for framerates above 120fps, which makes it a little less relevant. People with 120Hz or more monitors will probably not buy an i3.

 

 

 

Just aiming for a resolution that the 1080 could draw at a 60fps ish would have been a start on two of the tests they only got 25 and 28 fps, which for CPU porposes is useless information almost any CPU from the last 6 years could play those games at 25 and 28 fps odd FPS.

And the is a reason for using 1080p and getting rediculious frame rates it simulates a GPU upgrade and the load of games increasing over time on the CPU.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Dash Lambda said:

... Okay.

Okay...

Now I really get what people are talking about with the new thumbnails.

This is the first video from LTT I decided not to watch because of the thumbnail.

you didnt miss much , pretty half assed really 

 

tldr: get an i5 7400 instead 

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

It's sort of ironic that in the WAN show recently Linus was talking about how it's pointless for Samsung to refuse to call QLED tvs "quantum dot," and was going on about how his viewers are smarter than to be tricked by that, but then pulls shit like this.

 

 

I don't have a complaint with the thumbnail; it's entertainment news. NBD. 

 

Also, why test vs an i5-7400? Surely it would be better to compare to the i5-6500, which has huge market share but is still a value option?  

Link to comment
Share on other sites

Link to post
Share on other sites

Regarding the thumbnail: I dont understand the complaints. I'm glad the CPUs are coming out his mouth and not is arse.

LTT needs to do some linux kernel compilation benchmarks & VM benchmarks. Some people need to buy computers for work

             ☼

ψ ︿_____︿_ψ_   

Link to comment
Share on other sites

Link to post
Share on other sites

Note to Linus: please improve reviews before someone decides that this forum board makes a good springboard for their own hardware reviewing career. :P

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

disregarding the their terrible thumbnails which make them look childish and unprofessional, their reviews have been shallow for awhile.  The GPU reviews have been pretty half-assed too.  Their rationale is that if you want in-depth reviews, you can goto other sites.  That's pretty arrogant of Linus to say. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Wolther said:

I'm confused. What makes that i3 so good o.O. Don't they both have the same ipc? If so, shouldn't it be  (cores * usage) * hertz? or am I missing something

 

 

Same IPC, but the i3 is clocked at 5GHz in their tests, and the tests were skewed from running 4K, not 1080p. Had they ran 1080p, the 7350K would have had an impact on performance far sooner than the GTX 1080 would.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Drak3 said:

Same IPC, but the i3 is clocked at 5GHz in their tests, and the tests were skewed from running 4K, not 1080p. Had they ran 1080p, the 7350K would have had an impact on performance far sooner than the GTX 1080 would.

I know that. But the i5 should've been higher in the CPU tests too (not just the games) it had more total clocks than the i3 if you add up all the clocks from every core 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Wolther said:

I know that. But the i5 should've been higher in the CPU tests too (not just the games) it had more total clocks than the i3 of you add up all the clocks from every core 

Parallelization doesn't scale 100%, and hyperthreading helps alleviate the difference between the dual core and quadcore.

 

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Drak3 said:

Parallelization doesn't scale 100%, and hyperthreading helps alleviate the difference between the dual core and quadcore.

 

In cinnebench though? I thought that was supposed to utilized everything fully 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Wolther said:

In cinnebench though? I thought that was supposed to utilized everything fully 

Cinebench is fine for rough ideas, but no benchmark tests everything completely. That's why we don't rely on single benchmarks for most things.

Cinebench also sees benefit from Hyperthreading. In the Video I linked below, Jayz2Cents clocked the 7350K, 7600K, and 7700K at 4.7GHz. The 7700K (effectively the 7600K with Hyperthreading and 2MB additional L3 cache), was a fair deal better than the 7600K in Cinebench.

 

 

Also, note how he tested in 1080p w/ Dx11, and got results that are fairly expected.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

Lol I bought my 4670k brand new for less than this costs

 

ncix4670k.jpg

Intel 4670K /w TT water 2.0 performer, GTX 1070FE, Gigabyte Z87X-DH3, Corsair HX750, 16GB Mushkin 1333mhz, Fractal R4 Windowed, Varmilo mint TKL, Logitech m310, HP Pavilion 23bw, Logitech 2.1 Speakers

Link to comment
Share on other sites

Link to post
Share on other sites

Seriously what is with the thumbnails these are even worse then the CES ones.  I'm already determined to not watch this video since I really don't want to reward them for using these absurd clickbait thumbnails. 

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly, with the way they've been handling thumbnails, and like others have said, their testing methodology... I'm seriously thinking of unsubscribing from Linus Tech Tips.

I'd probably stay subbed to Tech Quickie and Channel Super Fun.

Currently focusing on my video game collection.

It doesn't matter what you play games on, just play good games you enjoy.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×