Jump to content

Watch Dogs will not run on Dual-core CPUs

the game is caped at 82FPS? thats strange..

would you mind pasting link for full arcticle please? :)

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Watchdogs has probably a "special" version for the xbone and ps4 with limited AI activity, no physics, garry's mod level graphics and all running at a good 30 frames per second at a beautiful 790p. 

remember resolution is just a number!

My Setup :P

Spoiler

Skylake: I7-6700|MSI B150 GAMING M3|16GB GSKILL RIPJAWS V|R9 280X (WILL BE 1070)|CRUCIAL MX300 + WD BLACK 1TB

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

remember resolution is just a number!

a number that tells how nice a picture will look on a screen, but just a number ;)

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

.....

Watch_Dogs is absolutely raping my Core 2 Quad Q8200. All the cores run at 90-99% with an average (total CPU Usage) of around 93%. While my AMD 6870 sits at 35% utilization. It barely breaks a sweat. Well, at least I can see that the engine has excellent CPU scaling. Unlike AC4 which puts 80% of the load on the first 2 cores and leaves the second 2 just doing nothing.

My PC: CPU: Intel Core i3 3220, MB: ASUS P8P67 LE, GPU: Palit Jetstream GTX 670, RAM: 4GB 1333mhz DDR3, Storage: 750GB Hitachi, PSU: CoolerMaster G650M 80+ Bronze, Cooling: Coolermaster Hyper 212 Plus, Case: Multirama, Display: Acer x229w 22" 1680x1050, Keyboard: Logitech K120, Mouse: Steelseries Kinzu v2, Sound: Logitech 2.1 system

Link to comment
Share on other sites

Link to post
Share on other sites

82 isn't the cap. Just where the GPU hit it's limit I suppose. That's what they say I guess.

But here's the link http://www.techspot.com/review/827-watch-dogs-benchmarks/page5.html

i love how the r9 290 and GTX 780 are freakin neck and neck, not a single FPS difference between the two of them !

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Watch_Dogs is absolutely raping my Core 2 Quad Q8200. All the cores run at 90-99% with an average (total CPU Usage) of around 93%. While my AMD 6870 sits at 35% utilization. It barely breaks a sweat. Well, at least I can see that the engine has excellent CPU scaling. Unlike AC4 which puts 80% of the load on the first 2 cores and leaves the second 2 just doing nothing.

yes this is the patern for all the next gen games to come...better use of mutli-core CPU is something we all should enjoy, unless you own a dual core CPU lol :P

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

I expect some additional game patches, and driver updates, but its nice that the game is using multitasking/multithreading. That saved the bacon for AMD FX owners :D
I definitely need to re-test the game with the tweak SkilledRebuilds provided.

EDIT:  Most of these benchmarks use average fps values, but as i noted before, i can have a pretty nice avg fps, but the jumps to minimum fps are terrible...

Link to comment
Share on other sites

Link to post
Share on other sites

Look at that Core i3, knocking at the 8350's back door. :P Such an awesome budget gaming CPU. Generally costs less than an FX-6300 (which it generally out-performs) and barely holds back a high-end GPU like the R9-290X. 

 

Again, I stand firmly behind the fact that when it comes to gaming the number of cores in a CPU does not matter. What does matter is IPC per core, architecture, cache and thread scheduling.

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Look at that Core i3, knocking at the 8350's back door. :P Such an awesome budget gaming CPU. Generally costs less than an FX-6300 (which it generally out-performs) and barely holds back a high-end GPU like the R9-290X. 

 

Again, I stand firmly behind the fact that when it comes to gaming the number of cores in a CPU does not matter. What does matter is IPC per core, architecture, cache and thread scheduling.

Yes, but the difference between an i3 and a i7 isnt compeling either...the difference is 6fps...very expensive 6 frames i should add :D

Also to note that the FX processors have unlocked multipliers in contrast to the i3, and only a fool would run these processors at default frequencies...

Link to comment
Share on other sites

Link to post
Share on other sites

Look at that Core i3, knocking at the 8350's back door. :P Such an awesome budget gaming CPU. Generally costs less than an FX-6300 (which it generally out-performs) and barely holds back a high-end GPU like the R9-290X. 

 

Again, I stand firmly behind the fact that when it comes to gaming the number of cores in a CPU does not matter. What does matter is IPC per core, architecture, cache and thread scheduling.

and if you look at prices the FX-8320 cost less than a core i3-4350 and the FX-6300 cost MUCH less then even the cheapest i3-4130!

 

Both AMD are unlocked and will outperform ANY core i3 in any mutlti-threaded games and applications, you can stand behind the i3 all you want but you're wrong.

The FX-8320 will clock at 4.6GHZ or more and totaly outperform the core i3 in any MODERN games there is.

 

The FX-6300 is 109$ http://pcpartpicker.com/part/amd-cpu-fd6300wmhkbox

The FX-8320 is 134$ http://pcpartpicker.com/part/amd-cpu-fd8320frhkbox

 

On the CPU side, the Core i5 and FX-8000 series processors can get the most out of a high-end GPU, or near enough. The FX-6000 as well as the Core i3 series also perform well, while those with a FX-4000 series CPU will lose quite a bit of performance and AMD APU/Phenom II users are out of luck. Likewise, Haswell-based Pentium and Celeron processors won't do that well in Watch Dogs either, if that comes as any surprise.

http://www.techspot.com/review/827-watch-dogs-benchmarks/page6.html

 

It's only a matter of time before a core i3 gets totaly overloaded in such modern games, his multi-thread performance is just way too low.

https://docs.google.com/spreadsheet/ccc?key=0AlC81MjwelBgdEZNV3l6aHl1eUNwSUR4Rml0MXMzN1E&usp=sharing#gid=0

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

I think the reason why the Wolfenstein team wanted a high-end CPU requirement was because the game engine utilises high compression which is demanding on the CPU. It also is why the game textures look like ass. Watch Dogs doesn't use the same crappy engine that Wolfenstein does so I don't understand why they demand such a high-end CPU. The game though has really bad technical problems such as tearing, frame-rate and frame time issues. Also it doesn't look as good as the footage we saw of the game from last year which is a shame. I just hope that a patch is in the works to give us the better textures and fix the visual problems. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

Look at that Core i3, knocking at the 8350's back door. :P Such an awesome budget gaming CPU. Generally costs less than an FX-6300 (which it generally out-performs) and barely holds back a high-end GPU like the R9-290X.

Again, I stand firmly behind the fact that when it comes to gaming the number of cores in a CPU does not matter. What does matter is IPC per core, architecture, cache and thread scheduling.

My 8320 was less expensive then a I3 actually. So I'm pretty happy with its performance.

Link to comment
Share on other sites

Link to post
Share on other sites

:(

Catalyst 14.6 basically just threw a big red gaming evolved hammer at my shit. <- for the lols, GPU is fine.

Someone should also take this image and put the AMD Gaming Evolved logo on the Hammer and saturate some more red ;)

Screen-Shot-2012-07-17-at-11.51.46-AM.pn

Just moved to W8.1 (cos BF4 boost and all that jazz, meaning also a recent install) and updated from 14.4 (which also software hammered my W7, but worked great on W8.1) this comical little kindergarden BSOD

has had 3 rounds to startup properly and hasn't so it gets the DDU chop, and 14.4 installs perfectly fine

So 14.6,...it's gone... wasn't tested, wanted the improvements... have to wait :(

Wasn't your average BSOD, proper memory image garbledness like those 3D depth paintings/images... cept I didn't wanna stare at this one too long :/

1080p 8xAA /w Ultra (every option set to the right, config edited for PC spec) (first playable scene in locker room)

Catalyst 14.4 - First 15 seconds of GPU rendered locker room scene , 22fps, then jumps to 24-26fps, then around 30fps for a little while then variably 24-31fps <- initial level load and smoothing out period.

*Placeholder for when 14.6 actually works and I'll put comparative results here (plus PC details are in SIG although I am running 1050mhz Coreclock)

 

Obviously I won't be using 8xAA (even with the "up to 35% extra" 14.6 brings) but most likely be using the 4xAA if its always above 55fps.

If not 2xAA/SMAA and SweetFX or something like that to cover up the semi-uglies :)

I already applied the pagefile tweak before I started looking at the FPS so there's also that...

Loving the game (as I didn't really play GTA for extended periods to finish it or "get into it"), but have been playing this off and on for the last two hours now, pretty awesome moments... & performs quite well with this 14.4 set.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

It is a well known fact that i3 CPUs cannot play modern games. Everyone who says different is a paid INTEL shill, those benchmarks are a lie, more cores is always better I have proof listen to me. 

Have you not read the official UBISOFT announcement? Dual core cannot run this game it is confirmed - http://tech4gamers.com/watch-dogs-will-not-run-at-dual-core-processors/ -. Do not listen to evil propaganda, do not look at benchmarks, listen to the truth. The truth is in the cores.

Peace.

Link to comment
Share on other sites

Link to post
Share on other sites

It is a well known fact that i3 CPUs cannot play modern games. Everyone who says different is a paid INTEL shill, those benchmarks are a lie, more cores is always better I have proof listen to me. 

Have you not read the official UBISOFT announcement? Dual core cannot run this game it is confirmed - http://tech4gamers.com/watch-dogs-will-not-run-at-dual-core-processors/ -. Do not listen to evil propaganda, do not look at benchmarks, listen to the truth. The truth is in the cores.

Peace.

the core i3 does have hyperthreading so it will run watch dogs fine on medium/high settings...no doubt, but if you scroll back a bit in the thread and check the price for FX-6300 and FX-8320 the i3 is really not worth it, it cost more than those and won't perform anywhere near.

 

The benchmarks seem pretty accurate to me, FX 8 core @ 4.2ghz will feed a R9 290X or 780ti same goes for a core i5-4670K, this is what was expected...the FX-6300 need a good speed bump to keep up but it does well, as expected...the core i3 can't be overclocked an run on 2 fast hyperthreaded cores so it kind of keep up, this was also to be expected, and the core i7 will outrule them all forever, again as expected..

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

A notice to all the peeps reading this thread:

The new Catalyst 14.6 Beta seems very unstable. I tried numerous times to install it, but it only ended with a BSOD/Install Error. Your experience may vary.

Link to comment
Share on other sites

Link to post
Share on other sites

and if you look at prices the FX-8320 cost less than a core i3-4350 and the FX-6300 cost MUCH less then even the cheapest i3-4130!

 

Both AMD are unlocked and will outperform ANY core i3 in any mutlti-threaded games and applications, you can stand behind the i3 all you want but you're wrong.

The FX-8320 will clock at 4.6GHZ or more and totaly outperform the core i3 in any MODERN games there is.

 

The FX-6300 is 109$ http://pcpartpicker.com/part/amd-cpu-fd6300wmhkbox

The FX-8320 is 134$ http://pcpartpicker.com/part/amd-cpu-fd8320frhkbox

 

On the CPU side, the Core i5 and FX-8000 series processors can get the most out of a high-end GPU, or near enough. The FX-6000 as well as the Core i3 series also perform well, while those with a FX-4000 series CPU will lose quite a bit of performance and AMD APU/Phenom II users are out of luck. Likewise, Haswell-based Pentium and Celeron processors won't do that well in Watch Dogs either, if that comes as any surprise.

http://www.techspot.com/review/827-watch-dogs-benchmarks/page6.html

 

It's only a matter of time before a core i3 gets totaly overloaded in such modern games, his multi-thread performance is just way too low.

https://docs.google.com/spreadsheet/ccc?key=0AlC81MjwelBgdEZNV3l6aHl1eUNwSUR4Rml0MXMzN1E&usp=sharing#gid=0

 

 

My 8320 was less expensive then a I3 actually. So I'm pretty happy with its performance.

 

 

the core i3 does have hyperthreading so it will run watch dogs fine on medium/high settings...no doubt, but if you scroll back a bit in the thread and check the price for FX-6300 and FX-8320 the i3 is really not worth it, it cost more than those and won't perform anywhere near.

 

The benchmarks seem pretty accurate to me, FX 8 core @ 4.2ghz will feed a R9 290X or 780ti same goes for a core i5-4670K, this is what was expected...the FX-6300 need a good speed bump to keep up but it does well, as expected...the core i3 can't be overclocked an run on 2 fast hyperthreaded cores so it kind of keep up, this was also to be expected, and the core i7 will outrule them all forever, again as expected..

 

Ok - all of you, lets back it up for a sec and look at the specifics. In the benchmarks posted here: http://www.techspot.com/review/827-watch-dogs-benchmarks/page5.html the specific CPUs tested were the i3-4130 and FX-8350. Not the more expensive i3's (which I wouldn't buy nor recommend anyways) and not the 8320 (which I don't deny is a very good CPU for the price).

 

That being said, the current prices for these CPUs on PCpartpicker, as it currently stands, are as follows:

 

i3-4130 = $118

FX-8350 = $189

 

So yeah, this i3 is not more expensive than an 8350. It is actually the contrary and by quite a bit. The FX-6350 which matched this i3 is currently listed at around $139 and the 8320 is priced really well at $134 - still more than the entry level i3 used in this comparison. So you can't simply say "my FX-8320/8350 cost less than an i3." because not all the i3's are priced the same. Details people, details! ;)

 

The FX-6300 is a far better value than the 6350 at just $109 - but there are still several reasons I, personally, would still prefer and recommend this i3 (the 4130) over the 6300 for gaming specifically. Also, this is not the only game that has been shown to demonstrate that high core count does not automatically = better performance and this is a game said to require a CPU with more than 4+ threads/cores. You also have think about it from the perspective of looking forward. The only upgrade path of an FX-6300 machine is the 8320/8350 which requires certain 970 chipset motherboards or 990/fx motherboards with the adequate power phase design. These are generally more pricey, though there are a few exceptions. With Intel you can go with pretty much any LGA1150 motherboard you want and drop in anything from a Pentium to an i7 or Xeon with zero issues and leave a good chunk more of your budget available towards a stronger GPU on your initial build. I'm not saying to buy the cheapest motherboard, but you can get some very good quality, feature-filled Intel boards at really good prices.

 

I am not an Intel fanboy. I am simply stating facts. Considering what the i3-4130 is and how well it performs at that price point, even stacked up against more expensive CPUs with twice the core count in a modern triple A game, it is a very impressive CPU. Anyone looking to game on a budget who wants the best performance for their money should look very closely at this and consider it's actual performance in gaming, not how many cores it has. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just to add to my last post... 

 

There are a lot of factors and variable that effect gaming performance and it varies from game to game. I realize Watch Dogs may not be "properly optimized" yet and that that might be partly giving an edge to the i3 - knowing Intel's strong per-core grunt. But that is also a testament to it's capabilities. Many games that are years past release are still not well optimized (at least on PC) *cough* skyrim *cough*. 

 

When comparing CPUs within certain price points it's difficult to say one performs better than the other as a "blanket statement" because though it may be true in certain games, it may be the opposite in other games. What I would suggest is to ultimately ignore brand and core count and look at the performance of all the CPUs within the price point of your budget and see which performs best - overall - running the types of games you'll be playing the most. 

 

Getting a bit off-topic, I know. The bottom line is Watch Dogs, as it currently stands - can run on a dual core CPU and does so very well on the sub $120 i3-4130. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

...and so it does and even better on a 134$ overclocked FX-8320.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Does it annoy anyone else that devs push these ridiculous system requirements only to have them not be true? I mean I was expecting it because it's just marketing garbage but come the fuck on this site http://www.techspot.com/review/827-watch-dogs-benchmarks/page5.html shows an i3 coming 3FPS behind an 8350 on Ultra for fucks sake. They said it won't run on dual core CPU's but there it is chugging away right behind the top range chips, this reminds me of the CoD:Ghosts devs putting in that 6GB requirement but when the game was cracked an the requirement was stripped it was found that is uses just 2GB.

-The Bellerophon- Obsidian 550D-i5-3570k@4.5Ghz -Asus Sabertooth Z77-16GB Corsair Dominator Platinum 1866Mhz-x2 EVGA GTX 760 Dual FTW 4GB-Creative Sound Blaster XF-i Titanium-OCZ Vertex Plus 120GB-Seagate Barracuda 2TB- https://linustechtips.com/main/topic/60154-the-not-really-a-build-log-build-log/ Twofold http://linustechtips.com/main/topic/121043-twofold-a-dual-itx-system/ How great is EVGA? http://linustechtips.com/main/topic/110662-evga-how-great-are-they/#entry1478299

Link to comment
Share on other sites

Link to post
Share on other sites

...and so it does and even better on a 134$ overclocked FX-8320.

No denying that, however if you consider that an OC'd 8320 probably performs about the same as a stock clocked 8350 then we're only looking at about 3 fps difference. Is that really worth it when you have to spend more on a motherboard that can OC an 8320 plus the extra cost of a decent CPU cooler? Is ~$50+ extra worth that few extra frames? I guess that's a question that boils down to personal preference and budget.

 

Does it annoy anyone else that devs push these ridiculous system requirements only to have them not be true? I mean I was expecting it because it's just marketing garbage but come the fuck on this site http://www.techspot.com/review/827-watch-dogs-benchmarks/page5.html shows an i3 coming 3FPS behind an 8350 on Ultra for fucks sake. They said it won't run on dual core CPU's but there it is chugging away right behind the top range chips, this reminds me of the CoD:Ghosts devs putting in that 6GB requirement but when the game was cracked an the requirement was stripped it was found that is uses just 2GB.

The i3 is hyper threaded which means it has far more efficient thread scheduling (can handle 4 threads at once) vs just a straight dual core CPU like the Pentium G3220 further down the list. That being said, the Pentium even still runs the game at acceptable frame rates with an R9-290X. That is an odd CPU/GPU combo you wouldn't normally see, but the fact remains that the game very well can be run on dual-core machines.

 

This is why I repeatedly advocate that it doesn't matter how many cores a CPU has. What matters is overall performance (total combined IPC per core).

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

No denying that, however if you consider that an OC'd 8320 probably performs about the same as a stock clocked 8350 then we're only looking at about 3 fps difference. Is that really worth it when you have to spend more on a motherboard that can OC an 8320 plus the extra cost of a decent CPU cooler? Is ~$50+ extra worth that few extra frames? I guess that's a question that boils down to personal preference and budget.

 

The i3 is hyper threaded which means it has far more efficient thread scheduling (can handle 4 threads at once) vs just a straight dual core CPU like the Pentium G3220 further down the list. That being said, the Pentium even still runs the game at acceptable frame rates with an R9-290X. That is an odd CPU/GPU combo you wouldn't normally see, but the fact remains that the game very well can be run on dual-core machines.

 

This is why I repeatedly advocate that it doesn't matter how many cores a CPU has. What matters is overall performance (total combined IPC per core).

Well, even wikipedia states that a IPC doesnt really matter to the end user

 

For users and purchasers of a computer system, instructions per clock is not a particularly useful indication of the performance of their system.

And i have to agree with them. IMO an i7 less than double the IPC of an fx-8350, and it still only outputs a few frames more for twice the price...

Long story short, IPC matters if you like your benchmark numbers higher, otherwise its a nice thing to have, but it usually doesnt affect much.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, even wikipedia states that a IPC doesnt really matter to the end user

 

And i have to agree with them. IMO an i7 less than double the IPC of an fx-8350, and it still only outputs a few frames more for twice the price...

Long story short, IPC matters if you like your benchmark numbers higher, otherwise its a nice thing to have, but it usually doesnt affect much.

IPC certainly does matter. If an i5 had the same IPC per core of an 8350 it would perform far worse. The fact that it has close to twice the IPC per core is why it performs about the same in games optimized for 4+ threads and better in games optimized for 1-2 threads. An i3 with roughly the same IPC as an i5/i7 and only 2 cores is able to hang with an 8350 partly because of it's high IPC and partly because of hyperthreading. So yeah, IPC matters. Along side IPC you need sufficient cache and efficient scheduling. Both AMD and Intel have that but it works in differently in each type of CPU. So yes there is more to it than IPC, but IPC does have a significant impact on CPU performance which translates to better gaming performance (higher FPS) which does matter to the end user. 

 

The difference in performance is not much - almost splitting hairs when we're talking about a 3 FPS +/-. So it really doesn't matter if you go with an i3 or an 8320 if the end result is the same, that I'm not disputing. The fact is a big part of the reason the i3 can perform like an 8320/8350 is because of it's higher IPC. So you can't say IPC doesn't matter. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

IPC certainly does matter. If an i5 had the same IPC per core of an 8350 it would perform far worse. The fact that it has close to twice the IPC per core is why it performs about the same in games optimized for 4+ threads and better in games optimized for 1-2 threads. An i3 with roughly the same IPC as an i5/i7 and only 2 cores is able to hang with an 8350 partly because of it's high IPC and partly because of hyperthreading. So yeah, IPC matters. Along side IPC you need sufficient cache and efficient scheduling. Both AMD and Intel have that but it works in differently in each type of CPU. So yes there is more to it than IPC, but IPC does have a significant impact on CPU performance which translates to better gaming performance (higher FPS) which does matter to the end user. 

 

The difference in performance is not much - almost splitting hairs when we're talking about a 3 FPS +/-. So it really doesn't matter if you go with an i3 or an 8320 if the end result is the same, that I'm not disputing. The fact is a big part of the reason the i3 can perform like an 8320/8350 is because of it's higher IPC. So you can't say IPC doesn't matter. 

The IPC does matter to CPU performance, and its something the manufacturer should be about when engineering the product, but its not something the user should even be bothered.
 
What the user should be about more is about the general performance of the processor.
 

IPC is just a number! :P

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×