-
Posts
56 -
Joined
-
Last visited
Content Type
Forums
Status Updates
Blogs
Events
Gallery
Downloads
Store Home
Posts posted by maccatackar
-
-
Just now, PCNoobie said:
Your system looks fine. Is there any driver problem?
i dunno, all my drivers are up to date
-
1 minute ago, PCNoobie said:
what is your system spec?
i7 4790k
gigabyte z97 gaming g1 wifi
16gb hyperx 1866mhz ram
asus rx 480 8gb
-
-
so how do i do it? ive seen people do it videos? do i have get a older bios for my motherboard
-
hey everyone, im about to try overclock my cpu, will be my first time and my motherboard is a asrock z170 pro 4, whats the recommended settings and stuff to overclock my cpu
would be great if someone could help me
-
2 hours ago, K0MP4CT said:
Looks good. Although unless you're planning on upgrading to an unlocked chip later on, then a Z170 motherboard is just adding to cost for no reason.
yes planning on getting a i7 6700k
-
28 minutes ago, ybriK said:
Drop the Z170, get either a B150 or H170 and i5 6400, unless you plan to buy a used 6700K in 2-3 years time. You can decide if 5-7% of raw performance (doesn't translate to gaming) is worth it to justify spending on a 6500 (imho it's not).
http://cpu.userbenchmark.com/Compare/Intel-Core-i5-6500-vs-Intel-Core-i5-6400/3513vs3512
http://www.cpubenchmark.net/compare.php?cmp[]=2578&cmp[]=2599
yes planning on getting a i7 6700k
-
-
hi guys would I be better off with a amd fx 8350 or a intel i5 6400?
-
i am planning on getting r9 380x to along with it as well
-
2 hours ago, El Diablo said:
dont get that skylake
go for something with waymore cores,, even an 8core amd
trust me,, when games start using asyncronous scheduling u will thank me
the skylake has a hhigher clock cycle however it can only process lets say one instruction per cycle
the amd 8 core will do more to actually having seperate registers,, control units and arithmatic logic units
the skylake wont,,, so the older amd (if u are on a budjet) will take u waaaaaaaaaaaaaaay further, espicialy in dx12 games where everycore can now talk to the gpu insted of ONLY 1 in dx11 and before
that look good to you man
-
Just now, Alexokan said:
Hard to say from this side of the screen.
You're running 2 monitors? What's your GPU vram usage?
how do i check that?
-
Just now, Alexokan said:
You're really not going to be blown away by that upgrade.
Your current processor is still fairly capable.
but why would i be getting frame drops stutters when i play games, is that my gpu?
-
-
Just now, Ronnie76 said:
to be honest that seems pretty cheap for AUD get a H170 board though to save you money
i was gonna that z170 has it handles better ram speeds
-
-
Just now, Ronnie76 said:
Yeah. Performance wise the card is extremely old but I wouldn't expect a bottleneck, the age of both matches both. You really should upgrade everything.
if i was to upgrade everything, whats your best choices a skylake or haswell?
-
Just now, Ronnie76 said:
570 is like a 950-960 now a days
its a gigabyte gtx 570 http://www.gigabyte.com.au/products/product-page.aspx?pid=3685#ov
-
Just now, Ryan_Vickers said:
Generally speaking, you want a GPU bottleneck - that's how most people systems just tend to be, and that results in a smoother experience; CPU bottlenecks tend to cause very erratic frame times and thus stutter
thats whats happening, frame rate drops and all, really wanna solve the problem
-
Just now, sunil6512 said:
Are you plugged into the GPU because if you display is plugged into the motherboard it will run of CPU graphics and cause high usage
i have a dual screen setup off the gpu ahaha
-
Just now, WellHello said:
If it runs at 100% , cpu is bottlenecking , not gpu's fault . Also , what is the gpu usage ?
how do u find out?
-
-
Just now, HatsuneMilku said:
He forgot to mention he lives in Australia.
sucks to be me ahaha
-
Just now, thekeemo said:
Absolutely it can. It can handle almost everything at least on high usually better.
ill do some research and find a cheap one thanks dude, most things is expensive for me as i live in australia
bloack ops 3 wont run
in PC Gaming
Posted
didnt think of that