-
Posts
27 -
Joined
-
Last visited
Reputation Activity
-
Pocketmouse reacted to Vejnemojnen in What cpu to buy?
Well, B250 boards don't support 8th gen cpu-s... 1151 v1 instead of 1151 V2..
-
Pocketmouse got a reaction from Leanora in What cpu to buy?
You have a lot of options to consider. If you're just concerned about your CPU's speed you could always just go for an i7 7700K which comes with a price tag of just under 350 (conveniently), but if you're looking for something newer here's a build I put together for under $350 that will significantly boost your performance as well:
Specs:
CPU: AMD Ryzen 5 2600X $179.99
Motherboard: AsRock B450M Pro4 $75.61
Memory: 16 GB (2 x 8 GB) Corsair Vengeance LPX DDR4 3200 MHz $84.99
Total: $340.49
https://pcpartpicker.com/user/Pocketmouse/saved/#view=9Zmx6h
I used DDR4 3200 because Ryzen CPU's perform better with faster memory and since you only had 2400 MHz memory, I opted to go for memory instead of an after market cooler since the 2600X comes with the Wraith Spire cooler and while it isn't top-of-the-line, it's much better than Intel's stock cooler, or (in the case of recent generations) no cooler at all. Of course, you can always swap the Memory out for a cooler if you deem that more important.
-
Pocketmouse reacted to Semper in Air Cooler for pc case that has clearance of 163mm
You can usually find the maximum height of tower coolers on the technical specification section of the webpage for your case. Directly compare that to the listed height of the DRP4, and you'll have your answer.
The 275Q, for example, has a max height of 170mm
The DRP4 has listed dimensions of [L x W x H] of [145.7 x 136 x 162.8]
-
Pocketmouse reacted to jorenc in How much room should I give my GPU's fans to properly cool it?
Enough room = yes. If you do vertical mount, it will run hotter, but unless your GPU is clocked to where every degree counts, you'll experience no issues whatsoever. Modern GPU's are great at handling heat. Even if it was pressed up against the tempered glass, you're not going to even get close to 70C.
Imo, the GPU down a slot looks worse than tubes pressing against a GPU in the top slot.
Listen to Luke, "Just do it. It'll probably be ok."
-
Pocketmouse reacted to Mira Yurizaki in How long ago could NVIDIA cards have supported Freesync?
FreeSync, or adaptive sync as it's called in the standard, was an optional feature of DisplayPort 1.2a.
-
Pocketmouse reacted to Princess Luna in How long ago could NVIDIA cards have supported Freesync?
Among the reasons nVidia finally allowed FreeSync was probably to keep the GTX 1080 Ti as a better product than the Radeon 7, in other words competition saved the day as always.
-
Pocketmouse reacted to Stefan Payne in How long ago could NVIDIA cards have supported Freesync?
...the support from the Entertainment Industry, the wide avaibility of freesync compatible displays.
When Samsung announced their upcoming TVs will support Freesync, it was over for nVidia.