Jump to content

A84Gamer

Member
  • Posts

    7
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

A84Gamer's Achievements

  1. Hey guys, I’ve got a computer that’s over 10 years old. I’m planning to upgrade, but before I do that, I’m curious to know if you could help me identify the potential bottleneck between my CPU and GPU when doing renders with Blender. I’ll list my specs as reported by CPU-Z and then give some additional thoughts: Intel Core i7 920 at 2.7Ghz (Bloomfield) 4 Core, 8 Threads EVGA Nvidia GTX 750 Ti Asus P6T Deluxe Version 2 24GB of DDR3 Triple Channel So, some thoughts I think are worth mentioning. Whenever I do I Blender render and enable both the CPU and/or GPU, activity monitor reports the CPU usage is 100%. However, the GPU can barely reach 25% usage. If I updated my system to a Ryzen 9 5950x with 16 cores and 32 threads, do you think renders would make use of the GPU more effectively since I’m increasing not only core count, but frequency as well? I realize that pairing a 5950x with a 750 Ti is probably laughable, but with the GPU shortage being what it is, that’s the scenario for which I am preparing. Thanks a bunch.
  2. You have all been great and very helpful, thanks a bunch. Since I kept this current PC for over 10 years, I might do the same with this one, so I'm probably going to be investing quite a bit on it and wanted to make sure I didn't waste any money. The only thing I'm really annoyed over is the horrible timing with the graphics card market being what it is. I currently have a 750 Ti and haven't had any luck with Newegg shuffles. The cost of some of the cards on the OEM sites that don't say out of stock are difficult to accept as well. I watched a review on Gamers Nexus about the 3080 Ti in particular, and the price to performance increase that Steve pointed out was very discouraging. For what it's worth, to give a little background, I'm going to be using this for 3D art, so if I was even considering getting a 3080 Ti, I might as well go for the 3090 to get 24gb of VRAM. My understanding from the research I've done is that there is little point of having all those CUDA cores if I don't have the appropriate amount of VRAM to pair with it.
  3. Hi, I've got a few questions about the Zen 3 Ryzen 5000 BIOS update on some motherboards. I'm thinking of building a new PC to replace my current one that is over 10 years old. My concern is that if I end up with a motherboard that is not already on the latest BIOS for Ryzen 5000 compatibility, am I going to be able to even get to the BIOS in the first place to run the update? It's not like I have an B550/X570 with a Ryzen 3000 running on it already. For that matter, would a B550/X570 board on an older BIOS be able to boot to Windows in the first place if a 5000 series CPU is installed? In other words, would it be able to run, but the latest features/enhancements of Ryzen 5000 are disabled if on the older BIOS? Thanks in advance for your help, it's much appreciated.
  4. Ah, good catch. Thank you. FYI for anyone else that has this question. I contacted Netgear and the advisor said you can use something like this extend it as a mesh: https://www.netgear.com/home/products/networking/wifi-range-extenders/EAX80.aspx It's safe to assume that the router you are connecting the extender to must be Wi-Fi 6, so keep an eye out for that. Thanks again.
  5. Hey there, I have the Netgear Nighthawk AX6 6-Stream AX4300 WiFi 6 Router. Is it possible to connect a Nighthawk Mesh Satellite to it in order to extend my network in the same way a mesh network behaves? Or is the AX4300 not designed to function as a mesh base station in the first place? I don't want to simply use a range extender since it doesn't offer the same benefits of a mesh node. Thanks!
  6. My monitor is a Dell 3007WFP-HC Here's a link to the spec sheet. http://www.dell.com/downloads/emea/general/ultrasharp 3007wfp-hc_en.pdf So according to this, Max resolution of 2560x1600 at 60 Hz.
  7. So I did a little research and found my answer... kind of. Which brings me here for your advice. It's about that time that I'm thinking of buying a new GPU to replace my GTX 750 Ti, and I thought to myself, what if I just went balls to the wall and bought a 1080. Not that I'll actually go through with that kind of purchase, but academically, I'm just curious if my system could even utilize it's full power or if would be a waste of money. My first concern was my Asus P6T Deluxe V2 because of its PCI-E 2.0 slot. Below is a link to an article I found that basically said the way PCI-E scaling works, even if I use a 16x 2.0 slot, there will be around 1-2% of a performance drop at most which really isn't noticeable. https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_PCI_Express_Scaling/ However, here's where I get a little worried. The article pointed out that the findings were based on systems running Sandy Bridge or newer. I'm running Nehalem. So what do you think, would a GTX 1080 be a dumb purchase if paired with the Nehalem based Core i7-920?
×