Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Man

Member
  • Content Count

    159
  • Joined

Awards


This user doesn't have any awards

About Man

  • Title
    Member
  • Birthday 1990-09-13

Contact Methods

  • Reddit
    u/Devgel
  • Twitter
    @MrDevgel

Profile Information

  • Gender
    Male

System

  • CPU
    Core 2 Duo E8400
  • Motherboard
    Intel Q35 Express
  • RAM
    2 x 2GB DDR2 800MHz
  • GPU
    Nvidia GT440 (1GB)
  • Case
    Dell Optiplex 755
  • Display(s)
    Asus VH222
  • Keyboard
    Dell SK-8135
  • Operating System
    Windows 8.1 Pro

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I'll look into it, thanks!
  2. The card in question is an Inno3D iChill GTX 770 HerculeZ X3 Ultra (quite a mouthful, I know!). Appears to be in immaculate condition but has only 2 Gigs of vRAM, which is problem number 1. Problem number 2 is the system. I've an old Dell T3500 (X58). And while it has a dedicated 6-pin and a strong enough triple-rail PSU (525W), this power hungry beast requires an 8-pin + 6-pin power connectors so I'll be using a twin-molex to 8-pin converter cable. Another problem is the length. This GPU is almost almost exactly a foot in length and I'm not sure if I'll manage to squeeze it inside that chassis. Lastly, T3500 doesn't have any exhaust fan while this card has a TDP of 230W at peak, possibly more since it's factory overclocked. Games I'm primarily interested in are Hitman 2, RE2 Remake, Fallout 4 and "The Outer World", preferably at 60FPS @ 1080p. Sure the vRAM is a HUGE limitation but I won't mind playing at medium texture resolution. So can anyone with a Dell T3500 shed some light on this topic? Thanks in advance.
  3. SLI is pretty much dead, mate. I was puzzled when I first heard of SLI (and Crossfire), and even more puzzled by the way it disappeared. SLI was the staple of any premium gaming machine only a few years ago and now; no one even talks about it! I distinctly remember seeing a video on YouTube in which a fellow demonstrated the power of twin GTX950s and went on to boast about how SLI is here to stay, how it's the future of PC gaming and haters are gonna hate and so on. And to be honest, I kinda agreed with him. I mean back then, twin GTX950s in SLI weren't only comparable to a GTX970, but were also cheaper (by $100 or more) and even a bit more energy efficient. Not to mention, almost all modern games of the time were well optimized for dual GPU setups (Fallout 4, GTA-V etc.). Would be nice if one could squeeze the performance of an RTX2060 Super out of a two-way 1650 Super SLI!
  4. Yes, probably. But it isn't necessarily a bad thing considering the 2060S can push most games at 100FPS. If you want, you can squeeze out a dozen or two more frames by going easy on anti-aliasing. AA is one of the biggest GPU resource hog; and you can easily minimize the GPU bottleneck by disabling it altogether. Less load on GPU per frame = more frames per second = more strain on CPU = less bottleneck. AA @ 1080p is pretty much worthless in most cases and I, for one, tend to disable it completely. But then again, my eyesight isn't as good as it used to be!
  5. Halogen lights are a brilliant idea! EXTREMELY cheap and MUCH safer than resistors, which tend to go boom! Found 2 x 100W 12v halogen bulbs going for $2 each. Even cheaper than resistors, and I won't have to build a heatsink for them either. Solved!
  6. 150W is by no means the official limit of 6-Pins. I shall never buy into "6-Pin = 150W" nonsense unless someone give me a good reason for it. I admit, I'm by no means a PSU expert or an electronics engineer or whatever but you don't necessarily have to be an expert to see the point. I've seen solo molexes running at 8 amps 12V without overheating, and that's just 1x 12V wire paired with 2x GND wires. A 6-pin, on the other hand, has 3x 12V wires paired with equal number of GND wires. The extra two pins you see on 8-pin are both GND wires as well (there's no additional 12V), which are (probably) there to reduce circuit resistance to improve amperage and in turn wattage at a given voltage (Ohm's Law), or at least that's what my admittedly limited knowledge suggests. And if a solo molex can supply 96W, I'm sure a 6-pin will manage to push around at least 130W or more easily + anywhere from 45-65W from the PCIe slot itself. Most GPUs with external power usually draw 55W from the slot, or at least that's what their BIOS entries suggest. I think a 6-pin should be able to handle a ~180W GPU, no problem.
  7. @huricanxl Precision T3500s are proprietary semi-ATX servers. You can't just buy a random ATX compatible motherboard and squeeze it inside. It just won't fit, to put it simply. And even if you somehow manage it, the front I/O connect is 100% proprietary. The dual intake fans, which also serve as CPU fans will also pose a major problem. And I'm just scratching the surface here. (I own a T3500 myself so trust me on this one!) None of the online power calculators bring different rails into consideration, hence they are useless for the most part. A 15K HDD from Seagate is rated at just 10W (5V 0.8A + 12V .5A). Worst case scenario is ~40W. An X58 motherboard is rated at 28.6W as per Intel. Let's call it ~40W, mostly at 5V and 3.3V. A single DDR3 RAM stick is rated at 2.5W whereas the T3500 has 6 triple-channel RAM slots. Let's call it ~30W for no reason! The twin intake-fans are rated at 12V 0.9A. Let's just say they draw ~30W. Optical drives draw ~20W at max RPMs (Google search suggests 1.2A @ 12V + 0.8A @ 5V = 18.4W). Call it ~30W. Xeon X5690 is rated at 130W. Call it ~150W. Pair it all up with a power hungry (but not too crazy) ~200W GPU. Total (THE WORST case scenario): 520W. Total: (realistic worst case scenario): 446W. Your system will draw 446W IF your HDDs, DVD, CPU, GPU, Fans and whatnot operate at 100% load SIMULTANEOUSLY, which is next to impossible! That's not how modern systems operate, thanks to dynamic voltages of the CPU and GPU. P.S I wouldn't put a high TDP GPU in there, thanks to complete lack of exhaust fans, except for one inside the PSU. A <150W GPU is more its forte. Something like a 1660 Super or an RTX2060 Super, tops. Never underestimate case airflow. Hope it helps.
  8. It's something that's hard to 'predict' without looking at the amperage distribution between rails and voltages. The total rated output (300W in your case) is just one side of the picture. Take a look at these quote/unquote "450W" PSUs below. One's 12V rail is rated at 7A, other's at 10A and hence are able to provide just 84W and 120W respectively @ 12V! Remember that CPU and GPU, the two most power hungry components in modern ATX systems, draw their power from 12V rail(s). So... First, check the voltages at idle with a multimeter. All three main voltages (3.3v/5v/12v) should be within ±5% tolerance. If everything's okay then plug the system, stress the CPU and test the voltages again. The reason you've to test again is due to voltage dip or 'sag' (Look it up on Wikipedia for more info.). Voltage tends to drop down gradually as the components start to draw more and more power. If the PSU passess in both tests then it's 'most likely' fine. Remember that I said 'most likely', as opposed to 'absolutely'. You have to also consider AC ripple voltage as well, which can only be measured with specialized tools, such as an oscilloscope or a high-speed (2MHz or more) multimeter with true RMS. Hope it helps.
  9. Can you suggest a good, cost effective dummy load with a power draw of 200-ish watts @ 12V?
  10. So, I've a couple of old PSUs lying around and I want to stress test them to see if their voltages are within tolerance under low to moderate load. Mostly just for fun, although I might end up using them is some sort of project. In any case, I'm thinking about ordering a 100W 1.5Ω power resistor from Motherland China and connecting it with the 12V which will put a theoretical load of around ~96W @ 8A on a single Molex but I'm not sure if it's going to be safe. Another option is buy two resistors, each rated at 100W 2Ω and connect them with Molex connectors from different rails, which sounds a little better and safer to me (72W x 2 = 144W @ 12V). The problem is that I haven't done anything like this before so need some expert advice. I'm by by no means an electronics engineer, not even close! Just some crazy hobbyist playing around with old PSUs! I know I shouldn't but can't help myself. Thanks in advance.
  11. I see. So they only with with X58 and X79 based motherboards?
  12. Okay, so I've done a little bit of research about it in the past hour and it appears that memory controllers in modern CPU architectures don't rely on motherboards. They are entirely CPU dependent as the controller itself is located on the CPU, instead of the northbridge. And since the motherboard officially supports Xeons, I think it should be able to handle ECC memory as well, right? My apologies for all these stupid questions. I'm totally clueless when it comes to system memory! Thanks!
  13. So I'm building a cheap dual-monitor setup for my friend and I'm a bit confused about the ECC vs. non-ECC situation. First, here are are the specifications: CPU: Xeon E3-1220 (Sandy Bridge, Socket LGA-1155). Motherboard: Gigabyte GA-H61M (Rev. 1.2). GPU: Nvidia Quadro 600. Plus HDD, SSD, Wifi card etcetera. Memory: ? Here's the problem: While the motherboard officially supports some Sandy Bridge Xeon CPUs (including the E3-1220) according to the user manual, there's no mention of ECC memory anywhere. On the other hand, Gigabyte's official website mentions only this: I find it rather vague, to be honest. I wonder if it's mean the motherboard supports non-ECC memory "as well"? I mean, almost all motherboards support non-ECC memory so why mention it?! In any case, here's the RAM I'm interested in: I think you can understand why I'm debating this! <$30 for 16 Gigs of RAM is quite tempting, to say the least. For comparison, non-ECC memory costs over twice as much i.e $60+ for 2 x 8GB 1,333MHz modules. TL;DR Do you think ECC memory sticks will work with a Sandy Bridge Xeon paired with a cheap H61 motherboard? Thanks in advance!
  14. If you mean my favorite game on Sega Genesis console then the answer is this: If I've to pick either this or the legendary 'A Link to the Past', I'd pick Beyond Oasis any day! Superb graphics, excellent hack-and-slash gameplay, lots of weapons and combos, enormous bosses, knights in shiny armor, zombies, witches, puzzles, this game has everything! Of course, the star of the show are the four spirits (fire, water, shadow and plant) which you can summon to make your adventures easier. And the ways you can summon them is just... well, let's just say it took my breath away as a kid! For example, you can summon the fire spirit from any fire source, let it be torches, sparks, campfires, fire bombs, whatever! Likewise, you can summon the shadow from any reflective surface like mirrors, shiny armor, rubies, ice, anything! Same goes to water and plant spirits. It's a shame most people in the Western world aren't even aware of its existence.
×