Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About Giganizer300PRO

  • Title
  • Birthday May 25

Contact Methods

  • Discord
  • Steam
  • Twitter

Profile Information

  • Gender
  • Location
    Slovenia [EU]


  • CPU
    i5 4690K @4.5GHz OC
  • Motherboard
    MSI Z97-G45
  • RAM
    Kingston Fury X (Red) 1x 8GB 1866MHz
  • GPU
    Asus Strix GTX 970
  • Case
    Fractal Design Arc R2 Midi
  • Storage
    1x Samsung 840 EVO + 1x Seagate Barracuda 2TB
  • PSU
    Corsair CX500M
  • Display(s)
    Samsung S24E390 + LG L1970HQ
  • Cooling
    Cooler Master Hyper 212 EVO w/ extra CM Fan
  • Keyboard
    Razer Blackwidow Ultimate 2014
  • Mouse
    Razer Deathadder 2013
  • Sound
    Sennheiser HD 598

Recent Profile Visitors

1,404 profile views
  1. Interesting. I'm assuming software RAID would work over PCIe...? Is software RAID really as effective as controller RAID? What I mean is; will it be nearly the same in this scenario (8 drives, speeds from 54 to 77 MB/s, hobby project for infrequently used storage) or are there any drawbacks, etc. Is it as smooth as having a RAID card? I also found some port multiplier cards. I found this a lot at low price points. I converts 1 SATA port to 5 SATA ports. It supports FIS-based switching, so could I take this and connect all 8 drives to the mother board, 5 on the PM and 3 directly to the MB and use the motherboard's integrated Intel RAID or software RAID with the drives? I read that a PM is transparent to the drive, but the host knows of the multiple drives and with FIS-based switching I could output to all the drives at once. The controller in question is SATA 2, so does that mean each of the 5 drives connected to it could do up to 60 MB/s or do the real world speeds vary?
  2. I want to hook up 8 drives to reach a reasonable capacity and higher speeds. There's no way for me to add 8 drives to my system without adding some other equipment/component.
  3. I only have 4 SATA ports free on the motherboard. Even if I wanted to test just 4 drives, I'd have to run out and buy SATA cables.
  4. Thank you for your help. I've looked at some used RAID cards. Is there anything that I need to watch out for or is it more or less a matter of does it have the necessary ports? Some of the cards I found have SAS SFF-8087 ports. Is there a relevant difference between the 2 ports (I didn't see one)? Is there anything I should watch out for with other SAS connectors? From the cards I found, this one looks very promising: IBM ServeRAID M5015 LSI 9260-8i I also found some other ones: LSI L3-25121-79D Lsi L3-01144-01D LSI SAS 9210-8i LSI L3-01144-10A 8708EM2 Is this the right direction? I also found an Intel RAID card (INTEL SRCS28X). This thing has SATA ports and not SAS (I think). Would this also work?
  5. Wow. 200W on full or at least semi full (80-100%) load on both? I knew not to take power calculators too seriously, but that's a big difference. I didn't think it could be that far off. I guess I'll just test and see if it boots. Maybe I'll get some kind of a meter to see the actual load.
  6. I have a Corsair CX500M (see full build in spoiler). I'm currently running 3 1080p monitors (2x 60Hz, 1x 75Hz). I have 2x 120mm fans on the CPU cooler and 3 140mm fans in the case. The CPU is overclocked at 4.5GHz on 1.216V max (both variable, adjust to load). I overclock my GPU as well with the power limit maxed out at 120% and core voltage at +25 mV. I'm looking to add 8 hard drives as well as a PCIe RAID card to my system. I used some load calculators and the extra drives add almost 100W to the load consumption. These drives would be in a RAID array for secondary storage for rarely used files. Here are the drives (total 11, would only be adding 8 of these): - 2x WD1600AAJS - 7x ST3160812AS  - 1x ST3160815AS - 1x WD1600AAJS My question is, can I get away with adding all these drives and the RAID card to my system without changing the PSU? When the CPU and GPU would be under load (gaming), the drives would most likely not be in use (if that's what it took, I'd make sure). How much over PSU's rating can I go and still boot and how much when under load? Would not stressing the drives at the same time as the CPU and GPU help with the load on the PSU? Would it help if I got rid of one monitor or some fans?
  7. All the EK 1070 (except for the Asus RGB) and GTX Founders Edition blocks seem to be compatible. The card listed in there is the "Inno3D INNO3D GeForce GTX 1070 Ti X2" (lacks the V2 at the end), but it seems like this should be pretty much the same thing. Maybe check to make sure before ordering. Store link: EKWB
  8. I was talking about a dedicated GPU. Should've been more clear. Integrated GPUs aren't exactly the same as dedicated GPUs, since they get their memory from the system RAM. Apart from the Ryzen series, integrated GPU's can be taken pretty much for granted in the consumer market these days. And even then, we have the Ryzen APUs.
  9. If it's really only web surfing, then no. You don't need a GPU for web surfing, even if making a beefy computer. Just get a decent CPU and plenty of RAM. If there are some other tasks/uses you think you might need a GPU for, then it depends on what those are. Sometimes integrated graphics on something like an Intel can get you pretty far.
  10. If you want to hit 240 fps on 1080p with all on low, I wouldn't upgrade at all. I'm assuming the games you play are more esports like and not very demanding. But if you were to upgrade, in your case probably go for a 1080ti get it for cheap (used or something) and save some money, since it sounds like the games you play are esports or low demanding.
  11. This only works if you enter a key and OEM keys will not work on this site. If you need to download without entering a product key first, you will need a third party tool or website. I found this thing with a quick search, but I've never used it before.
  12. Thank you all for your help so far. I have now checked all the drives. They're not actually all from 2010, some are also from 2008, but most are from 2007. There's 11 of them, all 160 GB as follows: - 2x WD1600AAJS (2010) - 7x ST3160812AS (2007) - 1x ST3160815AS - 1x WD1600AAJS (2008) They're all SATA 2, 7200 RPM with 8 MB of buffer. They all have listed internal data rates/transfer rates at 100 MB/s. That should be their max speed (maybe less in a real world scenario and not the same for both reads and writes), right? My question is, how can I expect these speeds to stack up in a RAID 0 or a RAID 10 configuration. In RAID 0, are the real world speeds of the array actually n (number of drives) times the real world speed of an individual drive, assuming all drives can perform identically? Or, for example, let's say I took 8 drives and put them in RAID 10, would the performance of the array be 4 times as fast as a single RAID 1 pair of drives (so if one RAID 1 pair could read at 100 MB/s, could the entire RAID 10 array read at 400 MB/s)? Or are there drop offs? I've checked my computer and my motherboard (MSI Z97-G45) supports RAID 0, 1, 5 and 10. It has 6 SATA 3 ports, 4 of which are available. I can fit 9 more drives in my case. It's here that I obviously run in to an issue. How can I connect more than 4 drives to my motherboard? I'm trying to get away with this as cheap as possible. Maybe I could get some "cards" or something that has SATA 3 on one end and multiple SATA 2s on the other? I've looked a bit in to this and if I understand correctly, depending on the configuration of such a thing, I might not be able to communicate with all of the drives at once (which would impact performance of the RAID array). Another option might be a PCIe SATA card. In that case, would I have any trouble setting up drives connected to that PCIe card and drives connected directly to the motherboard SATA in the same RAID array? What could I do to keep as much performance as possible (wouldn't want to bottleneck or slow down the array, because the whole point of this is to get these old drives to work a bit faster)? Are there any other areas or components I need to check or that could be affected by having this array in the system. Also in terms of reliability, I might do RAID 10. In any case, this would server as some extra dump storage for stuff I would rather keep, but would otherwise delete if I was forced to free up storage space. Even if the drives start dying in a couple of months or a year, that's totally fine. A part of why I want to do this is also to see how it works. That's why I don't want to spend too much to get this set up.
  13. If I understand correctly, you have the RAM already purchased. If so don't worry too much about it. If you're still choosing, go for something faster. 2400 is quite slow even for a budget build. If you can, get faster, if you already have it it's fine. Unless there's a specific reason for the NVMe drive, I would recommend dropping it out of the build completely. Go SATA and HDD for extra storage (if you need it). With the money you save you can double down on SATA SSD(s) or put it towards a monitor or just keep it, whatever you want. Most everyday users don't need NVMe, but it's a personal preference thing. As far as the M.2 heatsink goes there isn't really a reason to worry unless the PC is in an abnormally warm environment or there's something in the computer that's causing it to heat up more than usual (but still, it should be pretty much fine unless there's a heat gun aimed at it). If you want to know more or make sure, I'd suggest looking up the recommended and maximum operating temperatures. As far as the monitor goes, I wouldn't recommend shipping it form a different country. You should buy it locally and eat up the added costs. There's just too much risk of it breaking in transport or arriving with something not working properly. And if there's a problem with the panel, the repairs are usually almost equal to buying a new one. You don't want to be shipping it back if it's DOA either. I'm not sure how things are over there, but where I live monitors do have an added cost, sometimes quite extreme. Unfortunately that's just how it is and gambling with shipping such a big product from somewhere else is not worth it, especially since with the shipping and import fees paid, you'll in most cases pay about as much as you would if you bought locally.
  14. The platform you have picked here is totally fine. The only thing was the price. If you need a small form factor that's ok. If your case supports micro ATX boards, you'd probably be better with one of those. I can see why you're concerned with quality control. If you still want to save some of the money, you can get a bit more expensive mATX board, maybe something with good reviews or from a quality product line. In an ideal scenario, you'd want your RAM to be at least 3000MHz. With a value build you can definitely go lower. Since you already have the 2400MHz RAM purchased you shouldn't worry too much about it now and just stick with that (if you want more than 8 GB get another identical kit). If you have a GTX 1080 and a R5 2600, you might want to invest in a better monitor as well. You could say a GPU is only as good as the monitor it outputs to and with a 1080p 60Hz you are very limited. I would recommend higher resolution or refresh rate, maybe both - depending on the budget. If I understand correctly, you're getting a 1TB WD Black NVMe drive? Where did you find it run hot? Putting heatsinks on M.2 drives is generally a bad idea, as it will cool the drive's controllers, which prefer to get a bit hot thus degrading them (see this for more info). Also, is there a particular reason why you need a 1TB NVMe drive? In most consumer systems it's better to go for SATA SSDs over NVMe to save money, because NVMe doesn't offer much of a performance increase in real world scenarios (for example you will boot faster, but the difference in loading most games will be minimal or non existent). If you have any other questions, please link the entire build if possible.
  15. Do you need an ITX board? If not, I would recommend a standard ATX or microATX board, because it will make things a lot cheaper. Like @fasauceome said, that is quite expensive for an AM4 board, at least for B450. Here's a list of B450 boards, sorted by price. Don't just go for the cheapest ones, go thru a few of them that fit your budget and find something you like. Then make sure it's compatible and has everything you need. There are Biostar boards in there, so maybe don't pick one of those unless you really have to (not that there's anything wrong with them, I just feel safer when picking one of the other consumer manufacturers, because Biostar is mostly server gear).