Jump to content

Ashley MLP Fangirl

Member
  • Posts

    8,255
  • Joined

  • Last visited

Posts posted by Ashley MLP Fangirl

  1. 7 hours ago, XNOR said:

    Did you see that this is exactly what Iran told the world and showed off? I can't say I like the LTT clickbait thumbnails, but this one was pretty much accurate. Iran said to have a quantum computer and showed the world this dev board.

    which was an obvious lie, so why clickbait it. 

  2. 8 hours ago, RoseLuck462 said:

    Cool vid! I remember reading about these dual CPU MOBOs in PC mags back in the day, still want to build one either intel or AMD!

     

    For the vid itself I think the audio gets too low when Linus whispers…

    build an intel setup. if you want something dualsocket there's TONS of Xeon systems with supermicro mobo's for next to nothing on ebay. 

    keep in mind to check the power connectors on those motherboards, since many are made for servers. you might also need some extra fans on things like the chipset heatsink, since again those boards are designed for servers where they get tons of airflow. 

  3. i actually have a 2008 Mac Pro, ironically pretty close to that setup just intel. 2 quadcore Xeon E5462's, DDR2 ECC ram... it actually performs surprisingly well if you adjust your expectations a bit. 

    edit: i'm a bit further into the video and i just wanna say, my Mac Pro with just one gpu (a Radeon HD 7770) has no issue playing back even 1080p 60fps youtube lmfao

    as i said it performs surprisingly well for it's age. just shows how much further ahead Intel was at the time. 

  4. 15 hours ago, Rarity said:

    thats interesting , cause them used mac pro`s and used working xserves are going for pretty decent  prices nowadays ..

     

    yes they do. i would recommend against the 3,1 though, ideally get a 4,1 or 5,1 since those use DDR3 ECC ram. much easier to get a hold of these days than DDR2 ECC.

  5. 34 minutes ago, r00tb33r said:

    I was wondering the same thing, since I was in Mac card business at one point.  I don't believe there was an EFI firmware for Maxwell cards.

     

    It's been many years, I do not remember if Macs boot without an EFI display.

    they do boot without a mac firmware card, you just don't get a display output until the OS/gpu driver load.

  6. 12 hours ago, Rarity said:

    one thing though , was the gpu used in the video one with a mac firmware , or a normal one off the shelf ? 

    that does not matter other than getting a boot screen. when the OS/drivers load it behaves like any card. i own a 3,1 Mac Pro and i've had a pc 1030 working in it.

  7. this video was a fail imo.

    an XServe 3,1 under the hood is effectively a Mac Pro 3,1 in server form. i own a 3,1 Mac Pro, and there's a LOT more that it can do than y'all showed.

    for a start, 10.11 is the latest supported macOS version but it'll run all the way up to the latest release with patchers. with an nvidia card you'd want the High Sierra patcher, to install that, since that's the latest macOS that runs nvidia cards.

    i actually had a 1030 working in mine on High Sierra, so theoretically it'll take any 10-series card.

    furthermore you can very easily boot and install Linux on 3,1 Mac's so you could have done that. all the hardware in my 3,1 Mac Pro works on Linux so i don't see why an XServe would be a problem.

  8. have you stress tested the system> i mean running cinebench and heaven benchmark simultaniously to see if it turns off? 

     

    it could be as simple as a psu that's too weak for the hardware. slamming the machine with a load should show that. 

     

    if it does turn off, run cinebench or heaven on their own, to see if it still turns off when only the gpu or only the cpu is under load. if that's not the case, it's likely your psu is too weak to run both the gpu and cpu under load like in a game. 

  9. 18 minutes ago, Gnaah said:

    Hey,

    was just wondering if it is possible to have two monitors plugged into a GPU but only output display to one of those? My thinking is hooking up my computer to my 4k tv for when i want to relax and game on the couch and then when i want to sit at a desk, use my 2k ultrawide monitor. I was wondering though if they made switches that would cut the signal over HDMI/DP depending on where i wanted to game so i wouldn't have to render a monitor i wasnt using.

    any tips/advice would be useful.

    i assume you're running windows. 

  10. 2 minutes ago, CommanderAlex said:

    I've never had issues with not installing RAID drivers as I've never had to use RAID for the many systems I've built over the years, which is one that the OP is asking whether it's necessary to install if they will never use RAID. 

    it's great you've not had issues. depending on what kind of raid it is you can encounter them. i've had a motherboard with a hardware raid chip, without the driver the ports connected to the chip wouldn't work, even with drives in non-raid mode. 

     

    the point here is to aliviate such issues. a driver takes up so little space and resource that even if the device isn't used, it's worth installing to avoid stupid problems since it costs so little to have it installed. 

  11. i don't think you realize how drivers work. if you don't install them, Windows will just not have a clue on what to do with that hardware. even if it's dorment unused, that can cause problems since the OS has no idea what it even is, so it doesn't know it's sitting unused either. 

    you should ALWAYS install ALL drivers on a system, no matter if it's for hardware you're using or not. 

    for example, if you don't use your iGPU, if you have drivers and Windows knows it's there but unused, Windows can put the chip in a low power state. that reduces power consumption and heat output. without drivers, the chip will still be active and in a high power state, because Windows doesn't know how to tell it to go into a low power state. 

  12. 1 minute ago, baK1ikan said:

    y---yeah i know. what i'm asking is whether DELETING stuffs will eat your TBW out or not.

     

    i got a Samsung 840 or something whatever. I figured if kept writing a certain amount of data every once in a while, it'd last me quite a long time. what i hadn't realized is, what if i kept deleting stuffs from it

    i don't think so. here's an article explaining it: 

    https://www.ontrack.com/en-us/blog/how-long-do-ssds-really-last

    basically what wears it out is if you delete something and then put new data on it. that is measured in the write cycles myself and others have already mentioned. 

  13. as far as i'm aware not really. SSD's usually have a max drive writes in the specifications you can look up for your SSD. you can then use a program like CrystalDiskInfo to see how much you've written to the drive. 

    example here's my OS drive:
    image.png.9081ec3be9f01f7abd2dd1e1c61d4ef6.png

     

    and here's what Samsung says:

     

    image.thumb.png.63104af61592ff870f2ed4cd0c260c18.png

     

    so with my almost 30 terabyte written i'm about a third and a bit towards the cap for guaranteed reliabilty. 

    in any case, you'll want to make backups of all your stuff always just in case, but this gives you a good indicator. 

×