Jump to content

Aw_Ginger_Snapz

Member
  • Posts

    96
  • Joined

  • Last visited

Awards

This user doesn't have any awards

About Aw_Ginger_Snapz

  • Birthday Jan 13, 2000

Profile Information

  • Gender
    Male
  • Interests
    Computers...(duh), Drumming, Music, College, Engineering.

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hey everyone. I just have a quick question regarding a very weird config for octane. Here's the situation: I have a Acer Nitro 5 Laptop (Model No. AN515-51-55WL). It has a I5 7300HQ and 1050ti. It came with an Intel 256GB (some weird OEM SKU). There is nothing wrong with the Stock SSD other than it is just too small. When I got the laptop the first thing I did was add a 1tb Seagate firecuda hybrid drive and upgrade it to 16gb of RAM. 4 years later both drives are getting a bit full. Because of the size of the main SSD, I have had to move a lot of the programs I use on a daily basis such as Solidworks, Autodesk Suite, Hopsan, and more. the time to launch all of those programs is just way too high now. Lets just say I can make a cup of coffee in the time it takes to launch some programs. Also keep in mind that I don't do any gaming on this machine. So heres my question: I have this Intel optane drive that I salvaged from an 8th gen dell Ultrabook. I also know that is perfectly functional. Its an Intel Optane Memory H10 with Solid-state Storage (P/N: 019RWR). Meaning, its one of those weird OEM only SKUs that is both Optane and an SSD on one M.2 Drive. Does anybody know off the top of their head if this drive would be compatable with this laptop? The intel website just says optane will work with 7th gen CPUs onwards which is pretty broad. Also, it would be a massive waste of time to tear apart this laptop to find out it doesn't work. I would like to use this drive cause it would cost me nothing. Is it possible to use the SSD portion of the drive with Optane disabled? I know the obvious answer to this is that it is not compatible, and that I should just go buy a larger SSD or HDD for the laptop, but this thing is so close to the end of its useful life, that I'd rather just invest in a new laptop instead of invest more into it. Especially since this thing struggles with large assemblies. Also, let me know if you are aware of any good (and preferably free) SSD cloning software. Thanks for your time. :D
  2. The Phenom II X6 you mentioned was a fairly good price. If your on a budget, I suggest you upgrade your CPU. Especially since the motherboard and memory is good. Plus, there wouldn't be a lot of additional cost.
  3. Her boot drive is an SSD, but its running out of space. I was planning to upgrade her HDD anyway.The reason for the upgrade is because I need her current hardware for other machines. Also, even before I put the questionable hard drive into that machine, I was having stability issues at random times. A fresh copy of windows would randomly hard reboot for no apparent reason. This isn't so much of an issue now however.
  4. I thought the same thing, but there is little to no information on this topic. I know that with the old socket FM(x) platform was pretty hit or miss with multi-display setups.
  5. Hey peeps, I am thinking of upgrading my moms desktop that she uses for her business. The reason for this is that I need some of her current hardware for different computers/applications. Additionally, I have had some stability issues with the platform as well. For the replacement I was looking into was the Ryzen 3 2200G because of its unbeatable price at Microcenter at the moment, and I am very happy with my experience with 1st Generation Ryzen. Because my mom isn't gaming, I was thinking of going with a cheap A320 motherboard such as the Gigabyte GA-A320-S2H. This motherboard has the two connectors I need for the two monitors she uses (Being DVI-D and VGA. Monitor details in current spec list) The Hurdle is figuring out if this motherboard (or even platform) has multi monitor support. So my question is this: Does anyone know If Ryzen APUs support Multi-Display setups? Her workloads typically consist video chatting (Skype and WeChat), web browsing (Firefox with an ad blocker for security reasons), content viewing on websites such as Youtube and Netflix, editing Word documents, and managing payrole and taxes through Excel, Quickbooks, and TurboTax. Her current computer specs are as follows: AMD FX 8320e (Stock Clocks) MSI GMA720-fx (I could be wrong. This board is a P.O.S. and needs to go regardless) 8gb of HyperX DDR3 1866 MHz (2 x 4Gb dimms) EVGA GTX 750ti SC OCZ SSD A very questionable 500GB HDD (I plan to replace with a 1TB Firecuda SSHD) Windows 10 Pro and two Dell Ultrasharp P170S monitors, both of which have 1x VGA, and 1x DVI-D Dual Link connectors each The reason for this configuration was because the CPU, motherboard, and memory were on sale at microcenter at the time. The monitors were free from my dads job, as they were just throwing them away. The GPU was my old graphics card, which I have since upgraded. The parts that I mainly want out of the old setup are the CPU, RAM, and GPU.
  6. If money is no object, then sure go for it. Something to keep in mind however is that AM4 will continue to see CPUs released on it until 2020. If I were you, I would just wait until the next generation Intel/AMD CPUs come out. Considering how Intel screwed over anyone who bought Sky lake/Kaby lake with Coffee lake and making them incompatible, they might do it again with the next generation they bring to the table. Thus, limiting your upgradeability on a future platform. That is, unless money is no object for you, and you have no problem buying the best you can get every generation. I would just stick with the R5 1600 so if/when it becomes obsolete, you can just upgrade to a future AM4 CPU with a little more kick without having to go through the hastle of buying a new motherboard and cooler (if your using stock cooler that is).
  7. Do you have a microcenter near you? They have some killer deals on CPU/Mobo bundles! Honestly, I would either go with a better Gigabyte board or an MSI board. My Asus B350 Prime got bricked because something went bad with the bios. After the random death, and horrendous experience of sending it in for a replacement, I don't recommend Asus Motherboards anymore. I bought a Gigabyte AB350-Gaming motherboard because of the "Dual Bios" feature. Honestly, the board is very barebones, and lackes quite a few features, compared to the Asus one. I would either buy the Higher end gigabyte AB350-Gaming 3 (A little more expensive than the AB350-Gaming), or an MSI B350 Tomahawk. Something to keep in mind is that Gigabytes software in Windows is HORRIBLE! It feels like driving an old beat up car because you never know if/when it is going to crash. This goes for most of Gigabytes software in general, it's all horrible. The interface is clunky and looks like one of those virus programs like "PC Optimizer Pro" with it's "Install Center". Hell, It reminds me of preinstalled bloatware on an OEM system like HP Support Center. If that wasn't enough, the install disk asks if you want to install Norton Anti-Virus. What a joke! And good luck controlling fan speeds, or changing fan profiles in Windows because the fans run at whatever speed they feel like! Bottom line is that they need to hire better software developers.
  8. While yes, ASICs use many of the same components, the architecture of the silicon, power design, memory footprint can all vary. Like I said in the original post, this is all a speculated scenario. I do not know what kind of architecture either company would use. So the only answer I can give you is it depends. The most limiting factor to this would most likely be the memory. After all, we're in a global DRAM famine because of the mobile phone market.
  9. No. The idea is for these ASIC machines to be a separate device from a GPU. The idea wasn't for AMD/Nvidia to stop selling regular GPUs. The point is for them to make ASICs that will encourage encourage miners to stop buying GPUs, and instead buy a standalone device that will mine only currencies. Thus, lowering the demand for GPUs, and making them cheaper as a result.
  10. I agree. I just can't see AMD consciously continue to compete in the graphics card market if they continuously lose money. After all, why would a business take away R&D money from a market they can compete in (CPU), and instead invest it into a market that they can't compete in (GPU).
  11. If you're planning to buy a used RX 480 at a good price (which won't be hard to find due to concurrency mining inevitably crashing), and you're not going out of the way to get a motherboard or PSU to be able to set it up, then you should consider it. My experience with crossfire has been pretty solid. Although, I can't speak for anyone else. I used to run a Radeon HD 7970, then bought a second one 3 years later at a great price for crossfire. While some games didn't scale well, most of them scaled very well. However, about 2 years later I upgraded to a GTX1070 because I had 2 other computers that needed GPUs, so I no longer use them in crossfire. Honestly, I wouldn't listen to anyone who hasn't experienced SLI or Crossfire first hand because there's more to it than just scaling and raw performance. In my opinion, it's all about the utility and versitility of SLI and Crossfire that make them worth your time. SLI and Crossfire are not good for future proofing a system. But they're a great, cost-effective method of keeping a system relevant after a few years or so. Like I said earlier, when you eventually upgrade to a more powerful single card years from now, you can keep around your two RX480s and use them in other machines that may need a video card for light gaming, video playback, productivity applications, or just a boost in overall performance above integrated graphics. The point is, if you already have a motherboard and PSU that support crossfire, and you're not buying a second card now and waiting for later, then you may definitely want to consider using Crossfire on the RX480. Remember that there is no rush in your case because the RX480 is still a very good card as of now. I would just wait for cheap used cards to show up before you crossfire.
  12. The only way I could recommend you do this is if you are getting the second card for an insane deal, then immediately resell it for a profit because of the cryptomining craze. Other than that, you're wasting your time.
  13. I just hope that whatever profits AMD makes from the crypto craze will be enough to sustain themselves and reinvest enough into R&D to keep up. Not gonna lie, Vega was pretty underwhelming in gaming performance. I'm sure this will sound too familiar, but lets hope that the next architecture (Navi) won't disappoint. However, I love AMD cards for their feature set like Freesync, Crossfire, and their driver software for instance. As a person who already has a GTX 1070, I cannot wait for the used market to be flooded with cheap AMD cards so I can finally use Freesync on my monitor. I have no loyalty to either Nvidia or AMD, I just don't want the market share to become extremely polarized to either side, resulting in a monopoly. (For those of you who ask why I bought a Freesync monitor if I own a Nvidia card, it is because I got a good deal on a 144hz monitor that just happened to have Freesync, and I couldn't afford the extra $200 on a G-sync display) This is gonna break this down like a 5th grade presentation, Why doesn't Nvidia or AMD invest in developing ASICs? They could have a foothold in both markets, and Miners wouldn't take as many cards away from the gaming market place. They have enough money to invest in R&D and the technical expertise in silicon architecture, power efficiency, software design, and marketing. Their brand could appeal to casual miners who would otherwise buy off the shelf GPUs. Additionally, they could appeal to casual miners by developing an intuitive software interface to manage their ASIC hardware. Finally, the used market would not be flooded with used GPUs after a crypto crash. This could harm both AMD and Nvidia because people would buy used cards instead of new cards. As a result, this would hurt both companies income, and would take away money that could otherwise be reinvested into R&D for new products. Furthermore, if the used market does not become flooded, consumers wouldn't he harmed either. New GPUs would be affordable because miners would demand the ASICs instead of ordinary GPUs. Thus, preventing the hyper-inflation of graphics cards prices we see today. This is vastly oversimplified, very theoretical, and has no chance of actually happening. But it's fun to dream...
  14. This reminds me of the days when you could get Socket LGA 771 Xeons to run on Socket LGA 775 motherboards with the help of a little tape from Ebay... The good ol' days...
  15. If your on a budget, I'd recommend the Gigabyte B350-Gaming. It's what I'm rocking now, and it's rock-solid reliable for pretty much everything. I got my R7 1700 to 4 Ghz no problem. I couldn't even do that on my Asus X370 Prime board.
×