Jump to content

fordy_rounds

Member
  • Posts

    212
  • Joined

  • Last visited

Everything posted by fordy_rounds

  1. It'd also make it easier to find the results, as we can just bookmark the one sheet.
  2. This depends (at least in part) on the software. Can the software simulations be distributed across two systems? If so, then that certainly seems reasonable - although remember it's not just CPU/MB/RAM costs, but also PSU, GPU, and storage costs have to be factored in also (although, depending on your needs, you may be able to get away with a cheap GPU in at least one of these systems; but it's still an added $50-100 over the cost of the single system).
  3. Thanks so much! I had thought about doing this (I had poked around at the advanced control a little bit) but wanted to see if there was a better way. I've got it set now to Full power, so the GPU is going strong, with the CPU limited to 6 threads and coming out around 80C. I'll experiment with adding one thread at a time and see what happens, but I don't really want to go above 80 for long periods of time.
  4. So, with folding month going on, of course I want to fold as much as possible. Unfortunately, my CPU (R5 3600) doesn't want to cooperate - on Medium settings, temps are getting near 90C, and my case fans are really loud. (I have a better CPU cooler in the mail, but for now I'm on the stock cooler.) When I set the slider to light, temps (and volumes) are much more reasonable. However, on light, the GPU only folds on idle; since I'm not using my GPU heavily (using Chrome mostly) it seems like I'm wasting my GPU's power. Is there any way to tweak settings so that CPU can be on Light, but GPU still folds all the time?
  5. I'm not affiliated with AMD, so I can't give a definitive answer, but: 7/12/14nm is an approximation of the transistor size, and has nothing to do with chiplet size; I suspect that the Zen+ chiplets used in the 2700X pack more transistors (compared with the Zen chiplets used in the 1200) into the same size die. This actually makes sense, as it would allow them to use the same tooling for die-cutting and packaging. Since your eyes can't see at the 12-14nm level, they'd end up looking the same. Good point, I hadn't thought about Athlon....
  6. They look identical because, on a macro (visible) scale, they are. The manufacturing processes aren't perfect; after manufacturing, they thoroughly test each die before packaging them into what you get as a CPU. The Zen/Zen+/Zen 2 architecture (as an example) has 4-core chiplets, with as many as 4 chiplets per package. When they test the chiplets, here's a few (simplified) examples of what happens: If all 4 cores work properly, it can get packaged into an 8, 12, or 16 core CPU (such as a 2700X). If only 3 cores work properly, it's paired with another 3-core chiplet and packaged as a 6-core processor (such as a 3600). If only 2 cores work properly, it can be paired with another 2-core chiplet and packaged as a 4-core processor. If 1 or 0 cores work properly, they get thrown out. So really, yes, it doesn't cost any more or less to produce the cheaper CPUs (although the high-end 12 and 16 core CPUs do cost more, as they have more chiplets); in fact, producing the cheaper CPUs reduces the overall manufacturing costs for all CPUs since it means less waste (that is, without the cheap CPUs, only the perfect chiplets would be useable, and everything else would get thrown out).
  7. Oh, that is odd. On mine, the first drive still booted fine on it's own, so I just had to do a clean install on the second drive. I'm not sure how to help, then.
  8. Don't bother with bcdedit. I had a similar issue and it never fixed it (and at one point it required my installer USB to boot, lol...). Your best fix, as mentioned by others, is to plug in the new SSD and nothing else into a computer (desktop or laptop, shouldn't matter) and use the USB to reinstall.
  9. You buy a USB ARGB hub. Do a search on Amazon, or Newegg, there's a few models out there. They usually have an internal USB connector for control.
  10. Samsung does not "own the rights" to USB-C; it's a standard of the USB-IF, which is a multi-company consortium (and Apple has a person on the board of directors!). But yes, they'd definitely get less money, as you can buy C cables without "Made for iPhone" certification, but not lightning cables (and companies pay a ton to Apple to get MFi). And while I'm not certain this is still true, I know Samsung used to produce the screens for iPhones, so they got some of the money for every iPhone sold anyway.
  11. There might be others, but I use Ninite.
  12. I have a set of 3 fans already ordered; 2 will go as front intake, one will replace the current fan as rear exhaust.
  13. Agreed, but I can plan in parallel. Right now I have 3200MHz CL16 1x8GB (it was what I could afford at the time, since I also had to get the MB and CPU... I was upgrading from a 10-year-old box with ddr3 ram). I am planning to get a matching stick, $30 on Newegg, next month when my budget resets (my wife and I keep each other on fairly strict budgets, which is sometimes annoying, but it keeps me from blowing a paycheck on parts, and keeps her from blowing a paycheck on books, so it works out). So that is definitely already being planned for.
  14. I need help making a decision. I'm thinking about using this bracket to vertically mount my GPU; it's the only bracket I've found that stands any hope of fitting into my 4-slot Silverstone PS15. But I can't decide. Here's some pros and cons I've come up with. Pros of vertical vs. Cons of horizontal: I can see the front of the GPU vs. all I see is the side and backplate. (My card has no RGB or anything, but I'd still like to be able to see the design of the front instead of a plain black backplate.) Short expansion cards can (probably) fit behind the GPU vs. being right below it. (This is a major factor in the next two points.) Better airflow if I add another PCIe card (and I will probably add in a WiFi card) vs. another card would block part of the left-side fan, reducing airflow to graphics card. Additionally, the card would be further from the glass than it now is from the PSU, further improving airflow. I might be able to use two additional add-in cards (in slots 3 and 4) vs. being restricted to only one (the GPU blocks slot 3, so only 4 is available). (This is minor, since I only have plans right now for a wireless card.) Cons of vertical vs. Pros of horizontal: Cost; the vertical mount is $40 vs. horizontal is free. Risk: there are some unknowns, such as how much space I'll actually have behind the GPU (it should be enough to fit LP cards, but I'm not 100% certain) vs. I know how everything will fit. complexity, as adding in a PCIe riser adds another element that could fail and/or cause problems, vs. simplicity of having the GPU plugged straight into the MB. Restricted CPU cooler upgrade options (tower coolers would interfere with the GPU, as seen below) vs. more options (though I'm still case-limited to 154mm high). Additional good-to-knows: I'm using the Gigabyte B450M DS3H, which, unfortunately, uses expansion slot 1 for the M.2 drive, pushing the GPU into slot 2 (as seen below). Slot 3 is x1 (but blocked by horizontal GPU) and slot 4 is x16 length, but wired for x4. Also, while the bracket looks like it should fit on only 4 slots, it also angles downward quite a bit, so I think I'll have to modify it (i.e. cut it) to fit onto only 3; thus my posting in the Modding forum. A note on WiFi (because that's a major player here): Going wired isn't an option right now. I've been using a USB WiFi dongle, but I'm getting tired of its shenanigans and want to upgrade it to PCIe. I've looked into M.2 adapters, but an M-key to A/E-key adapter is $23 in a terrible red on Amazon or $50 in black from hwtools.net, and all the M.2 wifi cards are A or E-key. (Also, they don't come with a PCIe bracket, so I... just have antenna cables hanging out of my case, I guess? There's no good solution here.) Plus I may want to upgrade to NVMe storage in the future, so I'd like to leave the M.2 empty for now. I have an Amazon gift card that would cover the cost of the bracket, that is limited in its use for other things (because, for example, my matching second stick of RAM to be ordered soon isn't available on Amazon, only on Newegg). To aid in the decision, I've also taken photos of my current build and with the GPU mocked up in the vertical (using my graphing calculator as an obviously temporary brace). (I'll be replacing the current exhaust fan with a set of white-frame fans as soon as they come in the mail.) The original (current) setup, with a standard horizontal mount. All the same parts, but mocked up to show a vertical mount. What do you guys think? Is it worth it to go vertical?
  15. Another indicator of how terrible this is is to compare input wattage to output wattage. (Note: I'm oversimplifying this.) It says it takes as much as 5A input, which, at 240V, is 1200W. But it can only output a maximum (if you could stress all the rails all the way, which you probably can't) of 231W. That's only 20% efficient. (Compare this to the 80+ efficiency that's usually recommended....) Even if it's only taking 3A, that's 32% efficient. And where does the other 70-80% go? Heat. Which is why some people call this sort of PSU a fire starter. Now, to be fair, the 3-5A is a rating for the wall circuit supplying it, not a statement of how much current the PSU will actually draw (for comparison, my 500W PSU specifies a minimum input of 8A@100V, or 800W, yet 500/960 is 62%, but the unit is 80+ Bronze certified, which means it'll never actually need to pull a full 800W. So input power rating doesn't tell the whole story, efficiency-wise, due to some built-in safety factors.) But nevertheless, I wouldn't expect very good efficiency out of this at all.
  16. Android isn't at all locked down the way that iOS is. If you don't like the icons, it's not hard to change them (though app support may vary). Behind the scenes, Android is (sort of) Linux. There's even apps you can get that expose the CLI, though you have to root the tablet to do very much with it - but that's not hard to do. Well, sort of, but Android isn't what you seem to think it is. It's not Windows or iOS where there's only one version; it's much more like Linux. Each OEM can modify it the way they want (look at, for example, any Samsung Android device vs. the Google Pixel series; Samsung is famous for shipping a heavily modified Android), and if you want more freedom, you can install something like LineageOS. You don't even have to use the Google Play Store (and in fact, for licensing reasons, Lineage doesn't include Google Apps, you have to install them separately), there are other Android app stores out there. Where did you get this info? Unless things have changed a lot (admittedly, my tablet is 8 years old), that's not true. I have a rooted Android tablet (Samsung Galaxy Tab 2.0 7") and the only function that doesn't work is the IR (which I never used anyway). Rooting doesn't disable the touch screen; installing another distro (e.g. Lineage) doesn't disable the touchscreen. All in all, as an 8-year Android user and rooter, I'd say that a) you seem to have some misconceptions about what Android is, and should do more research before spending any money; and b) Android is, by far, the most Linux-like OS that you'll find in a tablet form-factor, and probably your best option given your stated use-cases.
  17. This solution is probably overkill, but you could run a VM with a lightweight Linux distro (assign it to, say, only one or two cores of your CPU, so that it's not resource intensive) for the chat and music, then assign one mouse to the VM and one to the main OS.
  18. That's what I was thinking. VGA is an analog signal, so it takes some special conversion.... Since all the monitors only have DVI and VGA, apparently, OP should be using a DVI cable and either two DP->DVI conversions or one and one HDMI->DVI conversion, and keep everything digital.
  19. Absolutely. It always starts at 5V, and increases to the higher voltages only after a negotiation with the device being charged. And as for current, think of current rating as an upper limit, not a guarantee. In other words, a 5V/3A charger will provide 3A only if the load device (your computer/phone) needs it; it won't push 3A on a device that only needs 2, but it can provide 3A to a device that needs it. Thus, you can always use a power supply that's rated for higher current than your device, but not one that's lower. (Voltage, on the other hand, must match; if it doesn't, you risk under-powering if it's too low or boom if it's too high.)
  20. I don't have an answer, but you're more likely to get an answer if you post your system info, e.g. laptop or desktop, brand and model if prebuilt, specs is you built it, so that people here can know what you're working with.
  21. They're not really proprietary, I've seen them before. They're just an intermediary so you can use either the common 2+1 ARGB connector (2nd picture, left) and Gigabyte's 3-pin ARGB connector (2nd, center). And many cheap MBs don't support ARGB (mine doesn't either), so you'd need a controller or hub, as others have stated.
  22. Something like this: https://www.amazon.com/Updated-2020-Version-HDMI-Splitter/dp/B0822HWM4L/ref=sr_1_5?dchild=1&keywords=hdmi+splitter&qid=1601587449&sr=8-5
  23. Ok, I see. Also, I found this video which seems to start in the middle of a presentation, but is an HP rep explaining that the weird add-in board serves two purposes: it is more compact than a single large 2-socket motherboard, allowing for a smaller case form factor, and it makes it so that the motherboard is common with the single-processor Z4 series, thus reducing engineering and manufacturing costs.
  24. It's not that old; I use a Z640 for work. I've never opened the case though. I probably have one of those, and don't even know it.... I do know for sure it's a dual CPU, so it probably has it. I think it's fascinating. What's its connection to the main board? Is it PCIe or something else? And I don't know about individual parts, but I do know that the machine as a whole is very heavy....
×