Jump to content

FAQBytes

Member
  • Posts

    1,129
  • Joined

  • Last visited

Everything posted by FAQBytes

  1. I know I already commented on the sff net site, but hot diggity damn am I glad this is a thing. Can't wait to get rid of this Node 202. It was a nice stepping stone, but I'm ready for something much nicer. When are you expecting first units to go out? I have cravings to upgrade my CAD box.
  2. @Shunsushi You're lucky you caught me when you did. I haven't been on in a while. You can use a screwdriver to short the power pin and ground on SSI 24 Pin Front Panel Header denoted as BC with a screwdriver: If you have any questions, feel free to ask. My Email is FAQBytes (at) gmail (etc), as I don't check here very often anymore.
  3. I'd say no since most extend past the actual fan space.
  4. I feel like that some of these are misnomers. DX -> CUDA Acceleration GEFORCE -> CUDA Cores I have no issue with either of these statements, despite on how much I hate the fact this is true. Same goes for the second statement, but I would like to state that even though AMD cards do not support CUDA, they are not "Pointless". They often times provide a much greater IO/$ than NVidia if CUDA acceleration isn't necessary. Unfortunately Matlab and SolidEdge greatly benefit from CUDA, so NVidia has forced my hand for GPU selection for my next build. I agree that you do get better price/performance with any type of consumer equipment, but often for medium businesses real-estate is more cost-effective than pure cost. There also isn't the same amount of support for consumer as there is for professional equipment, which the peace of mind and guarantees from the OEM is by far worth the extra price. The reason why they get "Pro" cards are for driver support to "Unlock" other card capabilities. (Again, it frustrates me, but I somewhat understand) It also has better support than consumer cards. Also, more modern "pro" cards are moving towards ECC memory. I understand that your business likely isn't at the stage that requires the "professional" cards, but there are situations that necessitate it. The guarantee that these businesses get with the Xeon and Epyc CPUs is by far worth the price for these larger businesses. If your business primarily relies on desktops for employees rather than offloading, I can see how generic consumer equipment would be more ideal. However, an AMD-NVidia setup would be more cost effective than an Intel-Nvidia setup when going with consumer equipment, since it seems that you require CUDA acceleration. Again, you're paying nearly double for the equivalent performance (multicore, since that seems to be the debate here) with Intel's i9 and Ryzen, so much of what you have taken issue with defeats itself. You talk about cost-effectiveness, then you say AMD is dead in the water. If you truly cared about pure cost-effectiveness, you wouldn't even be worried about current generation.
  5. And the difference between 3.9 GHz and 4.0 GHz is worth ~200% the price? And obviously I don't know how to run any businesses. If I did I would just throw money at my problems to make them go away, because that worked for car manufacturers, right? /s But if you run your own business, why are you talking about standard GPUs (I hate the fact that "Business" and "Gaming" drivers are exclusive, but money is money, eh?) and threadripper and not Epyc?
  6. Please do let me know what you are doing that benefits so greatly from 2 additional cores. Also: "All of the i9 chips feature base clock speeds of 3.3GHz, reaching up to 4.3GHz dual-core speeds with Turbo Boost 2.0 and 4.5GHz with Turbo Boost 3.0. And speaking of Turbo Boost 3.0, its performance has also been improved in the new Extreme Edition chips to increase both single and dual-core speeds. Rounding out the X-Series family are the quad-core i5-7640X and i7 models in 4-, 6- and 8-core models." https://www.engadget.com/2017/05/30/intel-core-i9-extreme/ 4.3 GHz is only on 2 cores.
  7. Again, this is "Prosumer", which is mostly rendering and encoding. Sure, more cores help, but if you're complaining about 16 cores not being enough, nothing will ever be enough. 32/64 Epyc won't be enough for you even when run in 64/128 dual socket I presume (Because AMD). By the way, at this point where you're talking, things are generally offloaded to co-processors or GPUs (I mean, provided that you actually have workloads capable of such). Also, think of pricepoint. 10/20 4.3 or 16/32 3.9 at $999 USD.
  8. Heh... "Only" 16 cores. Even with running a personal UnRaid pentesting server, I hardly ever fully utilized 16 cores (2x E5-2680s). It's a superniche area, and not releasing pricing isn't the end of the world considering that it is launching within the next month. We'll see how Vega turns out.
  9. If it's not a price consideration, s8+ hardware wise. I don't like Shamsung's UI, but there's workarounds even without root.
  10. As someone who has dealt with wireless communications I am skeptical about the battery life claim. While my projects have been over kilometers, most of the other sensors would still kill a coin cell in a matter of hours. I understand bluetooth is far more power efficient, as it is only over a few feet and has a much smaller bandwidth, They're technically claiming 2920 hours lifespan. (2 hours * 365 days * 4 years) https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/IWS20201320wireless20power20consumption.pdf Alright, so using Wattage = Amperage*Voltage we can see that the consumption is 14.85 mW/hr. The CR2032 (Standard coin cell) has an energy density of 198mWh/g and typically contains 0.109g of Li. http://data.energizer.com/pdfs/cr2032.pdf 198mWh/g*0.109g = 21.582 mWh So, with the on consumption, that gives 21.582/14.84 = 1.45 hours on time, which sounds about right to me. However, similarly to what I have to do with the XBee, since I am running it on a solar powered glider, you can put it to sleep. Running with straight 2 hours of in-sleep gives us: .00078mA*3.3v = .002574 mA 21.582/.002574 = 8384.615 hours of idle time. Alright, so we can know that it won't die if it's just 'on'. (Keep in mind this is just for the bluetooth, there is still a microprocessor that runs all this which also takes a bit of power. Since I don't know which one, I can't calculate for that.) Since we are talking about several years, self-discharge is a concern, but I don't have information on that at the moment. Let's see how much is left for the actual sending. .002574mA*2920hours = 7.516mWh 21.582-7.516 = 14.066mWh remaining. 14.066/14.84 = 0.9478 hours of sending capacity. Let's see what the duty cycle is on this: 0.9478/2920 = 0.0325% duty cycle. That's one hell of a tight duty cycle considering most people's typing speed. The time of one connection is 1.15s. 1.15s = 0.0003194444hr 0.9478 hours/ 0.0003194444 = 2966 individual instances lasting less than one second. So, if you're typing constantly, don't expect a 4 year lifespan. They're essentially expecting one send instance per hour from what I'm gathering here. Ehhh.... I'm skeptical to say the least, especially if you are a fast typer. Not to mention this isn't taking into account any other electronic components.
  11. Do you notice stuttering or low FPS? Utilization isn't necessarily representative of performance. Ideally, you want both running at 80-100% unless you are framecapping.
  12. I'll be honest, if you can tolerate the touchshiz on Samsung phones or are willing to flash them (if you still can, I haven't kept up with Shamsung after I got rid of my S2)/run a launcher, they're far superior to the 3T, even the 7 edge if you get the Snapdragon variant (I'm not mixing that around, am I?). Don't get me wrong, Oneplus makes great phones, they're just outclassed if you can afford the Sammy. If you're looking at the 3/3T, take a look at the ZTE Axon 7 (What I got instead), Huawei Mate 9 (most expensive of the bunch and my least familiar) and Xiaomi Mi 5S (another that I was looking at).
  13. I can vouch for laptops having hyperthreading for several generations now as well as it's more power efficient to more effectively utilize fewer cores than to underutilize more cores.
  14. Actually, try removing the one with the fans nearby and see if that is a 3.5" bay. The black 'clips' are for cards like this: That back part slides into it and prevents it from moving side to side to prevent damage to the PCI-e slot and card itself.
  15. I believe the hard drives go in the middle photo bottom left. You should be able to remove the drive tray and install them there. The top bar is to hold heavy peripheral cards in place and prevent them from sliding out, especially during transportation.
  16. It may also be an ARK case, as the internals look very similar, and they also provide a similar looking case. If you need to know something specific, I may be able to help with that.
  17. Looks like it is a SuperLogics 4u chassis. http://superlogics.com/industrial-computers/rack-mount-pc-computer/SL-4U-AH110M-HA/267-5893.htm?gclid=CjwKEAjwlpbIBRCx4eT8l9W26igSJAAuQ_HG5JHPkG78k4L27R5R7gylq56yRps7TqK_Ja7fSFKx4RoCHQvw_wcB This isn't the one, but it is close. EDIT: On second glance, not so much...
  18. Yeah, that's compatible.
  19. Yeah, that's an SSI EEB board, the same form factor as my motherboard (In description). Finding a case for it is a royal pain in the arse as there aren't that many, and any that exist are large,bulky, and expensive. I got a super cheap rack mounted server for $200 that came with 12 500GB hard drives, so I put it in that, it's just super inconvenient moving it around while I'm still at college. So your choices are basically a rack mounted server case, designated intel case, or something like this: https://www.amazon.com/Phanteks-Enthoo-Chassis-PH-ES614P-TG-Titanium/dp/B00YCX0XR6/ref=sr_1_2?ie=UTF8&qid=1493592394&sr=8-2&keywords=ssi+eeb+case All of which are large.
  20. I got my ZTE Axon 7 (not mini) from Ebay in December for $300, and that was when it was still relatively new. I'd also recommend the Oneplus 3/3T (Or 2 if you can't find the 3 for your price) in addition to the choices you listed above. I'd have to say that I have yet to regret purchasing from either. One word of warning for the Axon 7 is a lack of a great tempered glass screen protector, which is definitely my one complaint about that. The audio is great, especially when playing out loud, and it can get loud enough to rival my friend's boom pod or whatever they're called. Camera is good, but it's no S6 or S8, but they had to sacrifice somewhere. Slowmo is cool for stuff like rocket launches or minor diagnostics, but can be gimmickey otherwise. I just run Evie launcher since it was very close to CM, but default launcher isn't the worst.
  21. As long as nothing it comes in contact with contains any aluminum it should be fine. The alloyed layer should, for the most part, insulate the rest of the copper. It will cause minor pitting when the low-melting alloy is removed, but it shouldn't be too awful as most gallium/Ga-In TIM manufacturers recommend copper heat-sinks.
  22. http://www.ebay.com/itm/Ultra-Compact-Turnigy-XT60-Serial-Series-Battery-Connector-Adapter-/231109036111?hash=item35cf2c084f:g:RfQAAOxypNtSnoKg This is a series connector that would get you the increased voltage.
  23. To answer the title question, it would be in parallel. You would have 2 2s batteries as the voltage does not change. You would still have 7.4 volts, not 14.8 volts.
  24. https://www.google.com/amp/lifehacker.com/147855/geek-to-live--automatically-back-up-your-hard-drive/amp
  25. https://www.amazon.com/Command-Refill-Strips-Medium-9-Strips/dp/B0014CQGW4/ref=sr_1_3?ie=UTF8&qid=1490412494&sr=8-3&keywords=3m+command+strips Err... Get the larger pack. Each one is rated at 3 lbs. EDIT for clarification: Get the VESA mount, but instead of screwing them to the wall, use the command strips.
×