Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards

About Eniqmatic

  • Title
  • Birthday 1993-04-15

Contact Methods

Profile Information

  • Gender
  • Location


  • CPU
    AMD 1100T 3.8GHz
  • Motherboard
    Crosshair V Formula
  • RAM
    Corsair Dominator
  • GPU
    MSI GTX 760 Twin Frozer
  • Case
    Zalman Z11+
  • Storage
    256GB 840 EVO SSD, 1TB WD
  • PSU
    Corsair AX750
  • Display(s)
    Samsung UE32F5000
  • Cooling
    Antec Khuler
  • Operating System
    Windows 7

Recent Profile Visitors

1,741 profile views
  1. Nice, I bought an AX200 as an upgrade for my laptop a few weeks ago, I didn't really care if it was certified at the time but nice to see it is!
  2. Total price is less than $20. You are correct, they have these devices and obviously they work well. I saw many of these in my research. The problem being that they are often to sense a single person, since hospital beds are usually single beds. This isn't a deal breaker but its a nice feature to be able to tell how many people are in the bed, not just IF someone is in the bed. Hopefully that makes sense! Appreciate the comment, it was something I looked into!
  3. It's this one, I didn't make it: https://www.thingiverse.com/thing:3564942 I made the bed enclosure, but sharing that would be useless since it's unique to my bed!
  4. Hey guys! Thought I’d share my bed occupancy sensor (yes I know, more bed occupancy sensors based on load cells!) and hopefully make it easy for anyone else that wants to replicate. Using those cheap 50kg load cells under each corner of the bed, a HX711 board, Wemos D1 Mini and of course Home Assistant! One thing I found hard to find information on was how to mount the sensors. I tried a few things but ended up 3D printing casters/holders for my bed legs and allowing the load cells to slot into them, this stops the bed sliding around on the cells and interfering with the reading. This method keeps everything secure and ensures each cell is in the exact same position. See attached pictures. So what do I use this for? This has greatly improved my automations: Ensuring that when we are both in bed, every light and media player is turned off, door locks are set and the house alarm is set to “armed home”. I have lights that are automated through Home Assistant when motion is detected, I add a condition to these automations not to trigger these lights when we are both in bed, this is so that our dogs do not trigger lights all night. During night hours, the bathroom light comes on at a very dim brightness if one of us exits the bed during the night. This is great for not being blinded when getting up to the bathroom during the night! I’ve created a full guide here Github code: https://github.com/EverythingSmartHome/mqtt-bed-sensor
  5. Sorry I meant, wired or Wifi? I assume wired? What site are you using to test?
  6. Yes that's what I'm trying to suggest, a smaller SSD pool, perhaps 2 SSDs together, and a large many disk HDD array?
  7. Would having a smaller SSD only pool for that project work then transferring it off afterwards?
  8. 2. In this case ZFS Zil combined with l2arc might be of use, data is written to the SSD then later the system will automatically transfer it to the big pool. This might give you a better explanation: https://www.45drives.com/wiki/index.php?title=FreeNAS_-_What_is_ZIL_%26_L2ARC Would an all SSD array be too expensive?
  9. 1. Yes that's right, so in terms of how to understand it, you have your NAS which is a filesystem (for all intensive purposes lets call this local, there are distributed filesystems such a Gluster and Ceph as you mentioned above but lets stick to basics) locally to the machine. Then you have your "sharing protocol" which is ontop of the filesystem and is used to share this local filesystem with remote systems over a network connection (again, not just network only but lets stick to basics). Then you have your remote machine communicating with that "sharing protocol" - hopefully that makes sense? CIFS is probably good but can act weird sometimes. I think it will server you best, ISCSI might be another option? 2. Most filesystems have a way of using a cache disk (typically SSD) in order to handle this situation but again testing is needed here. ZFS has an l2arc which could help. The other option is to have 2 pools, one small SSD array that you can work directly too, then your traditional slower spinning disk array which the data can be transferred to after the job is finished (I don't know the workload but see if that fits your situation). 3. ZFS is a "self healing" filesystem, which does integrity checks of the data and can automatically correct any corruption. It's worth reading up on, its quite a complicated and interesting beast. It would be classed as a "software raid" solution, but don't be scared if you have read things like "intel software raid is terrible and no one should use software raid" because there was articles like that going around - ZFS is a completely different beast altogether and more an enterprise filesystem. 4. What do you mean by tiered storage? If you can expand on this I can maybe help. I realised I forgot to answer your 4th question in the first post but you've kind of answered it since, 10G is not a requirement, 1G will work, as will 100mb, but as you've already identified, less than 10GB will most likely be your bottleneck.
  10. What is your setup you are using to test with?
  11. Hello! 1. Forgive me if my assumption is wrong, but I think you are a little confused here. You ask which filesystem would allow for smooth access of files from Linux and Windows, however the filesystem is not where you need to look for this question, it would be technically be called "filesystem protocol" which includes CIFS/NFS/iSCSI etc and should work regardless of the underlying filesystem. Anyways to answer the question, CIFS/smb is probably the best choice unfortunately since you are using Windows, you will need to install Samba on Linux and do a little configuration but its pretty easy. 2. Unfortunately no one can really answer that question, it really "depends" - depends on the work load and the hardware. Best thing to do is actually set something up and test different workloads and scenarios with different filesystems and record your results and go from there. 3, Again, it "depends" - zfs would like direct access to the drives so either via onboard sata ports or through a HBA. You need to workout which route you go for before that question can be fully answered! Hope that helps a little!
  12. Ubiquiti LiteBeam or similar will do these easily.
  13. There is a policy that stops you connecting to a PC without a password..not sure why they implemented that but they did. Try setting a password on the second PC and trying again. Do you have admin rights on both?