Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

WereCatf

Member
  • Content Count

    7,784
  • Joined

  • Last visited

Reputation Activity

  1. Like
    WereCatf got a reaction from Letgomyleghoe for a status update, Just compiled a custom firmware for my Prusa i3 MK2S - printer. The original firmware   
    Just compiled a custom firmware for my Prusa i3 MK2S - printer. The original firmware uses a baudrate of 115200 to communicate with Octoprint, which means large gcode-files take FRICKING FOREVER to transfer to the SD-card from Octoprint. I wanted to speed this up, so I went and set the baudrate to 921600 and compared the speed against the original one: original firmware takes 2 minutes 44 seconds with the gcode test-file and my custom firmware takes 2 minutes 11 seconds.
     
    Not exactly a huge win or anything and 2 minutes is still a lot, but..well, it did shave half a minute off, so I guess it's still an improvement.
  2. Like
    WereCatf got a reaction from The_Vaccine for a status update, Just compiled a custom firmware for my Prusa i3 MK2S - printer. The original firmware   
    Just compiled a custom firmware for my Prusa i3 MK2S - printer. The original firmware uses a baudrate of 115200 to communicate with Octoprint, which means large gcode-files take FRICKING FOREVER to transfer to the SD-card from Octoprint. I wanted to speed this up, so I went and set the baudrate to 921600 and compared the speed against the original one: original firmware takes 2 minutes 44 seconds with the gcode test-file and my custom firmware takes 2 minutes 11 seconds.
     
    Not exactly a huge win or anything and 2 minutes is still a lot, but..well, it did shave half a minute off, so I guess it's still an improvement.
  3. Like
    WereCatf got a reaction from piratemonkey for a status update, Just compiled a custom firmware for my Prusa i3 MK2S - printer. The original firmware   
    Just compiled a custom firmware for my Prusa i3 MK2S - printer. The original firmware uses a baudrate of 115200 to communicate with Octoprint, which means large gcode-files take FRICKING FOREVER to transfer to the SD-card from Octoprint. I wanted to speed this up, so I went and set the baudrate to 921600 and compared the speed against the original one: original firmware takes 2 minutes 44 seconds with the gcode test-file and my custom firmware takes 2 minutes 11 seconds.
     
    Not exactly a huge win or anything and 2 minutes is still a lot, but..well, it did shave half a minute off, so I guess it's still an improvement.
  4. Like
    WereCatf got a reaction from Slottr for a status update, I'm not exactly a Python-wizard and it took me a while of reading, but I managed to p   
    I'm not exactly a Python-wizard and it took me a while of reading, but I managed to publish my first project to PyPi: pyTimedInput
     
    Why? I needed a way of getting input from user while having a timeout, so the script can continue to do its thing without blocking forever, if no input is forthcoming. It seems quite a few people have a need for something similar, so there's a tiny chance someone else will find this useful as well.
  5. Like
    WereCatf got a reaction from The_Vaccine for a status update, I'm not exactly a Python-wizard and it took me a while of reading, but I managed to p   
    I'm not exactly a Python-wizard and it took me a while of reading, but I managed to publish my first project to PyPi: pyTimedInput
     
    Why? I needed a way of getting input from user while having a timeout, so the script can continue to do its thing without blocking forever, if no input is forthcoming. It seems quite a few people have a need for something similar, so there's a tiny chance someone else will find this useful as well.
  6. Like
    WereCatf got a reaction from piratemonkey for a status update, I'm not exactly a Python-wizard and it took me a while of reading, but I managed to p   
    I'm not exactly a Python-wizard and it took me a while of reading, but I managed to publish my first project to PyPi: pyTimedInput
     
    Why? I needed a way of getting input from user while having a timeout, so the script can continue to do its thing without blocking forever, if no input is forthcoming. It seems quite a few people have a need for something similar, so there's a tiny chance someone else will find this useful as well.
  7. Like
    WereCatf got a reaction from piratemonkey for a status update, My dreams tend to be quite wild, often centering around the various traumatic events   
    My dreams tend to be quite wild, often centering around the various traumatic events of my past and thus being very emotionally heavy -- just started this day with one and I'm depressed as all hell and, quite frankly, having thoughts of self-harm again. Anyways, some of my wilder dreams can also be quite entertaining and have some fun twists to them, like e.g. at the beginning of the night I had a Cyberpunk-inspired dream.
     
    In Cyberpunk, practically all humans are cyborgs with all sorts of fancy computerey augmentations and whatnot. In my dream, however, androids were the dominating, sentient race on Earth, not humans, and these androids also loved augmentations. Alas, the augmentations they loved were made of human fleshy bits; patches of skin, hair, bones, even internal organs -- bones, fat and hair being on the cheaper end of augmentations, patches of skin more expensive and growing more expensive per area, patches of skin with custom tattooing even more so, then internal organs and the ultimate status-symbol was a human heart ticking away in one's chest!
     
    Due to the popularity of these augmentations, the androids had been hunting humans rather vigorously and as our numbers were dwindling rapidly, the government had to step in and start regulating human-hunting. This caused a lot of friction and all sorts of troubles and, obviously, black market and illegal hunting - groups going around.
     
    That's about as much as I can remember of the dream. It may just be because it was my dream and not someone else's, but I find the idea of androids augmenting themselves with flesh a fun, quirky twist and I'd love to see someone run with the idea and make a movie or a mini-series out of it. (With me getting a truckload of dosh as compensation for coming up with the idea!)
  8. Like
    WereCatf got a reaction from Letgomyleghoe for a status update, My dreams tend to be quite wild, often centering around the various traumatic events   
    My dreams tend to be quite wild, often centering around the various traumatic events of my past and thus being very emotionally heavy -- just started this day with one and I'm depressed as all hell and, quite frankly, having thoughts of self-harm again. Anyways, some of my wilder dreams can also be quite entertaining and have some fun twists to them, like e.g. at the beginning of the night I had a Cyberpunk-inspired dream.
     
    In Cyberpunk, practically all humans are cyborgs with all sorts of fancy computerey augmentations and whatnot. In my dream, however, androids were the dominating, sentient race on Earth, not humans, and these androids also loved augmentations. Alas, the augmentations they loved were made of human fleshy bits; patches of skin, hair, bones, even internal organs -- bones, fat and hair being on the cheaper end of augmentations, patches of skin more expensive and growing more expensive per area, patches of skin with custom tattooing even more so, then internal organs and the ultimate status-symbol was a human heart ticking away in one's chest!
     
    Due to the popularity of these augmentations, the androids had been hunting humans rather vigorously and as our numbers were dwindling rapidly, the government had to step in and start regulating human-hunting. This caused a lot of friction and all sorts of troubles and, obviously, black market and illegal hunting - groups going around.
     
    That's about as much as I can remember of the dream. It may just be because it was my dream and not someone else's, but I find the idea of androids augmenting themselves with flesh a fun, quirky twist and I'd love to see someone run with the idea and make a movie or a mini-series out of it. (With me getting a truckload of dosh as compensation for coming up with the idea!)
  9. Like
    WereCatf got a reaction from Mateyyy for a status update, My dreams tend to be quite wild, often centering around the various traumatic events   
    My dreams tend to be quite wild, often centering around the various traumatic events of my past and thus being very emotionally heavy -- just started this day with one and I'm depressed as all hell and, quite frankly, having thoughts of self-harm again. Anyways, some of my wilder dreams can also be quite entertaining and have some fun twists to them, like e.g. at the beginning of the night I had a Cyberpunk-inspired dream.
     
    In Cyberpunk, practically all humans are cyborgs with all sorts of fancy computerey augmentations and whatnot. In my dream, however, androids were the dominating, sentient race on Earth, not humans, and these androids also loved augmentations. Alas, the augmentations they loved were made of human fleshy bits; patches of skin, hair, bones, even internal organs -- bones, fat and hair being on the cheaper end of augmentations, patches of skin more expensive and growing more expensive per area, patches of skin with custom tattooing even more so, then internal organs and the ultimate status-symbol was a human heart ticking away in one's chest!
     
    Due to the popularity of these augmentations, the androids had been hunting humans rather vigorously and as our numbers were dwindling rapidly, the government had to step in and start regulating human-hunting. This caused a lot of friction and all sorts of troubles and, obviously, black market and illegal hunting - groups going around.
     
    That's about as much as I can remember of the dream. It may just be because it was my dream and not someone else's, but I find the idea of androids augmenting themselves with flesh a fun, quirky twist and I'd love to see someone run with the idea and make a movie or a mini-series out of it. (With me getting a truckload of dosh as compensation for coming up with the idea!)
  10. Like
    WereCatf got a reaction from Cyberspirit for a status update, There don't seem to be particularly many people here who are into actual hobby electr   
    There don't seem to be particularly many people here who are into actual hobby electronics and I haven't seen anyone else mention anything about it, but...I am *totes* excited about Espressif's upcoming ESP32-C3! RISC-V + WiFi, with Espressif's popular and pretty good SDK? Yes, please!

  11. Like
    WereCatf got a reaction from piratemonkey for a status update, There don't seem to be particularly many people here who are into actual hobby electr   
    There don't seem to be particularly many people here who are into actual hobby electronics and I haven't seen anyone else mention anything about it, but...I am *totes* excited about Espressif's upcoming ESP32-C3! RISC-V + WiFi, with Espressif's popular and pretty good SDK? Yes, please!

  12. Like
    WereCatf got a reaction from soldier_ph for a status update, There don't seem to be particularly many people here who are into actual hobby electr   
    There don't seem to be particularly many people here who are into actual hobby electronics and I haven't seen anyone else mention anything about it, but...I am *totes* excited about Espressif's upcoming ESP32-C3! RISC-V + WiFi, with Espressif's popular and pretty good SDK? Yes, please!

  13. Like
    WereCatf got a reaction from 2FA for a status update, I've got a PFsense-router as my main, Internet-facing router, which is then followed   
    I've got a PFsense-router as my main, Internet-facing router, which is then followed by a big switch and to this switch are all of my devices connected, including one Zyxel WiFi-router upstairs and one Linksys WRT1900ACS WiFi-router downstairs. Both of these WiFi-routers are running OpenWRT.
     
    Now, I've had this feeling that something's off with the Linksys and e.g. GeForce Now and Project xCloud both complain about high latency when connected to it, but not when connected to the Zyxel. Speedtest doesn't show any issues and I am getting ~500Mbps out of the Linksys, so I haven't understood the problem. Well, just a few days ago I happened to be using Parsec to do stuff and I came to realize that Parsec does show network-latency and...when connected to the Zyxel, it's showing about 0.9ms whereas with the Linksys it's showing 30-60 ms network-latency! That's fricking terrible! No wonder it's always felt so off!
     
    Took me a good while and hours of messing around to figure what was wrong and how to fix it. Nothing seemed to work, not even full reset, until I found a thread on Github where someone was complaining about similar issues. Apparently, there's a feature called A-MSDU, that's supposed to work fine and is part of the 802.11ac - specs, but it's not working right with the Marvell 88W8864 WiFi-chipset using mwlwifi - driver -- after disabling AMSDU, I finally got the expected 0.9ms latency over WiFi on the Linksys as well!
     
    If anyone reading this ever stumbles upon the same issue, the fix in my case was to add the two following lines to a startup-script:
    echo 0 > /sys/kernel/debug/ieee80211/phy0/mwlwifi/tx_amsdu
    echo 0 > /sys/kernel/debug/ieee80211/phy1/mwlwifi/tx_amsdu
  14. Informative
    WereCatf got a reaction from Skiiwee29 for a status update, I've got a PFsense-router as my main, Internet-facing router, which is then followed   
    I've got a PFsense-router as my main, Internet-facing router, which is then followed by a big switch and to this switch are all of my devices connected, including one Zyxel WiFi-router upstairs and one Linksys WRT1900ACS WiFi-router downstairs. Both of these WiFi-routers are running OpenWRT.
     
    Now, I've had this feeling that something's off with the Linksys and e.g. GeForce Now and Project xCloud both complain about high latency when connected to it, but not when connected to the Zyxel. Speedtest doesn't show any issues and I am getting ~500Mbps out of the Linksys, so I haven't understood the problem. Well, just a few days ago I happened to be using Parsec to do stuff and I came to realize that Parsec does show network-latency and...when connected to the Zyxel, it's showing about 0.9ms whereas with the Linksys it's showing 30-60 ms network-latency! That's fricking terrible! No wonder it's always felt so off!
     
    Took me a good while and hours of messing around to figure what was wrong and how to fix it. Nothing seemed to work, not even full reset, until I found a thread on Github where someone was complaining about similar issues. Apparently, there's a feature called A-MSDU, that's supposed to work fine and is part of the 802.11ac - specs, but it's not working right with the Marvell 88W8864 WiFi-chipset using mwlwifi - driver -- after disabling AMSDU, I finally got the expected 0.9ms latency over WiFi on the Linksys as well!
     
    If anyone reading this ever stumbles upon the same issue, the fix in my case was to add the two following lines to a startup-script:
    echo 0 > /sys/kernel/debug/ieee80211/phy0/mwlwifi/tx_amsdu
    echo 0 > /sys/kernel/debug/ieee80211/phy1/mwlwifi/tx_amsdu
  15. Like
    WereCatf got a reaction from Ziondaman for a status update, Eposvox mentioned that Noctua is apparently planning to make a desk-fan. I really, RE   
    Eposvox mentioned that Noctua is apparently planning to make a desk-fan. I really, REALLY hope it's going to be that odd, brown "Noctua-colour"; if it is, I'll absolutely buy one. Heck, I might buy a couple ones! It's just nerdy enough to suit my odd tastes 😅
  16. Like
    WereCatf got a reaction from The_Vaccine for a status update, Eposvox mentioned that Noctua is apparently planning to make a desk-fan. I really, RE   
    Eposvox mentioned that Noctua is apparently planning to make a desk-fan. I really, REALLY hope it's going to be that odd, brown "Noctua-colour"; if it is, I'll absolutely buy one. Heck, I might buy a couple ones! It's just nerdy enough to suit my odd tastes 😅
  17. Agree
    WereCatf got a reaction from Cyberspirit for a status update, Eposvox mentioned that Noctua is apparently planning to make a desk-fan. I really, RE   
    Eposvox mentioned that Noctua is apparently planning to make a desk-fan. I really, REALLY hope it's going to be that odd, brown "Noctua-colour"; if it is, I'll absolutely buy one. Heck, I might buy a couple ones! It's just nerdy enough to suit my odd tastes 😅
  18. Like
    WereCatf got a reaction from soldier_ph for a status update, I've had trouble with my OnePlus Bullets Wireless 2 Bluetooth - headset on my desktop   
    I've had trouble with my OnePlus Bullets Wireless 2 Bluetooth - headset on my desktop for a while now, where the audio would just occasionally cut completely out for a couple of seconds randomly. Also, attempting to use the headset's microphone would just lock Windows 10 up. Full reinstall of Windows didn't work, neither did any driver-updates, disabling power-management or anything else. So, I bought a PCIe WiFi 6 + Bluetooth 5.1 - adapter based on Intel's AX200NGW off of eBay some while ago and installed it on my desktop yesterday.
     
    I haven't had any need to test the WiFi - capabilities on the card, but Bluetooth works great; audio ain't cutting out anymore, microphone works, the signal carries further away than it did with the previous USB-stick I used for Bluetooth. The card with an included screwdriver, two antennas, WiFi 6, Bluetooth 5.1 and all cost less than a single Bluetooth 5.x USB - stick, so totally not a bad buy. I definitely recommend, if anyone reading this has a need for a good Bluetooth - solution.
  19. Like
    WereCatf got a reaction from The_Vaccine for a status update, I've had trouble with my OnePlus Bullets Wireless 2 Bluetooth - headset on my desktop   
    I've had trouble with my OnePlus Bullets Wireless 2 Bluetooth - headset on my desktop for a while now, where the audio would just occasionally cut completely out for a couple of seconds randomly. Also, attempting to use the headset's microphone would just lock Windows 10 up. Full reinstall of Windows didn't work, neither did any driver-updates, disabling power-management or anything else. So, I bought a PCIe WiFi 6 + Bluetooth 5.1 - adapter based on Intel's AX200NGW off of eBay some while ago and installed it on my desktop yesterday.
     
    I haven't had any need to test the WiFi - capabilities on the card, but Bluetooth works great; audio ain't cutting out anymore, microphone works, the signal carries further away than it did with the previous USB-stick I used for Bluetooth. The card with an included screwdriver, two antennas, WiFi 6, Bluetooth 5.1 and all cost less than a single Bluetooth 5.x USB - stick, so totally not a bad buy. I definitely recommend, if anyone reading this has a need for a good Bluetooth - solution.
  20. Like
    WereCatf got a reaction from Ziondaman for a status update, Just finished watching The Queen's Gambit -- what an amazing show! I started watching   
    Just finished watching The Queen's Gambit -- what an amazing show! I started watching the first episode yesterday in the background as I was doing something else and I completely forgot about what I was doing and just sat there, glued to the screen, watching 4 episodes in a row; I totally love it when shows have that effect on you!
  21. Like
    WereCatf got a reaction from soldier_ph for a status update, Just finished watching The Queen's Gambit -- what an amazing show! I started watching   
    Just finished watching The Queen's Gambit -- what an amazing show! I started watching the first episode yesterday in the background as I was doing something else and I completely forgot about what I was doing and just sat there, glued to the screen, watching 4 episodes in a row; I totally love it when shows have that effect on you!
  22. Like
    WereCatf got a reaction from The_Vaccine for a status update, Well, getting CUDA 10.2 toolkit working in an LXC-container running Ubuntu 20.10 Groo   
    Well, getting CUDA 10.2 toolkit working in an LXC-container running Ubuntu 20.10 Groovy wasn't fun. For one, apparently CUDA 10 is no longer available in 20.10's repos, only CUDA 11. Secondly, installing CUDA 10 via NVIDIA's repos insists on trying to install NVIDIA's drivers inside the container, which obviously isn't going to work, so you have to jump through a bunch of hoops to work around that!
     
    Set up the NVIDIA repo-settings. Install all the dependencies required by the metapackage cuda, except for anything that nvidia-cuda-drivers requires and the package itself. Proceed with apt download cuda-10-2 cuda-demo-suite-10-2 cuda-runtime-10-2. Install the above debs with dpkg --force-all -i yourdebsabove Remove the dependency on nvidia-cuda-drivers for cuda-runtime-10-2 in /var/lib/dpkg/status Use apt-mark hold nvidia-cuda-runtime-10-2, so apt update won't undo what you did above and try to install nvidia-cuda-drivers again! You'll also likely find out that CUDA 10 - toolkit doesn't compile with GCC 9 or newer, so you'll need to set up GCC 8 with apt -y install gcc-8 g++-8 && update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-8 8 && update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-8 8 I really wish nvidia-cuda-drivers wasn't a hard dependency, would've been much easier!
     
    PS. Yes, I know this isn't exactly the best possible place for this kind of stuff, but I'm documenting here anyways until I can be bothered to whip up my own Wordpress-blog, so someone facing the same issue may still find these instructions via Google.
  23. Like
    WereCatf got a reaction from Ziondaman for a status update, Well, getting CUDA 10.2 toolkit working in an LXC-container running Ubuntu 20.10 Groo   
    Well, getting CUDA 10.2 toolkit working in an LXC-container running Ubuntu 20.10 Groovy wasn't fun. For one, apparently CUDA 10 is no longer available in 20.10's repos, only CUDA 11. Secondly, installing CUDA 10 via NVIDIA's repos insists on trying to install NVIDIA's drivers inside the container, which obviously isn't going to work, so you have to jump through a bunch of hoops to work around that!
     
    Set up the NVIDIA repo-settings. Install all the dependencies required by the metapackage cuda, except for anything that nvidia-cuda-drivers requires and the package itself. Proceed with apt download cuda-10-2 cuda-demo-suite-10-2 cuda-runtime-10-2. Install the above debs with dpkg --force-all -i yourdebsabove Remove the dependency on nvidia-cuda-drivers for cuda-runtime-10-2 in /var/lib/dpkg/status Use apt-mark hold nvidia-cuda-runtime-10-2, so apt update won't undo what you did above and try to install nvidia-cuda-drivers again! You'll also likely find out that CUDA 10 - toolkit doesn't compile with GCC 9 or newer, so you'll need to set up GCC 8 with apt -y install gcc-8 g++-8 && update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-8 8 && update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-8 8 I really wish nvidia-cuda-drivers wasn't a hard dependency, would've been much easier!
     
    PS. Yes, I know this isn't exactly the best possible place for this kind of stuff, but I'm documenting here anyways until I can be bothered to whip up my own Wordpress-blog, so someone facing the same issue may still find these instructions via Google.
  24. Informative
    WereCatf got a reaction from Jurrunio for a status update, Well, getting CUDA 10.2 toolkit working in an LXC-container running Ubuntu 20.10 Groo   
    Well, getting CUDA 10.2 toolkit working in an LXC-container running Ubuntu 20.10 Groovy wasn't fun. For one, apparently CUDA 10 is no longer available in 20.10's repos, only CUDA 11. Secondly, installing CUDA 10 via NVIDIA's repos insists on trying to install NVIDIA's drivers inside the container, which obviously isn't going to work, so you have to jump through a bunch of hoops to work around that!
     
    Set up the NVIDIA repo-settings. Install all the dependencies required by the metapackage cuda, except for anything that nvidia-cuda-drivers requires and the package itself. Proceed with apt download cuda-10-2 cuda-demo-suite-10-2 cuda-runtime-10-2. Install the above debs with dpkg --force-all -i yourdebsabove Remove the dependency on nvidia-cuda-drivers for cuda-runtime-10-2 in /var/lib/dpkg/status Use apt-mark hold nvidia-cuda-runtime-10-2, so apt update won't undo what you did above and try to install nvidia-cuda-drivers again! You'll also likely find out that CUDA 10 - toolkit doesn't compile with GCC 9 or newer, so you'll need to set up GCC 8 with apt -y install gcc-8 g++-8 && update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-8 8 && update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-8 8 I really wish nvidia-cuda-drivers wasn't a hard dependency, would've been much easier!
     
    PS. Yes, I know this isn't exactly the best possible place for this kind of stuff, but I'm documenting here anyways until I can be bothered to whip up my own Wordpress-blog, so someone facing the same issue may still find these instructions via Google.
  25. Informative
    WereCatf got a reaction from Hackentosher for a status update, I've been emailing back-and-forth with Kingston for a bit regarding the 250GB Kingsto   
    I've been emailing back-and-forth with Kingston for a bit regarding the 250GB Kingston A2000 NVMe-drive I recently bought; the drive suffers from an issue with power-management under Linux that causes a hard system-crash, which is obviously not a very nice thing. This crash can be worked around by applying "nvme_core.default_ps_max_latency_us=5500" to the kernel command-line via e.g. /etc/default/grub under Ubuntu, but that means the drive won't ever use the lowest power mode. Also, having to even apply such a workaround obviously should not happen -- a less technically-inclined person would be tearing their hair out and could lose important data! I, personally, can deal with the workaround, but by nature I can't help but be bothered by the thought of anyone who won't know to use it.
     
    Kingston's folks have been very nice about this whole thing, immediately taking my ticket with the seriousness one could expect and have been asking me for more details about when/how I can trigger this issue to happen and such and they are now working on a fix. Apparently they have to talk to the manufacturer of the controller, so it's not a very easy fix, but I can say I've been very happy with Kingston's handling and their communication with me about it so far. Good customer-support and handling of issues like this is something I value very highly and my view of Kingston has definitely gotten a good bump up.
×