Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited


This user doesn't have any awards


About PCn00b3000

  • Title
  • Birthday 1996-06-07

Profile Information

  • Gender
  • Interests
    Tinkering, IT and general computer research, writing poetry... Just PC Master Race Things ;)
  • Biography
    I'm just another SrA in the world's greatest Air Force
  • Occupation
    2T251; Air Transportation USAF


  • CPU
    Ryzen 7 1700 | 8-Core, 16-Thread 3.7GHz
  • Motherboard
    GIGABYTE GA-AB350-Gaming
  • RAM
    Corsair Vengeance LPX 16GB (2x8GB) DDR4 DRAM 3000MHz
  • GPU
    Zotac GTX 1070 AMP! Edition
  • Case
    Corsair Carbide Clear 400C Compact Mid-Tower
  • Storage
    Crucial MX300 275GB M.2 | 6TB WD | 4TB Seagate
  • PSU
    EVGA SuperNOVA 750 G2 | 80+ GOLD | 750W
  • Display(s)
    50" AVOL 1920x1080 Flatscreen
  • Cooling
    Wraith Spire LED CPU Cooler
  • Keyboard
    Corsair Vengeance K65
  • Mouse
    Logitech G300
  • Operating System
    Windows 10 Pro

Recent Profile Visitors

1,669 profile views
  1. I've tried every other forum and Oculus support, but found nothing helpful, so I'm giving this forum a shot. Long story short, I've run into numerous points where the system would stop tracking my controllers when I'm in-game. I could still rotate my digital hands in any direction, but my system can't track my head movements and my controllers stick to the point they were at when the tracking was lost. I don't know what else to try. I've tried everything I could possibly find online in forums and tutorials. I bought my Oculus Rift S back in August, and it *IMMEDIATELY* bugged out on me by losing tracking in both controllers after about 30 or so minutes of playing. I returned the first one and got a replacement, hoping that all I had was a defective unit. Come to find out, a week into using the new model: History repeated itself and I was faced with immediate tracking issues. The pass-through mode was only static, tipping me off to the issue being with my USB ports on my PC. I got a PCI powered and non-powered expansion card, but that didn't fix the issue. So a failed fix, and $65 later, I tried the typical "turn off power limiting settings in Windows" trick. Nothing. My PC is up to date, Nvidia drivers installed and updated, power management set to "f*ck you PC, you're my slave: DO WHAT I SAY"... And nothing is fixed. I'm about 4 days away from returning this POS to the Best Buy I got it from, and giving up on VR. I had such high hopes for it, hearing about it from colleagues and friends who had it. Unfortunately, this has brought me to the point where I'm debating on whether or not to actually just full-on DESTROY the thing out of rage. Has anyone else had the same issue in the past? Oculus claims its a firmware issue, but I find that very hard to believe. It's not NEW, so it shouldn't bug out as often as it does! I've got to say, in my 23 years on this planet, I've never been so disappointed and rage-filled over anything else tech-related in my life.
  2. I'm using Mariadb for that, and my system is configured properly to where it's recognized on Nextcloud. I wonder if I need to do any config changes on there at all
  3. All of the commands were accepted, and I see this now (attached photo), so that worked well. Though, for some reason, Nextcloud is still not showing the test images and files I put onto the drive. I've rebooted apache2 as well, so I'm not sure what else to do.
  4. Okay. I have access to the drive through my ubuntu user profile, and Nextcloud is accepting the local mount point and saving the settings. I used the following commands in this exact order on a newly formatted drive (EXT4): sudo chown -R www-data:www-data /mnt/sdb1 sudo chmod -R 0750 /mnt/sdb1 sudo chown -R talaftw /mnt/sdb1 sudo chmod -R 0770 /mnt/sdb1 After that, my profile can add and edit the files on the drive. Now, Nextcloud (within the control panel) will not show any of the files that are present on the drive, even with the permissions.
  5. Looks like I've got some more reading to do... I don't even know what to put in as my user profile besides "root", since that doesn't work either
  6. Does anyone know the exact command I can paste into the terminal? I'm completely new to attaching hard drives to a cloud service like this
  7. So, once I get Nextcloud to have ownership, I should put do the same with my user profile?
  8. I'm nearly finished setting up my Nextcloud server, but I have a slight problem. When I give permission to Nextcloud to access my local drive (4TB Seagate external drive), my Ubuntu user profile no longer has the ability to make changes to the drive, let alone access the content on the drive at all. This is the command I'm using to give Nextcloud full permissions on the mounted disk: sudo chown -R www-data:www-data /mnt/sdb1 ~ sudo chmod -R 0750 /mnt/sdb1 My friends suggested I use this command to give my Ubuntu profile (TalaFTW) access to the drive: sudo chown talaftw -v /mnt/sdb1 However, when I use the second command, the files within the drive still can't be accessed, or some will be and others won't. My question is: Is there any way to give both Nextcloud & my Ubuntu profile permission to do everything we need to do to basically be dual gods to the drive? Shared permissions that won't conflict with each other? I've asked other forums and none of them would give me a straight and simple to follow answer. Thank you very much in advance. Cheers
  9. This might be a simple question for some of you out there; at least, that's my current hope. I'm currently attempting to lead Nextcloud to recognize my EXT4-formated eternal hard drive, so that I can have space to host content for my members. In the "Configuration" slot, I'm assuming they're asking for the disk's mount point, but I'm too new to Linux to understand how to find it correctly. I simply put </mnt> in, and it accepted it. In the content area of the site, I am able to see a folder with the correct folders. However, when I click on them to see what is inside, they're all empty. I'm not sure how to have the content visible and having it "streamed" (for lack of a better term) straight from the hard drive. Does anyone have any suggestions on what I could do? Thanks in advance
  10. I'm following this video above to connect my Raspberry Pi to my ExtHDD. In the video, he explains that we'll need the drive mount point to give it permissions to be accessed in Nextcloud. He can see the "mount point" on the OMV control panel > File Systems (1st attached photo) But I cannot (2nd attached photo). I'm sure the mount point for my ExtHDD is /srv/dev-by-label-VIP, but nothing works when I put that path into Nextcloud's external drive app. Is there a terminal command I can use to see exactly where the actual mount point is? I've searched all over forums and can't seem to find how I can find it. I've mounted the drive, but the mount point isn't displayed for me on the OMV control panel. I ANSWERED MY OWN QUESTION SMH: If the mount point column doesn't show for you, click the "mounted" bar and select "columns". From there, you can select to show the "mount point" column Sometimes I surprise myself with how I can ask a question but immediately get the answer by simply looking around
  11. None of the software solutions I'm looking for work properly, so I'm hoping you guys can help me with something. I planned to use the disk I used to open an OpenMediaVault server to also hold the content on the open space, but obviously within Windows it's not mountable. I had the idea of dual booting Win10/Peppermint 10 or Ubuntu, but for some reason I've not been able to do so without the installer freezing. So, the last thing I thought to do is find a way to mount an Ext3 formatted disk within windows to make the content upload a bit more practical, as I'll need to have updates put up often. Does anyone have any ideas that'll work? Thanks so much
  12. Wow! I guess I didn't look too far outside of where they host if for you I'll have to reinstall that plugin and give it another shot, thanks so much for clarifying! Cheers mate
  13. Nextcloud was what I looked into for a little bit, but it seems like they have a cap on storage space available for use with them. Also, I plan on branching out on my Discord server, and the 50 member limit is literally limiting. Not to mention the prices are outrageous lol I was also wondering if I could just host the content straight from the 6TB external hard drive I have, instead of having it on a company's radar. I don't get the point of "shared folders" if it can't be used for remote access
  14. Long preface short, I wanted to see if I could use my Raspberry Pi 3 B+ to host content for members on my Discord server. I wasn't sure about how it'd work until I found out about OpenMediaVault. I've had the Pi sitting around and wanted to see if I could use that in place of a dedicated server (I'll only have 40+/- members accessing content on the "hopeful" media host). I have the Pi configured, I know basics on how to use OpenMediaVault and how to use the terminal to make updates and change settings and such: I just want to know if this was all in vain if I can't share access to members on my server, but whom of which are outside of my home network. Does anyone know of a plugin I could install or a process I could follow to configure my content host server to let Discord members only view shared folders? I'm extremely new to this and I'm banking on this working; I've tried using Mega.nz but when members share links around, they along with my Mega accounts seem to always get disabled, even though we're not sharing anything NSFW or inappropriate. Trolls ruin good things, so a self-hosted "cloud" server is what I wanted to try out. Thanks! P.s. I haven't began on configuring security measures, so guidance and forum posts that might help me will also be greatly appreciated. Right now, the host server is not live anymore because of this ~
  15. I've considered using a RAID array for the active archives then cloning that to other various drives that would be placed off site. Luckily, speed isn't a huge deal to me with something like an archive drive or array