Search the Community
Showing results for tags 'http'.
-
What I'll be explaining from beginning to end is the setup and network installation of a server you can host (be that bare-metal or inside a virtual machine) that enables you to install various distributions of Linux onto clients from a network server instead of from USB, CD, or other local media. This is a rather advanced topic I'll be covering. For that reason it is expected that you already have (at the minimum) a fundamental understanding of certain network concepts (IP addresses, subnets, default gateways, routers, protocols like DHCP, HTTP, FTP, SSH), what Linux is, and how to install it. Regardless of that being said, if you need additional help with networking, or how to install Linux feel free to leave a comment and someone or myself will help you wherever you're stuck. For the purposes of this installation I'll be using Ubuntu Server 20.04 inside a VM on a server on my network. Index 1. Network Setup 2. TFTP & HTTP Setup 3. iPXE Setup 4. Main.ipxe Setup
-
- (i)pxe
- net install
- (and 4 more)
-
Hello, I am trying to enable HTTPS for remote access which requires a SSL certificate. To do this I have been using Caddy but I'm open to using something else. Currently my setup is an Ubuntu Server and I use Docker+Portainer for Jellyfin. I am new to servers so I can't figure how to do this. I can't seem to find any tutorials and I am a little confused by the documentation on Jellyfin's website.
-
As stated in the title I was wanting to know how to setup a HTTP server. My main questions are: What programs do I use to setup the server? What operating system should I run on the server? I need the server to support php and sql. These are my only requirements I have. Thanks for the help in advance.
-
Hi There, We have all seen the USB device called "USBKILL" - check here - https://www.google.co.za/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwiC6eTugufVAhVMKMAKHdqNBjkQFggmMAA&url=https%3A%2F%2Fusbkill.com%2F&usg=AFQjCNGmBB8346oK64oOWvWq2hlHT_2S9A Do we have one for http for example? Regards, I<3C
-
Alright, I was going to try to format this all fancy, but I gave up pretty quick. Hi, Let's get to the story and issue I want to resolve/discuss.... I am a college Sophomore, and I have a dorm room. I pay around 100$/month for access to the "Premium Internet Package", giving me 100Mb/10Mb Speeds (with a nice, low ping) to the Rj-45 in my room, connected to my little router to get wifi for my laptop and phone. But these ass-cakes, they decided they could control website access... I have no contract with them to not proxy my way out. I have nothing saying I won't, no agreements. So, brings my statement. I want to get passed the filtering, I want to get passed the VPN Blocking, the well known proxy blocking. YOUTUBE BEING BLOCKED. DO YOU KNOW HOW HARD IT IS TO WAIT UNTIL I COME HOME ON WEEKENDS TO WATCH LTT???? /very/ So, my long and short of it is, I cannot connect to VPNs or Proxies that are available to me. But I thought of something, and I want you to hear me out. I figure I have a laptop sitting at home (with my grandparents) that can easily be left on, connected to power / Internet , So I thought, could I host my OWN proxy that I could put in my android wi-fi settings and bypass all of the stupid filtering implements. (Yes, I know Proxies slow down traffic) I'm not the brightest bulb in the basket, but I am CompTIA A+ Certified, and I believe I have a pretty decent understanding of Ports and internet systems. This Laptop is running windows (and I can not change that), so I can use this as the actual "Proxy Server" My Android Smartphone is a SGS5, Soon to be upgraded to a OP3T Any constructive responses are beloved, thank you for reading and any help you might offer ~I hate college so much...
-
Hello everyone, I have a little issue I can't figure out. I am in no way an expert in networking. My issue is that I want to access my Gitlab server from anywhere both in https and ssh for git operations. I did figure out how to configure https to work but I can't make ssh work in any way. I can access the server and use git normally in https only. I run Unraid for my server with a Gitlab docker instance and NginxProxyManager to do the reverse proxy. My configuration is like this My router: Port 22, 80 and 443 are open and forwarded to my main server IP on an other port for each of them. The NginxProxyManager listen to the 3 ports forwarded by the router I have my own domain and have a cname for gitlab.mydomain.com and Gitlab listen to this url https://gitlab.mydomain.com. Of course it is just an example domain name. I tried to forward the port 22 directly to my gitlab server as well as the nginx proxy docker instance and nothing works. What ever I try, the gitlab server return either not authorized or ask for git password. (I have a ssh key configured) For reference, everything worked before I put my Gitlab server behind my domain and a reverse proxy. In local on a specific IP the ssh access worked like a charm. I know I might not be clear in every way, but I don't know what to do to fix the issue. Thanks
-
I could use some help from anyone who is familiar with the http server build for the ESP32 arduino core. I keep running into the problem where the server responds to a GET request with 'Header fields are too long for the server to interpret' A detailed explanation of my issue can be found on github. https://github.com/espressif/arduino-esp32/issues/2983 If someone prefers to view it here on the forums instead, here is the copy paste of it
-
How do i pass Username and Password in GET Request in Java HTTPURLConnection? i did some searching and found this: https://stackoverflow.com/questions/27150388/how-to-pass-username-and-password-in-get-request-in-java However it does not work, Also they seem to use another base64 library than me, i just use the standard one, And the comment saying try with putting the auth right into the encodeToString function does not work either: String encoded = Base64.getEncoder().encodeToString(("test:server1").getBytes()); connection.setRequestProperty("Authorization", "Basic "+encoded); Ive tested my webserver and it i can log in via the firefox login prompt. Please help. Code snippet: https://hastebin.com/coqaladoke.cs
-
so i did a small packet capture of my WIFI interface using wireshark for downloading a image file. in this pcap i did not see a single HTTP request. it was mostly filled with TLSV1.2 and some TCP requests. Then i realized the site i downloaded the image was using HTTPS. So i decided to find a website that uses just HTTP to download a image. i did find one, i downloaded the image while wireshark was running but still not a single HTTP request.I did observe a lot of SSDP requests in this saying Notify HTTP/1.1. Would like to know if theres something wrong in my procedure or an explanation of whats going on if this is normal. Thank you
-
Good evening. I have been wanting to build myself a storage server for quite some time now. The main functions of the server will be; - Mass storage of my girlfriends videos/photos (she is a photographer/video creator) - Storage for her to work from on her laptop / desktop - Internet-attached storage (I want access to the servers files over the internet from anywhere, from any device (HFS?)) - A Plex server for in-house streaming of music and movies - The option to host the occasional game server (Currently playing ECO) - Redundancy ( will also be running an off-site backup like backblaze monthly) - In the neighborhood of 20 TB of storage Fortunately I just got an older Dell T320 from work (I work in the dental industry). It's harddrives have failed, and even though DELL sent us two 1tb SAS drives, we couldn't rebuild the OS. The customer ended up buying himself a new server, seeing as this one is from late 2014. Short story. the server contains: 32 gigs of ECC RAM A Xeon E5-24xx V2 (not quite sure) An 8 port SAS controller (compatible with SATA). I'll include the service tag, if any of you fancy checking out the specs: http://www.dell.com/support/home/uk/en/ukbsdt1/product-support/servicetag/b29ld22/research?lwp=r What I'm wondering is: - What OS should I go with? (I am quite comfortable with Linux, and I work with Windows sever on a daily basis) - What kind of RAID setup should I run? (currently have got 2 x 4tb WD red NAS drives, and I can buy as many more as is required) - What kind of backup schedule would be sufficient? - What software/solution would I use for accessing the files online ( I did try HFS on an old PC i set up as a server, and that seemed OK) - Partitioning (How would I go about doing this efficiently?) - Upgradeability ( In case I need to expand the storage down the line) Any feedback is appreciated, and I am open to any suggestions! Thanks a bunch in advance //Even
-
Hi! First off, by date I mean date and time in an http response. I'm coding a web server in c++ for fun and I'm building a library to help with things I really don't wanna code more than once ever, one of them being the date in a specific format, and others without using ctime directly. I'm almost done, but now I wonder, what the f* is that for? I've gone through chrome back and forward and I haven't been able to find anything about any dates, and I tried using an arbitrary one, nothing happened, ofc I wanna do everything right but I'm actually clueless as off to why one would ever need this as it doesn't seem to get registered anywhere in the client side and any decent server could do it internally. Thank you.
- 3 replies
-
- http
- web server
-
(and 2 more)
Tagged with:
-
Hi. I've coded a minimal c++ web server, and now I wanna start adding more things and make it as smooth and perfect as possible. I'm starting with caching. Now, I can send the typical stuff in the response headers that'll make the browser give me back either a "If-None-Match" if I'm doing ETags or a If-Modified-Since if I'm using dates. Now what I'm asking is which one would be more optimized? I honestly don't know how to make ETags an easy task, it could be really useful in long-term caching for pages that won't be changed in a long period of time, but is it really worth it? I mean I see no really simple way of doing, either store it in file comments (which would go against the purpose, given you'd have to check metadata for that), I could make a global variable, like an array or some other object to store every single page's ETag, which would make it really hard to update pages if you're trying to access them as you're developing or editing, because you'd either have to change that variable manually or make something to check whenever the file was modified. This would be nice for pages that only got an update every once in a while and was highly requested, but any other case would make it clunky. Not mentioning long term caching can't really be that useful, I mean what are the chances you're gonna keep a page cached in a browser for say 1 or 2 months? Second Option: Last-Modified + If-Modified-Since + Checking File Metadata for last modified date every time. Which doesn't seem too much of a compromise until you start getting a lot of requests. I don't know at what point this becomes not-worth it compared to ETags? I figure it's still way better than sending the whole file every time, but that's not even really an option so.. I could just go ahead and code the database program, which would be optimized and would be able to handle ETags the best way possible, but that can take a while and I want to understand this first. Maybe I could do it sort of like a hybrid, I'd config each page as ET or MD and handle it all as such. But I need someone's opinion. I'm unexperienced, I don't know until what point storage can be powerful in order to handle looking into metadata a lot of times or if I should make it more scalable by handling those ETags to memory, or whether that'd happen at all if I did. Thank you.
-
Hi. I'm coding a web server in c++, I'm reading on request headers, and testing out different browsers. I am now building a class that stores and all around works with the request headers's info. Everything seems pretty straight forward and has been working wonders, but I came across Accept-Encoding. Now Firefox and Chrome both accept both gzip and deflate. I'm don't understand what I'm supposed to do about this, straight up sending stuff through sockets seems to have been working so far. I've found a lot on decoding, but I can't seem to be able to get any info on how to encode. And what are these anyway? I've heard of utf-8 and stuff but this is completely unknown to me. Should I encode anything? How would I do so? Thank you.
-
Hi, I'm coding a c++ web server, and I have yet another question. I've been working the http protocol and everything, but now I would like to understand what the process is the browsers go through to obtain the images to put on the final page. I'm guessing the img tag is found, an url is formed from src, the image is then directly downloaded from the server and the browser then displays it in the page. Given that's probably how it works, I would like to know how the rest works. I know how regular http request headers and responses look like, and I'm assuming for an image it's just about the exact same. But how does it happen? Do I just open the image and start sending it packet by packet raw after the protocol response thing? I honestly have no idea, should I send the image with the response I sent in the first place? If so how? What exactly is going to happen, when the browser realizes that there's an img tag, and that it needs an image from that server, on that url to put there? And what exactly is the server supposed to do about it? Thank you for your time.
-
I am wondering how you can do this, so I can redirect my internal IP address of my router to redirect to, lets say Google to stop other people on my network attempting to logging into it. Would I have to host the site on my server or can I just redirect to Google directly?
-
From Google's official security blog: https://security.googleblog.com/2016/09/moving-towards-more-secure-web.html Nothing to really criticize here. Definitely an important step which is long overdue. It annoys me how some websites still don't use HTTPS, especially when passwords are being transmitted. Hope the marks will be bigger than just a small grey banner though. Let's Encrypt should make it easy enough, also SSL certificates are not that expensive if you don't need something like wildcards, so I don't really see an excuse to not use SSL here.
-
I have a peculiar problem I for the life of me can't figure out. I've tried who knows how many browsers but Chrome I mainly use and it can't manage to get or even use the majority of my connection. I have 30 down 10 up and using the test file from Azurespeed storage blob download test. Every browser i've tried gets anywhere from 900kb/s and below but usually its bombing to below 300kb/s. HOWEVER If I use Citrio browser and use its download priority settings and set it to high Citrio will download the same exact file from the same location at 3.4mb/s (my full speed). no other browser will cap out. (Just to cover bases, Live streaming, Netflix and such are unaffected). If anyone has any insight, Plz =(
- 2 replies
-
- web browser
- http
-
(and 1 more)
Tagged with:
-
I am looking to make a protected subdomain for my website that forces the user to complete a form using the HTTP 401 header (preferably encrypted so no one can access it who shouldn't). How would I implement such a great thing into my subdomain the most efficiently? Specifically, I want only me and people with the password to be able to view the content of this subdomain.
-
I recently joined the forum and was wondering why the site always defaults to HTTP even when I explicitly open it through HTTPS. Every link takes me back to HTTP.
-
Came home from school, only to find I can't use my wireless to it's fullest potential. I noticed youtube and other https sites such as reddit and the Oneplus forums worked, but anything with http:// redirects me to the uverse page. Sadly, I'm not at well verse in the profession of networking as I should be. With this in mind I thought you guys could help. I'll provide some info below Router: Motorola NVG589 ISP: ATT Uverse Connection type: I believe it's dsl Things I've tried Changing to google's publc DNS from my pc. Apparently, Uverse does not allow you to change the routers DNS through gateway. disabling ipv6 restarting and rebooting router connection through Ethernet only to receive same results. changing channels, checking cords.... TEMPORARY SOLUTION If i pop open/connect to my vpn, I have full browsing capabilities. I find this to be very strange.Any guess as to what's going on? I would really appreciate any help. EDIT: 200th post
-
I have this strange problem: SOMETIMES, when I reboot my PC, after reboot SOMETIMES, I can't access the sites with http. Works only https sites and rebooting helps, if not once, maybe twice or third reboot helps. Any help?