Jump to content

Downloading a website

Hi Everyone,

My internet pack will expire today and I'm thinking to take a break from it for a while.

I'm learning android dev, and taking material from https://developer.android.com/guide

I want to download this website so I can learn offline. But this website section has multiple pages.

Do I have to save every page to PDF?

Please share if there is a way to download them or print them all without doing it manually 馃檹

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Parvesh Khatri said:

Hi Everyone,

My internet pack will expire today and I'm thinking to take a break from it for a while.

I'm learning android dev, and taking material from https://developer.android.com/guide

I want to download this website so I can learn offline. But this website section has multiple pages.

Do I have to save every page to PDF?

Please share if there is a way to download them or print them all without doing it manually 馃檹

My understanding of the way websites work is there are assets, and a bunch of code to say what to do with those assets. 聽Generally any hyperlinks are removed on conversion, making it impossible to do. 聽PDF isn鈥檛 the only way to save stuff though it is likely the only way you could print it. 聽Firefox used to have a save feature which more or less saved a file folder that included both the assets and the code seperately. 聽I don鈥檛 know if that would copy an entire site or not though.聽

Not a pro, not even very good. 聽I鈥檓 just old and have time currently. 聽Assuming I know a lot about computers can be a mistake.

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

I use Offline Explorer from MetaProducts.com聽 - it's shareware but should be fine for 30 days or so trial.

You would have to configure the project properly first, to guide it a bit, by specifying starting address (/guide in your case), and how many sub levels it should go, and within what folders it should stay (because the software otherwise would scan all links in pages and keep downloading from anywhere on the site and even download stuff from other websites.

For example looking at the links on the left, you would want only the pages from /guide/something聽 but there are some exceptions聽 (for example "device compatibility > support different pixel densities goes to /training/something/somethingelse so you'd want to allow downloading from there as well聽 and there's also /topic/

I'll give it a try and see how far I can take it.. get back to this thread within a few hours maybe.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Parvesh Khatri said:

Hi Everyone,

My internet pack will expire today and I'm thinking to take a break from it for a while.

I'm learning android dev, and taking material from https://developer.android.com/guide

I want to download this website so I can learn offline. But this website section has multiple pages.

Do I have to save every page to PDF?

Please share if there is a way to download them or print them all without doing it manually 馃檹

Httrack (on Windows) and wget (on Linux) are the most used by far. There's honestly way too many guides online, so I'm not going to google them for you; these tools are very configurable, you can choose to download whole websites or just sections, downnload or ignore assets, to make it faster if you need text only, and many other things.

DESKTOP PC - CPU-Z VALID |聽i5 4690K @ 4.70 GHz | 47 X 100.2 MHz | ASUS Z97 Pro Gamer | Enermax Liqmax II 240mm | EVGA GTX 1070Ti OC'd

HOME SERVER | HP ProLiant DL380 G7 | 2x Intel Xeon X5650 | 36GB DDR3 RDIMM | 5x 4TB LFF Seagate Constellation 7.2K | Curcial MX500 250GB | Ubuntu Server 20.04

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LionSpeck said:

Httrack (on Windows) and wget (on Linux) are the most used by far. There's honestly way too many guides online, so I'm not going to google them for you; these tools are very configurable, you can choose to download whole websites or just sections, downnload or ignore assets, to make it faster if you need text only, and many other things.

I'm using HTtracks right now on Linux, its stuck on downloading, not your fault, but it is not working for me

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Parvesh Khatri said:

stuck on downloading

Could be accesses denied, slow connection, server blocking the IP, any number of things. You could try to re-run it in verbose mode

DESKTOP PC - CPU-Z VALID |聽i5 4690K @ 4.70 GHz | 47 X 100.2 MHz | ASUS Z97 Pro Gamer | Enermax Liqmax II 240mm | EVGA GTX 1070Ti OC'd

HOME SERVER | HP ProLiant DL380 G7 | 2x Intel Xeon X5650 | 36GB DDR3 RDIMM | 5x 4TB LFF Seagate Constellation 7.2K | Curcial MX500 250GB | Ubuntu Server 20.04

Link to comment
Share on other sites

Link to post
Share on other sites

you can use httrack PC software to download the complete website.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now