Download a website in ubuntu

We will use wget to save entire webiste. Syntax is

wget <options> http://yoursitename.com

Following are the useful options you can use with wget.

-c / --continue Resume interrupted download.
-nd / --no-directories Don't create directory hierarchy.
-P / --directory-prefix=Save downloaded files to specified directory.
-U / --user-agent=Spoof user agent string.
-r / --recursiveGo crazy :)
-l / --level=use 0 for infinite depth level or use number greater than 0 for limited depth.
-k / --convert-linksModify links inside downloaded files to point to local files.
-p / --page-requisitesGet all images, css, js files which make up the web page.
-np / --no-parentDon't download parent directory contents.

I normally use following command to download a website.

wget -r --level=0 -convert-links --page-requisites --no-parent  <url>

6 comments:

Anonymous said...

Thanks! I used this to save a copy of our website that we created to chronicle our trip to China to adopt our son. I want to use the space for other things now, but didn't want to lose the memories recorded on the website. One detail though: you list "-convert-links" as a parameter, but it needs to be "--convert-links" in order for the links to be changed to match the local file system. Thanks for the help!

Brett

Johnny said...

Is there some way to prevent some particular file format from being downloaded..? Like, the images etc.

Anonymous said...

I normally use following command to download a website.

wget -r -v --level=0 --convert-links --page-requisites --no-parent

cause is verbose and you can know what is going on

shrek said...

Good job friend. really good to save the html tutorials.

Thank you very much.

Anonymous said...

There is a typo in the example of the command you usually use. You put '-convert-links'. It should be '--convert-links' (two dashes instead of one. It doesn't convert the links unless this is right.

Thanks for this post.

Francois said...

That is fantastic! I am working on a project to create a presentation based on a website and hopefully this will help with the files I need to do that.

Thank you!