Pages

How to Download Entire Website Using WGET on Linux

Sometimes we want to download or make offline version of website. If you are trying to do it then you can do it with wget command available in linux system. It is helpful when we want to make offline version of some helpful websites or tutorials and we don't need to visit again. The command for downloading entire website is as follows:

$ wget \
   --recursive \
   --no-clobber \
   --page-requisites \
   --html-extension \
   --convert-links \
   --restrict-file-names=windows \
   --domains yourwebsite.com \
   --no-parent \
     www.yourwebsite.com/destination/root



With this command, it will download files under www.yourwebsite.com/destination/root


The meaning of the options are as follows:
  • --recurisve : download the entire web site.
  • --no-clobber : don't overwrite any existing files (will be used when the process interrupted or resumed).
  • --page-requisites : get all the elements that compose the page(images,css,etc).
  • --html-extension : save files with the .html extension.
  • --convert-links : convert links so that they work locally, off-line.
  • --restrict-file-names=windows : modify filenames so that they will be worked on windows as well
  • --domains yourwebsite.com : don't follow links outside yourwebsite.com.
  • --no-parent : don't follow links outside the directory destination/root
  reference :Linux Journal

Unknown

Phasellus facilisis convallis metus, ut imperdiet augue auctor nec. Duis at velit id augue lobortis porta. Sed varius, enim accumsan aliquam tincidunt, tortor urna vulputate quam, eget finibus urna est in augue.

No comments:

Post a Comment