Wget [basic]

Homitsu
  12 years ago
  1

 

- INTRO -

GNU Wget (or just Wget, formerly Geturl) is a computer program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from World Wide Web and get. It supports downloading via HTTPHTTPS, and FTP protocols.

Its features include recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more. It appeared in 1996, coinciding with the boom of popularity of the Web, causing its wide use among Unix users and distribution with most major Linux distributions. Written in portable C, Wget can be easily installed on any Unix-like system and has been ported to many environments, including Microsoft WindowsMac OS XOpenVMSMorphOS and AmigaOS.

It has been used as the basis for graphical programs such as GWget for the GNOME Desktop and KGet for the KDE Desktop.

[Wikipedia]

 

To test the command lines you must:

  • have an internet connection;
  • be able to create new folders in your home folder;
  • be able to open a terminal;
  • be able to move inside a folder by terminal ["cd" command].

 

- BASIC USE -

Following command line downloads an entire web site, then rewrites the links to make them working locally and records each transaction on log.

wget -m -k -K -E http://www.yoursite.xy/ -o /home/yourmintusername/Wget-log

 

Following command line downloads in a folder all the jpeg of a site. You must create the folder before launching command and move inside it using command "cd".

wget -r -l1 —tries=inf —no-parent -N -A “*.jpg” http://www.yourjpegsite.xy/

 

Following command line is great to backup your blog [it works perfectly in Tumblr]. You must create the folder "Blog-BackUp" before launching command and move inside it using command "cd".

wget -m -k -K -E --random-wait -r -p -e robots=off -U mozilla http://blog2backup.xyz

 

- MORE INFOS -