So you need to download huge files on a slow connection. Your browser isn’t doing a very good job. You need partial file support and resume support. You hate fancy GUI’s that eat memory. You need to mirror an entire blog or site. You need a powerful download manager which does all this and more.
If you answered yes to any of the above questions then wget is definitely for you.
GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It can resume aborted downloads, using REST and RANGE.
Continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file.
More Info here: