So you need to download huge files on a slow connection. Your browser isn’t doing a very good job. You need partial file support and resume support. You hate fancy GUI’s that eat memory. You need to mirror an entire blog or site. You need a powerful download manager which does all this and more.
If you answered yes to any of the above questions then wget is definitely for you.
GNU Manifesto explains the details. In legal terms, GNU software is protected by the GNU General Public License, or GPL, and by the GNU Lesser General Public License, or LGPL. The Linux kernel, which is subject to the GPL, benefits from this project (especially from the tools), but should not be seen as the same thing.">GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It can resume aborted downloads, using REST and RANGE.
Continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file.
More Info here:
Disclaimer: As with everything else at Cool Solutions, this content is definitely not supported by Novell (so don't even think of calling Support if you try something and it blows up).
It was contributed by a community member and is published "as is." It seems to have worked for at least one person, and might work for you. But please be sure to test, test, test before you do anything drastic with it.