When trying to determine the best fit for a download manager, I eventually decided upon using wget for a number of reasons. After looking at my example, you may decide to give it a try.
Some of the Reasons I liked this solution.
- Not having to integrate a Download Manager with a browser(s).
- Ability to log in to remote site.
- Easy integration with CRON or AT, so that the downloads could be scheduled for evenings and weekends.
- It also seemed quits simple!
First, as many may use a proxy server, when the following line is entered in the terminal window, it will define the proxy that you will use with wget (the proxy specified in your browser will not be automatically used). Obviously, if you don't use a proxy for your browser, you won't need to use the command.
tux:~ # export http_proxy="http://your-proxy.your-doamin.com:1234"
Where "your-proxy.your-domain.com:1234" would be the proxy settings used in your browser.
For Single Use--Right Now!
After logging into the destination, right click the links you want and paste them one at a time into a text file, surround each link with single quote mark on each end, with a space between the links. Next, if it is a one time download, you would paste the links you just copied after the user and password info. The following command would download the desired file into the present working directory on the Linux host. (Note that entire command is all on one line and wraps to the next. Not sure of the limit, but I have used this with what seems to me to be incredibly long strings.) If you want to schedule it for later, either once or many times, you could paste the login info into the file and run it as a bash script (below) within AT or CRON.
tux:~ # wget --http-user=yourlogin --http-password=yourpassword 'url.domain.com/path/to/first/file' 'url.domain.com/path/to/second/file' 'url.domain.com/path/to/third/file' 'url.domain.com/path/to/fourth/file' 'url.domain.com/path/to/fifth/file'
Sample bash script to run while you are at the beach or skiing with your family. (beach.sh)
##Following changes to directory to download files to.
##Following is the command specified above to start the download.
wget --http-user=yourlogin --http-password=yourpassword 'url.domain.com/path/to/first/file' 'url.domain.com/path/to/second/file' 'url.domain.com/path/to/third/file' 'url.domain.com/path/to/fourth/file' 'url.domain.com/path/to/fifth/file'
##Following would allow you to change to another directory and then repeat the command above and have the
##Then you would use wget for files for this location, etc.
If it was a job that you will want to run occasionally and the URL's may change from time to time, use the AT command to schedule the "beach.sh" bash script to run. This is my use--I'll run it starting noon tomorrow (Saturday) (until ?) and download 50 Gig or more of ISO's and such while Internet usage is low. CRON would be the choice if you wanted to download the same files on a scheduled basis, but I will use AT here.
I want it to start when I'm at the beach and out of cell range!
Start the AT daemon, schedule the time, and then enter your command into it.
tux:~ #at noon tomorrow
##Exit this with a <CNTRL-D)
Tomorrow you will be at the beach or skiing and this will run for you. Hope you enjoy the beach/skiing and that you find this has worked for you when you get back.. :0)
Disclaimer: As with everything else at Cool Solutions, this content is definitely not supported by Novell (so don't even think of calling Support if you try something and it blows up).
It was contributed by a community member and is published "as is." It seems to have worked for at least one person, and might work for you. But please be sure to test, test, test before you do anything drastic with it.