[Subject Prev][Subject Next][Thread Prev][Thread Next][Subject Index][Thread Index]

Re: Re: downloaders for linux



On Fri, 23 Mar 2001, Biju Chacko wrote:

 |On Thu, Mar 22, 2001 at 04:22:47PM +0530, Atul Chitnis wrote:
 |> dump all the URLs into a text file one after the other,  then run
 |> wget -b -i <listfilename>
 |
 |Will they be downloaded parallely or serially?
They will be downloaded serially... but u have options of continuing
download etc...

 |While on the topic -- can wget do stuff akin to Getright's segmented downloads?
 |That is, download different parts of the same file from multiple FTP sites? I
 |don't know whether this actually makes a difference...

You can also try curl which is more feature rich than wget you can
download a series of files...

like http://tutorials/javatut1.html - http://tutorials/javatut100.html

using a single command like

curl http://tutorials/javatut[1-100].html -o javatut#1.html

It also supports ssl and cookies and what not !!

(you can even download from sites that don't allow hotlinking ! though I
think this can be achieved with wget too)

Kingsly



	        .:: Kingsly John                ICQ 14787510 ::.
               --------------------------------------------------
            .:: Linux 2.4.2 #8 Wed Feb 28 12:41:38 IST 2001 i686 ::.
            --------------------------------------------------------