[Subject Prev][Subject Next][Thread Prev][Thread Next][Subject Index][Thread Index]

Re: Re: downloaders for linux

On Thu, Mar 22, 2001 at 04:22:47PM +0530, Atul Chitnis wrote:
> Here is a better way:
> dump all the URLs into a text file one after the other,  then run
> wget -b -i <listfilename>
> and go away ;-)

Will they be downloaded parallely or serially?

While on the topic -- can wget do stuff akin to Getright's segmented downloads?
That is, download different parts of the same file from multiple FTP sites? I
don't know whether this actually makes a difference...


Biju Chacko        | biju@xxxxxxxxxxx (work)            
Exocore Consulting | biju_chacko@xxxxxxxxx (play)
Bangalore, India   | http://www.exocore.com