http downloading of big files
Mark Purcell
mark at purcell.aaa.net.au
Mon Sep 10 23:29:11 EST 2001
On Mon, Sep 10, 2001 at 11:02:43PM +1000, simonb at webone.com.au wrote:
> well,
> im trying tow download a 50MB file
> with http...
> how can i get wget (or anything)
> to resume where it (inevitably) stoped
> last time?
Why don't you try `wget -c`
-c --continue-ftp
Continue retrieval of FTP documents, from where it
was left off. If you specify "wget -c ftp://sun-
site.doc.ic.ac.uk/ls-lR.Z", and there is already a
file named ls-lR.Z in the current directory, wget
continue retrieval from the offset equal to the
length of the existing file. Note that you do not
need to specify this option if the only thing you
want is wget to continue retrieving where it left
off when the connection is lost - wget does this by
default. You need this option when you want to con-
tinue retrieval of a file already halfway
retrieved, saved by other FTP software, or left by
wget being killed. The -c option is also applicable
for HTTP servers that support the `Range' header.
Mark
More information about the linux
mailing list