[NBLUG/talk] FTPing large files
jezra at jezra.net
Thu Jan 7 18:39:42 PST 2010
On Thu, 07 Jan 2010 17:52:49 -0800
sean machin <smachin1000 at gmail.com> wrote:
> Hi All,
> I'm writing a python script for my Centos server which (among other
> things), tries to FTP a large (7GB) archive image to another server
> across the WAN.
> My script calls the curl program to do the upload. Curl always seems
> to fail after a few 100MB however.
> Any ideas on how best to transfer this large file? I do not have SSH
> access BTW so can't use scp.
> Thanks :)
> talk mailing list
> talk at nblug.org
Personally, I would use wget -c ftp://file/to/download for a
download that is likely to be interupted. The "-c" flag will allow wget
to pick up where it left off, so running wget -c ftp://file/to/download
will only get the needed bytes.
More information about the talk