[NBLUG/talk] FTPing large files
smachin1000 at gmail.com
Fri Jan 8 08:30:47 PST 2010
Thanks for all the responses. I am using curl to do the FTP upload, to
a my personal web account hosted by 1&1. My home ISP is Comcast cable.
Here's a sample of the exact command I run and the error msg. I get.
[sean at localhost ~]$ curl -T /tmp/backup/svn.tar.gz -u uxxxxx:xxxxx
% Total % Received % Xferd Average Speed Time Time Time
Dload Upload Total Spent Left
2 6497M 0 0 2 157M 0 133k 13:52:17 0:20:12
curl: (55) select/poll returned error
> I think the right focus is why it's failing. That's very odd, as I've
> used FTP to transfer multi-gigabyte files without difficulty.
> Do you have an error message that FTP is generating? Which FTP
> program are you using? Is the destination server a generic web host?
> They might be quashing the connection purposefully.
> - Chris
> jezra wrote:
>> On Thu, 07 Jan 2010 17:52:49 -0800
>> sean machin <smachin1000 at gmail.com> wrote:
>>> Hi All,
>>> I'm writing a python script for my Centos server which (among other
>>> things), tries to FTP a large (7GB) archive image to another server
>>> across the WAN.
>>> My script calls the curl program to do the upload. Curl always seems
>>> to fail after a few 100MB however.
>>> Any ideas on how best to transfer this large file? I do not have SSH
>>> access BTW so can't use scp.
>>> Thanks :)
>>> talk mailing list
>>> talk at nblug.org
>> Personally, I would use wget -c ftp://file/to/download for a
>> download that is likely to be interupted. The "-c" flag will allow wget
>> to pick up where it left off, so running wget -c ftp://file/to/download
>> will only get the needed bytes.
>> talk mailing list
>> talk at nblug.org
> talk mailing list
> talk at nblug.org
More information about the talk