Download large number of files using lftp - A 9998 Issue

I use lftp for transfer files between 2 servers. There’s a folder with more then 20000+ files in the remote server, and lftp client just transfered 9998 files.
Is there any config settings could solve this problem?? Or something wrong with my command?

I searched so much, but still have no idea. Thank you for helping me out~

P.S. I use command mirror -c remote_folder for that transfer.

Check for:

  1. Are you out of disk space?
  2. Are you getting any error message when run lftp, if so post it?

1/ Enough disk space.
2/ No error messages output.

And i found that, command cls |wc -l's result is 9998, and command ls -l|wc -l's is 10000l. I don’t know the deep things, but i think the lftp downloads files depends on the result of cls | wc -l. It thinks there were just 9998 files, so it downloads only 9998 files.

My guess is that lftp facing some sort of file download limit or FD limits and so on. Sometimes ftp server also puts limits on how many files one can download due to OS limits too. Are those files in same directory or sub-directories? I tried to test it with my local FTP server but it has max 8970 files so I am unable to reproduce your problem:

ls -R | wc -l

Are you aware of strace command and usage? It can debug such weird issues. Another option is to look into lftp debug command:

lftp ftp.somewhere.dom

Now type:

help debug
debug -o error.txt -c -p t

Now run commands and see what error.txt logs says when it breaks at mirror or cls/ls and any other lftp commands.

My guess is that lftp facing some sort of file download limit or FD limits and so on. Sometimes ftp server also puts limits on how many files one can download due to OS limits too

ftp server is a server of Godaddy, and i checked with FileZilla which listed with the whole(20000+) files out there, so i think the ftp server is not limit the file list number.

Are those files in same directory or sub-directories

Aim folder is a flat with 20000+ images, no sub-folder exists, it is a wordpress which tidy folder by a date format, such as 2018/08, 2019/09.

debug -o error.txt -c -p t

i run it right now, and wait for something output~ :smiley:

lftp a@b:/public_html/wp-content/uploads/2018> mirror -c 10

Total: 1 directory, 9998 files, 0 symlinks                                             

New: 9998 files, 0 symlinks

951119231 bytes transferred in 2649 seconds (350.7K/s)

lftp a@b:/public_html/wp-content/uploads/2018> ls -l 10|wc -l

10000l 10' at 0 [Changing remote directory...]

As you can see 9998 images downloaded, but there are 24697 images in the 2018/10 folder in all.
And no err log outputs. :joy:

i tried wget ftp://ip:port --ftp-user=a --ftp-password=b -r -m --tries=10 this command, the result is also 9998.

It is possible that there is limit on that lftp client itself. Try downloading starting filename with a, b, c, … z and then digits 0,1, and so on. Use for loop as follows:

for w in {a..z}
do
  do lftp -e "mget $w*;exit" -u USER,PASS ftp.somewhere.dom/public_html/wp-content/uploads/2018/
done

For file starting with 0…9

for w in {0..9}
do
  do lftp -e "mget $w*;exit" -u USER,PASS ftp.somewhere.dom/public_html/wp-content/uploads/2018/
done

Remember to use three loops,

  • {a..z}
  • {A..Z}
  • {0..9}

Another option is to get list of sub-dires and mirror them one by one. Basically write a small script and be done with it.

Are you maybe out of inodes on one side or the other?
Doesnt normally happen with large files, but with many small ones this is sometimes the case.

df -hi

I think it is better to generate a tar file at source server and transfer that big file instead of 20000 files.

Well, finally i download them after tared the folders one by one, manually by worked.
Maybe later days i would find the reason, but not enough time for now.

Thank you for your help~~~ nice to discuss with you nixcraft.

I don’t have permission to ssh the godaddy server right now, so I can’t check inode, but i know the server which ftp client’s on is not reach the inode limit at least.

Thank you for your tip~ have a nice day sk1u!

Yes, i choose to tared them through cPanel at last, lucky i am still have the permission to do the tar operation.

Thank you for your help~ Have a nice day Diego_Spano~


Linux sysadmin blog - Linux/Unix Howtos and Tutorials - Linux bash shell scripting wiki