0

Possible Duplicate:
what is the fastest and most reliable way of transferring a lot of files?

I am currently attempting to transfer over 1 million files from one server to another. Using wget, it seems to be extremely slow, probably because it starts a new transfer after the previous one has been completed.

Question: Is there a faster non-blocking (asynchronous) way to do the transfer? I do not have enough space on the first server to compress the files into tar.gz and transferring them over. Thanks!

Nyxynyx
  • 1,509

2 Answers2

0

Put the files on a hard drive and ship it via FedEx, UPS, DHL, etc.

Michael Hampton
  • 252,907
0
  • Run 1 rsync process per directory, until you saturate your network link. Script it so that a new rsync process is triggered when the previous one ends.
  • or, run 1 rsync process per unique character at the start of the filename using includes.
  • or, run 1 rsync process per unique 1st + 2nd character combination of the filename using includes.

Basically rsync - doing whatever it takes to trigger enough to saturate your network link.

EightBitTony
  • 9,441
  • 1
  • 37
  • 47