1

I am trying to tar a huge directory (with about 600GB) and sometimes my SSH conection drops and I need to execute the tar code again and my linux overwites the previous file beginning all the tar again.

Is there some parameter that allows my tar file to be resumed after some problem?

2 Answers2

8

Is there some parameter that allows my tar file to be resumed after some problem?

Nope.

What you should do, though, is run your command from within a terminal multiplexer like screen or tmux. That way if your connection drops, the process keeps running.

EEAA
  • 110,608
0

Over ssh, I would use rsync. This brings everything over uncompressed, but it will only bring over changed files, which is likely what you want as well. A 600GB transfer will probably have a few connection drops. This recovers automatically.

I've used variants of this script over time. I forgot where it came from, so if the author is found, I would like to give credit.

#!/bin/bash

### ABOUT
### Runs rsync, retrying on errors up to a maximum number of tries.
### Simply edit the rsync line in the script to whatever parameters you need.

# Trap interrupts and exit instead of continuing the loop
trap "echo Exited!; exit;" SIGINT SIGTERM

MAX_RETRIES=50
i=0

# Set the initial return value to failure false

while [ $? -ne 0 -a $i -lt $MAX_RETRIES ]
do
 i=$(($i+1))
 rsync -avz --progress --partial -e "ssh -i /home/youngian/my_ssh_key" /mnt/storage/duplicity_backups backupuser@backup.dreamhost.com:.
done

if [ $i -eq $MAX_RETRIES ]
then
  echo "Hit maximum number of retries, giving up."
fi