17

I'm using wget to mirror some files across from one server to another. I'm using the following command:

wget -x -N -i http://domain.com/filelist.txt

-x = Because I want to keep the directory structure

-N = Timestamping to only get new files

-i = To download a list of files from an external file, one on each line.

Small files such as one i'm testing that's 326kb big download just fine.

But another that is 5gb only downloads 203mb and then stops (it is always 203mb give or take a few kilobytes)

The error message shown is:

Cannot write to âpath/to/file.zipâ

(I'm not sure why there are the strange characters before and after. I am using Putty in Windows and this may or may not have something to do with it, so I left them in. I presume not though.).

The full response is as follows: (I have replaced paths, ip and domain name)

--2012-08-31 12:41:19-- http://domain.com/filelist.txt Resolving domain.com... MY_IP Connecting to domain.com|MY_IP|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 161 [text/plain] Server file no newer than local file âdomain.com/filelist.txtâ

--2012-08-31 12:41:19-- http://domain.com/path/to/file.zip Connecting to domain.com|MY_IP|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 5502192869 (5.1G) [application/zip] The sizes do not match (local 213004288) -- retrieving.

--2012-08-31 12:41:19-- http://domain.com/path/to/file.zip Connecting to domain.com|MY_IP|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 5502192869 (5.1G) [application/zip] Saving to: âdomain.com/path/to/file.zipâ

3% [====>
] 213,003,412 8.74M/s in 24s

Cannot write to âdomain.com/path/to/file.zipâ

It doesn't seem to make any difference if the path directory already exists or is created on the fly.

Does anyone have any idea why it stopping and how I can fix it?

Any help with be most appreciated.

EDIT: I have also tried just doing a wget, no file input and renaming the file. This time it downloads a little over 3gb and then gives the same cannot write error.

wget -x -N http://domain.com/path/to/file.zip -O files/bigfile.zip

9 Answers9

12

You will get this error if you are out of disk space. run df and you will see if the directory you're writing to is at 100%

Sun
  • 335
4

It is a problem with long URL. I faced it too. So, I used bit.ly and shortened the url. Works like a Charm!

2

I was getting same error on this command:

sudo wget -O - https://nightly.odoo.com/odoo.key | apt-key add -

the problem was sudo for second command and i solve it with:

sudo su
Bheid
  • 123
1

First try:

cd ~

to get you in correct dir before you download with wget command

Gerald Schneider
  • 26,582
  • 8
  • 65
  • 97
stu
  • 11
1

Finally figured it out and it was a space issue. It's a problem with 1and1 Cloud Server's, more about it here: http://www.mojowill.com/geek/1and1-dynamic-cloud-server-disk-allocation/

1

I was doing something similar to:

wget -x -N -i http://domain.com/filelist.txt

I was receiving:

--2016-12-09 07:44:23--  https://www.example.com/dir/details?abc=123&def=456
Resolving www.example.com (www.example.com)... 1.2.3.4
Connecting to www.example.com (www.example.com)|1.2.3.4|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
details?abc=123&def=456: No such file or directory

Cannot write to ‘details?abc=123&def=456’ (Success).

In my equivalent filelist.txt file I had a URL like:

https://www.example.com/dir/details?abc=123&def=456

So to debug I tried creating the same file wget was trying to create:

touch "details?abc=123&def=456"
touch: cannot touch ‘details?abc=123&def=456’: No such file or directory

Viola! It looks like the ? was the problem, but good practice would be to remove all special characters from the file names, imagine what the & will do if not escaped.

Sooth
  • 161
1

I just added a - to the tar command after the pipe after wget

I had

wget https://example.com/path/to/file.tar.gz -O -|tar -xzf -C /path/to/file

then changed it to

wget https://example.com/path/to/file.tar.gz -O - | tar -xzvf - -C /path/to/file
Shadi
  • 121
0

I also got this error when accidentally trying to wget a file into a folder in Mac OS Trash. Moving the folder out fixed it, since Trash is read-only.

fizzybear
  • 101
0

If it starts saving a large file and writes 203 MB of it, I would suspect that you have either a full file system on the receiving end or the network connection is timing out.

You can use df -h on the receiving server to see if the filesystem is full

Check out this answer for timeout issues with wget:

https://stackoverflow.com/questions/2291524/does-wget-timeout

Also, re-try the transfer that failed and omit the -N timestamp option

Also, run ulimit -a to see if there is a file size limit on the receiving server