I ran into this problem on a FreeBSD box today. The issue was that it was an artifact of vi (not vim, not sure if vim would create this problem). The file was consuming space but hadn't fully been written to disk.
You can check that with:
$ fstat -f /path/to/mount/point |sort -nk8 |tail
This looks at all open files and sorts (numerically via -n) by the 8th column (key, -k8), showing the last ten items.
In my case, the final (largest) entry looked like this:
bob vi 12345 4 /var 97267 -rwx------ 1569454080 rw
This meant process (PID) 12345 was consuming 1.46G (the eighth column divided by 1024³) of disk despite the lack of du noticing it. vi is horrible at viewing extremely large files; even 100MB is large for it. 1.5G (or however large that file actually was) is ridiculous.
The solution was to sudo kill -HUP 12345 (if that didn't work, I'd sudo kill 12345 and if that also fails, the dreaded kill -9 would come into play).
Avoid text editors on large files. Sample workarounds for quick skimming:
Assuming reasonable line lengths:
{ head -n1000 big.log; tail -n1000 big.log } |vim -R -
wc -l big.log |awk -v n=2000 'NR==FNR{L=$1;next}FNR%int(L/n)==1' - big.log |vim -R -
Assuming unreasonably large line(s):
{ head -c8000 big.log; tail -c8000 big.log } |vim -R -
These use vim -R in place of view because vim is nearly always better ... when it's installed. Feel free to pipe them into view or vi -R instead.
If you're opening such a large file to actually edit it, consider sed or awk or some other programmatic approach.