33

I'm looking for a quick way to compare directory contents. Is it possible to do an md5sum (or equivalent checksum) of an entire directory?

Using Ubuntu Linux

pufferfish
  • 3,012

10 Answers10

43

Sure - md5sum directory/*

If you need something a little more flexible (say, for directory recursion or hash comparison), try md5deep.

apt-get install md5deep
md5deep -r directory

To compare a directory structure, you can give it a list of hashes to compare against:

md5deep -r -s /directory1 > dir1hashes
md5deep -r -X dir1hashes /directory2

This will output all of the files in directory2 that do not match to directory1.

This will not show files that have been removed from directory1 or files that have been added to directory2.

Argyle
  • 143
Shane Madden
  • 116,404
  • 13
  • 187
  • 256
29

If you'd like to see what's different (if anything) between two directories, rsync would be a good fit.

rsync --archive --dry-run --checksum --verbose /source/directory/ /destination/directory

This will list any files that are different.

JakePaulus
  • 2,367
14

i think i answered this one before with this answer:

find . -xtype f -print0 | xargs -0 sha1sum | cut -b-40 | sort | sha1sum

gives: b1a5b654afee985d5daccd42d41e19b2877d66b1

the idea is you hash all the files cut out the hashes one per line, sort them and hash that yielding a single hash. this doesn't depend on the names of the files.

Dan D.
  • 251
5

The cfv application is quite useful, not only it can check and create MD5 checksums, it can also do CRC32, sha1, torrent, par, par2.

to create a CRC32 checksum file for all files in current directory:

cfv -C

to create a MD5 checksum file for all files in current directory:

cfv -C -t md5 -f "current directory.md5sums"

To create a separate checksum file for each sub directory:

cfv -C -r

To create a "super" checksum file containing files in all sub directories:

cfv -C -rr
Alicja Kario
  • 6,449
4

I used hashdeep, as explained in this askubuntu answer: Check the correctness of copied files:

To calculate the checksums:

 $ cd <directory1>
 $ hashdeep -rlc md5 . > ~/hashOutput.txt

To verify and list the differences:

 $ cd <directory2>
 $ hashdeep -ravvl -k ~/hashOutput.txt .
 hashdeep: Audit passed
    Input files examined: 0
   Known files expecting: 0
           Files matched: 13770
 Files partially matched: 0
             Files moved: 0
         New files found: 0
   Known files not found: 0

This has an advantage over md5deep in that it will show renamed (moved), added, and removed files, as well as avoiding the problem with 0 length files pointed out at the bottom of http://www.meridiandiscovery.com/how-to/validating-copy-results-using-md5deep.

Paul Gear
  • 4,686
Argyle
  • 143
3

This worked for me: (run it while in the directory you are interested in)

md5deep -rl . | awk '{print $1}' | sort -n | md5sum
cat pants
  • 2,363
1

You could create MD5 sums of every single files, order these checksums alphabetically and has them (with or without newlines). Since MD5 is cryptographic, it should work just fine with hashes of hashes.

There should be a certain order to things, otherwise you will get different results for equal dirs.

And you should consider that adding some file to one dir will completely change the result, even if it was just a .directory of .DS_Store file.

1

As a specific case, lets say you want to copy some files from directory1 to directory2 and then you want to verify a successful copy using an md5 comparison.

First. cd to directory1 and type:

find -type f -exec md5sum "{}" \; > ~/Desktop/md5sum.txt

which will create a reference file containing an md5 sum for each file in directory1. Once this is done, all you have to do is cd to directory2 and type:

md5sum -c ~/Desktop/md5sum.txt

The program md5sum fetches each path from the md5sum.txt file, computes the md5sum of that file in the destination folder and then compares it with the sum it has stored in the file.

After the process is complete, you will get a summary such as 'So and so many files didn't match up' or something like that.

Joel
  • 11
1

I've had a need for verifying integrity of backups/mirrors which contain a large number of files and ended up writing a command-line program called MassHash. It's written in Python. A GTK+ Launcher is also available. You may want to check it out...

http://code.google.com/p/masshash/

0

One-liner:

find directory -exec md5sum {} \; 2>&1 | sort -k 2 | md5sum

This lists all files and directories and gets md5sum for each. Then gets md5sum for everything.

Tricky bit solved here that md5sum is not capable to do the sum for a directory, but it tells this to us: md5sum: dir/sub_dir: Is a directory. We just move this message to a standard output.

laimison
  • 649