phaolo: But.. isn't MD5 also weaker as a checksum for data integrity vs other methods?
(I thought that collisions were just non-unique matches)
Its a game between increased CPU load and security. Depending upon attack angle, more primitive hashing may turn out more efficient. For example, chance that a random bit-flip produces a hash collision are unrealistic.
To make analogy, sure you can shoot a fly out of the sky with MOAB, but MOAB also increases costs, weight etc. Regular swatter just suffice for the fly. I don't imagine someone purposely forging a garbage-filled file, that accurately matches your file, to then find a way to slip it in your system just to trick the bit-rot detection... CPU time is why some anti-rot file systems even use crc32, even lazier form than md5.
Of course, if you expose your file system outside, things change. In particular, if filesystem is distributed
and there is a chance to combine bit-rot detection with protection against forging, then use of SHA-2 is justifies itself. But for purely local filesystem, more complex hash may (benchmarking rocks!) introduce unwanted and unjustified lags/overhead in everyday work. Security almost always counters usability.