For checksumming a large number of files, I'd use existing tools. You can do this easily in Linux, but it should work under WSL just as well:
Code:
find DIR -type f -print0 | xargs -0 -n1 -P8 checksum >checksum.txt
(Replace
checksum with your checksumming program, e.g.,
sha256sum.)
That will checksum all of your files under the directory "DIR" in parallel (up to 8 in parallel), and it will very likely be finished sooner than writing a custom program to do the job (on my computer it takes
sha256sum about 8 seconds to compute the checksum of a 2.5GB file; I'm guessing your collection is in the neighborhood of 500GB, which would take about 1600 seconds (26m 40s) of CPU time, or about 200 seconds (3m 20s) of real time if you compute the checksums in parallel on 8 CPUs; of course this assumes
sha256sum is the bottleneck and not the disk/flash I/O speed, but regardless, even the non-parallel checksummer could have finished already since the time you first asked).