Mmmmm.
http://www.redhat.com/archives/rhl-l.../msg03266.html
http://linuxgazette.net/102/piszcz.html
If what they're saying is true, then 25K files in a list is going to be really expensive when it comes to removing a single file from the directory file list. I don't know if you can guess the order, but you might be able to manipulate the order in which you delete files in your favour. Also, blowing away the entire directory in one hit may be more efficient than removing each file individually.
zacs7's idea looks a lot better IMO. By processing the files when there are fewer in the directory, you minimise the amount of extra work in manipulating directory file lists.
I would suggest further research on "benchmarking filesystems".