So, in order to speed my research I decided to build a four or five node ethernet 'cluster', and run my software on each. Naturally I've given in and decided to reprogram it using networking code to distribute the processing. The idea of having four 2 GHz boxes parsing English makes my mouth water, doesn't it yours?Code:test
Anyway, long story short I've been trying to get a single server / multiple client setup where the clients are processing nodes and the server just sends requests and writes results to a centralized place. I have a global variable for sent/received bytes total on the server and client sides. The problem is when I have more than one client connecting to the server, the total of received and sent bytes on the server is lower than the total fo that sent on the clients. I have a feeling this has something to do with mutual exclusion, and I tried isolating the send/received requests on the server-side, but that didn't seem to work. By the way, this is in Java. So anyway I hope someone can shed some light, Bubba? Sunlight? Kermi? Thanks!
=-{da}-=
Code:// there ain't no source code it's in the attached file! nice new feature to enforce code tags guys... nice...