I am trying to determine the best possible solution to accomplish a simple, yet complex task.
I've been working on a new project for quite some time and have written a custom framework from the ground up. It's more or less a (Client --> Server --> Server <-- Client) system.
So far I have the basics done. Here's the process:
- I expect to have one client connect to my listening server.
- That client will periodically send me a string of IP Addresses (10 or more at a time).
- When I receive the data, I split it up accordingly and query the IPs just received.
All of that I can handle and have completed the code for up to 95%. The string of IP Addresses I expect to receive are of other servers in which I will attempt to query and gather data from.
However, the problem is I may need to query up to 100 or 1000 servers and I don't want to have to wait for one to finish, in case it's unresponsive and gets delayed the full timeout length before moving on.
With that said, I want to query as many servers as fast as possible. And I don't care which order they come back in. As soon as I receive the data I just need to save it to a file (no further communication to the original client is required).
I am writing this in C and in the Linux environment -- so is there a way to even do that? Would I need to use the fork() command or write my own custom Socket Pool to handle multiple outgoing connections at once?
- Stack Overflow