Help (again).
I'm doing something horrible and bad and coding a program that uses a new TCP communications protocol. It reads a string from a client, and then does something with it. This is a stupid question, but how do I know when to stop reading over the socket?
doesn't work. We are trying to send the byte length of the string, but there appears to be some disparity - the client thinks the string length is, say, 747, and the server thinks it's only read 742 bytes, so it hangs.