Link to home
Start Free TrialLog in
Avatar of BlaM
BlaM

asked on

Detect if a client lost its socket connection to the server application

I have a server application which sets up a socket, waits for a connection and then is supposed to process the received data (socket(), bind(), listen(), accept())

So far I then end up in this loop:

      while(1) {
            fd_set fds;
            struct timeval tv;
            char buf[MAX_PACKET_SIZE];
            int size;

            FD_ZERO(&fds);
            FD_SET(clientsock, &fds);
            tv.tv_sec = 0;
            tv.tv_usec = 0;

            if(select(clientsock+1, &fds, NULL, NULL, &tv))      {
                  /*data is available, read and send it back*/
                  size = recv(clientsock, buf, MAX_PACKET_SIZE, 0);
                  buf[size] = 0;
                  cout << buf;
                  send(clientsock, buf, size, 0);
            }
      }


That's okay - but how do I detect a lost connection? At the moment I end up in an endless loop and have no chance of knowing if the client is still there. In the final product I have to return to waiting for a new connection, but then I first have to know when the initial connection was ended.
Avatar of BlaM
BlaM

ASKER

Oh yes, by the way: It's a linux application ;) - forgot to mention.
ASKER CERTIFIED SOLUTION
Avatar of itsmeandnobodyelse
itsmeandnobodyelse
Flag of Germany image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of BlaM

ASKER

Thanks. That was simpler than I thought ;)