I have a server that receives connection requests from clients. This server makes use of the asynchronous Socket.BeginReceive and Socket.EndReceive method. The code is pretty similar to the code found here.

In my case, after calling Socket.BeginReceive I need a timeout such that if the client hangs on to the connection but does not transmit any data at all for a fixed amount of time, I need to terminate the connection.

  • How do I go about terminating the connection in this scenario?
  • What is the best way of coding a timer?

Accepted Answer

Just call the socket's Close() method. The callback method will run pretty quickly after that, you'll get an ObjectDisposedException when you call the EndReceive() method. Be prepared to catch that exception.

You probably don't want to block the thread that called BeginReceive, you'll need a System.Threading.Timer or System.Timers.Timer to detect the timeout. Its callback should call Close(). Beware of the inevitable race-condition this causes, the timer's callback will run if the response was received a microsecond before the timer expired. You'll close the socket, even though you got a good response. The next call to BeginReceive() will fail immediately.

Written by Hans Passant
This page was build to provide you fast access to the question and the direct accepted answer.
The content is written by members of the stackoverflow.com community.
It is licensed under cc-wiki