gayouj
Newcomer
This one seems to be pretty weird. I'm using asynch TCP sockets in my program. I have one class called ConnectionManager (implemented on the server machine), which has this NetMain event running on its own thread:
ServerGroup is just a Vector style class that I wrote that deals specifically with TCP type server sockets. The ConnectionManager itself deal with maintaining this list of connections, including raising events to signal when something has happened on each connection and it can also detect client disconnects and takes care of the business of tearing down that connection.
When the ConnectionManager detects that a client has disconnected from one of the servers in the list, it calls a Delete method on that particular server. The Delete method executes the following code:
I'm not at any point removing that element from the ServerGroup, however. So if I get a client connection which then disconnects later, that entry in the ServerGroup list will still exist; it's internal Socket object will just be set to Nothing.
Where things get weird is as follows:
I'm testing the program out on my machine right now (just connecting to localhost). Everything is using the same port. Server is listening on port 1000 and all clients are connecting on port 1000.
I'm noticing in this instance that when I connect two or more clients to the server, they're all being given the exact same socket reference by the TcpListener's AcceptSocket method (I've stepped through the code in run time and confirmed that each socket has the same handle).
This is creating some very bizarre behavior. When I connect one client and then disconnect it, things are fine. When I connect a second client, it is given a new slot in the ServerGroup list and I can see via stepping through code that the socket object for ServerGroup's 0 index element is no longer Nothing - it's a reference to the same socket that was returned for the ServerGroup's 1 index element.
The only reason I can come up with for this happening is that the end points for each socket are not unique. Both sockets have the exact same host and port for their server and client end points, so I'm guessing that TcpListener just serving up a reference to the same internal resource.
I would like to be able to differentiate between connections better than this in my software and was wondering if anyone else has had any experience with this and has thought their way around it. Really, I'm just trying to understand the internals of the TcpListener better so that I can engineer a better solution.
Also, it's not an option to delete the ServerGroup element after the socket has been released. I want to be able to maintain them after the disconnect occurs and just dispose of the socket itself.
Thanks in advance for any help.
Code:
Private Shared Sub NetMain()
Dim TcpListen As TcpListener = New TcpListener(m_ListenPoint)
Try
TcpListen.Start()
Catch ex As Exception
RaiseEvent Exception(ex)
End Try
ServerGroup = New EthServerPool()
Do
If (TcpListen.Pending) Then
If (ServerGroup.UpperBound < m_MaxClients) Then
ServerGroup.AddSocket(TcpListen.AcceptSocket())
ServerGroup.Current.RxThreshold = m_RxThreshold
End If
End If
Loop Until m_Stop
End Sub
ServerGroup is just a Vector style class that I wrote that deals specifically with TCP type server sockets. The ConnectionManager itself deal with maintaining this list of connections, including raising events to signal when something has happened on each connection and it can also detect client disconnects and takes care of the business of tearing down that connection.
When the ConnectionManager detects that a client has disconnected from one of the servers in the list, it calls a Delete method on that particular server. The Delete method executes the following code:
Code:
If (Not (m_Socket Is Nothing)) Then
m_Socket.Shutdown(SocketShutdown.Both)
m_Socket.Close()
m_Socket = Nothing
End If
I'm not at any point removing that element from the ServerGroup, however. So if I get a client connection which then disconnects later, that entry in the ServerGroup list will still exist; it's internal Socket object will just be set to Nothing.
Where things get weird is as follows:
I'm testing the program out on my machine right now (just connecting to localhost). Everything is using the same port. Server is listening on port 1000 and all clients are connecting on port 1000.
I'm noticing in this instance that when I connect two or more clients to the server, they're all being given the exact same socket reference by the TcpListener's AcceptSocket method (I've stepped through the code in run time and confirmed that each socket has the same handle).
This is creating some very bizarre behavior. When I connect one client and then disconnect it, things are fine. When I connect a second client, it is given a new slot in the ServerGroup list and I can see via stepping through code that the socket object for ServerGroup's 0 index element is no longer Nothing - it's a reference to the same socket that was returned for the ServerGroup's 1 index element.
The only reason I can come up with for this happening is that the end points for each socket are not unique. Both sockets have the exact same host and port for their server and client end points, so I'm guessing that TcpListener just serving up a reference to the same internal resource.
I would like to be able to differentiate between connections better than this in my software and was wondering if anyone else has had any experience with this and has thought their way around it. Really, I'm just trying to understand the internals of the TcpListener better so that I can engineer a better solution.
Also, it's not an option to delete the ServerGroup element after the socket has been released. I want to be able to maintain them after the disconnect occurs and just dispose of the socket itself.
Thanks in advance for any help.