Hello everyone, thanks for reading!


I have a little question about TCP sockets and how to define which port will be used for the connection. So far I just defined port 3000 as a default port and I have a local service running (client) that connects using (localhost):3000 to my server. Since the socket depends on source IP, source Port, destination IP and destination Port it's pretty unlikely the combination will repeat. However, unlikely doesn't mean impossible, and of course it happened. It so happens that TeamViewer (a remote desktop app) also seems to use a local socket and (yes!) it's also using port 3000...
The result is a SocketException when I try to open my local TcpListener (Only one usage of each socket address (protocol/network address/port) is normally permitted)


Now, it would be easy to just try to open the listener on say 3001 or 3002 or whatever port is available, but
1st, that would mean that the client side (may be local or remote) will have to cycle through a range of ports to connect to the server, which would maybe mean my client will take a long time to connect (in computer times) and
2nd, how big a range should I set to avoid the (veeeeery unlikely I know, but still possible) scenario where they are also in use by other apps? A small range would mean a faster cycling of the options, a large range would prevent possible collisions with other apps using the same port.


Is there a protocol for knowing which port someone might be using? Or how I should set my server/client sides to find each other? I've been reading some TCP standards and I could not find much so far, other than a first handshake and then a port switch like FTP, but that would mean a connection is already established.


Any ideas? Thanks for any help!