I am using a 2003 server as a terminal server. The server has 2 network cards one for access to the local area network and one for a dedicated, secure connection to another terminal server. The purpose of the setup is to allow users on the LAN to login to a remote desktop session on the local terminal server from which they can connect to the remote terminal server.
The problem is that the connection to the remote terminal server does not always connect. When I disconnect the LAN connection the remote connection works every time. It seems as though when both network interfaces are connected Remote Desktop does not "know" which one to use. I want to set it up so that Remote desktop only uses NIC 1. Everything else uses NIC 2. Is this a static route? How and where do I set this up?
Windows NetworkingNetwork ManagementNetwork Architecture
8/22/2022 - Mon
I started with Experts Exchange in 2004 and it's been a mainstay of my professional computing life since. It helped me launch a career as a programmer / Oracle data analyst