Link to home
Start Free TrialLog in
Avatar of nicolausj
nicolausj

asked on

Blocking websites for Remote Desktop users

We have about 10 dumb terminals that connect to our Terminal Server. We want to block certain websites while allowing others.

I have made changes to the host file on the server to block the websites we don't want, and it works on the server, but when I go to a terminal and log into a remote session the sites can be accessed.

The terminals themselves don't have internet, only when using remote desktop can users get to the web.

Any ideas?
Avatar of Don
Don
Flag of United States of America image

Avatar of nicolausj
nicolausj

ASKER

Not really what I was looking for. We only need to block websites on our thin clients, not "all" of our computers. Some staff still need access to sites we wish to block on our thin clients.
Have you tried flushing the dns cache and then checking to make sure the changes from the host file are being read?

ipconfig /flushdns <-- flush dns
ipconfig /displaydns <-- Will display all the dns information that is in the cache, basically all the hosts file information.
DNS looks good on the server. Everything I have blocked using the host file is appearing as 127.0.0.1

Is it possible the RDP sessions aren't picking up on the host file? Do I need to share it? or is there a place I can include it in the remote desktop settings?
as long as the security on the hosts file is set so those usesrs can read/execute the file you wont need to worry about sharing it etc. Are you doing this as an admin account or as a user? Can you list the examples that you're trying to block? Can you show us the hosts file?
Configure them to use a false proxy 127.0.0.1, then add the sites to be allowed to the exceptions list.
proxy.jpg
namol : I have configured the servers host file using the administrator account, and I will check to see that all users have read access to the servers host file.

namol: all users have read and exicute permissions on the host file.
And the hostfile is basically the following

127.0.0.1       www.youtube.com
127.0.0.1       www.facebook.com
127.0.0.1       www.hotmail.com

It blocks all the sites fine when sitting infront of the terminal server (administrator account) but when using RDP the sites aren't stopped.

Also, when looking at the host file I noticed some of the files on the server appear with blue font and the rest are all black. Does this mean these files are being shared with the thin clients?
never mind... it means the files are compressed.
I would remove the www portion from the hosts file. That just blocks the specific server but all of those servers have more than one, such as images.facebook.com etc.
If they use VPN, you can setup a small DNS server to block those requests automatically (or give bogus responses)

if not, configure a custom DNS on that server with read permissions ONLY for those who remote in to that particular machine.

A bit of a hack but it could work. If this doesn't make any sense let me know, I'll give you a complete breakdown of what I mean in detail.
as namol said, www. is not restrictive enough, but other than that, there different dns prefixs such as .ca, .hk, .co.uk etc.
They don't use VPN, they are thin clients (dumb terminals) within our building.

On the server itself it does seem to be restrictive enough but I'll remove the www. and see if the clients RPD sessions are affected.

But why are the remote desktop sessions to the terminal server not pulling the servers host file for the sessions? They were setup with Windows XP Embedded, but it doesn't make sence that I would need to mess around with the Clients host file when the session is via RPD on the server...
Removing the www. in front of the websites actually allowed the server to surf to the sites we wanted to block. When I put the WWW. back, the sites were then blocked.

Any Ideas why my thin clients RDP sessions aren't being affected by the host file on the server?

Thanks
have you tried to modify the clients host file and try again?
I'm still not seeing how modifying the clients host file is going to change anything.... but I did look, and not one of the thin clients has a host file.
Alright, I just ran another test. Used remote desktop to the sever, and logged in as the local administrator. When trying to surf to www.google.ca the host file worked and stopped me from getting to the site. If used one of our user accounts I can surf to the restricted webistes.

Does this mean there is a permissions issue? Or could the remote users have been setup incorrectly on the server?
Nevermind I figured out the problem.
ASKER CERTIFIED SOLUTION
Avatar of nicolausj
nicolausj

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial