We help IT Professionals succeed at work.

Dedicated PCs for staff to access Internet (to contain malwares, ransomwares, etc)

Despite having in place proxy (that blocks numerous categories of sites like
social networking, public emails gmail, yahoo etc, shopping, video sites) plus
url filtering by Proofpoint plus AV for emails, we are still getting ransomware
& phishing compromises.  Education did not help

In many cases, users click on attachments or links received via email.

So I suggest only 'commonly trusted' sites needed for work are permitted for
users to directly browse from their PCs but if they need to browse more or
do google search, they have to remote into a couple of 'dedicated PCs' to
browse the Internet : these few dedicated PCs will have hardening & possibly
IOCs (like those of OSSEC) & other protection but in the event of compromise,
it's only limited to these 'dedicated PCs'.

Drive sharing for these PCs to users regular PCs (which users use to
access our internal systems) are prohibited but files transfer is via say
TightVNC's files transfer method.

What does anyone think of this?  Is it effective to stop ransomware?

It will be cumbersome but I guess this sort of "reverse jump host" could stop
the spread of compromises, ransomware etc.

Or users Rdp to these dedicated PCs with encryption but local resources
options in Rdp disabled : to further stop data leaks etc

If users download files, they will be made known that files can be wiped
out in the event there's infection, we'll need to reformat the PCs

Should these PCs join the AD/domain or just standalone to further help
stop any infection spread?  I thought standalone is better.

Is it more secure to create local accounts on these dedicated PCs or use
domain accounts (if integrated into AD)
Watch Question

Geert GOracle dba
Top Expert 2009

have you ever considered what can be done against internal malicious it staff ?
developers can be your best asset and your worst nightmare

there are always ways to work around security

when someone says their pc is protected against any cyberattack, by simply not connecting it to the internet, i often ask if it's coffee proof ...

pouring a coffee into a system nearly always brings it down

what will you do next ... prohibit coffee ?

consider letting the worst people help you in protect the system
even have them work with you for a day, they might see the light and what their actions cause
Top Expert 2014
If you go the route of standalone PCs I would.

1. Not letting them join the domain
2. do not shre usernames/passwords with domain
3. no file transfer at all, if user transfers a malicious file AV wont detect yet, you have another ransomware infection in your production environment. If users need a specific file IT department need to check source, and reason why they need it and they retrive it for them and check for sanity.
4. run those stand alone systems read only. e.g. as virtual machines that roll back on every reboot, except when you update and patch. so if they got compromised (or you suspect they could), just reboot and its ok again.
Dr. KlahnPrincipal Software Engineer
In my experience, trying to make anything foolproof only lets the fools demonstrate how extremely ingenious they can be.  Carelessness isn't solvable by education and it isn't solvable by throwing up roadblocks.

So long as there are no consequences for carelessness and somebody else has to clean up the mess, it will continue unabated.  Negative feedback is how control systems keep processes within the specified limits.

"The IT budget is out of control due to ongoing computer infestations.  The vast majority of these infestations were preventable.  Starting Monday, any system found to be infested due to user carelessness will result in the responsible individual being docked pay equal to the amount of time needed for IT to remedy the situation, without exception.  This policy applies to all employees, including supervisors and management."  That'll solve the problem.  There will be massive complaining, especially by managers who think rules don't apply to them, but infestations will decrease drastically.
Most Valuable Expert 2015
Use Application whitelisting and disable macros from running. That way you can only run programs that have been approved by you to be safe. So when someone clicks on a compromised attachment, or visits an infected url, the program or macro which starts the virus won't be able to run.
Top Expert 2014
@rindi, this is a good point.

dont forget to block cmd, powershell and other command line tool for the users.

unfortunately this wont help against injected code that runs inside whitelisted applications.
(but motsly the malware comes as own executables, so the app whitelisting will help a lot)


Let's leave aside internal risks because so far, all our compromises are due to users
opening malicious emails or visiting unsafe sites: getting hundreds every month.

How do we disable macros & does it help specifically for ransomware/cryptolocker?

Btw, MS Rdp is limited to 2 sessions, so I guess have to use TightVNC etc


Btw, which zipping/encryption tool does ransomware uses or they come with their
own flavors?  If Windows zipping is used, I'll consider disabling this Windows zipping
/encryption tool in those PCs used to access Internet
Most Valuable Expert 2015
Top Expert 2014
Many/most use own encryption, but some already sighted that uses available zip/rar encryption.
Exec Consultant
Distinguished Expert 2019
Q1: Drive sharing does not prevent ransomware or even reduce the attack surface. It may backfire as Ransomware can encrypt mapped or unmapped network share. I understand including those mapped cloud shares - as long as it is a recognized "drive" in the machine

 Q2: Ransomware also spread via RDP by bruteforcing into the remote machine so keep a strong login credentials otherwise the measures increase the attack surface. As a whole, the "jump host" does not readily handle to containment though it adds in deterrence for a direct leak out if there is restricted use of external storage device and backend server are harden and monitored closely for alerts and anomaly
 Q3: You may consider DeepFreeze such as after each user session and reboot, the original image is reverted back to last snapshots retained for common usage by users. There are add on to retain the data though I suggest that can be separatedly handled in dedicated machine rather than shared machine for multi-users. If infected, go for clean original snapshot of clean slate for assurance and do keep patches and signature to latest (snapshot should not be too outdated e.g. two generation older).

 Q4: Need to plug out the network cable instead as standalone or domain still dependent on the network connection. As mentioned, infected machine must be isolated immediately to prevent cross infection instead - physical containment is a safer and faster means to reduce damage to other systems esp the backend servers (AD, File server, etc)

 Q5: Least privileged principle instead of local or domain. As long as it is not default "super" admin for even normal user (or all users, Everyone is being the Power user etc). It minimize the exploit kit success to exploit further on vulnerable unpatched appl like office, adobe suite/flash etc and leading to callback that bring in or "drop" other malware including ransomware though not foolproof.
btanExec Consultant
Distinguished Expert 2019
For "which zipping/encryption tool does ransomware uses or they come with their own flavors?"

Typically it will leverage the OS zip where possible to keep it "light" e.g. Bart
The files it encrypts include important productivity documents and files such as .doc, .docx, .xls, .pdf, among others. When these files are detected, this infection will change the extension to .bart.zip, so they are no longer able to be opened. This ransomware places its targeted files in individual zip archives and applies password protection to these archives.
but there are cases of using WinRAR or rar per se to compress the encrypted files. The strongest is still to encrypt rather than just password zip which I see it more of being "convenient" for aggregating the ransomed files and later ease of knowing where those encrypted files are then decrypt after opening up the archive...