Ever wonder what it's like to get hit by ransomware? "Tom" gives you all the dirty details first-hand – and conveys the hard lessons his company learned in the aftermath.
First things first. My name isn't really Tom, and I don't really work for a company named Metal Fabricators Anonymous. But all of the other story details I'm going to tell you are indeed true.
So yeah, my name is Tom and I'm a software application specialist at Metal Fabricators Anonymous in Michigan. We're a mid-sized manufacturing company with about 150 employees spread between our main plant in Michigan and our second location in Florida. I oversee our Enterprise Resource Planning (ERP) application and most of the other day-to-day software, focusing primarily on implementation, configuration and management of our ERP-related business processes as well as custom-reports, user training and user support. I share an office with a colleague (and best friend) who's in charge of hardware and network configuration and security.
The not-so-normal day at the office
One weekday not that long ago, things started out pretty normal. Coffee, morning music, checking email and getting into the daily routine – until the phones suddenly began ringing off the hook and our email started blowing up. It happened quickly and all at once. Apparently, people on our manufacturing floor could no longer view PDFs of our blueprints, which they need to manufacture our products. In addition, our Quality and Engineering departments were getting Windows error dialog boxes when trying to open Excel files from our shared network drives.
It took a minute to figure out what was happening, but we soon noticed that many of the files on our shared drives had been renamed. We also noticed that besides not being able to open files, there were some new readme files in our shared folders. Reviewing the readme files and doing a quick Google search revealed that we'd been hit with ransomware. The first thing we needed to do was locate and disable the suspect workstation. The file properties of renamed and encrypted files revealed this info. I think we actually called the user and had him literally rip the Ethernet cable out of the wall jack.
Stop and breathe
OK. Take a deep breath. What just happened here? How do we deal with it? Then we remembered our backups. This would be the quickest and easiest way to fix it. We had nightly backups of all shared network drives on our NAS device. We'd lose a few hours of work corporate-wide, but at least we would be back up quickly.
The responsible user workstation hadn't been backed up, but there wasn't much there, so we weren't worried about that casualty. We didn't have the most robust backup software either (an Acronis competitor, which I won't name), but something is better than nothing, right? My colleague started digging into the backup files and started the recovery. We were feeling great until the restore process started to fail. Apparently, the backups weren't getting validated and hadn't been completed successfully for the previous six months.
The recovery plan that we didn’t have
Well, OK. How bad was the damage? The encrypted files had been renamed with a fairly consistent convention, so we were able to use Windows search to see how many there were. A thousand files showed up. Then five thousand. Twenty thousand files. Fifty thousand. Then up over 100,000files. This wasn't good, and it became apparent that paying the ransom was going to be our best option. This led to nearly a week of trying to acquire almost $4,000 in bitcoins, which is a surprisingly frustrating challenge when you have no bitcoin purchase history and don't have a validated account with an existing bitcoin exchange organization.
Out of desperation, we actually ended up finding a local, "back-alley" bitcoin dealer. I won't get into the details, but it was sketchy – not to mention expensive and financially risky. After paying the ransom to the hacker, the other risky thing was watching the hacker's decryption tool run on our network while not really knowing what it was up to. We obviously have financial and identity-related data on our network. There was also some fallout with a few hundred files that never quite decrypted properly.
How did we get into such mess? It turned out that a user had received an email with an attached PDF that included a bad link. The PDF looked like a legitimate invoice or quote request, so the user made the mistake of clicking on a link in the document.
From bad to worse
Seems like we’re in the clear, right? Wrong. The story gets worse. We started backup validation moving forward, but within four weeks experienced an additional attack. This second attack actually infiltrated the backup files on our NAS and encrypted them as well. This was something we didn't even realize was possible. The back-alley bitcoin guy was trying not to laugh when he came into the office to cash in ... for the second time.
This second attack was initiated by someone who managed to log onto our server via Windows Remote Desktop Connection. We had issues with the decryption script (once we finally paid for it), as it didn’t support the operating system that we had. Again, we ended up with some file fallout. Of course, my mind goes back to the first time we ran the decryption script. It seems a little coincidental that the "second hacker" jumped onto our network so easily, and then knew exactly how to get into our secured NAS, doesn't it?
These attacks cost us tens of thousands of dollars in lost production time and made us look foolish and incompetent to our fellow employees, customers and suppliers. In addition, it cost us nearly $8,000 in ransom, which in turn undoubtedly helped the hackers continue to perpetuate these malicious attacks. It cost us nights and nights of working to clean things up. It required some very uncomfortable conversations with our CFO, COO and the company owner.
Lessons learned (the hard way)
In the end, we learned some very important lessons the hard way. We learned to utilize quality backup software like Acronis, which is ultra-secure and utilizes secure cloud backup storage alongside physical backups. We also moved our physical backups offline and off-site. And we're now running a new firewall that has multiple layers of security protection.
We fell asleep at the wheel and had a very rude awakening. And here's the real-life issue: I know we're not the only company that's become complacent with backups. Most of us are overworked and overtasked, so it's easy to put things like backups at the bottom of our priority list, especially if we go months or even years without having to rely on them. What we learned, however, is that some time, money, and proactive planning on the front end is MUCH easier and MUCH less expensive than trying to react on the back end of an unforeseen attack on your data.