“My worst nightmare.”  That’s how John Doe (the name we’ve agreed to call him), an IT manager with 25 years of experience in the field, described the first of three ransom attacks on his employer. The incident cost the company hundreds of thousands of dollars and halted its operations for weeks, revealing the cracks in what they thought was a solid security plan.

John works for the Montreal branch of a manufacturing company. In November he spoke to Jean-Michel Vanasse at Cybersecurity 20/20, an event presented by IBM and NOVIPRO, a firm that specializes in IT and cloud computing solutions for businesses. “Nobody wants to talk about this kind of incident. We’re glad someone came forward to share their experience,” said Yves Paquette, president of NOVIPRO.

It started with employees reporting that they couldn’t connect to our system remotely or access their files,” explained John. He quickly realized that the company had been hit with a ransomware attack, in which data is encrypted to prevent employees from accessing it and then the attackers demand a ransom to be paid on the dark web in exchange for the decryption key. The damage was considerable. In less than an hour, the hackers had seized multiple servers, including company databases and staff emails.

Paying the ransom was never even considered, since there’s no way to be sure that the criminals would hold up their end of the bargain. John says his company never even bothered checking how much they wanted or what the payment terms were.

From bad to worse: when your recovery plan fails

In situations like this, the best thing you can do is to disconnect from the network and rebuild the servers based on your backup copies. The company thought it was well protected. It had daily on-site backups, other copies on remote servers, and even more copies off the network for maximum security.

But the most recent backups didn't work,” explains John. “The only copies that hadn’t been encrypted were those that were disconnected from the network and stored remotely, but we soon found that we couldn't restore them.” They panicked. None of their relatively recent sources would allow them to recover all their backed up data. In fact, the most recent intact copy was eight months old. The company was forced to comb through each backup copy to see what could be salvaged from it, and then put the various pieces together.

It was a puzzle for all of us,” laments John.

 The attack revealed a major flaw in the company's security plan: they were creating backup copies, but they weren’t verifying them. “A plan that isn’t tested, isn’t a real plan.” It’s a mistake they won’t make again.

The consequences of the attack

Ultimately, the email server was the hardest hit and they lost eight full months of data. John managed to retrieve some messages from employee workstations, but the rest was gone forever. Rebuilding the company’s central database required manually recreating transactions from hard copies—two to six weeks’ worth, depending on the department.

They were able to gradually resume routine operations two weeks after the attack and it took a month for the business to be fully operational again. In the meantime, hundreds of employees were paid to essentially do nothing while the rest were only partially productive. John says the financial losses are hard to quantify, but they “definitely amount to hundreds of thousands of dollars.”

Two more attacks

John never found out how the company was attacked or why. In all likelihood, it was simply an attempt to extort money from anyone who clicked on a phishing email sent at random.

The manufacturer’s U.S. parent company fell victim to similar incidents on two subsequent occasions. And even though the company was better prepared and the damage was less severe, the intruders still got in.

A company can improve its posture with more rigorous staff training and backup policies, but you can never reduce the risk to zero,” says John. “It just takes one click to bring down the entire house of cards.”