Sony failed their employees twice, first when they skimped on security, and again when they refused to protect their employees. Tech companies made cuts this year, and remaining employees feel pressured to overwork. Meanwhile, companies may not have the resources to keep companies safe. Those making decisions that put the company and its employees at risk need to be held responsible. Upsetting the course of someone’s life and safety should come at a great cost.
What Happened at Insomniac?
Hackers from Rhysida were able to take 1.67TB of data from Sony game studio Insomniac Games with a phishing attack. They stole assets, game stories and spoilers, future game roadmaps, communications including emails, Slack messages, and recorded meetings, and employee information.
The employee information included “HR documents,” personal information including addresses, disciplinary reports, and more. The confidential information on employees was enough to steal their identities or stalk them.
Software engineers have been under considerable pressure during 2023 due to unprecedented layoffs. Game developers are frequently targets of harassment campaigns, especially women, racial and religious minorities, and LGBTQ+ people. These people were already at heightened risk simply because they make entertainment. Thanks to personal information and participation in Slack channels, it may be easy to figure out what groups people belong to.
Stress from work shouldn’t include having a target on your back, and it should never include your own company helping arm your attackers.
Rhysida asked for 50 Bitcoin, approximately $2 million. The data was put up for auction, with 2% of the data being sold off. The rest, as Rhysida promised, became public after giving Sony and Insomniac Games a one-week time limit. They claim the attack took a mere 20-25 minutes to carry out.
Security Failures Are Often a Choice
Obviously no company executive wakes up and decides to have their company get hacked. However, they do make decisions that lead up to a hack. This can include cuts to employee headcount, not making time for “tech debt,” or work to make systems more secure, robust, and easier to work on, not giving developers time to write tests and perform security checks, not hiring people who can regularly do testing, and choosing flashy products over the occasionally necessary hard work of making systems more reliable.
But it doesn’t stop there. Companies should be performing their own penetration testing (“pentesting”), where either an internal “red team” or hired engineers try to break into your system to help you identify vulnerabilities. Companies need to ensure they’re encrypting all data, both in transit and at rest. They need to ensure that permissions lock users out of data they don’t need. That means not lumping everything into the same Google Doc group, but using groups and permissions to ensure someone who has access to game storyboards doesn’t also have access to employee records. This data should only be accessible through a VPN, and, again, only through authentication including two-factor auth.
The weakest link in most security is the human element. Companies need to ensure they’re not only training their employees, but testing them. Employees should know not to trust emails sent from external sources. These messages should be filtered for most employees to begin with, as many people never need external communications at work. Once you’ve made phishing difficult, educate all employees on what it is and what to do if they suspect phishing. There should be an IT department, well-staffed, who can immediately respond with help. Occasionally, pentesters should try to crack these employee’s accounts, using it to improve education and strengthen security.
These efforts cost money. Ransom costs money. Unfortunately, if your company would rather throw their employees under the bus than pay up, it’s workers who have to deal with these costs. However, if your company can accept responsibility, they can make that easier on their workers.
Why Paying Ransom Usually Works
It might seem strange to suggest a company should pay for data. After all, the data was copied from their servers, it could be copied again. What guarantee does a company have that a hacker group wouldn’t share the data after getting their money?
Marketing.
Rhysida group only wants to make money. They didn’t attach Insomniac because they had any issue with the company or their games, they just saw them as an easy target. They were right. Had things gone to plan, they would have gotten $2 million for just 25 minutes of hacking.
Let’s say they sell the data on the dark web to random people. Maybe they make $5,000? A lot less than they could get from a billion-dollar company. Rhysida group made sure their name was on the communications. They have created a brand. Yes, branding for hackers. If they get a reputation for taking payments and sharing their data anyway, they’ll never receive another payment again. Instead, they could make millions from large companies instead of a few hundred dollars from less skilled hackers and scammers who want to use the data to extort and steal from the victims.
There’s precedent for this too. Besides more newsworthy hacks, people have their machines locked up with ransomware daily. Hackers use a trojan to encrypt the contents of your drive and will instruct you to send them some cryptocurrency to decrypt their drives. They do so. They could just leave your drive encrypted, locking you out forever, but if they did that, no one would ever pay them. So, instead, they put on their best customer service voices and fix your computer.
On the more newsworthy front, hospitals have been the targets of these ransomware attacks. Because patient information, including financial information, is so valuable to hospitals, they often will pay the data hostage takers the money to keep private data secured. Hacking groups have, largely, held up their end of the bargain. They’re not multi-billion dollar corporations who would just let your data walk out the front door, after all, they have standards.
Accountability Now
Executives, vice presidents, project managers, even individual contributors and the person or people who handed out their credentials to phishers, they all made choices that lead here. Obviously not everyone is directly responsible. It’s a lesson to do better and to make things right now. Proper planning that makes time for tech debt, security checks, testing, and education, combined with robust security policies that break up data storage and encrypt everything makes a company safer. Not perfectly safe. You can never be perfectly safe. But if you prioritize the security and privacy of your company and your employees, you don’t suffer hacks like this every few years.
You certainly don’t just let the thieves share the data for less than your CEO’s annual bonus either.
Sources/References:
- Lawrence Abrams, Bleeping Computer
- Kenneth Andersen, Meto
- Samuel Axon, Ars Technica
- Nicole Carpenter, Polygon
- CBS News
- Sarah Fielding, Engadget
- George Szalai, The Hollywood Reporter