My parents were recently trying to sell a couch on Facebook marketplace. They nearly got scammed. My mom was more cautious with the second person who messaged her, sending me her profile. Within minutes, I had send back the woman’s full name, her husband’s name, their address, their family members, the cars they drive, a few hobbies they may be into, sports teams, and more. I couldn’t have asked them to give me as much information about themselves as fast as I could find it online.
Many of us live in a post-privacy world. The United States has not made privacy a protected right for citizens. Some might point at our Bill of Rights, but that only protects from unwarranted searches from the government. For profit? Your life is fair game.
That’s why we’ve seen a dramatic increase in the number of data breaches and leaks over the past few years. Personal information became the currency of many corporations, with nearly every company trading in it. So much personal information out there, in the wild, and usually, not protected well. Companies storing credentials, personal information, and more on servers that don’t use encryption. They’re not ensuring that data is protected behind user groups, so a product manager doesn’t have access to the database and an app developer doesn’t have access to sales receipts. Companies are putting data at risk with bad policies and poor training. Does a company ensure another company has decent data standards before selling them information? They don’t have to.
Companies aren’t taking security seriously. They don’t have to. A leak might have been damaging for a company in the past, with consumers being wary of using their credit card at a retailer that recently had a leak, and therefore less likely to shop there at all. Today, consumers don’t change their habits after a leak. They’re just too common to keep track of. On top of that, there are no laws ensuring companies that do violate our privacy also protect it.
Since the market can’t change this behavior, we need to change the law. Massive violations of consumer security needs to lead to sentencing and even the end for companies that have particularly heinous violations. We need to protect consumers’ privacy the same way we’d protect their bodily safety, because often they’re one and the same.
Privacy is Safety
In December, 23andMe, the DNA collection and testing company, disclosed they had a data breach. 7 million people affected, their DNA ancestry information leaked to hackers. Those hackers gained access to 23andMe systems in April of 2023, and went unnoticed for months, being discovered in October. The company may have found out about the breach only because hackers leaked the information, including home addresses and birth dates of 1 million users with Ashkenazi Jewish ancestry.
23andMe didn’t even tell people they may have been targeted for their heritage when they finally did tell people their data had been hacked, and took over a month to tell users they were in danger.
A lawsuit against 23andMe claims the hackers targeted people with Ashkenazi Jewish and Chinese ancestry. A commenter asked the hackers for accounts with Chinese ancestry, and the hackers responded with 100,000 accounts, saying they had data from 350,000 accounts. 23andMe allegedly did not tell users they could be targeted for their race or heritage.
Antisemites, racists, other hateful people, could target their attacks based on nationality. And while 23andMe had all of this data in one place, cross-referencing between leaks could easily suss this information out in other ways.
It’s a gold mine for anyone looking to attack people based on their race, religion, heritage, or nationality.
Marginalized Groups, Again, More Affected
Across social networks, victims of these attacks are often the people who spoke out about social issues affecting racial or religious minorities, women, or LGBTQ+ people. Marginalized groups are often the target of hackers. With companies doing little to protect this information, many even selling it, this is easy to do. It’s a fear that many have online, but none more than marginalized groups, who wonder if, by speaking out, they may endanger themselves or their family.
The 23andMe attack seemed to be a large grab of data, but hackers focused on releasing data on particular groups of people. They made the data easy for white nationalists to access. These hacks hurt everyone, but they clearly affect marginalized groups more. They do this because they know just how dangerous this information can be in the wrong hands. It’s a targeted attack, and companies holding this data in improper ways made it possible.
Dangerous Data
At companies where bad security policies gave people access to data like people’s location, stalkers have used their victim’s GPS location to follow their targets. Google collects enough location data to plot out where you have been and even predict where you’ll be at any given time. That information could be invaluable to stalkers.
Financial records can allow someone to steal your identity, ruin your life, or even use your information to go after family members who might trust a message seemingly coming from their loved one. Health records could be used for blackmail, or again used to improve attacks on family members.
Losing your privacy isn’t just an annoyance. Poor data security could lead to emptied bank accounts, ruined credit scores, eviction, ruined reputations, and far worse. Some victims of doxxing have found armed members of SWAT teams kicking down their doors. Some have even been killed in such “swatting” attacks thanks to America’s trigger-happy police.
Data is a weapon, and companies hold the key to using it against you. Unfortunately, they’re not working hard enough to protect you.
Legal Repercussions Necessary
Some companies have never had a large or devastating leak. They know they would lose their business if hacked. This includes reputable security companies and password managers. However, other companies know they won’t have to worry as much. The law isn’t as good at punishing companies as it needs to be. Fines are small, and are easy to write off as necessary expenses. Perhaps the fines are lower than the costs of moving to better security.
For the most heinous of leaks, ones where the company was clearly negligent, it should lead to harsh penalties. Fines large enough to bankrupt the company, arbitrators being able to break up companies into smaller companies to prevent such widespread data handling. CEOs and CTOs need to be held personally liable. Community service, security education programs, and even jail time could be appropriate. For companies that leak the most vital of information due to negligence, like social security numbers, a death sentence for the corporation, completely revoking their right to do business in the country is necessary.
Mistakes happen. To err is to be human. Often a leak isn’t an innocent mistake though. If you drink and drive, you knowingly you endangered the lives of others. If you store unencrypted information on people on your servers, you knowingly endangered people. That’s different than a data admin being the victim of a clever phishing attack. Poor policies should be considered criminally negligent. Accidents happen. Knowingly endangering people isn’t an accident.
We can go about this in a variety of ways. The first is to set minimum security requirements for companies, with 3rd party auditing for large companies and self-assessment for smaller ones. Educate and force them to keep their users safe. This could include letting users know what data is stored at all times, being able to delete that data at any moment, encrypting all data at rest and in transit, and ensuring data is locked down so only a few employees can even gain access to it. It certainly includes regular training to help employees identify and halt attacks.
Our lawmakers are out of touch with technology and the dangers data possess. We need laws protecting privacy and security, with harsh penalties. Otherwise, all of your data will be easily searchable to anyone, with little to no information on you. From facial recognition to text analysis, finding the identity of a person with minimal information is getting easier every day. If we don’t force companies to lock our data down now, we’ll never be able to get our privacy back.
For many—if not most—Americans, it’s already too late.
Sources:
- Katie Canales, Business Insider
- Bree Fowler, CNET
- Edward Helmore, The Guardian
- Beth Maundrill, Infosecurity Magazine
- Mariella Moon, Engadget