Many Apps Record Your Every Action Without Securing Your Information, a Developer’s Perspective

Reading Time: 9 minutes.
Three screenshots from Air Canada app. These screenshots store user data and were sent to servers without encryption.

Unmasked passwords obtained by Air Canada

I’m a software developer during the day and… well, often the night as well. I’ve done a wide variety of software development, from games to automated tests, PC to Android, and everywhere in between. But it’s those automated tests I want to talk about. I’ve already lost a large number of developers with that, but stick with me here. Some of the most creative projects I’ve worked on had to do with testing. For those who don’t know, automated testing is a form of end to end testing. Basically, you run your app or webpage somewhere, and you automate interaction with it using a different program. This can tell you if you have any bugs or problems.

Some of the best features I worked on were things to make diagnosing a problem easier. A simple failure message isn’t enough. My tests would log every interaction, create their own steps to reproduce bugs, and would pair each of those with a screenshot. We would even do optical character recognition on screenshots to pull out important text, toast messages, or other difficult to quantify items. This was incredibly useful for finding bugs, figuring out how to replicate them, and getting us on the fast track to fixing them.

That technology was separate from the app we release on the Google Play store or App Store. None of the companies I’ve worked for ever had something that logged customer interaction in the way that I was logging our automated tools.

But some companies do.

All Your Screen Are Belong to Us

Some companies record every interaction you have on a webpage or app in the form of a screenshot. That’s not necessarily wrong. In fact, there are many benefits for the user. However, if screen recording is not done properly, it’s a huge security and privacy mess. It’s hard enough to get developers to write their own tests and diagnostic software. You can bet they didn’t do screen recording for analytics and testing purposes correctly.

A number of apps on the App Store feature screen recording for analytics, customer service, and testing purposes. Unfortunately, they were transmitting information unencrypted, without telling the user, and that information has leaked personal information for app users. Here’s what happened, why it’s wrong, what apps were affected, and what will be done about it.

The Benefits of App Recording

I’m sure many of you are more concerned about the security implications of sending your screen across the internet every time you tap it than how this will help you, but stick with me for a bit. There are many reasons to support this kind of screen recording. But, you can always skip to the next major section, “Where Developers Went Wrong” if you just want to be angry.

Customer Service

Let’s say you’re using your favorite paid app and it crashes every time you try to do something important. You send an email off to customer service but they don’t know what the hell you’re talking about. A few weeks or months later, one of their developers stumbles onto the glitch and it’s finally fixed. However, you’ve already deleted the app, stopped using it, or found that alternatives aren’t as good as what you used to have.

But let’s replay that with screen recording. You make a complaint and the customer service rep looks up the last screen recording of your app session when it crashed. They quickly see how you had the problem, and a developer has picked up the ticket, ready to fix it for the next release. You’re thanked, and the next time you update your app, you know that you had a part in fixing that bug for millions of people.

Analytics

So, you know when you first start texting someone you really like? Every message you send throws butterflies into your stomach. Is she going to like this? Can they respond right now? Will he laugh? Maybe they ghost you in the end. It hurts, but, dating, like app development, is a learning experience. You figure out what works, what doesn’t, and you get back out there. You 2.0, better, faster, smarter. Release notes: the latest patch removes that bug where the human would randomly start talking about its keyboard collection for twenty minutes.

App developers do this kind of analysis on our creations as well. But we don’t want to wait for the 1-star app reviews to come pouring in to discover an issue. We don’t want to find out that no one can figure out how to use our “new and improved” search bar until it’s too late. We want to make sure our users are happy right away and solve problems before we lose them. While that would be extremely creepy and needy in relationships, it works out quite nicely in software development.

We do this through analytics. Developers measure what features are getting a lot of use, which users really like the app and which parts of the app they like the best. We learn about parts of the app we can remove, improve, or revert. By looking at the parts of our app or webpage that get a lot of traffic, or are used by our target audience, we can ensure the app works perfectly, exactly how our users want it.

Screenshots to the Rescue

Taking screenshots of user interactions can tell us more than event logs. We will know more than “The user searched for X,” we’ll know that “The user searched for X because they couldn’t see X was on their home page already.” Screenshots show us exactly what’s going on, enabling us to see how our apps are really being used. That results in slick interfaces that feel intuitive. It’s because we designed them specifically with our users in mind.

Limitations on What’s Recorded

Now, these aren’t actually recording video. That would be way too much data constantly going back and forth between servers. No one wants that. Instead, it’s screenshots of interactions. It might not even be all interactions. Perhaps we don’t want to see the information you enter during the sign up flow, we only want to see how people interact with a new ad placement, or a new video player. Do people know they can adjust the video quality? Can they find that easily?

These interactions tell us more than the typical logging we do. Yes, we all do logging. Every website, app, and game you’ve ever used is doing logging. We’re already tracking users’ every click, every message, every search, every like. All of this is done to find errors and find ways to improve the experience of the app. Taking screenshots of those interactions helps give us the why of a particular action. It helps us learn far more than what we could from just a log of “User tapped on the search bar twice.”

We’re not taking video. However, we do need to ensure we’re not collecting information you won’t want recorded. You likely don’t want your personal information included in those screenshots. You don’t want unencrypted passwords or credit card numbers stored anywhere. Let’s face it, you want the benefits of these screenshots without the horrific implementations of your personal information, photos, passwords, and more leaking on the internet. And this is where developers messed up.

Where Developers Went Wrong

Data Unmasked and Unencrypted

If you’re going to take screenshots of users’ interactions with your app, there are two key things you should do. First, you should place a black bar over any private information. Developers must ensure they place a black box over fields they may fill out containing personal information, views of plain text passwords for verification that they don’t have a typo, banking numbers, and other such personal information. This requires some extra work on their part. They’d have to label every single text field where personal data could live. It would require a significant amount of diligence.

The second option is likely the one most developers would do instead (but should do alongside): encryption. Before you transmit the images from the users’ phones to your servers, have them encrypted on device. This prevents them from being intercepted in a “man in the middle” attack, and it ensures you’re storing the images encrypted on your servers.

The problem? Most developers were doing neither. That means plain text passwords, addresses, banking information, credit cards, and more were transmitted over the internet in images. It’s impossible to say how many people they exposed. If you used any of these apps and they weren’t encrypting and/or blocking your private data, it’s all but certain that it will get to hackers at some point. Using optical character recognition (OCR), it would be incredibly easy to index all of this plain text information. A hacker could easily capture every bit of data you’ve ever inputted into one of these apps.

If your data is stored in unencrypted plaintext somewhere in the world, someone will find it eventually.

Missing Privacy Policies

Even if these companies were storing your information properly, they weren’t telling you they were doing so. Missing from privacy policies was clear information about these practices. Many used third party tools like Glassbox that don’t ask their users to update their privacy policies. Other companies just didn’t think the analytics and diagnostic information they were collecting through photos, rather than logs, was worth telling users.

Regardless, even if you were the kind of person to read privacy policies (you’re not), you wouldn’t have found out they were spying on every move you made. That opens Apple as well as these companies to legal issues now.

Third Party Tool Glassbox

Developers can integrate Glassbox’s tools into their apps to gain insights on customers and, as we’ve discussed, take screenshots of their activities. Glassbox makes integrating these features easy, but it also introduces issues.

For example, Glassbox can store the images on their own servers, unless companies pay to have them stored on their own servers. This means that Glassbox has a large repository of images that include credit card numbers, usernames, addresses, full names, banking information, passport numbers, and far more. If a hacker were to hit Glassbox’s servers, they’d have a treasure trove of personal information. Glassbox claims they’re encrypted, but we know they’re not encrypted before leaving a user’s device, meaning these are effectively unencrypted.

Even if a developer decides to use their own servers, instead of Glassbox’s, the information is still transmitted without encryption. Someone on an unprotected WiFi network, a victim of a malicious router and a man-in-the-middle attack, or the company itself could leak user data. Glassbox’s software doesn’t enable this encryption, and a developer might have to go to great lengths to add it themselves, depending on how open Glassbox’s SDK is.

Glassbox also does not inform developers that they’ll have to update their privacy policies. They don’t force their clients to do this either. While forcing developers to update their privacy policies would be the right thing to do, it would likely increase the effort required to implement their SDK. As a result, Glassbox chooses to allow developers to do whatever they want with their privacy policies, largely leaving users in the dark.

Air Canada’s Data Breach

Testimonials from Air Canada emplolyees touting their love of Glassbox

Air Canada, victim of a hack, was a Glassbox user.

In 2018, Air Canada, a Glassbox user, leaked piles of customer data. Hackers were able to obtain personal information, including the full details of a person’s passport, names, email addresses, various travel account numbers, photos, addresses, and more. It’s an extreme privacy leak that will have identity theft ramifications for the victims for years to come. The leak only affected mobile users.

The fact that this only affected mobile users is suspicious. If it had been a breach of Air Canada’s servers, where they were storing this information without encryption, then it would have also impacted desktop users. Instead, this only hit mobile users, those who likely would have been hit by screenshot based diagnostics and analytics leaks. Glassbox could have leaked this information, or at least enabled Air Canada to leak it.

How’d Air Canada Leak Data?

Two screenshots of the same screen. In one, the credit card numbers were hidden. In the other, they were not.

Intercepted screenshots from Air Canada are not always privatized. Source: The App Analyst via Tech Crunch

Initial reports on the Air Canada leak did not mention how hackers gained access to this information. Initially, we suspected Air Canada’s hack involved scraping passwords and usernames from other sites, and using this to decrypt a person’s personal information. This would have worked for desktop and mobile users alike though. Air Canada did not explain how hackers got their hands on mobile users’ accounts.

Air Canada claimed they did not see improper password usage, which implies that the hack did not use passwords. Chester Wisniewski, a cyber security specialist who spoke with CBC, stated, “I suspect hackers stumbled across a bug in the API.” I’m not a cyber security expert myself, but I have some experience in the matter. I agree, this doesn’t sound like a password hack, it sounds like some part of the Air Canada app used an insecure API or SDK. Knowing what we now know about Glassbox, it’s possible these screenshot based analytics were the cause of the hack.

What Apple Wants

Apple doesn’t want to put a stop to the practice of taking screenshots for users actions. This, in combination with diagnostics logs, is incredibly useful. Glassbox isn’t the only developer making screen session recording software. Many companies do it in-house, either for diagnostics and logging or just for internal testing. These are incredibly useful tools that help us make better apps.

However, Apple does want developers to be clear with their users. They’re forcing developers to either remove their screen recording software or clearly define it in their privacy policies. Apple’s requests seem to vary between developers, so they could treat some violations, like those using unencrypted screenshots, as more heinous violations of their guidelines than those who simply did not disclose that they were taking screenshots.

Apple has strict rules against collecting user information without informing users. These apps clearly violated those rules.

“Protecting user privacy is paramount in the Apple ecosystem. Our App Store Review Guidelines require that apps request explicit user consent and provide a clear visual indication when recording, logging, or otherwise making a record of user activity. We have notified the developers that are in violation of these strict privacy terms and guidelines, and will take immediate action if necessary.”

– Apple spokesperson to TechCrunch

What Google Wants

Google has not threatened to ban these app developers from Google Play. They have not banned the practice either. Google Play developer guidelines state that developers must disclose data collection practices to the users. This seems to be a violation of that policy, but, depending on the individual privacy policies of these apps, they could already have language that Google finds sufficient. Google seems to believe that existing privacy policies and the poor security within these apps are fit for their service. You might think otherwise.

What do WE Want?

So what should we ask for? Certainly not for developers to get rid of screen recording software. It’s extremely useful for diagnosing problems, customer service, testing, and analytics. It helps us make your apps better and it makes getting help for problems much easier as well.

However, we should ask for four main things:

  1. All screenshots are encrypted on device and are then sent and stored encrypted.
  2. Don’t take screenshots of sensitive data at all.
  3. Developers need to state that they’re doing this in their privacy policies.
  4. Developers should give users the option to turn it off.

These four simple rules would allow developers to continue to collect vital diagnostic information while respecting privacy, security, and users’ choices. Implementing these four features, if using in-house recording or some third party APIs, would not be difficult. The most difficult—and most vital—part is encrypting these screenshots. Some third parties may make this incredibly tricky if not impossible. Developers using those third party features would have to find replacements or make their own (which, admittedly, would not be difficult either), or simply going with the second option, never taking screenshots of personal information.

These are not big asks. App users deserve to know that their data is being handled with care, that they’re in control of their devices, and that they’re in control of what information is made public. Unless we push Apple and Google to enforce guidelines that are more strict than their current rules, we won’t get the security and privacy we need from our apps. You can send Apple feedback using this website and Google’s feedback form is here. Ask them to enforce better privacy policies. You can also share stories like this one with your friends to better explain the issue and create advocates out of everyone.

There’s room for compromise here, we just need to tell Apple, Google, and app developers what we want.


Sources: