Apple Bans Apple Watch Developer Before Copying Design

Reading Time: 4 minutes.
FlexType's swipe Apple Watch keyboard next to Apple's. Apple's reads "Copy that," which is fitting.

Believe it or not, the official Apple image says “copy that.” Are they oblivious or rubbing it in?

FlickType was an app made for predictive typing on the Apple Watch and iPhone. The iPhone versions was specifically made to help blind and low vision users. The app was a hit with both Apple Watch users, who saw it as a better way to type on the Apple Watch than Apple’s scribble, and blind and vision impaired users. But behind the scenes, the developer had always been struggling with Apple. Apple would deny updates to his iPhone keyboard, because it required either VoiceOver or Full Access to work. Neither is against the rules. Apple flip-flopped on their decision, eventually allowing the app, then banning it.

The same goes for his Apple Watch keyboard app. Apple initially blocked it in 2019 because Apple didn’t want keyboards on the Apple Watch, though other apps with the feature, and even his keyboard, were approved. Later they’d approve it again.

Then Apple got on stage this week and showed off the Apple Watch Series 7 and its new “QuickPath” keyboard. It looked and functioned much like FlickType. If that wasn’t bad enough, Apple had considered acquiring FlickType before banning it, then copying it.

Now FlickType’s developer is suing Apple for targeting his app with take-downs and later copying it.

Blocked and Sherlocked

FlickType’s developer, Kosta Eleftheriou, doesn’t want to go into too many details about his two lawsuits against Apple. However, it’s not hard to figure them out. The fist was due to the approval and removal of the Apple Watch FlickType keyboard. First, it wasn’t allowed because Apple claimed they weren’t going to allow keyboards on the App Store for the Apple Watch. This wouldn’t have been much of a problem if A) Apple wasn’t in talks to acquire FlickType, and B) Apple hadn’t already approved other keyboards. Apple did eventually approve the app. You can still find FlickType on the App Store for the Apple Watch. However, the developer may not be able to update it. After all, the iPhone version went through similar hoops. The iPhone FlickType keyboard can speak what you’re typing, requiring either VoiceOver or Full Access to do this. Apple continued to approve, then deny his app. Eventually, he gave up developing it.

Apple went from seeing demos of FlickType to releasing a very similar typing experience with “QuickPath” on the Apple Watch Series 7. That’s the second lawsuit.

This is something Apple does often. They’ll see a popular 3rd party app, copy the functionality, and release it themselves. It’s called “Sherlocking,” after Apple copied a popular app, Watson, with an update to their own app, Sherlock. The latter killed the Watson app as no one would buy it anymore. Eventually, Sherlock would become Spotlight Search. That’s right, one of the most commonly used parts of macOS navigation started as an app Apple copied from another developer.

The difference this time is that Apple was considering acquiring the app and had asked for demos before releasing their own version.

Apple’s Human Error Problem

One of the issues that was most rampant for FlickType developer Eleftheriou was that Apple wasn’t consistent with their reviews. For example, after approving the iPhone app when seeing that it required either full access or VoiceOVer for functionality, they approved it. However, each release was a guessing game. Would this reviewer understand that? Would they look at previous reviews? Or would they just deny the app and the app’s appeal? Turns out, it’s frequently the latter. Apple has a serious human error problem when it comes to App Store approvals. Many apps on the App Store are scams, generating millions of dollars, and Apple does little to stop them. Apple profits from scams, so the incentive to fix things just isn’t there. However, it’s also a problem of too many apps to review to keep up with. Human reviewers make frequent mistakes.

Dangerous Mistakes

This is especially dangerous due to Apple’s proposed CSAM scanning tools. For a quick recap, these look for child sexual abuse material (CSAM) on your device. They don’t scan on the cloud, but actually scan through your iPhone’s drive. This is problematic for many reasons. First, countries could use it to scan for any image, including political content, LGBTQ+ content, photos from protests, and so forth. Secondly, there are false positives. Within a day of having access to the tool, researchers found at least one natural false positive from completely different images matching the scan. If this happens often enough, it goes to a human to review blurred thumbnails of your photos to see if it might be CSAM before sending it off to the proper authorities. Apple backed down due to outcry from customers, security experts, and privacy advocates, but may still release it at a later date.

Apple’s employees can’t even get an app reviewed properly when the instructions to do so come in the app’s description. A false positive with CSAM scanning could ruin a person’s life. Local police wouldn’t treat someone kindly if accused of child abuse, and proving one’s innocence could take months or even years. Fighting technology that has found you guilty is extremely difficult. This isn’t the kind of thing you want to make a mistake over.

Many Things Wrong with Apple

This is such a unique issue that highlights so many problems at Apple. The Apple Watch Series 7 is such a small upgrade that they tried to block another keyboard app from the App Store so people would have to upgrade for a keyboard. Plus, Apple Sherlocked yet another developer after considering purchasing his product and receiving a demo. And it shows how bad Apple’s human reviews can be.

Apple likely thought none of these items would be a problem for their public perception and brand image. When people thought of Apple as a titan that could do no wrong, that was easy. We know better now. Apple has some changes to make.


Sources and Further Reading: