I Love How NYC Came Together to Hate on an AI Companion Product

Reading Time: 6 minutes.
A drawing of a "friend" poster, defaced. The definition for friend is crossed out with text reading "an actual person, you losers"

(That’s a fake URL, obviously don’t go there)

If you’re a New Yorker, you’ve probably seen the ads for something called “Friend” in the subways. It seemed you couldn’t find a train car or station without one of the low-information ads. From the first moment I saw one, I just knew it had to be for AI. It mentioned nothing except a fake dictionary definition for the word “friend” on a white page with an overexposed image of something circular. It looked so full of itself and purposeless, I just knew it was AI. I felt the subtle urge to deface it with my trusty sharpie. I didn’t (officer), but the urge is clearly one I was not alone in feeling.

I haven’t seen a single one of these ads without some kind of defacing or damage in weeks.

I’m so proud of you, New York.

NYC is a dense city. We’re close to humanity all the time, and many of us have large, sprawling friend groups. Many of us are artists, even if only as a hobby. This has resulted in the worst place to try to push an AI companion. Artists hate AI that steals our work and we have no need for fake friends. Plus there’s one factor that we haven’t considered yet: New Yorkers don’t give a fuck. We jaywalk, flip off cops, argue with cab drivers that blow red lights, jump turnstiles, and curse out tourists and transplants who stop walking in the middle of a sidewalk. New York forces you to speak up a little, so we do.

Of course we defaced those ads. And while New York may have been just the wrong place to try to push an AI “friend,” these kinds of products simply aren’t popular. Thankfully. Because AI that’s always listening is not a precedent we want to set, but it is becoming more common. That’s why we have to call it out as loudly as we can. Especially with a sharpie in a train station.

What is this “Friend” Garbage?

Friend is an AI service that uses a chatbot “friend” to help you feel less lonely. The ads show an overexposed photo of a white pendant that looks like an oversized AirTag. In many ways, it feels like an ad for an iPod from the mid-2000’s, a pristine design that feels like it’s supposed to invoke the idea of “the future.” But is the future of friendship a one-sided affair with an AI?

The Friend hardware costs $129, listens to what you’re saying, and simulates a “friend” wherever you go. It’s a small startup and says the “memories” on the device will be stored, encrypted on the device, cannot be recreated, and “memories” won’t be erased with updates. Some users found it would “forget” things like their name, and at one point completely “forgot” everything and was rebooted. It apparently “listens constantly” and is powered by Google’s Gemini for responses. The pendant is always listening, but you can press the button on it to ask it questions, and it’ll generate some text to send to your iPhone (and only iPhone) via an app.

It’ll also send you messages unprompted at times, kind of like how a real human friend can text you or interject in a conversation. The pendant listens to and processes the data of anyone around you without their consent, at all times. Apparently it can even tease or bully you over what it’s hearing. AI has allegedly helped teenagers kill themselves, so of course there’s a chance it could be abused. Friend says no one under 18 is supposed to use the device, but since it’s always recording, plenty of people will feed into its data without consenting to the collection of that data, of all ages.

“I couldn’t find satisfactory environments to test the always-listening Friend; the concerns about digital eavesdropping made it too much of a gamble.”

– Kylie Robison on the difficulty of testing the Friend, for Wired

The creator says he based the personality of the Friend off of his own personality, which Wired writers describe as “brash” and “snarky.” They described the tone of the Friend as “opinionated, judgy, and downright condescending at times.”

It’s hard to describe the Friend AI pendant without exhibiting those qualities yourself.

The recordings, seemingly, are not exclusively stored on the device. The Friend pendant requires a constant internet connection through your phone to do anything. They claim to not currently sell data “to third parties to perform marketing or profiling,” and say they have no plan to do so in the future. However, they do state that they use data for research and personalization, including potentially handing over data to authorities. That seems to imply they collect data from the Friend pendant, or any interactions with it. It’s hard to say just how much data could be collected without consent, or what that data includes, because even users cannot see what part of conversations Friend is replying to, let alone what data has been collected. Friend says its users are liable for following the local laws, but if they don’t know everything that’s being record, and Friend doesn’t make that kind of data available, how could they know if they violated wiretapping laws? Without that transparency, users could be taking a risk.

“I realized that not everyone wants to be my friend.”

– Ari Schiffmann, on reducing the snarkiness of his Friend wearable

It sounds like it’s a toxic friend who is antagonistic, might gossip about you behind your back, and everyone gets upset when you bring them around. A real party pooper. If that’s what any of your actual friends are like, you deserve better.

Why is it Everywhere?

“The audience completes the work. Capitalism is the greatest artistic medium.”

– Ari Schiffmann, CEO and creator of Friend

Everyone, even those of us who typically don’t pay much attention to ads around us, knows the Friend ads. That’s because the startup dumped over $1 million on those ads. Imagine having $1,000,000 and you spend it being mocked by everyone in one of the world’s largest cities. What a waste. That lead to the creation of more than 12,000 ads, most of these are in subway cars, but a few take up a few panels in subway stations. Some even lined the walls of the walkways in stations.

And of course, we hate it.

Schiffmann claimed in an interview with Adweek—after the fact—that it was on purpose, stating “I know people in New York hate AI … probably more than anywhere else in the country. So I bought more ads than anyone has ever done with a lot of white space so that they would socially comment on the topic.”

He had also mentioned trying to emulate Apple’s design, which also includes a lot of white space, so these were likely going to have the same design regardless of motivation. You could be forgiven for believing he’s trying to cope with the negative attention. Because, is someone who sets out to antagonize a whole city really the kind of person who you’d want designing an artificial friend anyway? Is the person who wants to bother you on West 4th Street really someone you want hanging around your neck all the time? Why would you be interested in a product that gets your attention through trolling?

AI Data Collection is the New Social Media Spying

Schiffmann says he doesn’t collect data for marketing purposes and doesn’t intend to. The Friend AI companion is, supposedly, going to be just that: a chatbot made to sound like a friend. However, other companies may not share that motivation, and normalizing this kind of “always-on” recording can only have negative connotations for privacy. We can’t let these become normal parts of our lives. I already wrote about how no one wearing Meta’s smart glasses is allowed around me. That goes the same for any of these lonely loser lanyards.

AI will inch its way into our lives. But we can’t let it be from companies that want constant involvement in our lives or who may sell our data. AI should be a tool we can use and put away, not something that leeches off a part of us. Even if you could completely trust an AI pendant that is always listening not to share the data, do you want to normalize technology that’s always listening? Some companies will want that data for marketing purposes, if they’re not doing it already, and normalizing it will just expose us to more of that tracking.

If Your Friends are AI, You Have No Friends.

Go make friends. I know, listen, I know, it’s not easy. I’m a socially anxious person, though people often have a hard time believing that. I mask it well. It took until adulthood for me to start actually making friends. Even then, after school, with no rigid structures to give us time to make friends, we have to create those situations for ourselves. It can be challenging. No one teaches you how to open up your life to new social connections. But everyone’s searching for the same things. Humanity wants to socialize. You just have to put yourself out there enough to find people looking for new friends. There’s a “loneliness epidemic,” trust me, a lot of people are looking for friendship.

Find your people. Think about the things that bring you joy and find other people who enjoy the same things. Friendships and relationships will happen naturally from there. For me, it was music. For you, it might be a media fandom or perhaps a climbing gym. Sometimes it’s just a trivia night at a local bar. I once went to a meetup for a niche series, knowing I was the only one of my friends going. I reminded myself that everyone there will have at least one thing in common. I had a blast and made some new friends. One ran a board game night, another activity I could use to meet new people. Another worked at a comic store and frequently attended events for nerdy fandoms. You’ll find friends and build networks if you let the world see you. You don’t need AI for that.

AI won’t mourn you when you die. Which, if you spend your life dedicated to AI over humanity, no one else will either. You’re in this life now, go enrich it by meeting others. Real people. Because no matter how good AI gets, it’ll never replace real human connection, and you’re probably worth actually connecting with.


Sources:
,