Snapchat’s My AI is aptly named, in that its use is very much not in your interest. Presented by the company as an in-app feature to help users find everything from trivia answers to gift ideas, it’s riddled with privacy and safety concerns. If you—or your kids—are going to use it, here are some concerns to think about.
First, the privacy issues. For My AI to work as intended, you have to give it access to your location and other personal data. That’s a trove of information for the company and whatever entities with which it chooses to share that information. And it’s a tracking nightmare, or bonanza, depending on your perspective. Snapchat and its AI know where you’re going and what you’re doing with your phone.
Then there are the safety concerns. My AI can make inappropriate suggestions and offer answers to questions that parents would rather their kids not know. The Washington Post reported in the spring that during tests the supposedly safe technology offered advice on how to pull off an illicit party without getting caught. “After I told My AI I was 15 and wanted to have an epic birthday party,” technology columnist Geoffrey Fowler wrote, “it gave me advice on how to mask the smell of alcohol and pot. When I told it I had an essay due for school, it wrote it for me.”
Even Snapchat urges caution, writing on its website, “We’re constantly working to improve and evolve My AI, but it’s possible My AI’s responses may include biased, incorrect, harmful, or misleading content. Because My AI is an evolving feature, you should always independently check answers provided by My AI before relying on any advice, and you should not share confidential or sensitive information.”
Yes, there are examples My AI being put to benign and entertaining use. But engaging with it carries risks. Best to be aware of them.