Thursday, September 12, 2019

Facebook App to Keep Kids From Talking to Strangers Online Fails Its One Job


To absolutely nobody’s surprise, it turns out that letting the company with tons of privacy scandals run a messaging service for children might have been a bad idea. Now there are multiple reports that a pitfall in the design of Facebook’s Messenger Kids app lets children talk to unauthorized users in group chat—aka exactly what the app was built not to do.
The app works like this: Once a parent has approved a contact, children as young as six are free to chat with that person through video, texts, silly gifs, etc. That works if the conversation’s only one-on-one, but Messenger Kids allows for group chats, and that’s where the issue of permissions gets tricky.
Thanks to a bug in the app, a kid could be invited to a group chat by a friend authorized to do so, but the users therein required no such authorization. Messenger Kids didn’t screen whether everyone in the chat was pre-approved to talk to one another, resulting in thousands of children talking to strangers on the internet through an app designed to stop that from happening, a Facebook representative told Gizmodo.
The Verge confirmed with the company that it had been alerting users and quietly closing such group chats for the past week. “We recently notified some parents of Messenger Kids account users about a technical error that we detected affecting a small number of group chats,” a Facebook representative further explained in a statement provided to Gizmodo and other news outlets. “We turned off the affected chats and provided parents with additional resources on Messenger Kids and online safety.”
How long such an important and ostensively obvious loophole has been in Messenger Kids is anyone’s guess. But controversy has surrounded the app since its inception.
Ever since Facebook launched the service back in 2017, many child health care advocates have loudly voiced their disapproval for it. Nearly 100 of them signed a letter asking Facebook’s CEO Mark Zuckerberg to delete the app over concerns that increased screen time has been shown to cause stress, negative body images, and sleep deprivation, according to multiple studies the letter cites. Facebook later addressed some of these concerns by adding a “Sleep Mode” so parents could control how much time their children spent on the app, but others remained.
Namely, moderation and privacy, and whether Facebook is capable of successfully enforcing either. The last few years have seen the company fail to properly identify hate speech on its platform, accidentally leak the identities of moderators to suspected terrorists, and reveal a massive security breach. And I don’t think news of this latest privacy misstep among Facebook’s most vulnerable users will do much to boost its plummeting user trust surveys either. Not that you’d be able to tell by the company’s surging stock price.
Update 11:52pm, July 22: A Facebook spokesperson responded to our request for comment with a statement also provided to the Verge and CNET. We’ve updated the story to reflect the company’s response.

No comments:

Post a Comment