- Jan 6, 2017
- 835
Social media giant’s Messenger Kids app allows children to chat and text.
Facebook set the age limit for user accounts at 13 and older many years ago, but that doesn’t stop the more than 20 million estimated underaged users from partaking of the platform. Now, in the ultimate throw in the towel, Facebook has launched a pared-down version of its Messenger app to allow kids to video chat and text with both each other and approved adults.
The failure potential is limitless. In order for two young users to “friend” each other and use the app, both kids’ parents have to issue the approval. But who proves that the parent accounts are genuine and linked to the child? This is ignoring the potential for adults to create their own child account and associated parent account in order to target unsuspecting young users whose real parents aren’t as savvy about privacy concerns.
Tech entrepreneurs are just getting younger and younger.
Ads and data
Speaking of privacy, the far more likely threat is that the children’s accounts will be monitored and targeted for advertising and data gathering, both of which Facebook currently says they won’t be doing. They’ve also stated they won’t use the child’s activity to target the parents’ accounts with ads, but there have been a few slippery slopes in the past that say this might not play out as stated.
Start ’em young
Of course, the more basic nefarious angle is that Facebook wants to hook ’em while they’re young so that they’ll graduate to their own fully-functioning Facebook accounts when they’re older. That seems quite plausible given the perception among young people that Facebook is where their moms and grandmothers hang out.
Supply and demand
To be fair, though, it’s entirely possible that Facebook is simply acknowledging the high volume of underaged (and therefore ill-prepared) users and creating a “safe space” for them to use the platform. It might seem like throwing in the towel, but it could also be an investment in making sure there are boundaries in place to protect young users. As with every other website and platform out there, it’s ultimately the parents’ job to monitor their kids’ use, discuss safe behaviors, and pull the plug if things go wrong.
Facebook set the age limit for user accounts at 13 and older many years ago, but that doesn’t stop the more than 20 million estimated underaged users from partaking of the platform. Now, in the ultimate throw in the towel, Facebook has launched a pared-down version of its Messenger app to allow kids to video chat and text with both each other and approved adults.
The failure potential is limitless. In order for two young users to “friend” each other and use the app, both kids’ parents have to issue the approval. But who proves that the parent accounts are genuine and linked to the child? This is ignoring the potential for adults to create their own child account and associated parent account in order to target unsuspecting young users whose real parents aren’t as savvy about privacy concerns.
Tech entrepreneurs are just getting younger and younger.
Ads and data
Speaking of privacy, the far more likely threat is that the children’s accounts will be monitored and targeted for advertising and data gathering, both of which Facebook currently says they won’t be doing. They’ve also stated they won’t use the child’s activity to target the parents’ accounts with ads, but there have been a few slippery slopes in the past that say this might not play out as stated.
Start ’em young
Of course, the more basic nefarious angle is that Facebook wants to hook ’em while they’re young so that they’ll graduate to their own fully-functioning Facebook accounts when they’re older. That seems quite plausible given the perception among young people that Facebook is where their moms and grandmothers hang out.
Supply and demand
To be fair, though, it’s entirely possible that Facebook is simply acknowledging the high volume of underaged (and therefore ill-prepared) users and creating a “safe space” for them to use the platform. It might seem like throwing in the towel, but it could also be an investment in making sure there are boundaries in place to protect young users. As with every other website and platform out there, it’s ultimately the parents’ job to monitor their kids’ use, discuss safe behaviors, and pull the plug if things go wrong.