Meta strengthens the protection of minors on its platforms


Meta, the company behind Instagram and Facebook, has announced new measures to enhance the safety of minors on its platforms. One of the key changes is that minors under 16 (or under 18 in certain countries) will now be unable to receive messages or be added to group chats by users they don’t follow. This is aimed at preventing unwanted contact with strangers. In addition, Meta is introducing stricter message settings for teens, limiting the type and number of direct messages people can send to someone who doesn’t follow them.

Online safety for minors or children or kids

These measures are part of Meta’s efforts to provide more age-appropriate experiences for minors and to give parents more control over their teens’ online experiences. Meta says it is strengtheningĀ the online safety protection of minors and preventing them from unnecessary harassment.

Meta’s changes include

  • Block private messages from strangers by default: Teenagers under the age of 16 (under 18 in some countries/regions) will not be able to receive private messages from strangers by default, nor can they be included in group chats.Ā This initiative is intended to reduce the exposure of young people to potential online groomers.

    Gizchina News of the week

  • Upgraded parental supervision: The parental supervision function will be further strengthened.Ā In addition to being notified of changes to their children’s safety and privacy settings, parents can now approve or deny these changes, for example, preventing a teen from changing their account from private to public.

  • Protection from Inappropriate Images: Meta is developing a new feature designed to protect users from indecent or inappropriate images from existing friends.Ā The feature will work within encrypted chats and will prompt senders to avoid sending such content.

These new measures are part of a series of efforts Meta has made over the past year to protect minors. The company has previously been accused of turning Facebook and Instagram into “marketplaces for child predators” due to algorithm issues. Meta hopes that these tighter restrictions and stronger parental controls will effectively reduce illegal behaviour on the platform. It also hopes it will create a safer and healthier online environment for teenagers.

Final Words

Meta’s latest measures signify a significant step towards enhancing online safety for minors on its platforms. By restricting messaging and group chat capabilities for users under 16 (or 18), Meta aims to shield young users. This will shield them from potential online risks and unwanted interactions with strangers. The introduction of stricter message settings and enhanced parental supervision further underscores Meta’s commitment. The company hopes to foster age-appropriate experiences and empower parents to oversee their teens’ online activities. With these proactive changes, Meta seeks to mitigate concerns regarding child safety. It also hopes to strengthen protections against online harassment and inappropriate content. The company is ultimately striving to cultivate a safer digital ecosystem for teenagers worldwide.

Disclaimer: We may be compensated by some of the companies whose products we talk about, but our articles and reviews are always our honest opinions. For more details, you can check out our editorial guidelines and learn about how we use affiliate links.

Source/VIA :
Previous Matt Navarra says TikTok is testing 30-minute long videos
Next Gboard introduces new Assistant voice typing toolbar for Pixel Tablet