Social Media Parental Control Guidelines



CNN Business

More than a year ago, social media companies came under the spotlight for how they protected or failed to protect their youngest users.

In a series of congressional hearings, executives from Facebook (FB), TikTok, Snapchat and Instagram faced tough questions from lawmakers about how their platforms steer younger users to harmful content, damage mental health and body image (especially is a teenage girl), and the lack of adequate parental controls and protections to prevent it.

The hearings came after whistleblower Frances Haugen disclosed Instagram’s impact on teens in a so-called “Facebook paper” that prompted the companies to vow to change. The four social networks have since introduced more tools and parental control options designed to better protect younger users. Some have also made changes to their algorithms, such as defaulting teens to view less sensitive content and increasing their moderation efforts. But some lawmakers, social media experts and psychologists say new solutions are still limited and more work needs to be done.

“More than a year after the Facebook paper dramatically exposed the abuses of Big Tech, social media companies have taken only small, slow steps to clean up their practices,” the senator said. Richard Blumenthal, chairman of the Senate Consumer Protection Subcommittee, told CNN Business. “Trust in big tech is long gone and we need real rules to keep kids safe online.”

Michela Menting, director of digital security at market research firm ABI Research, agrees that social media platforms “offer very little material to counter the ills their platforms bring”. Their solution puts the onus on guardians to activate various parental controls, such as those designed to filter, block and limit access, as well as more passive options, such as monitoring and surveillance tools that run in the background, she said.

Alexandra Hamlet, a clinical psychologist in New York City, recalls that about 18 months ago, he was invited to a roundtable discussion about ways to improve Instagram, especially for younger users. “I don’t see many of our ideas implemented,” she said. She added that social media platforms need to work to “continue to improve parental controls, protect young people from targeted advertising, and remove content that is objectively harmful.”

The social media companies mentioned in the article either declined to comment or did not respond to requests for comment about criticism that more needs to be done to protect younger users.

Currently, guardians must learn how to use parental controls, while also being aware that teens can often bypass these tools. Here’s a closer look at what parents can do to help their children stay safe online.

After the fallout from leaked documents, Meta-owned Instagram has suspended its much-criticized plans to release a version of Instagram for children under 13 and focus on making its main service safer for younger users.

It has since introduced an education hub for parents with resources, tips and articles from experts on user safety, and rolled out a tool that allows guardians to see how much time their kids spend on Instagram and set time limits. Parents can also receive updates about accounts their teens follow and the accounts they follow, and see and receive notifications if their kids have updated their privacy and account settings. Parents can also see which accounts their teen has blocked. The company also offers video tutorials on how to use the new supervision tool.

Another feature encourages users to take a break from the app after a predetermined amount of time, such as suggesting them take a deep breath, write something down, check a to-do list or listen to a song. Instagram also said it’s taking a “stricter approach” to the content it recommends to teens, actively pushing them to focus on different topics, such as architecture and travel destinations, if they linger on any type of content for too long.

Facebook’s Safety Center provides oversight tools and resources, such as articles and advice from leading experts. Meta spokesperson Liza Crenshaw told CNN Business: “Our vision for Family Hub is to finally allow parents and guardians to help their teens manage the experience of Meta technology, all from one place.”

The center also offers a guide to VR parental supervision tools from Meta by ConnectSafely, a nonprofit that helps kids stay safe online, to help parents discuss virtual reality with teens. Guardians can see which accounts their teen has blocked and access monitoring tools, as well as approve their teen to download or purchase apps that are blocked by default based on their rating, or block specific apps that may not be suitable for their teen.

In August, Snapchat launched a parental guide and hub designed to give guardians a deeper understanding of how their teens use the app, including who they talked to in the last week (without revealing the content of those conversations). To use the feature, parents must create their own Snapchat account, and teens must opt ​​in and give permission.

While this is Snapchat’s first official foray into parental controls, it did previously put in place some existing safety measures for younger users, such as requiring teens to be mutual friends before they start interacting with each other and prohibiting them from having public profiles. Teen users have their Snap Map location sharing tool turned off by default, but can use it to reveal their real-time location to friends or family, even if their app is turned off as a safety measure. Meanwhile, the Friend Checker tool encourages Snapchat users to check out their friends lists and make sure they still want to keep in touch with certain people.

Snap has previously said it is working on more features, such as enabling parents to see which new friends their teens have added and allowing them to secretly report accounts that may have interacted with their kids. It is also developing a tool that will give young users the option to notify their parents when reporting accounts or content.

The company told CNN Business it will continue to enhance its safety features and consider feedback from the community, policymakers, safety and mental health advocates, and other experts to improve the tool over time.

In July, TikTok announced new ways to filter mature or “potentially problematic” videos. The new safeguard assigns a “maturity score” to videos detected as likely to contain mature or complex subject matter. It also launched a tool designed to help people decide how much time they want to spend on TikToks. The tool allows users to set regular screen time intervals and provides a dashboard detailing how many times they open apps, a breakdown of daytime and nighttime usage, and more.

The popular short video app currently offers a family matchmaking center that allows parents and teens to customize their safety settings. Parents can also link their TikTok account to their teen’s app and set parental controls, including how long they can spend on the app each day; limit access to certain content; decide whether teens can search for videos, hashtags or livestreams content; and whether their account is private or public. TikTok also offers its Guardian Guide, which highlights how parents can best protect their children on the platform.

In addition to parental controls, the app restricts younger users from accessing certain features, such as real-time and direct messaging. When teens under the age of 16 are ready to post their first video, a pop-up will also appear asking them to choose who can watch the video. Push notifications are disabled after 9pm for account users between the ages of 13 and 15, and after 10pm for users between the ages of 16 and 17.

The company says it will do more to raise awareness of its parental controls in the days and months ahead.

Discord did not appear in the Senate last year, but the popular messaging platform has faced criticism for its difficulty in reporting questionable content and the ability of strangers to connect with younger users.

In response, the company recently updated its Safety Center, where parents can find guidance on how to turn on safety settings, FAQs about how Discord works, and tips on how to talk to teens about online safety. Some existing parental control tools include an option to prevent minors from receiving friend requests or direct messages from people they don’t know.

Still, it’s possible for minors to connect with strangers on public servers or in private chats if the person is invited by someone else in the room, or if the channel link is put into a public group the user visits. By default, all users, including those between the ages of 13 and 17, can receive friend invitations from anyone on the same server, and they can then send private messages.



Source link