Skip to Content

How to block graphic social media posts on your kids’ phones

Many schools, psychologists and safety groups are urging parents to disable their children’s social media apps over mounting concerns that Hamas plans to disseminate graphic videos of hostages captured in the Israel-Gaza war.
Cavan Images/Cavan Images RF/Getty Images
Many schools, psychologists and safety groups are urging parents to disable their children’s social media apps over mounting concerns that Hamas plans to disseminate graphic videos of hostages captured in the Israel-Gaza war.

By Samantha Murphy Kelly, CNN

New York (CNN) — Many schools, psychologists and safety groups are urging parents to disable their children’s social media apps over mounting concerns that Hamas plans to disseminate graphic videos of hostages captured in the Israel-Gaza war.

Disabling an app or implementing restrictions, such as filtering out certain words and phrases, on young users’ phones may be sound like a daunting process. But platforms and mobile operating systems offer safeguards that could go along way in protecting a child’s mental health.

Following the attacks on Israel last weekend, much of the terror has played out on social media. Videos of hostages taken on the streets and civilians left wounded continue to circulate on varying platforms. Although some companies have pledged to restrict sensitive videos, many are still being shared online.

That can be particularly stressful for minors. The American Psychological Association recently issued a warning about the psychological impacts of the ongoing violence in Israel and Gaza, and other research has linked exposure to violence on social media and in the news as a “cycle of harm to mental health.”

Alexandra Hamlet, a clinical psychologist in New York City, told CNN people who are caught off guard by seeing certain upsetting content are more likely to feel worse than individuals who choose to engage with content that could be upsetting to them. That’s particularly true for children, she said.

“They are less likely to have the emotional control to turn off content that they find triggering than the average adult, their insight and emotional intelligence capacity to make sense of what they are seeing is not fully formed, and their communication skills to express what they have seen and how to make sense of it is limited comparative to adults,” Hamlet said.

If deleting an app isn’t an option, here are other ways to restrict or closely monitor a child’s social media use:

1) Set up boundaries

Parents can start by visiting the parental control features found on their child phone’s mobile operating system. iOS’ Screen Time tool and Android’s Google Family Link app help parents manage a child’s phone activity and can restrict access to certain apps. From there, various controls can be selected, such as restricting app access or flagging inappropriate content.

Guardians can also set up guardrails directly within social media apps.

TikTok: TikTok, for example, offers a Family Pairing feature that allows parents and guardians to link their own TikTok account to their child’s account and restrict their ability to search for content, limit content that may not be appropriate for them or filter out videos with words or hashtags from showing up in feeds. These features can also be enabled within the settings of the app, without needing to sync up a guardian’s account.

Facebook, Instagram and Threads: Meta, which owns Facebook, Instagram and threads, has an educational hub for parents with resources, tips and articles from experts on user safety, and a tool that allows guardians to see how much time their kids spend on Instagram and set time limits, which some experts advise should be considered during this time.

YouTube: On YouTube, the Family Link tool allows parents to set up supervised accounts for their children, screen time limits or block certain content. At the same time,YouTube Kids also provides a safer space for kids, and parents who decide their kids are ready to see more content on YouTube can create a supervised account. In addition, autoplay is turned off by default for anyone under 18 but can be turned off anytime in Settings for all users.

2) Talk to your kids about social media

Hamlet said families should consider creating a family policy where family members agree to delete their apps for a certain period of time.

“It could be helpful to frame the idea as an experiment, where everyone is encouraged to share how not having the apps has made them feel over the course of time,” she said. “It is possible that after a few days of taking a break from social media, users may report feeling less anxious and overwhelmed, which could result in a family vote of continuing to keep the apps deleted for a few more days before checking in again.”

If there’s resistance, Hamlet said should try to reduce the time spent on apps right now and come up with an agreed upon number of minutes each day for usage.

“Parents could ideally include a contingency where in exchange for allowing the child to use their apps for a certain number of minutes, their child must agree to having a short check in to discuss whether there was any harmful content that the child had exposure to that day,” she said. “This exchange allows both parents to have a protected space to provide effective communication and support, and to model openness and care for their child.”

3) What companies are doing

TikTok: A TikTok spokesperson, which said the platform uses technology and 40,000 safety professionals to moderate the platform, told CNN it is taking the situation seriously and has increased dedicated resources to help prevent violent, hateful, or misleading content on the platform.

Meta: Meta similarly said it has set up a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to monitor and respond to the situation. “Our teams are working around the clock to keep our platforms safe, take action on content that violates our policies or local law, and coordinate with third-party fact checkers in the region to limit the spread of misinformation,” Meta said in a statement. “We’ll continue this work as this conflict unfolds.”

YouTube: Google-owned YouTube said it is providing thousands of age-restricted videos that do not violate its policies – some of these, however, are not appropriate for viewers under 18. (This may include bystander footage). The company told CNN it has “removed thousands of harmful videos” and its teams “remain vigilant to take action quickly across YouTube, including videos, Shorts and livestreams.”

The-CNN-Wire
™ & © 2023 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: Health

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KIFI Local News 8 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content