Instagram will start notifying parents if their children repeatedly search for suicide and self-harm content.
However, a leading online safety charity criticised the update as "flimsy" and say it risks leaving parents "panicked and ill-prepared" for the difficult conversations that may follow.
In the coming weeks, parents who have parental supervision set up on their children's Instagram accounts in the UK, US, Australia and Canada will start receiving the notifications.
The messages will be sent if an underage user repeatedly searches for phrases promoting suicide or self-harm, phrases that suggest a teen wants to hurt themselves, and terms like "suicide" or "self-harm" in a short space of time.
The alerts will be sent through email, text, or WhatsApp, as well as a notification on Instagram - if the parents are signed up to the platform's optional supervision setting.
As well as an alert about what the under-18 has been searching for, parents will also be given the option to see "expert resources" to help them approach "sensitive conversations with their teen", according to Meta.
Soon, alerts will also be sent if a youngster is talking to Meta AI about suicide or self-harm.
Search terms relating to suicide and self-harm should already be blocked on Instagram and guardrails exist within Meta AI to stop harmful discussions and instead signpost helpful organisations.
However, the Molly Rose Foundation (MRF) says its research has shown suicide and self-harm content is still available on the app.
Read more:
Spain's battle against doomscrolling schoolchildren
Will 'tobacco trials' haunt tech giants?
Instagram can be 'problematic', says app's chief
"This clumsy announcement is fraught with risk and we are concerned that forced disclosures could do more harm than good," said Andy Burrows, chief executive of the charity.
"Every parent would want to know if their child is struggling, but these flimsy notifications will leave parents panicked and ill-prepared to have the sensitive and difficult conversations that will follow.
"Our research shows Instagram's algorithm still actively recommends harmful depression, suicide and self-harm material to vulnerable young people and the onus should be on addressing these risks rather than making yet another cynically timed announcement that passes the buck to parents."
Meta says it removes content that promotes suicide or self-harm, shows graphic imagery or depicts methods or materials associated with them and goes further for teens, hiding content that discusses these topics altogether.
It also says it blocks many search terms related to suicide and self-harm, and directs anyone searching for this content to local organisations for support.
Instagram's "teen accounts" for under 16s were started in 2024 and need a parent's permission to change settings, with an extra layer of monitoring that can be selected with the agreement of their child.
As a default for these account, Instagram turns on many privacy settings for all under 18s and children aged 13 to 15 will only be able to adjust these features by adding a parent or guardian to their account.
Meta is currently facing a significant lawsuit in the US, where it is being accused of creating addictive apps that harm young people's mental health.
It denies the claims and Meta chief executive Mark Zuckerberg told the court last week the company's aim has always been "to try to build useful services that people connect to".
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.
(c) Sky News 2026: Instagram to notify parents if teens search for suicide content


Police apologise to second parliamentary Speaker in two days over Mandelson arrest leak
Driver who crashed while gambling on mobile is jailed after pregnant woman suffered premature birth
Woman who fell from window to flee Nottingham attacker warned by police 'he could have killed her'
Assisted dying bill will almost certainly fail due to a lack of time
Soham murderer Ian Huntley in serious condition after workshop prison attack, as assault suspect identified

