Parents have been warned to keep their children off social media app TikTok after a video of a man taking his own life went viral.

Following the incident, which was orignally live-streamed on Facebook earlier this week, and has since been ciruclated on other online platforms, TikTok have said they are working to remove the videos, and are banning users who try to share the clip on the platform.

How are TikTok trying to stop the video from being shared?

The video-sharing app said it was using human reviewers as well as automated systems to detect and block the clip from being shared.

What have TikTok said?

A TikTok spokesperson said: “On Sunday night, clips of a suicide that had originally been live-streamed on Facebook circulated on other platforms, including TikTok.

“Our systems, together with our moderation teams, have been detecting and blocking these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.

“We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.

“If anyone in our community is struggling with thoughts of suicide or concerned about someone who is, we encourage them to seek support, and we provide access to hotlines directly from our app and in our Safety Centre.”

What have Facebook said about the clip?

Facebook confirmed it was aware of the footage and had been blocking further attempts to share it since.

A Facebook spokesperson said: “We removed the original video from Facebook last month on the day it was streamed and have used automation technology to remove copies and uploads since that time."

'Growing concerns about graphic content on social media'

The incident comes amid ongoing concerns about social media platforms and their handling of content linked to self-harm, with many raising concerns about the damage such content can cause, in particular, to younger users.

Fears about the impact of social media on vulnerable people come amid cases of people taking their own lives, such as that of 14-year-old schoolgirl Molly Russell in 2017, who was found to have viewed harmful content online.

Molly’s father Ian, who now campaigns for online safety, has previously said the “pushy algorithms” of social media “helped kill my daughter”.

The Government is currently preparing its Online Harms Bill, which will introduce stricter regulation for internet companies and social media platforms, with large fines and other penalties for those who fail to protect their users.

What have the NSPCC said?

Andy Burrows, head of child safety online policy at the NSPCC, said: “It is an important challenge to win this cat and mouse game, as tech firms try to take down this horrific content while malicious actors continue to spread it.

“After the Christchurch terror attack was livestreamed and spread widely, this is a test of whether industry is working across platforms, and has rapid response arrangements in place to take down live and recorded video as consistently as they do with still images.

“Situations like this underline that platforms have a duty of care to act on harmful content, and that Government must make progress on the Online Harms Bill this autumn to hold companies and bosses to account.”

Where to go if you need help

To get emotional support, call the Samaritans 24-hour helpline on 116 123, email jo@samaritans.org, visit a Samaritans branch in person or go to the Samaritans website.