Parents are being warned to watch out for several social media accounts alleged to be ‘inciting sexual activity’ from children in Suffolk.

Suffolk police are investigating a number of Instagram and Kik accounts thought to be contacting young people and attempting to incite them into sexual activity.

Officers have confirmed an investigation is underway after a school in Lowestoft sent a letter to parents warning them about four potentially dangerous accounts.

Three of the accounts are linked to the picture-sharing site Instagram. The fourth account is linked to the messenger app Kik.

St Margaret’s School, in Lowestoft, which sent the letter on January 21, advised parents to discuss the accounts with their children.

The letter stated: “Suffolk Police have asked that you discuss these accounts with your children and/or check to see whether they have had any contact.

“If contact has been made, please send an email to OperationWombat@suffolk.pnn.police.uk.”

The letter added that parents and children alike should refrain from making contact with any of the accounts while the police investigation is ongoing.

Farlingaye High School, in Woodbridge, sent a letter to parents citing three of the same accounts in December last year.

A police spokesman said: “We can confirm that we are investigating inappropriate social media activity by specific social media accounts.

“There is currently no identified immediate physical risk to any individuals and enquiries are ongoing.”

A spokesman for Instagram has since confirmed the accounts linked to its site have been disabled for breaching community guidelines.

He said Instagram does not allow child exploitation or grooming and has developed powerful resources – such as Photo DNA – to fight inappropriate content on its platforms.

He added that anybody who sees content which violates Instagram community guidelines should flag it to moderators, who review reports 24/7.

Members of the public can also report alleged grooming activity through the Help Centre contact form.

Kik did not respond to a request for comment.

Instagram has recently come under fire after a grieving parent claimed the social media site “helped kill” his daughter.

Ian Russell, father of 14-year-old Molly, said her suicide was possibly driven by “disturbing content” she saw on social media.

The Health Secretary and West Suffolk MP Matt Hancock has since written to internet giants, urging them to act.

Mr Hancock said he was “horrified” to learn of Molly’s death, and feels “desperately concerned to ensure young people are protected”.

In his letter to Twitter, Snapchat, Pinterest, Apple, Google and Facebook (which owns Instagram), he said: “I welcome that you have already taken important steps, and developed some capabilities to remove harmful content. But I know you will agree that more action is urgently needed.

“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.

“It is time for internet and social media providers to step up and purge this content once and for all.”

He added that the Government is developing a white paper addressing “online harms”, and said it will look at content on suicide and self-harm.

“I want to work with internet and social media providers to ensure the action is as effective as possible. However, let me be clear that we will introduce new legislation where needed,” he said.

A spokeswoman for Instagram said: “We are undertaking a full review of our enforcement policies and technologies around self-harm, suicide and eating disorders.

“As part of this, we are consulting further with mental health bodies and academics to understand what more we can do to protect and support our community, especially young people.

“While we undertake this review, we are taking measures aimed at preventing people from finding self harm-related content through search and hashtags.”