Facebook moderators were instructed not to remove extreme, abusive or graphic content from the platform even when it violated the company’s guidelines, an undercover investigation has found.

While nudity is almost always removed, violent videos involving assaults on children, racially charged hate speech and images of self-harm among underage users all remained on Facebook after being reported by users and reviewed by moderators.

“These revelations about Facebook’s content moderation are alarming, but not surprising,” said Julian Knight, a member of the Digital, Culture, Media and Sport Select Committee.

The Conservative Solihull MP said: “Facebook has recently committed to reducing fake news and improving privacy on its platform, which is welcome.

“But they don’t seem as committed to sacrificing profits made from extreme content, as is demonstrated by Channel 4’s investigation.”

Facebook, the world’s biggest social network with more than two billion users, called the practices “mistakes” which do not “reflect Facebook’s policies or values”.

The revelations come in an investigation by Channel 4’s Dispatches programme in which a reporter worked at Cpl Resources, Facebook’s largest centre for UK content moderation.

Over a six-week period between March and April this year, the reporter attended training sessions and filmed conversations in the Cpl offices in Dublin.

A particularly shocking video featured in the programme showed an adult man punching and stamping on a screaming toddler.

Moderators marked the video as disturbing – meaning users must click to view it – and allowed it to remain online, going on to use it in training sessions as an example of acceptable content.

One moderator filmed in the programme said: “If you start censoring too much then people stop using the platform. It’s all about money at the end of the day.”

Facebook told Dispatches the video should have been removed by moderators.

Roger McNamee, an early investor in Facebook who has since become highly critical of its impact on society, described such videos as the “crack cocaine” of the company’s product.

Mr McNamee said: “It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform.

“Facebook understood that it was desirable to have people spend more time on site. If you’re going to have an advertising-based business, you need them to see the ads so you want them to spend more time on the site.”

Facebook’s vice-president of global policy solutions, Lord Allan, disagreed.

“There is a minority who are prepared to abuse our systems and other internet platforms to share the most offensive kind of material,” the former Liberal Democrat MP said.

“But I just don’t agree that that is the experience that most people want and that’s not the experience we’re trying to deliver.”

In one training session filmed by Dispatches, the group is shown a cartoon of a woman drowning a young white girl in a bathtub, accompanied by the caption “when your daughter’s first crush is a little negro boy”.

The trainer says such images should be ignored by moderators and allowed to remain online.

Facebook later told Dispatches the picture would violate their policies on hate speech and it would investigate the incident.

In another session, the reporter and another moderator review a video which says “Muslim immigrants” should “f*** off back to (their) own country”.

The moderator said the video should be ignored because “they’re still Muslims but they’re immigrants, so that makes them less protected”.

When asked if such posts were classed as hate speech, Lord Allan said they were “right on that line” but “we’ve not defined it as hate speech if you are expressing a view about the government’s immigration policy”.

The investigation also found that extremist pages which had a lot of followers would be treated with special consideration, in the same manner as pages for governments and large news organisations.

The Facebook page for jailed far-right leader Tommy Robinson is given such protection, meaning frontline moderators cannot directly remove material which may violate policies.

Instead it is referred to a more senior reviewing team at Facebook known as “Cross Check”.

“Obviously they have a lot of followers so they’re generating a lot of revenue for Facebook,” one moderator said about the fascist Britain First page, before it was deleted when deputy leader Jayda Fransen was convicted of racially aggravated harassment in March.

Speaking on the programme, Lord Allan said: “This is not a discussion about money, this is a discussion about political speech.”

In a statement released ahead of the programme’s broadcast, Lord Allan said: “It’s clear that some of what is shown in the programme does not reflect Facebook’s policies or values, and falls short of the high standards we expect.

“We take these mistakes in some of our training processes and enforcement incredibly seriously and are grateful to the journalists who brought them to our attention.

“Where we know we have made mistakes, we have taken action immediately. We are providing additional training and are working to understand exactly what happened so we can rectify it.”

A Government spokeswoman said: “While some online platforms have taken important steps to improve safety, we have long-held concerns about their effectiveness in enforcing their own terms and conditions and ensuring UK citizens are not exposed to harmful content.

“That’s why we are working with tech companies, children’s charities and other stakeholders to develop laws that tackle the full range of online harms, set clear responsibilities for platforms and make sure there is improved support for users online.”​

– Inside Facebook: Secrets Of The Social Network will be broadcast on Channel 4 at 9pm on Tuesday July 17.