Social media firms will be required to agree legally-binding terms and conditions to prevent a repeat of scandals like the suicide of teenager Molly Russell under duty of care plans being considered by Boris Johnson.
The tech giants will face multi-million pound fines by the new online harms regulator Ofcom if they breach the agreements, and the prospect of being forced to suspend services unless they can remedy the failings.
Culture Secretary Oliver Dowden – who has presented the plan to Number 10 with Home Secretary Priti Patel – has pledged the firms’ codes to tackle content such as self-harm and eating disorders will have to be “meaningful” and vetted by the regulator to ensure they are “proper” and “effective.”
The current proposals are thought to stop short of criminal sanctions against the firms for breaches over “legal but harmful” content like self-harm videos, but named executives will be held accountable for companies’ policies and face fines and disqualification for breaches.
Criminal sanctions will be reserved for illegal online material such as child abuse and terrorism, with the legal responsibilities of social media firms to meet their duty of care on such content spelled out in codes of practices.
Ms Patel has pressed for tough sanctions against such abuses in the knowledge that even fines set at a maximum of four per cent of global turnover will not deter companies like Facebook, which has banked reserves of more than $50 billion and has seen thriving profits through the pandemic.
Mr Dowden has told Cabinet colleagues that the measures must be robust enough for him “to be able to look victims in the eye.”
Digital minister Caroline Dinenage, who met Ian Russell, the father of Molly, who took her life after viewing self-harm content, said the “chilling” insight into his experience had strengthened her resolve to ensure the legislation was “fit for purpose” and not “watered down.”
The Telegraph understands Mr Russell pushed for Ofcom to have powers to inspect and enforce changes to social media algorithms that are spreading dangerous content.
The proposals, set out as a response to the consultation on last year’s white paper, are expected to be published after the US elections, once agreed by the Prime Minister.
Some campaigners fear intense lobbying by the tech giants and Tory MPs concerned about potential threats to free speech could lead to the plans being watered down.
"Regulation will fail if it’s tough on paper but weak in reality. Unless the Government commits to strong financial and criminal sanctions, there isn’t the deterrence value to make some of the biggest firms in the world take notice,” said Andy Burrows, the NSPCC’s head of child safety online policy.
"Legislation must compel platforms to proactively prevent abuse, and impose a duty on them to cooperate with the regulator and disclose information on whether their sites are safe."
The demands are backed by Mr Russell, who believes senior tech executives should face the threat of criminal prosecution in cases where there are serious breaches of the duty of care as fines would not be enough of a sanction.
The Government is expected to draft a “tight” duty of care bill early next year that will lay down the sanctions and investigative powers of the new regulator but leave the scope of the duty of care on legal harms to secondary legislation to be voted on by MPs.
Mr Dowden told MPs earlier this month: “Teenage girls are particularly susceptible to self-harm and eating disorder websites, and so on. So I have in my mind this regime addressing that, whilst at the same operating in free society.”
Social media companies will be required to have “clear and accessible” mechanisms for users to report harmful content and force them to take it down.
The proposals also currently include powers for the regulator to block tech firms guilty of serious breaches of duty of care face from operating in the UK, a measure targeted at smaller operators.