Peers have called for modifications to social media algorithms to prevent promoting “harmful” content, including “Andrew Tate videos.” The House of Lords has voted 240 to 168 to implement a range of amendments to the government’s Online Safety Bill. These amendments aim to bring about comprehensive reforms that prioritize the safety of children during their online activities.
Baroness Kidron, a crossbench peer, has advocated for amendments to legislation that would prohibit social media companies from using algorithms to steer children and young people toward potentially harmful content. Websites utilize algorithms to determine which images, videos, or content users will most likely engage with.
Lady Bee Kidron, a prominent figure in the field of child rights and the founder of the 5Rights Foundation, has shed light on a concerning issue regarding social media influencer Andrew Tate. According to Lady Kidron, teenage boys have been unintentionally directed toward Andrew Tate’s videos due to a seemingly innocuous algorithm known as “content-neutral friend recommendation.”

The mechanism behind this algorithm is believed to encourage youngsters to view Andrew Tate’s content solely because other 13-year-old boys, who share similar interests, have previously engaged with his content on the platform. Lady Kidron expressed her worries about how these algorithms are shaping the online experiences of young individuals.
The Need for Algorithmic Responsibility and User Well-being
By relying on the assumption that teenagers with similar characteristics and preferences will have comparable interests, social media platforms inadvertently promote the content of controversial figures like Andrew Tate to unsuspecting audiences. This algorithmic approach fails to consider the potential negative impacts of such content on impressionable young minds.
Lady Kidron’s concerns highlight the delicate balance that needs to be struck between the desire for personalized recommendations and the responsibility to protect young users from harmful content. She emphasizes the need for platforms to reassess and refine their algorithms to ensure they prioritize the well-being of their users, especially the vulnerable and impressionable teenage population.
Tate has gained notoriety in recent years for expressing controversial views in his online videos. These views include his belief that women should primarily be homemakers, his assertion that rape victims bear some responsibility for their assaults, and his preference for 18-year-olds over women over 25 because they have had fewer sexual experiences.
Continuing her speech, Lady Kidron stated, “To push hundreds of thousands of children towards Andrew Tate for no other reason than you benefit commercially from the network effect is a travesty for children and it undermines parents.”
The peer disagreed with the government’s argument that all harm stems from content. Instead, she argued that harm can also arise from the structure and setup of companies. She remarked, “In a world of AI, immersive tech, and augmented reality, is it not dangerous and indeed foolish to exclude harm that might come from another source other than content?”
Baroness Harding’s Concerns: Recognizing Non-Content Harms in the Online Safety Bill
Conservative Party peer Baroness Harding supported Lady Kidron’s proposals, providing an additional example of how “harm” can extend beyond content. She described her experience using technology to monitor her teenage daughter during a school trip to the USA as “brilliant.” However, she expressed concern about the ease with which predators could exploit the same tools, stating that it sends a shiver down her spine. Baroness Harding emphasized the need for ministers to acknowledge that non-content harm poses a genuine and immediate danger.
Having led the NHS Test and Trace program during the pandemic, Lady Harding expressed her worry that the Online Safety Bill does not explicitly recognize non-content harms as real harms. She cautioned that if this is not addressed within the legislation, there is a significant risk of ambiguity in the future.
Lord Parkinson, the government’s culture minister, had urged fellow peers to vote against the proposed amendments, arguing that they could potentially weaken the Online Safety Bill. Commenting on the vote, he claimed: “The bill’s online safety objectives include that regulated services should be designed and operated so as to protect people in the United Kingdom who are users of the service from harm, including with regard to algorithms used by the service, functionalities of the service, and other features relating to the operation of the service.”