On Monday, a new law came into effect in the UK that forces social media companies like Meta, Google, and TikTok to take down harmful content or face huge fines. The law, known as the Online Safety Act, was passed in 2023, and the country's telecoms watchdog, Ofcom, just published a list of 130 things social media platforms must remove to avoid penalties.
These rules cover serious issues like terrorism, human trafficking, and the sharing of child sexual abuse material. But it doesn't stop there—Ofcom also wants to stop things like racial hatred and hate based on religion or sexual orientation. These are already illegal, but the law’s guidelines give social media platforms clear instructions on what to remove.
What makes this law tricky is that some offenses are complicated. For example, some actions might involve conversations between users, or situations where the people involved have to be considered, like their age, gender, or identity. This means social media platforms n...
Full Access
Included:
-
Access to All Articles.
-
One Plan. No Tiers.
-
No Ads.
-
Cancel anytime.