4

Feb

2025

Do big tech companies have a ‘duty of care’ for users? A new report says they do – but leaves out key details

  • External Article
  • child sexual abuse material (csam)
  • child sexual exploitation
  • child sexual exploitation material (csem)
  • cybercrime
  • digital child exploitation
  • social media
  • The Conversation

Large social media companies should have to proactively remove harmful content from their platforms, undergo regular “risk assessments” and face hefty fines if they don’t comply, according to an independent review of online safety laws in Australia.

This “digital duty of care” would put the onus on tech companies to proactively address a range of specific harms on their platforms, such as child sexual exploitation and attacks based on gender, race or religion.