fbpx
Menu

Tech companies should be held accountable for child sexual abuse material

This week, we saw high-profile US Senators during the US Senate Judiciary Committee Hearing with Tech CEOs about online sexual exploitation, and social media’s impact on young people question social media company leaders and requested to explain themselves because of their abject failure to protect children from threats of sexual predation. Tech companies are enabling and emboldening predators and perpetrators of Child Sexual Assault (CSA).

Companies around the world fall short in safeguarding children and grapple with various challenges on social media. Concerns about sexual predators, addictive functionalities, instances of suicide and eating disorders, unattainable beauty ideals, and the prevalence of bullying often pepper the conversations when we discuss social media companies and the naked profiteering that they glean from it.

In the same week as Artificial Intelligence (AI) generated images of Taylor Swift were released across the internet, and warnings from the Australian Federal Police flooded social media warning parents about the meta data contained in the popular Back To School photos being harnessed by perpetrators, it’s clear that there is a long way to go before we can trust these organisations alone with our children.

CEO of IGFF, Clare Leaney said “When Elon Musk purchased the platform formally known as Twitter, he loosened moderation policies, in what was thought at the time to be a deliberate attempt to tank the business and to re-instate the First Amendment to Americans – seemingly rendering the rest of the planet as the 51st state.  Subsequently, this week, the formally profitable business previously known as Twitter had to block user searches for the AI generated Swift images.  Retroactive acts like this protect no one and give cover to and emboldens perpetrators.  Tech companies are enabling perpetrators. We can do better.”

Tech companies should be held accountable for child sexual abuse material and victims should have every right to sue tech platforms and app stores when they deliberately and systematically violate those rights.  In every instance, they should be believed and supported.  Declarations of continued investment and work on “industry-wide efforts” to protect children are hollow and not nearly enough.

Work of agencies like the Centre for Digital Wellbeing, a policy research and design centre that highlights and warns about the impacts of technology, have been warning Australians for years about this escalating situation.  They have been consistently formulating policy responses and developing initiatives to assist users to better engage in healthy, ethical and safe digital practices

-ends-

 

For all media enquiries please contact Joe Stroud, Chief Operating Officer, Head of Government Relations and Media:

E: joe@igff.org.au

About In Good Faith Foundation

In Good Faith Foundation is a national charity and support service providing advocacy services to individuals, families and communities impacted by institutional abuse for over 20 years.

Read the ABC article here: https://www.abc.net.au/news/2024-02-01/meta-tiktok-x-ceos-grilled-at-us-senate-child-safety-hearing/103412442 


About the Author : Megan


0 Comment

Related post

  TOP