Author

By John Glenday, Reporter

March 18, 2021 | 3 min read

You may not be anywhere near the office water cooler right now, but we still want to spotlight the most talked about creative that should be on your radar. Today, we’re talking about a powerful campaign from the Canadian Center for Child Protection which is hijacking Twitter’s 15th birthday to demand a clampdown on depitctions of child sexual abuse.

Victims of child sexual abuse have hijacked Twitter’s 15th-anniversary celebrations by relating the devastating emotional toll wrought by their struggle to ensure the platform removes recordings of their abuse.

A group of survivors have come together under the auspices of the Canadian Centre for Child Protection (C3P) to recount the hours of self-monitoring and reporting required to push Twitter and other online platforms into action.

Twitter has been singled out for action, with the network performing on the issue badly relative to its peers. C3P has commissioned a no-holds-barred video to provide an outlet for those caught in a never-ending battle they can’t win.

Using actors to protect the anonymity of participants, the short video presents testimonies from survivors.

Voicing a situation experienced by many survivors, one participant said: “From infancy, until I was 15, I was trafficked and used in child sexual abuse material (CSAM) which continues to be shared widely across the internet. I spend hours every day searching for my own content, reporting thousands of accounts and posts sharing CSAM."

“When platforms don’t actively look for or prevent this content from being uploaded, the burden falls on me to have these images removed. Each time one account gets taken down, five more take its place. It’s like a hydra, a monster that I can never defeat.

“I shouldn’t have to go looking for images of my own abuse. This isn’t my job.”

The campaign coincides with research authored by C3P, which ranked Twitter as the poorest performing online platform for the ease of reporting CSAM. Of particular concern is the fact that Twitter does not have an option to report CSAM directly from a tweet or direct message, nor any functionality for reporting specific users who share such material.

In 2015 Twitter partnered with the Internet Watch Foundation, an anti-child abuse charity, to automate the process of pulling indecent images identifiable by unique ‘hash’ codes.

Creative Water Cooler Creative

Content created with:

Twitter

Twitter is what’s happening and what people are talking about right now.

Find out more

More from Creative

View all