John Tanagho, Executive Director of IJM’s Center to End Online Sexual Exploitation of Children, is rallying the child protection community to think big: “It’s time for a zero-tolerance approach to OSEC, or what I call a CSAM wipeout,” he says.
IJM’s vision is for any instance or hint of CSAM—child sexual abuse materials—to be instantaneously blocked. A CSAM wipeout, as explained by John in his Fast Company article, is when such content cannot be produced, rendered, uploaded, stored, or shared on any device.
John suggests that “[w]hile most enacted or debated laws focus on detection and reporting, it is time to explore a more preventative approach that protects the privacy of CSAM survivors and prevents ongoing child sexual abuse.”
With this more aggressive approach, we can expect a strong deterrent effect. Most especially as more and more governments and corporations join arms against perpetrators past, present, and future.
Leading the way in preventive efforts are companies like global hotel chain Marriot International, which blocks access to known CSAM websites, in an innovative partnership with Internet Watch Foundation and Cisco.
In the past, protecting children without compromising user privacy was a seemingly impossible balancing act.
But advancements in technology and thoughtful policymaking have opened the door to solutions that achieve online safety while protecting user privacy:
In the tech sphere, with on-device machine learning, tech companies can detect harmful content such as child sexual abuse material (CSAM) without compromising user privacy. These systems analyze data locally on the user's device, ensuring that sensitive information is not transferred to external servers.
Concerning policymaking, IJM’s Center to End OSEC has seen the fruits of collaboration with demand-side governments and other like-minded groups. The Center has helped facilitate public consultation with survivors as well as experts, enabling policies to be survivor-informed. Now, online safety laws are being implemented in Australia, the UK, and the Philippines. We are seeing progress, albeit slower, in the continuing online safety discussions in the U.S., Canada, and the European Union.
With advances in both technology and policymaking, we are at the dawn of a much safer online environment for children, while respecting the privacy rights of all users.
IJM advises governments and policymakers to make on-device CSAM prevention a requirement. Having on-device safety features built into all tech products will accelerate progress toward a CSAM wipeout.
Having on-device safety features built into all tech products will accelerate progress toward a CSAM wipeout.
With the balance between user privacy and online safety achievable and in sight, the time is ripe to take a proactive stance against OSEC and OSAEC.
It's not one or the other: both user privacy and online safety can be achieved.
Parents, educators, corporations, and lawmakers all have roles to play in creating a digital world where children can explore and learn safely. Open dialogue and proactive measures are key to achieving this balance. Here are some practical action points:
Stock image. The child featured here is not an OSAEC victim/ survivor.
John Tanagho emphasizes that it’s “when companies elevate online safety side-by-side with profit and user privacy, and when governments put resources behind their global pledges,” that CSAM cannot stand a chance.
For a zero-tolerance approach to CSAM to be successful, we need safety tech and big tech firms to responsibly lead the way to develop and explain innovative tools that secure protection and privacy. And with the help of policymaking, online safety will be a non-negotiable, right up there with user privacy and profitability.
We need safety tech and big tech firms to responsibly lead the way to develop and explain innovative tools that secure protection and privacy.
And finally, we believe these efforts will succeed when governments and corporations work together, urgently and vigilantly, towards a shared vision of a CSAM-free society.