loader from loading.io

Building a Database of CSAM for AOL, One Image at a Time

Community Signal

Release Date: 10/18/2021

When an Online Community Pro Retires show art When an Online Community Pro Retires

Community Signal

 is a legend of the online community profession. After 30 years, she has retired. But what does it mean when we retire from this work? Her career began AOL in 1994, building communities and managing a massive volunteer program. Among her numerous stops, Rebecca found a focus in child safety, leading such efforts for Sulake (the company behind Habbo Hotels and Disney’s Virtual Magic Kingdom), Mind Candy (Moshi Monsters), and most recently SuperAwesome, a provider of tools for safer, responsible digital engagement with young people, who was acquired by Epic Games. A program manager for...

info_outline
Breaking: Online Community Consultant Discovers Brand New Concept (Again!) show art Breaking: Online Community Consultant Discovers Brand New Concept (Again!)

Community Signal

Online community consultants aren’t unlike consultants for any other area of work. Some are ethical, smart, and talented, and some aren’t. Consultants also don’t often make great guests for the show because they view it as yet another lead generational funnel for them to shout generalities into. But hopefully an exception is this episode with community consultant . On it, we discuss how being humble is often at odds with how many consultants promote themselves, as they place a certain importance on appearing authoritative and revelatory, even if that isn’t actually correct in the...

info_outline
Kinks vs. Crimes and Gender-Inclusive Content Moderation at Grindr show art Kinks vs. Crimes and Gender-Inclusive Content Moderation at Grindr

Community Signal

Bodies aren’t moderated equally on the internet. Content moderation efforts, especially those at large, mainstream platforms, can suffer from policy-based bias that results in moderation centering a cisgender gaze. This reinforcing of heteronormativity can leave some of your most vulnerable community members – and potential community members – feeling alienated, ostracized, and simply unwelcome. Last year, in her role as CX escalations supervisor at , Vanity Brown co-authored a whitepaper, . Insightful, with a straight forward approach to making content moderation just a bit...

info_outline
Safeguarding a Diabetes Charity Community and Knowing if You’ve Done the Right Thing show art Safeguarding a Diabetes Charity Community and Knowing if You’ve Done the Right Thing

Community Signal

Safeguarding is a term used in Ireland and the United Kingdom that covers efforts to protect the health, wellbeing, and human rights of people, especially children and those who are otherwise vulnerable. At , four people alternate by week as the safeguarding lead, helping to protect those that the charity comes in contact with. One of them is Josh Poncil, the online community and learning manager. Among his responsibilities is . On this episode, we talk about safeguarding and knowing if you’ve done the right thing at the end of the day, plus: What is considered “too technical”...

info_outline
Empowering Employee Resource Group Leaders With Your Internal Community Platform show art Empowering Employee Resource Group Leaders With Your Internal Community Platform

Community Signal

Employee resource groups (ERGs) can do a lot to create a greater sense of belonging at your organization. But the folks who volunteer to lead these groups may find themselves in need of help when it comes to utilizing perhaps the greatest tool at their disposal: Your internal employee community platform. As a community strategist within large organizations,  has trained employees to help them get the most out of these platforms. She has also managed two large migrations, both from Jive, and that has led her to have a (in her words) cynical perspective on the resources made available...

info_outline
The Chief Community Officer Hype Machine show art The Chief Community Officer Hype Machine

Community Signal

As we celebrate Community Signal’s 7th birthday, Patrick takes questions from Community Signal listeners and supporters in this first ever “Ask Patrick Anything” episode of the show. Questions include: If everything had worked with CNN+, what would community look like for the platform? Would you rather be a working community professional or a community consultant? Will we ever see community leaders in the C-suite as the norm? 2023 will be Patrick’s 25th year of community work, so this is an opportunity to reflect on that passage of time. A lot has changed and, surprisingly, some...

info_outline
Elon Musk’s Quest to Make Twitter Worse show art Elon Musk’s Quest to Make Twitter Worse

Community Signal

Elon Musk’s presence has loomed over Twitter since he announced plans to purchase the platform. And for these few weeks that he’s been in charge, many concerns have proven to be justified. Musk , and then . He is . The verification process, perhaps one of Twitter’s most trusted features, has been unraveled. He’s offered severance to those who don’t want to be part of  Following the results of a Twitter poll, , who was suspended from the platform for his role in inciting the January 6th attacks. So, what happens now? What of the many social movements that...

info_outline
When Community is on 3 Teams in 5 Years show art When Community is on 3 Teams in 5 Years

Community Signal

As  customer base and product offerings have grown, so has its community. The Zendesk community started in 2008, under the support organization, as a space for people to ask and answer questions about using the product. Since then, it has shifted departments multiple times, leading to changes in KPIs and core purpose. , the company’s director of community, joins the show to explain how she has navigated these challenges. Tune in for her approach on thoughtfully managing change and expectations within your community and inside of your organization. Patrick and Nicole also discuss: ...

info_outline
Why Community on the Product Team Works, From a Product Leader’s Perspective show art Why Community on the Product Team Works, From a Product Leader’s Perspective

Community Signal

Recently, community pro Danielle Maveal joined Community Signal to discuss . In this episode, we’re getting the opposite perspective from product leader . Gitesh and Patrick worked together at CNN, where community reported into product. And while the product and community that they were building were short lived, they both speak highly of their time working together. Gitesh describes creating a team atmosphere where each individual’s expertise was respected and given room to ladder into organizational goals, giving each person the opportunity to see the impact of their work....

info_outline
Lessons in Building Safe, Inclusive, and Functional Spaces for LGBTQ+ Folks show art Lessons in Building Safe, Inclusive, and Functional Spaces for LGBTQ+ Folks

Community Signal

If you’re wondering how you can more actively foster safety and belonging for LGBTQ+ folks in your online community, there’s precedent to learn and borrow from. In this episode of Community Signal, we’re joined by , the CEO and founder of . Venia shares lessons from her decade of experience building community for LGBTQ+ individuals, which started when she began sharing her transition journey on YouTube.  Patrick and Venia discuss tools, policies, and practices that can help build queer friendly spaces over time. For example, how easy is it for someone to edit their profile...

info_outline
 
More Episodes

If you work in content moderation or with a team that specializes in content moderation, then you know that the fight against child sexual abuse material (CSAM) is a challenging one. The New York Times reported that in 2018, technology companies reported a record 45 million online photos and videos of child sexual abuse. Ralph Spencer, our guest for this episode, has been working to make online spaces safer and combatting CSAM for more than 20 years, including as a technical investigator at AOL.

Ralph describes how when he first started at AOL, in the mid-’90s, the work of finding and reviewing CSAM was largely manual. His team depended on community reports and all of the content was manually reviewed. Eventually, this manual review led to the creation of AOL’s Image Detection Filtering Process (IDFP), which reduced the need to manually review the actual content of CSAM. Working with the National Center for Missing and Exploited Children (NCMEC), law enforcement, and a coalition of other companies, Ralph shares how he saw his own team’s work evolve, what he considered his own metrics of success when it comes to this work, and the challenges that he sees for today’s platforms.

The tools, vocabulary, and affordances for professionals working to make the internet safer have all improved greatly, but in this episode, Patrick and Ralph discuss the areas that need continued improvement. They discuss Section 230 and what considerations should be made if it were to be amended. Ralph explains that when he worked at AOL, the service surpassed six million users. As of last year, Facebook had 2.8 billion monthly active users. With a user base that large and a monopoly on how many people communicate, what will the future hold for how children, workers, and others that use them are kept safe on such platforms?

Ralph and Patrick also discuss:

  • Ralph’s history fighting CSAM at AOL, both manually and with detection tools
  • Apple’s announcement to scan iCloud photos for NCMEC database matches
  • How Ralph and other professionals dealing with CSAM protect their own health and well-being
  • Why Facebook is calling for new or revised internet laws to govern its own platform

Our Podcast is Made Possible By…

If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Vanilla, a one-stop shop for online community.

Big Quotes

How Ralph fell into trust and safety work (20:23): “[Living in the same apartment building as a little girl who was abused] was a motivational factor [in doing trust and safety work]. I felt it was a situation where, while I did basically all I could in that situation, I [also] didn’t do enough. When this [job] came along … I saw it as an opportunity. If I couldn’t make the situation that I was dealing with in real life correct, then maybe I can do something to make a situation for one of these kids in these [CSAM] pictures a little bit better.” –Ralph Spencer

Coping with having to routinely view CSAM (21:07): “I developed a way of dealing with [having to view CSAM]. I’d leave work and try not to think about it. When we were still doing this as a team … everybody at AOL generally got 45 minutes to an hour for lunch. We’d take two-hour lunches, go out, walk around. We did team days before people really started doing them. We went downtown in DC one day and went to the art gallery. The logic for that was like, you see ugly stuff every day, let’s go look at some stuff that has cultural value or has some beauty to it, and we’ll stop and have lunch at a nice restaurant.” –Ralph Spencer

How organizations work with NCMEC and law enforcement to report CSAM (28:32): “[When our filtering tech] catches something that it sees in the [CSAM] database, it packages a report which includes the image, the email that the image was attached to, and a very small amount of identifying information. The report is then automatically sent to [the National Center for Missing and Exploited Children]. NCMEC looks at it, decides if it’s something that they can run with, and if it is … they send the report to law enforcement in [the correct] jurisdiction.” –Ralph Spencer

When “Ralph caught a fed” (37:37): “We caught the guy who was running the Miami office of [Immigration and Customs Enforcement]. He was sending [CSAM]. … That one set me back a little bit. … I remember asking the guy who started the team that I was on, who went on to become an expert witness. He worked in the legal department, and his job basically was to go around the country and testify at all the trials explaining how the technology that caught these images worked. I said, ‘I got an email about this guy from ICE down in Florida, was that us?’ He’s like, ‘Yes, that was you.'” –Ralph Spencer

Facebook’s multiple lines of communication offer multiple avenues for content violations (45:08): “Zuckerberg is running around talking about how he’s trying to get the world closer together by communicating and increasing the lines of communication. A lot of these lines just lead to destructive ends.” –Ralph Spencer

About Ralph Spencer

Ralph Spencer has been working to make online spaces safer for more than 20 years, starting with his time as a club editorial specialist (message board editor) at Prodigy and then graduating to America Online. He’s wrestled with some of the most challenging material on the internet.

During his time at AOL, Ralph was a terms of service representative, a graphic analyst, and a case investigator before landing his final position as a technical investigator. In that position, he was in charge of dealing with all issues involving child sexual abuse material (CSAM), then referred to as “illegal images” by the company. Ralph oversaw the daily operation of the automated processes used to scan AOL member email for these images and the reporting of these incidents to the National Center for Missing and Exploited Children (NCMEC) which, ultimately, sent these reports to the appropriate law enforcement agencies.

The evidence that Ralph, and the team he worked with in AOL’s legal department, compiled contributed to numerous arrests and convictions of individuals for the possession and distribution of CSAM. He currently lives in the Washington, DC area and works as a freelance trust and safety consultant.

Related Links

Transcript

Your Thoughts

If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.