![[assets/covers/content-moderation.jpg]] Content moderation is the practice by which platforms decide what speech, images, and behaviour are permitted on their services. It is not a neutral technical process but an act of governance, and platforms have long tried to obscure this behind the language of community standards and automated systems. ## Moderation is constitutive, not incidental Every platform moderates. The question is only how, by whom, and according to whose values. The Napalm Girl controversy makes this unavoidable: Facebook removed Nick Ut’s photograph, the Norwegian Prime Minister reposted it in protest and had her copy removed too, and global press coverage eventually forced a reversal. The reversal is the tell. Principled systems don’t reverse under political pressure; editorial operations do.[^coti-p10] What makes the case so useful is that it wasn’t a grey area at all. Facebook had a *specific policy* on *this specific image*, drilled into moderators during training as a canonical example of content to remove, despite its historical significance, because it depicted a naked child, in distress, photographed without her consent. That is not an algorithm failing to understand context. It is a deliberate institutional decision that happened to be wrong.[^coti-p11] ## The scale problem Screening millions of posts on a case-by-case basis every week is structurally impossible, which means moderation is always de facto rule-based. Rules encode cultural assumptions dressed up as universal ones. The child nudity prohibition sounds like bedrock (near-universal across societies, Gillespie notes), but applying it to a Vietnam War atrocity photograph exposes its limits immediately. The rule was built for one kind of case and breaks on another. Platforms know this and apply it anyway, because consistency at scale requires abstracting away exactly the context that makes hard cases hard. See [[Norm Conflict]].[^coti-p11] The consent question is where this gets genuinely difficult. The Napalm Girl was photographed during a military operation; consent was neither sought nor possible. The same rule that might sensibly protect child nudity in a domestic context collapses when applied to [[Photojournalism|war photography]]. If consent is the bar, documentary photography of civilians in extremis becomes almost impossible as a practice. Facebook’s policy didn’t engage with this; it just applied the rule.[^coti-p11] ## Equal enforcement The Norwegian Prime Minister reposting the photograph is a small, clarifying moment: a head of government trying to route around a private company’s content policy, and failing. Politicians should be held to the same rules as everyone else (perhaps higher), not granted special immunity because of who they are. Twitter/X’s approach with verified accounts went the other way. Equal enforcement is the only principled position, and it’s the one [[Platform Power|platforms]] find hardest to maintain because it offers no political upside. ## Selected passages > ‘Nor was it an error: in fact, Facebook had a specific policy on this specific image, which it had encountered before, many times. It was later reported by Reuters that the famous photo “had previously been used in training sessions as an example of a post that should be removed. [...] Trainers told content-monitoring staffers that the photo violated Facebook policy, despite its historical significance, because it depicted a naked child, in distress, photographed without her consent.”’ > > *Custodians of the Internet*, p. 11 ## Appearances - *Custodians of the Internet*, Tarleton Gillespie (2018), Ch. 1 ‘All Platforms Moderate’, pp. 10–11 [^coti-p10]: [[Custodians of the Internet (2018)]], p. 10 · *‘even the prime minister of Norway herself, reposted the photo to Facebook, only to have it quickly removed.’* [^coti-p11]: [[Custodians of the Internet (2018)]], p. 11 · *‘Nor was it an error: in fact, Facebook had a specific policy on this specific image, which it had encountered before, many times. It was later reported by Reuters that the famous photo “had previously been used in training sessions as an example of a post that should be removed. . . . Trainers told content-monitoring staffers that the photo violated Facebook policy, despite its historical significance, because it depicted a naked child, in distress, photographed without her consent.”’*