Gillespie’s central argument is that content moderation is not incidental to platforms; it is constitutive of them. Every platform moderates, and the question is only how, by whom, and according to whose values. ## Chapter notes ### Chapter 1: All platforms moderate The opening chapter works through the Napalm Girl controversy in forensic detail. Facebook removed Nick Ut’s 1972 photograph of Kim Phuc; the Norwegian Prime Minister reposted it in protest and had her copy removed too; global press coverage forced a reversal. Gillespie uses this as an entry point into the structural impossibility of consistent moderation: Facebook had a *specific policy* on *this specific image*, drilled into content moderators in training sessions. The policy failed not because the system didn’t work but because it worked exactly as designed, and the design was wrong. The chapter establishes two key tensions that run through the whole book: (1) that ‘cultural and legal prohibitions against underage nudity are firm across nearly all societies’ but applying that norm to a Vietnam War atrocity photograph constitutes category error; and (2) that a global platform must impose a single content regime on images that every serious news editor has always treated as requiring contextual judgement. The NYT published the photo only after internal debate. Facebook applied a rule. Nixon’s reported response (wondering aloud whether the image had been faked) is a footnote Gillespie includes that opens onto a much larger point about how powerful actors have always tried to neutralise documentary evidence by attacking its authenticity rather than engaging with what it shows. ## Further reading - David Stenerud, ‘The Girl in the Picture Saddened by Facebook’s Focus on Nudity,‘ *Dagsavisen*, 2 September 2016. - Mike Ahlers, ‘Nixon’s Doubts over Napalm Girl Photo,‘ *CNN*, 28 February 2002. ## Linked concepts - [[Content Moderation]] - [[Platform Power]] - [[Norm Conflict]] - [[Photojournalism]] - [[Richard Nixon]]