Do Western Societies have a Rape Culture?

Unchallenged
May 29, 21

The entertainment industry has been instrumental in feeding rape culture.

Journalist Harriet Williamson points out that rape is commonly normalised through rape jokes or sexual violence in television shows and films. Popular American TV dramas like Game of Thrones have shown sexual violence through their plot and content.

Sources