Originally posted by Robtard
TIL: Rape culture now exist when before it was just a "Leftist" lie, but only in Hollywood.
How clever do you think this is? We received new information about Hollywood. It changes the perception when you find out in this business(that really permeates many different businesses considering everything you need to make a film) had sexual predators and a ton of people knew and kept quiet.