Just goes to show how ridiculous it is that Hollywood always seems to be preaching to the rest of us about morals especially considering the terrible impact their industry has on violence against women and gun violence in general. This isn't an accusation, it's a proven fact.