Let me reiterate, no question. Let me also add the demeaning way women are treated in the entertainment industry not just as actresses but also the types of movies they make which demeans women.
Some liberals (especially those in Hollywood), like to present themselves as the moral authority always taking the high ground on social issues when time and time again, it's only a façade. The bottom line is Democrats are tied to Hollywood and the entertainment industry in general and if women’s rights groups think this is okay then they’re living by double standards.