Gteetings! Now, to start off, I am not American and I don’t have much familiarity with playboy as a magazine or a company, the only reason I know of playboy is from movies and tv shows.
I have always thought that playboy was a chauvinistic magazine that objectified women and was owned by a not so nice man. So when I went to their site I was a little surpriced to find articles like this: The Manosphere Isn’t Even Having Fun or Losers Who Wear Smart Glasses Are Filming Women Without Consent
Can someone explain to me if I’ve been wrong about playboy or if they are actually progressive?


i’m not an expert on playboy, but i have to think “feminist” would be quite a stretch… “progressive,” maybe. that said, when it comes to big adult entertainment industry publishers like playboy, it’s all consensual–no one’s getting their pictures taken against their will. men (and women) consumers of such media are there for T&A, and the talent are there for money; everybody wins. the fact is that some women enjoy showing off their bodies, and if they can get paid boatloads of money to do it, all the better.
all of that starts scaling downward the further down the ladder you go, an example being the girlsdoporn scandal where the women were coerced into making movies they were told would never be released domestically. the whole thing ended with the owners of the business landing decades in prison, as it should be.
Buzzfeed but ended worse essentially?