I wish men would understand what it’s like to live in a world where woman’s bodies are constantly sexualized, and used to make others feel inferior. Our whole society is built around making woman feel bad about themselves so that they consume. There isn’t a single place you can look where you won’t find a woman’s body to sell a product. It’s hard being bombarded constantly with pictures and videos of what you’re supposed to be and who you’re supposed to look like. It sucks. And it constantly makes you feel inferior.