So for the past few weeks Discovery Channel has been airing a special called "Life."
It is amazing to see how nature survives and thrives.
I have noticed though, that in nature it is usually the males who impress females.Male animals have elaborate dances, rituals, bright colors and other ways to attract and compete for mates.
Up until recently in the human world, this pattern was true was also.Males would compete for and try to impress females. Somehow though this has been turned around and i feel like women are now pulling out all the stops to get men to notice them--beyond just clothes and makeup, but also in very public displays of private areas, plastic surgery, and extreme competition between girls that now even sometimes erupts in physical violence.
When did this shift occur? Is it real or am I mis-reading? Why?