Men didn't give women freedom, women fought and died to acquire those freedoms.
Women have contributed to Western civilization for as long as men have, you silly bint. Or do you think that women sat on their asses, doing nothing whatsoever, for hundreds of years and only began to pull their weight when they got voting rights?
What is it they tear down? I can only judge how it is around me, but the trend here is that it's cleaner and fresher where only women are, while it's sometimes downright filthy where only men are.
The world has never been in better shape than it is now, btw.