Now i notice people here saying that the reason why women have become "masculine" is because the men in the society haven't lived up to the societal norms of being "a man" and so women have filled in that role, that's not how i see it. In many cases particularly in the west, women have advocated in specifically the feminist movement about how a woman can do all of the men's roles, and how they do not need a male figure in their lives, also to point out that they themselves wanted to do those jobs for their own sense of empowerment, basically a false sense of equality by pursuing the roles men in society have previously taken up, and I have no idea why the blame is now shifted upon the men for the advocacy of these women that have made it so that both men and women do certain things for example shopping as is what this thread was originally made about
You are totally right.. for Western women but when it comes to Somalis who came to the West mostly around 1991, they were forced into it.
Also, they say that the housewife/working man is a myth in the West and that most working class White women have always worked.
The feminist movement was more about getting equal pay and better job opportunities.