You assume, then, that someone (or a group of someones) arbitrarily decided "proper" gender roles. That's false. The gender roles assigned to men and women are biologically driven. It's a fact, for example, that infants who are nursed receive important disease immunities from their mothers; obviously, that protection to infants isn't available from a father "mother." Consequently, women have the role not only of child bearing, but of child nurturing.
As far as men wearing pants and women wearing dresses, are you unaware of numerous situations in which women wear pants--on a ranch, in a factory, in certain sporting events, on construction projects, etc.?
BTW: Your claim that gender roles are linked to male dominance is correct in an overarching sense. Why? Men are physically stronger than women. It isn't any more complicated than that.