For many decades, many professions have had the word “-man” at the ending. Such as Policeman, Fireman, Businessman, and so on and so forth. Also jobs have have been made to where they are only certain for type of sex, Doctors are usually male and the nurses are usually female. But over the course of time women have taken over these so called “man jobs” and showed that they can do it as well. So now a lot of professions have actually changed the job title from waiter or waitress to server, women are also taking on more important roles such as CEO positions. Still, sadly, women are under paid and a lot of the time are second runner up when it comes to a higher level job. I think women deserve to be given the chance to show many that they can take on a role as a higher level person and just do it as well as the man would, or maybe even better.