top of page

Gender Roles in America Women in the Workplace

Women in the Workplace While there has been significant progress made in creating a fair and equal playing field for women in the workplace, 2020 and the COVID-19 pandemic highlighted the disproportionate expectations for women and men in the workplace. Moreover, it highlighted the underlying gender roles that our society continues to uphold.  For this discussion, I want you to read the following two articles and listen to the associated podcast: Women in the Workplace 2020Links to an external site.  COVID-19 threatens to reverse five years of progress for women in Corporate AmericaLinks to an external site.  After reviewing the links above, provide a summary of how you believe gender and societal gender roles have impacted women in the workplace in 2020. In your review:  Define gender and its role in American culture. What roles have women typically held in American culture? How has that evolved over the past 20 years?  Where do you see women's roles in the workplace 5 years post-pandemic?

Gender Roles in America

The World Health Organization defines gender as the socially constructed characteristics of women and men, such as norms, roles, and relationships of and between groups of women and men that varies from one society to the other" (World Health Organization, 2019). In the United States, gender has often been binary as either male or female. Thus, there are roles, norms, and behaviors that American society expects from men and women.

Initially, American women were only accustomed to doing domestic chores. This includes taking care of the children, cooking for the family, and ensuring the home is clean. Therefore, marriage was a big deal for women as they needed a husband to work and provide for them. Homes survived on single-income models, where the man (husband or father) was responsible for the family's financial well-being while women sat back and managed homes.

However, this has changed as there is more role integration. More families now have dual-income systems, where both spouses earn a living to support the family. The emergence of daycares helps even mothers to leave their children in a safe place and go to work. Such has greatly benefitted women, who can now raise families without depending on men. Single mothers work just as hard in different professions to raise their families in ways that please them, and men are getting more involved in sharing the responsibility of nursing children.

Post the pandemic, I feel women will continue being represented even more in the corporate world. This is especially possible with the flexibility of work that COVID has brought, enabling women to respond to their corporate duties remotely. There is no need to take extended maternal leaves anymore hence more availability. This way, more women will land senior roles in the employment sector and maintain them over long periods with proper planning.


World Health Organization. (2019, June 19). Gender. Retrieved from website:

Violin and Piano School Florida.  Zoom, Online and In Person Orlando Florida
Be the first to know!

Thanks for subscribing!

bottom of page