How was life similar and/or different for American women before, during, and after the war? Did the war change the role of women in American society?



Answer :

before the war women could not get jobs it was against the social viewpoint the job was to stay home and keep it going. during the war that changed as all the men left and someone needed to work in the factories so women took over,but after the war they were sent out of the factories so the men could all have jobs

The war greatly changed the role of women in America. Before the war, women were restricted to a domestic life. This means that women were discouraged from participating in public life, and were mostly concerned with family matters. However, during the war, women had to take on roles that they had never attempted before. Women joined the workforce in large numbers, and they often took the role of providers for their families while the men were fighting in the war.

After the war, many women did not want to go back to their domestic life. Feminism gained traction and women achieved more visibility in the public sphere. They also gained more legal rights and a new role in American society.

Other Questions