iggy83
contestada

How did the role of women in the United States change during and after World War 1

Relax

Respuesta :

Answer:

During WWI (1914-1918), large numbers of women were recruited into jobs vacated by men who had gone to fight in the war. New jobs were also created as part of the war effort, for example in munitions factories.

Explanation: