QUESTION IMAGE
Question
how did world war i change womens roles in the united states?
women received greater educational opportunities.
women fought alongside men in the military.
women replaced men in the workforce.
women earned more money than men.
During World War I, many men went to war, so women took over their jobs in the workforce. Women did not receive greater educational opportunities as a direct result of WWI. They did not fight alongside men in the military in large - scale combat roles at that time. And women did not earn more money than men.
Snap & solve any problem in the app
Get step-by-step solutions on Sovi AI
Photo-based solutions with guided steps
Explore more problems and detailed explanations
Women replaced men in the workforce.