America in World War 1
When World War 1 erupted, the United States of America initially maintained a position of neutrality. However, growing hostilities from Germany eventually prompted the U.S. president to declare war on the nation, officially drawing the United States into the conflict. America in World War 1 Key Facts & Information The Cause of the War The … Read more