The United States Goes To War
From independence to the beginning of the twentieth century, the United States had adopted a tradition of isolationism. America stayed out of world affairs and expected the world to stay out of American affairs. This changed with the onset of World War I.
Although the U.S. aimed to remain neutral, Germany’s unrestricted submarine warfare led to the sinking of American ships and the deaths of numerous Americans. This and other actions were unacceptable, so the U.S. declared war on Germany on April 6, 1917.
American contributions were not confined to the battlefield: economic production increased, women entered the workforce, dissent against the war was stifled, and American citizens purchased bonds to pay for the war. The United States’ mobilization and entry into the conflict boosted the Allies’ morale, devastated German will, and helped lead to peace.