US War Economy
Arguably, ever since entering World War II, the United States of America’s economy has been a war economy. Starting or fostering wars became essentially, independently of geopolitical reasons, a “good” business proposition. The early 1940s marked the start of the era of systematic wars for profit. War defined as the ultimate capitalist enterprise. The extraordinary war efforts of World War II turned the United States into a giant global arms factory for the war in Europe and in the Pacific. It was even, cynically, credited as the main factor in ending the Great Depression of 1929.