Explore Freedom » Freedom on the Web » War with Spain Changed America for the Worse (And We Knew It Would)