American Imperialism “American imperialism” is a term that refers to the economic, military, and cultural influence of the United States internationally. Learning Objectives Define American imperialism Key Takeaways Key Points The late nineteenth century was known as the “Age of Imperialism,” a time when the United States and other major world powers rapidly expanded their … Continue reading American Imperialism
Copy and paste this URL into your WordPress site to embed
Copy and paste this code into your site to embed