When did the American Empire begin

I’m reading Peter Cooper’s Fall of Civilizations. He gives starting and ending points for each, usually when a major kind was enthroned. Ex.: Khmer 802-1431.

That got me to wondering what date a future historian would give to the USA. 1776, declaring independence? 1783, the Treaty of Paris? 1789, the start of the permanent government? Something else, like taking territory from Mexico in 1845?

A poll follows.

  • 1776 - Declaration of Independence
  • 1783 - Treaty of Paris
  • 1789 - Inauguration of Washington
  • Other
0 voters

I’d strongly argue against any earlier date, but if you want to make your case for 1763, when we were on the winning side in a world war, or anything else, let’s hear it.

I voted “other.” While the U.S. can trace its lineage as a nation back to 1776, it wasn’t until the end of the 19th or early 20th century (and, arguably, possibly as late as the 1940s) when the nation really became a world power, which is what I’d picture as being potentially termed as an “empire.”

I don’t think the term can possibly apply until 1803 at the soonest; with the Louisiana Purchase.

But I think I would actually place it at 1819 when we forcibly convinced the Spanish to cede Florida to us. Having already built a small but real Navy and used it in the War of 1812.

Agree with the previous. “Empire” means something specific, and the U.S. did not qualify at its founding.

Is it an empire now?

“The 19th century represented a clear decline for the Spanish Empire, while the United States went from becoming a newly founded country to becoming a rising power.”

Interesting historical change in your use of the word “empire”. Before the British, every empire was a localized empire. Even the European colonizing nations like Spain, France, and the Netherlands were never considered world empires. Empires, yes. Was Spain referred to as a world empire? I don’t remember seeing it.

And is America a “world” empire? Russia and China are major powers of their own, and would certainly be considered localized empires in historic terms. (The Khmer were hemmed in by the Dai Viet, Champa, Thai, and Ayuthayan Empires.) The US certainly hasn’t conquered them; it hasn’t even fought them. The countries it did fight, like Germany and Japan, are happily independent, with their own foreign policies.

I’m going to duck for now the question of whether America is an empire at all, though. Too many people think it is. I’ll see where the thread goes.

Yeah, I’m honestly not sure it is, either, but I was trying to answer your question without fighting the hypothetical too much.

I’m going with their participation in WW II as, IMO, they gained control/coordination/management of the West as well as the ability to project power (to varying degrees) anywhere.

I would give credit to Polk and the Mexican War in 1846. I was the first unvarnished land grab from what a 19th Century American would consider a sovereign nation.

In terms of a global overseas empire, there is a credible argument that the American empire started with the General Sherman incident and subsequent Korean Expedition of 1871, which is the first instance of military invasion of a sovereign nation which did not share continuous boarders with the continental United States, although the goal was just to force capitulation of the Joseon regime but not to annex land or establish colonies. (Forcing Spain to cede Florida and Puerto Rico, or purchasing land from France or Russia can’t really be viewed as imperialism per se…if you ignore the fact that both the US and the European colonial powers essentially ignored the sovereignty of the peoples who occupied those areas with some degree of continuity going well back into the pre-Columbian era.)

Certainly the US annexation of the Philippines in 1898 and the subsequent war in 1899-1902 to topple the fledgling First Philippine Republic was an act of imperial aggression, although the 1898 annexation of Hawaii, following the 1893 overthrow of the internationally recognized Hawaiian Kingdom in a coup d’état by private business interests (albeit in alignment with general American policy) could be regarded as the first blatant act of imperial reach. Subsequent acquisition of the French Panama Canal project in 1903, the US occupation of Haiti in 1915, the various fuckeries that the United States engaged in Latin American under the collective historical term of the “Banana Wars”, and then all of the post-WWII covert fumbling all over Centeral and South America, Cuba, Grenada, et cetera are all clear evidence of imperial proclivities, although the aspirations of United States was not to create an empire of colonies or acquisition of land, but influence over governments, strategic access to resources and favorable military basing, and control of global trade, a hegemony that it maintained into the early 21st Century.

Of course, the United States started grabbing sovereign territory to claim as its own, and forcing capitulation of adjacent nations and peoples to increase its wealth and control before it even became the federated nation it is today, and continued to do so until went as far westward as it could go, then turned attention to the south and outward across both bordering oceans. So, in a real sense, it has always been an empire which inherited colonial ambitions from its predecessor.

Stranger

Before that, in 1853, Commodore Perry sailed to the Ryukyus and the Bonin Islands south of the main Japanese islands, and claimed territory for the United States before sailing into Tokyo Bay and intimidating the Japanese into signing favorable trade agreements. Not exactly an invasion, but the same goals were accomplished with (a show of) military force.

It depends on how one defines “empire”. Is a monarch, whether titled emperor or something else like king, khan, sultan, tsar, etc. a requirement? Do the acquired territories have to be in some way not a full part of the imperial power? If so, then the United States isn’t and likely never will be an empire. If not, then I would argue for one of the early 19th century dates mentioned above, like the Louisiana purchase or acquisition of Florida.

I don’t think the form or government is really a factor. The element that stands out to me is that the land isn’t taken to make way for the expansion of the “empire’s” population, but is acquired for the sake of exploiting that land’s existing people, resources, trade, and militarily advantageous location.

If so, then we might have had brief times of being a small empire, when we had conquered but then left places like the Philippines, Japan, and Cuba, but not anywhere close to being large or major. Due to our method of expansion, admitting states as full members of the union as opposed to something like Spain with Mexico or Britain with India, we’re not an empire based on that definition.

When Spain lost control of Cuba in 1898, Columbus’ remains, which had been moved there for some reason, were transferred to Seville in Spain. Kicking Columbus out of the Western Hemisphere is an act some ancient emperor would approve of as a sign he now was the sole ruler.

I think you’re mostly right about that post-1960. Hawaii and Alaska were territories for a long time that lacked the autonomy of statehood. I believe current US territories have all had the opportunity to put referendums before their people concerning sovereignty, statehood, or the status quo.

My wife was just telling me about an article she read about Columbus’ remains. Was his body expelled by the US or was it evacuated by Spain the same way it was rescued from the slave revolt in Haiti a hundred years earlier.

An expansionist land empire like the Mongols? That qualified on the day the Jamestown party realized there was no silver to dig up and take home so they’d better start planting tobacco. After that it was Drang nach Westen all the way.

An overseas trading empire like the British/Dutch/French East Indies companies, where Mercantilism dictated that only their ships could use these ports? That happened shortly after the Battle of Manila Bay, when the German navy landed troops of their own, until Admiral Dewey chased them off.

But at that point it was still British colonists doing the expanding. If IIRC the first American expansion of that sort was Andrew Jackson and The Trail of Tears. The only difference is that rather than declaring Florida (or Texas, or California, etc.) to be a colony, we admitted them as a full part of the country. IMHO that still qualifies us an empire, just not of the same type as the British, Spanish, French, Dutch, etc.