420blazeit69,

And lastly, WWII wasn’t a war of conquest for the US… Calling the US’ actions in Japan “Imperialism” destroys any credibility you may have otherwise had.

The U.S. declaring war on Japan after Pearl Harbor was not imperialism. But after the war, when the U.S. turned Japan into a vassal state and kept a ton of military bases throughout the Pacific (to supplement those from its initial phase of empire building), that is imperialism.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • [email protected]
  • aaaaaaacccccccce
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • oklahoma
  • Socialism
  • KbinCafe
  • TheResearchGuardian
  • Ask_kbincafe
  • SuperSentai
  • feritale
  • KamenRider
  • All magazines