I chortle at the inference that Americans have somehow expunged their colonial origins.

How can you say that we have expunged our colonial origins? Our schools teach history, but there's no reason to keep calling us colonies. I find it rather offensive to call any sovereign nation a colony. Colony implies that a nation is still subject to or closely associated politically with another, superior country. I think most people are aware that we made a little break in that connection about 225 years ago. We fought a vicious war and proved our superiority. (No offense intended.) From what I understand New Zealand and Australia were granted their sovereignty by parliamentary legislation, but that doesn't make them colonies. I see the term colony as a derogatory word when talking about other countries, especially when we "colonies" were the ones saving the UK in the World Wars.