I am very sure that this is not the truth, though you may well be honest in mistakenly proposing it. I have more affection for your country and its people to contemplate the possibility that what you have written really represents their attitude towards the rest of the world. But I'm glad you have been induced to speak your own personal feelings openly. It's always good to get it out.
Why? See this is what we keep asking you and you don't answer and get twisted about it. I'm married to a European. My sister is married to an Indian who was raised in England. My ex husband is from Egypt. My best friends from Pakistan etc etc around we go.
I have a lot of love for Europe as well. But over the years I've notice this ridiculous snooty pants attitude for Europeans that denigrate Americans for thinking they are the center of the world etc.
And while this may be true to some degree, >>>>read this part three times.>>>>> At LEAST IT"S BASED ON SOME REALITY
America IS the most influential country in the world. Yes China is up there and very important. But as I've pointed out before, our country is less than 300 years old and it is up against China which has been around for thousands of years.
If you go to any country in the world, they can tell you about America. Why?
The world knows all about America and we're used to it. It's just the way it is. It's just the way it happened.
It seems that Europeans cannot handle this concept. They get outraged when Americans are not interested in the rest of the world the same way. Many Americans I know are content to live simple lives and never even leave their own zip code in their lives just doing their thing. As a result the goings on in the world have absolutely no meaning or influence in their lives at all.
Now for me, I would curl up and die if my life was like that. Most of the people I know personally are very interested in the world and in history etc and whatnot.
But the question I ask you YET AGAIN that will not be answered because it has not been answered the entire thread is this
What does Europe do that makes it so relevant and important to Americans?
Because most of what I hear is that its famous in HISTORY not that it's doing anything all that fantastic now.
China pulls the interest of Americans because of trade.
Europe tends to pull the interest of Americans as "tourists" if you know what I mean.
So please explain what's going on in England and Europe that requires America to sit up and take notice.
So far, I've got the UN. Allies in invasions etc. Then what?
Next consider Canada, and again What is Canada doing that is so important that it would influence Americans?