I am having an argument with my brother over this. He thinks, because the USA is able to conquer other countries, that makes it the better country. He is a brainwashed little shyt by my Republican father. He has never left my parents house. He lives at home, in the south of the USA with my father, who also has never left the country. I told him I'd rather move to Japan or Singapore and get out of this horrible country for good! I don't want to be here anymore! I have been around the world. I have studied other countries and their government (for fun). If my kids' father would let me leave this country, I would go to Japan and be an English teacher like I have always dreamed of doing. I know I will be leaving when my kids become adults. I understand being prideful of your country, but there is a limit where it just becomes ignorance. I am sick of that ignorance being shoved in my face on a constant basis. The people at work roll their eyes and say that people from around the world should be thankful that "America is so generous" when they speak about the USA going into other countries and forcing their views on the population. Really? Does this not sound like the conquering that other countries in the past did? Does it not sound like when Germany, Italy, and Japan tried to take over the world in WWII? Sure, the USA is doing it differently, but I am sick of the attitude! The USA isn't generous! It isn't going into these countries out of the goodness in their hearts! Really, do you think the USA is BETTER than every country on this planet? That the USA should be a central power to all of the other countries?
Do you think every country should be exactly like the USA in all of it's ways?