America and Football
America has 4 major sports: NFL, NHL, NBA and the MLB
But with their growing success in football (soccer), having topped their group that contained a usually dominant English side, will football (soccer) get a foothold in American culture and become one of the major sports?
Post you opinions, what you think will influence it's acceptance or rejection etc.
I personally think it will, especially now that USA are doing so well in the World Cup at the moment. I feel they are playing like a top side and not like a side that has Football as around the 5th highest played sport in the US. The USA has great potential to be a devastating football nation if it is fully embraced and accepted as a world game. The USA needs to stop kidding themselves and start playing real world class sports. The World Series has been the annual championship series of the highest level of professional baseball in the United States and Canada, doesn't it strike you a little silly to name your top level sporting event "World" when it only contains 2 countries? The majority of the world don't even play baseball and those that do rarely have it as a major sport. The same goes for NHL, although it is played more widely around the world, it is still a minor sport in comparison to Football. NFL is, once again, played only in America and Canada. NBA, another sport that is played in some countries around the world but is really only embraced in America.
The USA is dominated by sports that only they play and thus they can say that they are the best at these sports because NOBODY ELSE THINKS THEY ARE GOOD GAMES. Thus, being good at Football will act as a viable ego boost and is really the best way for the US to go in terms of sport.