Just over 40% of Americans report that religion plays a very important role in their lives,[1] a proportion unique among developed countries
end partial quote from:
https://en.wikipedia.org/wiki/Religion_in_the_United_States
Why do you think that religion is more important in the United States than in other developed countries?
My answer would be that it is the hardships that Americans put up with settling this country that made them tougher in many ways that in other developed countries. Though native American civilizations were here they were mostly wiped out by the diseases the White men and women brought with them mostly from Europe. The Native American Immune systems hadn't been trashed for thousands of years of big city life where sanitation problems killed millions over 3000 to 5000 years in the Middle East and Europe. So, Native Americans hadn't lost millions and millions of people to diseases over 5000 years. So, when Settlers came to America it almost completely wiped out the 70 million plus Native Americans and many tribes were reduced to 5% or 10% of their original numbers.
This allowed Americans to settle here without as much trouble as they would have had. Also, having nowhere to turn but God and living remotely as many Americans did made them have to rely on God and religions in order to face the many many hardships they faced since around 1620 when the first Pilgrim Settlers succeeded in not dying here in the U.S. like all the other groups had died before.
So, it is the hardships that early Americans faced that made us as a group as religious as we were and still are which is MUCH MORE religious than any other developed country on earth.
No comments:
Post a Comment