Friday, February 12, 2010

Is the United States of America a Christian Nation officially?

Not officially, but Christianity is the biggest religion in the US.Is the United States of America a Christian Nation officially?
the united states promotes no religion, its a free country.


Its called separation of church and state. If you were truly an American you would already know this.Is the United States of America a Christian Nation officially?
not anymore





it was once blessed, but now you got abortions, gays, adultery, porn, materalism, I can go on and on
Not any more.
  • free spyware
  • No comments:

    Post a Comment