The President of the United States, Barrack Obama, stood on the world stage and proclaimed to the inhabitants of the earth that the United States is not a Christian nation, to the shock and dismay of Christians everywhere. If the President meant that the U.S. has gradually turned its back on Christ and His Bible, few would disagree, but that was not his meaning. Is the United States of America a Christian nation?
The post Is America a Christian Nation (Part Four)? first appeared on Koa Sinag.