Ever since America has turned away from its Godly foundation, it has lost the right to call itself a Christian nation.— Florida Made (@RichardMIATL) April 23, 2016
America was founded on Christian principles, hence the "One Nation Under God" staple. As time progressed, America has gotten away from the very foundation that has caused it to be blessed. Look around at what's going on, and tell me whether America is still a Christian nation. Same-sex marriage, sexual perversion, greed, people stepping on each other to get ahead, abortion, etc. Last time I checked, none of those things should be going on in a "Christian nation." Those things became prevalent because America has turned it's back on God. When Adam & Eve disobeyed God and ate the forbidden fruit, sin entered the world. A by-product of sin is sickness & disease. The only way America can become a Christian nation is if believers get serious about asking God to heal this land. Unsaved folks aren't going to ask God to heal America because some unsaved folks have no concept of spiritual matters, so it has to be the responsibility of believers to pray & fast for this nation. As I see it, some believers can join the church choir, but they can't fast & pray for this nation. When their pastor calls for corporate prayer time, it's crickets & tumbleweeds. Some believers (myself included) like to have prayer time at home, and that's good. But when it comes time for corporate prayer for this nation, believers should have no problem coming together on one accord. Until America turns back to God, it cannot call itself a Christian nation. From what I see, there's nothing Christian going on in this country.
No comments:
Post a Comment