What has happened to Christianity in America?

Wednesday, June 13, 2018

Print item

What has happened to Christian churches in America? Christian churches and, perhaps, especially the mainline denominational churches, have been infected with the error of doubting the inspiration, inerrancy and authority of the Bible.

Subscribers must LOG-IN to read this full story.
Monthly and yearly online subscriptions are available starting at only $2.99. Access is free for print subscribers. Click here to see rates and register.


To report abuse or misuse of this area please hit the "Suggest Removal" link in the comment to alert our online managers.

Use the comment form below to begin a discussion about this content.

Registration is required to make comments. Click here to LOGIN.
You can register for FREE to post comments and receive alerts.