Was America Really Founded as a 'Christian Nation'?
Most Americans believe the U.S. was founded as a Christian nation. But, is this merely a myth? Larry talks with Professor Kevin Kruse, the author of a new book claiming 'Corporate America' invented 'Christian America,' and how that has defined & divided our politics since.
Dear readers! Thank you for your vibrant engagement with our content and for sharing your points of view. Please note that we have switched to a new commenting system. To leave comments, you will need to register. We are working on some adjustments so if you have questions or suggestions feel free to send them to feedback@rttv.ru. Please check our commenting policy. Happy holidays to you all! Question More