« Back to Glossary Index

The idea that the United States was established as an explicitly Christian nation, and the close relationship between Christianity and the state must be protected—and in many respects restored—in order for the United States to fulfill its God-given destiny.

#

Comments are closed