The idea that the United States was established as an explicitly Christian nation, and the close relationship between Christianity and the state must be protected—and in many respects restored—in order for the United States to fulfill its God-given destiny.
« Back to Glossary Index
No Category
Comments are closed