United States non-interventionism primarily refers to the foreign policy that was eventually applied by the United States between the late 18th century and the first half of the 20th century whereby it sought to avoid alliances with other nations in order to prevent itself from being drawn into wars that were not related to the direct territorial self-defense of the United States. Neutrality and non-interventionism found support among elite and popular opinion in the United States, which varied depending on the international context and the country's interests. At times, the degree and nature of this policy was better known as isolationism, such as the interwar period.