Many Americans would love to say that the United States is not and has never been an empire. Empire is what the British, French, Romans, and Mongols do, not the American republic, which was born of a revolt against imperial rule. Some scholars, such as Samuel Flagg Bemis, treat the U.S. seizure of the former Spanish empire at the end of the 19th century as the “great aberration” from the American norm of opposing empire. Since the Vietnam War era, however, the majority of U.S. historians have broken with this view, and the idea that the United States was and still is an empire now dominates the discipline. Calling U.S. foreign policy “imperial” has become more of a political litmus test than as a useful analytical term, leaving debates over American empire feeling stale.