1941 in the United States

Events from the year 1941 in the United States. At the end of this year, the United States officially enters World War II by declaring war on the Empire of Japan following the attack on Pearl Harbor.

1941 in the United States

Events from the year 1941 in the United States. At the end of this year, the United States officially enters World War II by declaring war on the Empire of Japan following the attack on Pearl Harbor.