If Japan had never attacked the USA, it would never had entered the war.The USA did not declare war on Germany after Pearl Harbour, only Japan. It was Germany that declared on the USA.Both of which is totally understandable.The USA joined the allies because it was forced to, not because it decided to.
Sighs in WW2
I've seen that movie. America did all the fighting by itself
If Japan had never attacked the USA, it would never had entered the war.
The USA did not declare war on Germany after Pearl Harbour, only Japan. It was Germany that declared on the USA.
Both of which is totally understandable.
The USA joined the allies because it was forced to, not because it decided to.
Why did America care that Germany declared war on them?