- April 18, 2013 - 8:33 pm in Film @en
Hollywood at Arms: How the Movies Take America to War
The American film industry has always been fond of portraying war. Since its earliest beginnings, telling the tale of America’s wars has been a Hollywood staple. The depiction of armed conflict…