Hollywood's Role
What is Hollywood's role in US policy and politics? (Generalities abound here ... I know it's wrong, but don't care)
Hollywood, on the whole, has become very politically active against the current White House administration, from Michael Moore to many others. This activity and a comment that I read today on Captains Quarters made me think, what is the real role of Hollywood in US politics and government?
Many Hollywood personalities have pushed the Bush administration's buttons since he was first elected over Al Gore. The 9-11 attacks offered slight respite from the undermining of the administration, but the aftermath and the war in Iraq have refreshed those who feel they have an axe to grind. Political involvement and speaking your mind, are not only rights, but responsibilities, if you would believe the lines from Hollywood.
What I have a problem with is the later use of events by Hollywood to make money. Will there be movies made on elements of the Iraq war? Absolutely. With there be movies made on 9-11 events? For sure (although further down the line, I think). What will the goal of these movies be? To educate the masses or to entertain and make money? I think we all know it is the latter. Does that not make the criticisms of the White House seem hypocritical?

<< Home