Hollywood is fake, full of plastic both ON and OFF the body! I hope the world don't think we are all like that! :0(
How has Hollywood shaped the way the rest of the world thinks about America? If you're an American, how do you feel about the way your lifestyle is presented on film?
In saying "Hollywood", one assumes of course, you mean the entertainment industry. Suffice it to say an unrealistic and often unflattering image of America and Americans is portrayed, for the obvious purpose of ratings and profit. We are depicted as cowboys, smug materialists, or vain drama queens. Yet it is even worse within our agenda-driven, sensationalist news media. Small wonder other countries see Americans in a negative light.
All told, we Americans are rather poorly served in the image dept as few, if any, real residents of the US live an existence anywhere near comparable to what is presented in our entertainment venues.
I am content. Hollywood is Hollywood! It's a movie industry and i know many people don't rely on Hollywood to study a country. But surely there's an image producing out. And to me opposite of all the answers here so far. Many images presented in Hollywood are real combined with individual opinions involved with praises and criticizes by the production crew. Those who find a positive or negative idea of a whole country by one or few movies regardless of the professional aspects of a film making process. It is their fault not Hollywood's.
The most positive available image that we can observe from Hollywood movies is the exercise of our freedom.
The only negative part of Hollywood today is the "sexualization" factor for me.
I can not understand Yin and Yang perspective of "fake" which many agreed! I respectfully disagree and would express the way some part of our society are just screaming criticizes about certain factors such as Hollywood, celebrities and politics and politicians and such is so disappointing.