Best Documentaries About Hollywood Icons

</>

Every year, when the Academy Awards grace television and computer screens across the world, audiences and critics alike joke about Hollywood tending to reward films about the industry that the awards operate within. While movies have repeatedly tackled the subject of Hollywood every year, documentaries remind audiences about the origins and histories contained within the entertainment industry in the United States. These documentaries offer insight not only into the actors working in them, but also into the societal impacts of movies as a whole. It is impossible to deny the impact of movies on everyday people’s lives, whether it is contemporary times with Marvel, or nickelodeons in the early-1900s.



from MovieWeb - Feed ,
via Read More

Post a Comment

Previous Post Next Post