Hollywood is so fake and people need to realize that people are just people and you too don't need to be born into something or have money or have whatever product someone is hawking on you.
The downfall of the industry seems to actually be good for art. I think the industry will find their way once the focus shifts from its greed-based origins downsizes and begins to support creative visions that speak to our times and shifting ideals.