Few People Know The Real Reason Behind Hollywood Becoming The Film Capital Of The World
By Natasha Kayes|Published January 12, 2024
×
Natasha Kayes
Author
I was born and raised in sunny Southern California and will never tire of the West Coast, although I spent several years living in Southeast Asia, about as far from California as you can get. Wherever I am in the world, I love straying from the beaten path, experiencing local life, and discovering hidden gems - camera in hand. The beach is my happy place and when I am not there (or writing), you will usually find me baking, watching movies, and cuddling my pugs. I have traveled around the country and around the world, and it never, ever gets old. Being able to combine my passion for travel and my love of writing is nothing short of a dream.
You hear “movies” or “film industry” and you think Hollywood, right? They are virtually synonymous today, but the fact is that this is not where the movie industry started and, if things had gone a little differently, we would think of New Jersey as the film capital of the world. What is more, Thomas Edison was – unintentionally and indirectly – responsible for the establishment of Hollywood’s film industry. Keep reading to learn the intriguing *real* history of Hollywood as we know it.
Did you know the history of Hollywood’s film industry? It seems no matter how well we know our home state, there is always something new and interesting to learn, and we just can’t get enough! If you feel the same, dive in and check out more Southern California history and fun facts. And if you are a California native (and proud of it!) you might love the cool duds at Wear Your Roots. See something you can’t live without? Be sure to use coupon code CALIFORNIA10 for a discount!
OnlyInYourState may earn compensation through affiliate links in this article. As an Amazon Associate, we earn from qualifying purchases.