This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
What's up beautiful people of L.A., I'm a college sophmore who, after years of living through harsh Northeast winters, decided the east coast is NOT for me. I'd rather have all year-round warm weather, visiting the beach whenever I please and all that. I also feel as if the cold weather makes me depressed. So yeah, I'm 100% sure I want to move to California after I graduate, but I feel as though my reason isn't entirely a valid point.
I guess this question is for those who moved from the east coast to the west Was it everything you expected? Did you experience any culture shock? Is the climate really as good as I'm anticipating? Are there any negatives to living on the west coast in general? I don't want to just move to Cali based on what I've seen in the movies and magazines. I don't have the means to visit L.A. at this point in my life because broke college student, lol. Thanks guys!
Subreddit
Post Details
- Posted
- 10 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/LosAngeles/...