I'm sick of all of the "public friendly", "made for everybody" mainstream movies. They bore me.
I'm looking for movies that are a bit dark and surprising. I'll give you an example- I thought Fight Club was excellent. I remember that they didn't even market what the movie was really about, it was a surprise.
Are there any more movies like that?
I'm looking for movies that are a bit dark and surprising. I'll give you an example- I thought Fight Club was excellent. I remember that they didn't even market what the movie was really about, it was a surprise.
Are there any more movies like that?