• TheEnglishMajor
    +5

    As an American, when I hear "American literature," I immediately taste dust and alcohol.

    The subjects range from the Great Depression to the Vietnam war and the great American road trip, but the most popular decade is the 2000s.

    So, for whatever reason, the 2000s being the most-mentioned decade completely threw me, and that feeling bothers me. I shouldn't feel so surprised by the inclusion of recent literature under the label "American"!

    • cuttysark
      +4

      I agree. To my mind it springs the American Dream, with thoughts of the dusty west, and always racism interwoven with it all. But I wonder if this is due to the literature we are exposed to when we were given these sort of classifications. Certainly our school collections only offered The Great Gatsby, To Kill a Mockingbird and Catcher in the Rye out of all the American novels.