One of the greatest things about reading nonfiction is learning all kinds of things about our world which you never would have known without it. There’s the intriguing, the beautiful, the appalling, and the profound. What nonfiction book or books have impacted the way you see the world in a powerful way? Is there one book that made you rethink everything? Do you think there is a book that should be required reading for everyone? (Rebekah of )
talk to each other. Bowling Alone by Robert D. Putnam shared statistics that revealed the
frayed and battered state of the American community. I read both of these twenty years ago.
Love Your Enemies and Alienated America are books I ran across about four years ago, and
both of these alerted me to the ways in which our community has become even weaker
and our civil discourse has become even more strident.
In the last two years, I've focused on learning ways to talk with those I disagree with, and, to help
me, I've read I Think You're Wrong, But I'm Listening and I Never Thought of It That Way.
I welcome any other book recommendations you might share with me.