The Thammasat University Library has newly acquired a book that should be useful for students interested in philosophy, ethics, psychology, sociology, civics, political science and related fields.
Moral Tribes: Emotion, Reason, and the Gap Between Us and Them is by Joshua Greene.
Professor Joshua Greene teaches psychology at Harvard University, in Cambridge, Massachusetts, the United States of America, where he also specializes in neuroscience and philosophy.
The TU Library collection also includes other books about community morality.
His book examines the paradox that morality helps people in a community to cooperate peacefully, but does not encourage comparable harmony with members of other groups.
Morality is meant to promote cooperation, but humans were designed for cooperation only with a select few people.
Morality tends to be focused on cooperating within groups, usually in the context of personal relationships.
Human brains did not evolve to naturally cooperate with all groups.
A group of people who are only interested in what is good for them individually leads to results harmful to the interest of all of them.
One example might be greedy businesspeople in the fishing industry who take away all marine life from the oceans until there are shortages of fish.
If these people learn to put limits on their personal self-interest by agreeing to follow rules, the common good will not be destroyed.
Professor Greene states that morality requires that we sometimes put our group ahead of ourselves, but this same instinct also inspires us to put ourselves ahead of other groups.
We feel obligations and associations with members of our own community, but not to outsiders. Conflicts arise wherever values and the interests of different communities clash on international and local scales.
Professor Greene proposes that to resolve this issue, common values are needed that all humans can agree on, even if they sometimes contradict some of what our common sense morality urges us to do.
If we can aim at maximize happiness as an impartial goal, it may disagree with instinctive individual morality and our obligations to a community.
Yet this approach may be the basis for universally valid standards of conduct.
In an interview posted online, Professor Greene stated:
When we talk about “rights,” it is, I think, really just a way of insisting our gut reactions are correct. If we can take a step back and simply describe where we’re coming from, and what our concerns and values are, that might be more effective than making high-minded declarations about who has what rights. […]
The tendencies that I’ve seen are general human tendencies, more apparent in some cultures than others, but not tied in any strong way to gender.
There are interesting differences between nations like the U.S. that are individualist and more collectivist ones like India. Research by Joe Henrich and colleagues indicates that many cultural differences stem from how people earn a living. In “Moral Tribes,” I discuss how in smaller societies where people hunt on their own, they don’t expect much from other people, and they don’t give much themselves. In small-scale societies where people need elaborate cooperation to survive — doing things like whale hunting, for example — people tend to be much more generous. […]
We’ve run many studies with financial stakes. In one set of experiments conducted with David Rand and Martin Nowak, we had people play the “public goods game.” Each participant gets $10. They can decide whether to keep their money, put all their money, or just a portion into a common pool. The money put in gets doubled and is equally divided among the four people. If you’re selfish, you simply keep your money and get a share of whatever the other people put in. That’s how you maximize your own payoff. But, if you care about the group as a whole, you put all your money into the pool.
In one of these experiments, we put people under time pressure. If your gut reaction is to be cooperative, then time pressure should make you give more, and having time to think, “Well, maybe I shouldn’t. I’ll lose my money,” will make you give less. By contrast, if your first instinct is to be selfish, you may initially think, “I want to keep my money,” but then after deliberating you might say, “Well, maybe I should put the money in the pool for the greater good.” A third possibility is that there are no competing impulses. People will just do what they wholeheartedly want to do. We don’t see this in all populations, but at least in some people, the more automatic reaction is to be cooperative, and further thinking makes people less likely to contribute. […]
I think that, in general, people have prosocial emotions. This is the core of morality.
We have negative and positive social emotions. We can apply them to our own behavior and we can apply them to the behavior of other people. If I care about the person I’m interacting with — someone I love, a friend, or just a stranger towards whom I have good will, my gut tells me to be cooperative, and doing that is rewarding. Likewise, I’ll feel guilty if I’m not cooperative.
We apply this thinking to others as well. If [in the public goods game] you put your money in, you’ll have the gratitude of the others who benefit. If you don’t, then you’ll have their contempt and their scorn. These social emotions motivate us to be cooperative — to think about other people’s well-being instead of just our own. Fortunately, we live in a society in which being exceptionally selfish is also shortsighted. Here, anti-social people tend not to do as well in the long run. But there are places where if you’re cooperative and trusting, you’ll lose.
Here is a brief excerpt from Moral Tribes: Emotion, Reason, and the Gap Between Us and Them:
This book is an attempt to understand morality from the ground up. It’s about understanding what morality is, how it got here, and how it’s implemented in our brains. It’s about understanding the deep structure of moral problems as well as the differences between the problems that our brains were designed to solve and the distinctively modern problems we face today. Finally, it’s about taking this new understanding of morality and turning it into a universal moral philosophy that members of all human tribes can share.
This is an ambitious book. I started developing these ideas in my late teens, and they’ve taken me through two interwoven careers—as a philosopher and as a scientist. This book draws inspiration from great philosophers of the past. It also builds on my own research in the new field of moral cognition, which applies the methods of experimental psychology and cognitive neuroscience to illuminate the structure of moral thinking. Finally, this book draws on the work of hundreds of social scientists who’ve learned amazing things about how we make decisions and how our choices are shaped by culture and biology. This book is my attempt to put it all together, to turn this new scientific self-knowledge into a practical philosophy that can help us solve our biggest problems.
(all images courtesy of Wikimedia Commons)