It’s Christmas Day and King Arthur’s knights are gathered at the round table, celebrating with the king and queen. The Green Knight, a kind of giant tree-man, rides in on horseback, interrupts the party, and challenges anyone present to a game. Here are the rules: The challenger will be allowed to deliver a blow that the Green Knight will receive without resisting. But first the challenger must agree to seek out the Green Knight in one year’s time and receive without resisting the same blow from him.
This is the set-up to the epic poem, “Sir Gaiwan and the Green Knight,” circa 1300, which I read in a couple of translations after seeing the recent film adaptation, The Green Knight. I’d never read the poem before, but the more I thought about its opening scene, the more I thought: how weirdly precise … and how familiar.
My brother and I, when we were kids, would take turns punching each other in the stomach. I recall, too, a scene from a comedy about insanely competitive college ballplayers, in which teammates take turns thumping each other’s knuckles with a flicked middle finger until the skin over their fists is flaming and their eyes are watering with pain. This is a boys’ thing; this speaks to what it supposedly means to be manly—brave, tough, strong.
Still, the round table knights, big burly fighting men, hesitate to take the Green Knight up on his proposal. The possibility of winning a contest of this sort would depend on one’s belief that his capacity to dish out and take punishment outweighs that of his opponent, a giant who resembles a tree. Who would accept such a challenge? Let me rephrase that. What kind of stupid would a person have to be to take part in a game like that? In answer, I’ll simply say that my brother was two years older than I was, a lot stronger, and although, in our stomach-punching game, I never made it past the first round, neither did I ever refuse to play, and in fact, playing was, as often as not, something I myself initiated.
I may have been acting on an impulse that reaches far back into mythic time. It turns out that “Sir Gawain and the Green Knight” is not the only ancient tale that involves a similar challenge. Scholars have called it “the exchange game.”
Gawain, Arthur’s nephew, seated by his side, steps up and accepts the challenge. Making the most of his first-turn advantage, he takes a sword and beheads the Green Knight in a single stroke. To the surprise and horror of everyone in the hall, the Green Knight stands, picks his head up off the ground, reminds Gawain of the bargain, and rides away with his head in his arms.
More commonly in Arthurian studies, this particular version of the exchange game is referred to as the “Beheading Game.”
In my last post, I wrote about Robin Wall Kimmerer’s book, Braiding Sweetgrass: Indigenous Wisdom, Scientific Knowledge, and the Teachings of Plants and its concept of reciprocity: the biosocial bond between human and non-human life created by the giving and receiving of gifts and the gratitude and obligation it engenders. I wrote, too, about Kimmerer’s project of restoring healthy relations between human and non-human nature as re-story-ation. “Stories,” Kimmerer writes,
are among our most potent tools for restoring the land as well as our relationship to land. We need to unearth the old stories that live in a place and begin to create new ones, for we are storymakers, not just storytellers. All stories are connected, new ones woven from the threads of the old” (341).
Back in the 1980s, physicist and cosmologist Brian Swimme called on artists, poets, mystics, and nature lovers to tell a “Cosmic Creation Story” that connected the big bang, the Gaian emergence of atmosphere and soil, cell symbiosis, the music of Mozart and other works of human genius in a continuous line of creative adventure. More recently, in his 2021 book The Nutmeg’s Curse, Amitav Ghosh argues that lasting solutions to our various crises must be rooted in “a common idiom and a shared story—a narrative of humility in which humans acknowledge their mutual dependence not just on each other, but on ‘all our relatives'” (242). These are only two examples of many who believe, as Kimmerer does, that new and repurposed stories are crucial to disrupting our current trajectory of collapse and to building a constituency for transformation.
Kimmerer’s background gives her access to a storehouse of Indigenous myth to draw on and interpret for present times, but she warns against the “wholesale appropriation” of Indigenous stories, advising that “an immigrant culture must write its own new stories of relationship to place” (344). On the other hand, I seem to remember at some point in the book—I couldn’t find the location in my notes, and, unfortunately, Braiding Sweetgrass includes no index—Kimmerer reminds her readers that all peoples have a past to draw on that is indigenous to somewhere. The implication is that even the immigrant community she is referring to must have some claim—however attenuated, however muddied by subsequent adjustments—to pre-modern myth and to a time when human groups lived more reciprocally with, as Ghosh puts it, ‘all our relatives.'”
It was with all this in mind that I saw David Lowery’s 2021 film, The Green Knight, much lauded by critics but not I think very widely seen. A few of those critics remarked on the film’s ambiguousness, that two viewers might walk away with very different opinions as to the film’s topic and message. Be that as it may, I’ll continue with the story.
A year passes, and to make good on his promise, Gawain journeys to find the Green Knight. As Christmas Day approaches, he comes upon a castle where he’s welcomed by a Lord and Lady. The Lord encourages Gawain to stay a few days and soon proposes a playful exchange. The Lord will hunt during the day and bring back whatever he kills as a gift to Gawain. In return, Gawain will stay in the castle with the Lady and her maids and will give to the Lord whatever good things he receives there. Gawain doesn’t quite understand the point of the proposition, but he agrees.
The hunt, the castle—these episodes make up the bulk of the poem. As do many literary works from an oral age, they play to a wide audience. The descriptions of the hunt are vivid and exciting. They are all about danger, daring, and action. The castle passages are of a different order. The Lady is intent on seducing Gawain. Will he be virtuous, rebuff—and perhaps insult—the Lady? Or, given the likeliness that he will soon be beheaded by the Green Knight, will he take a pleasure offered while he can?
These aren’t the only dilemmas. If Gawain does accept what the Lady is offering, will he make good on the agreement with the Lord and return the same to him? In these castle passages, we have a moral dilemma, we have the ancient question, how should one behave knowing that death is unavoidable and that one’s time on earth may be cut short at any moment? We also have the components of farce.
The director David Lowery makes a few wise adjustments and additions to this basic story. Foremost, he makes Gawain a young man, not yet a knight, a lazy youth who has skated along on his privilege. His story, therefore, becomes a bildungsroman, his journey about the formation of character. That gives the story an arc less discernable in the poem. Without losing the framework, Lowery also keeps a stricter control of tone, emphasizing the existential qualities and deemphasizing the low comedy.
But let’s think about the story in Kimmerer’s terms. First, it’s a tale from the immigrants’ indigenous tradition, reaching back to the early medieval period, with components—particularly, the exchange game motif—that surely go even further into the past. Second, the two games Gawain is involved in—one with the Green Knight, the other with the Lord—are inverted versions of each other, each dramatizing the matter of reciprocity. Both the poem and film render the Golden Rule in ways that transform it from a platitude into a problem of giving and receiving. As a human being in the world, within a household, within a wider ecology: Are you willing to take as you give? Are you willing to give as you take? Are you willing, in other words, to truly engage with others in a reciprocal relationship?
Braiding Sweetgrass is a work of cultural ecology, and my discussion of the film is in the same spirit of critique. What I mean is, Kimmerer’s survey of our “broken” ecology, our bad faith relationships within the web of life, aims mostly at the realm of culture—questions of how stories and ideas shape our account of ourselves as living beings within a living world. Traditionally, cultural critiques have themselves been criticized as shallow, facile, romantic, quietist, anti-modern, and, for these reasons and others, too easily recruited by the forces of political reaction. Whatever the original sources for the Green Knight story may be, the poem which has come down to us certainly bears the marks of a patriarchal, Christian society. On the other hand, the nature of myths is that they can contain the materials for their own subversion. This paragraph, in other words, means to raise a number of new questions. But this post is long already, so for now at least, I’ll leave it here.
A version of this essay first appeared in Society for US Intellectual History.
Swimme, Brian. “The Cosmic Creation Story” in The Reenchantment of Science, ed. By David Ray Griffin, (Albany, NY: SUNY Press, 1988), 47-56.
At some point, intellectual historians will have to reckon with the phenomenal success of Robin Wall Kimmerer’s book, Braiding Sweetgrass: Indigenous Wisdom, Scientific Knowledge, and the Teachings of Plants. When they do, they may place it among the most important works of its kind, up there with Walden, say, or Silent Spring. Now is probably not the time. First published in 2013, it is at this writing number three on the New York Times bestseller list of non-fiction books in paperback, a list it has appeared on now for 131 weeks.
What accounts for the book’s success? Certainly, a genre exists for lyrical nature writing. But it appears that Braiding Sweetgrass has crossed over to a wider audience. In the midst of this era of multiplying, accelerating crises, there is something emotionally stabilizing about Kimmerer’s book, and I think that can be attributed to her central concept: reciprocity.
Kimmerer is a professor of botany, trained in universities and mainstream science. But her concept of reciprocity comes from her background as a member of the Citizen Potawatomi Nation and her training in Traditional Ecological Knowledge (TEK). In the creation story with which she begins the book, Skywoman falls from above in a beam of light. The creatures in the darkness catch her, care for her, make for her a home of mud, and she reciprocates with the bundle of seeds she carries in her hand. In this way the earth was made, “not by Skywoman alone, but from the alchemy of all the animals’ gifts coupled with her deep gratitude” (4). An alchemy of reciprocal gifting, in other words—you receive a gift, you’re grateful, and you give a gift, and that creates a bond. Kimmerer later gives this a more systems science description: new arrangements are created, old ones transformed, by a joining of “obligate symbiosis” (343).
Indeed, Kimmerer is braiding together a kind of intellectual symbiosis between TEK and a systems science-informed biology that pushes back against conceptions that have long persisted in popular thinking about life and how it works. We are used to the idea of human life as essentially a struggle against a hostile environment. We are used to the idea that what is exceptional about human beings is that they are inherently ‘out of balance’ with their environment, that some sort of parasitic selfishness is the essence of human nature. In contrast to this, Kimmerer encourages readers to imagine what “beneficial relations” between humans and their environment “might look like” (6). What she describes is less a struggle than a kind of letting go. “A gift comes to you,” Kimmerer writes,
through no action of your own, free, having moved toward you without your beckoning. It is not a reward; you cannot earn it, or call it to you, or even deserve it. And yet it appears. Your only role is to be open-eyed and present. (23-24)
This passage comes from a chapter that draws on Lewis Hyde’s 1979 book, The Gift: Imagination and the Erotic Life of Property. But I’ve heard the theological concept of grace similarly described. Certainly, the familiar transfer of religious ideas to ecological thinking is present in Kimmerer’s language. From the Indigenous perspective, of course, there was never a division between the two. Her TEK is as spiritual as it is empirical.
My father-in-law, wholly secular, a mathematician, used to say that for him a walk in the forest was a sufficient substitute for church. That’s not an unusual sentiment. Familiar too are claims, expressed by many, that they ‘love’ nature, that they love the natural world. The idea of reciprocity provides for the rarer situation; it provides a way to conceive of nature as, in Kimmerer’s words, receiving people’s love and loving people back (122). If you’re going to think of yourself in a reciprocal relationship with an ecosystem in the way Kimmerer means it, you’re going to have to allow yourself to think about ecosystems as having spirits—or minds—of their own. You’re going to have to partake in some animism.
That’s asking a lot. For many secularists and religious folk alike, that’s taking a step onto uncomfortable ground. “How, in our modern world,” Kimmerer asks, “can we find our way to understand the earth as a gift again, to make our relations with the world sacred again?” (31).
She answers by telling stories. Some are personal stories about her experiences with her children in the garden or with her students in the woods. Others, like the story of Skywoman, are Indigenous myths, repurposed for the present day. For Kimmerer, these myths are important and relevant to our moment because they come from a time when people could still hear and interpret the teachings of “other species,” especially plants. The wisdom of plants, she writes, is
apparent in the way that they live. They teach us by example. They’ve been on the earth far longer than we have been, and have had time to figure things out. They live both above and below ground, joining Skyworld to the earth. Plants know how to make food and medicine from light and water, and then give it away.
Plants know, in other words, how to live reciprocally, and “we need to learn to listen.”
Behind this prescription is a direct case for narrative as the primary method of conveying the foundational ideas that shape a society’s imaginary. Drawing on Gary Nabhan’s construction, Kimmerer describes her project of ecological restoration as “re-story-ation.” Our relationship with the land we live on is “broken” because the dominant story we tell about it is in error. That story was brought by the immigrants from Europe to justify their domination, and it continues to inform our institutional structures and shape our responses to crisis (9-10, 31).
Kimmerer is not alone in this viewpoint. Agreement about the source and the time of the wrong-turning is widespread among those scholars concerned with ecological breakdown, mass extinctions, global warming—those scholars who see these matters as the meta-crisis, the broader habitat, so to speak, in which our political and social ills are nested, nurtured, and grow.
Writing these words brings to mind a video clip presented at one of the hearings on the January 6, 2021 attack on the Capitol. This clip showed long-time, right-wing Republican Party operative Roger Stone taking the official oath for some militant organization. “I am a Western chauvinist,” Stone vowed, “and I refuse to apologize for creating the Modern World.” The way this militant group has chosen to articulate and perform the debate is crude, narrow, and toxic. Still, the group’s general reading of the history aligns, at least in broad terms, with those scholars who associate the de-legitimization of animist belief systems and the advancement and institutionalization of an extreme dualism with the rise of the modern world and the philosophies of the West.
Kimmerer, for her part, defends the practice of Western science. She defends its practitioners, for whom the actual work of science—of “revealing the world through rational inquiry”—is an “often humbling” and “deeply spiritual pursuit.” But Kimmerer makes a distinction between scientific practice and “the scientific worldview.” The latter uses the products of science and technology “to reinforce reductionist, materialist, economic and political agendas.” The scientific worldview is destructive because it sustains “the illusion of control” and “the separation of knowledge and responsibility.” Kimmerer’s “dream” is that “the revelations of science framed with an Indigenous worldview” will lead to “stories in which matter and spirit are both given voice” (345-46).
Who will tell these stories, Kimmerer asks (9). For me, someone who has invested so much time and much of his living in hearing, reading, and thinking about stories, that’s the intriguing question. What are these stories and where are they? Who will tell them? Who is telling them?
A version of this essay appeared in Society for US Intellectual History.
When the semester is in full swing and shrinking non-work-related reading time, a service called Audm, which shows up on my pod-catcher, delivers recorded readings of articles from publications such as The New Republic, The New Yorker, and The New York Times Magazine. Now I can “read” while driving, cooking, and doing chores around the house.
A recent selection was a February 2022 essay by Arthur C. Brooks, called “How to Want Less,” from The Atlantic. Content could not have better fit the occasion: Brooks’ piece was not too dense so as to make multi-tasking impossible, but also strewn with a wide range of references from the history of ideas. Sartre, Buddha, Aquinas. The “hedonic treadmill.” The Tao. How nice to make use of one’s education while also prepping for dinner. Brooks writes about his daughter in college. I have a daughter in college, too.
Granted, there’s not much new in Brooks’ argument. Human beings are wired to seek satisfaction, and yet the very same wiring makes satisfaction impossible to achieve. “We crave it, we believe we can get it, we glimpse it and maybe even experience it for a brief moment, and then it vanishes.” It’s “the greatest paradox of human life.”
Before she went away to school, Brooks explained the situation to his daughter, and she found the prospect gloomy. He assured her that happiness was nevertheless possible. We have millenniums of human wisdom to draw upon, the intellectual work of those who’ve confronted the paradox, artists and philosophers, scholars and social scientists, and sages both religious and secular. If inner craving is unquenchable, we can achieve a kind of happiness via “free will and self-mastery” and the help of ideas.
Good advice! “The key to life is a low overhead” has been my preferred version of the basic maxim. Fewer wants lessen the need to labor for wages. Yet there was something in Brooks’ construction that, well, left me a little unsatisfied.
One of the reasons Brooks’ presentation is so familiar is how heavily it leans on “evolutionary biological imperatives.” Brooks calls on the age-old struggle of mind over matter, intelligence versus “several millions of years of evolutionary biology,” “a battle against our inner caveman.” This way of framing the paradox emphasizes the separation between mentality and physicality so favored by modern Western thought. A common thread among many scholars of the ecological imagination is that this “great separation” is at the root of our interrelated emergencies: environmental, economic, social, and political. Do Brooks’ constructions work to reinforce the mind/body, modern/primitive separation? Do they reproduce the dominant, underlying conception that nature and the body are governed by blind, material forces and that the human mind is an immaterial ghost that longs to break free from the physical?
I don’t know—it feels like a quibble. Life couples symbolic and material systems, and there is a real distinction between them, despite how interrelated they are. And certainly, plenty of thinkers within the ecological tradition find a relationship, as Brooks does, between freedom and self-restraint. Prince Kropotkin, Tolstoy, William Morris, Gandhi, Lewis Mumford, and Paul Goodman are among those Theodore Roszak includes within a “subterranean tradition of organic and decentralist economics” in his 1973 introduction to Small is Beautiful by E. F. Schumacher. Of course, Schumacher could be included on the list, as could Ivan Ilych and Cornelius Castoriadis. Similarly, and more recently, Giorgos Kallis, degrowth advocate and author of Limits (2019), writes that freedom is “not the unobstructed pursuit of desires.” It is the “conscious reflection on” and “mastery” of them (105).
But mastery. Would it be nitpicking to take pause at that word, too? In his brilliant 2021 book, The Nutmeg’s Curse: Parables for a Planet in Crisis, Amitav Ghosh ties the great separation to the emergence of modern geopolitics. As they expanded their imperial projects around the globe in the sixteenth century, western nations sought to claim possession of territories and monopolize markets, and to do so, they claimed possession of and monopolized the faculties of thought and intention. Entities traditionally considered mindful, with subjectivity and the ability to act on intention—whether they were plants, animals, persons, or ecosystems—were reconceived as mindless, soulless, “inert.” This laid the groundwork for their exploitation. Animist or vitalist philosophies had to be stamped out at home and abroad. Groups and peoples who attributed mind to non-human life were conceived as demonic or as primitive. They were hunted down as witches, exterminated in genocidal attacks. Nature and people were subordinated. Nature and people were “mastered.”
Do we pay something of the imperial mindset forward when we use such terms? And if “mastery” is tainted, wouldn’t “self-mastery” be tainted, as well? What would be a more suitable alternative?
My first thought was “self-care,” just because there has been so much talk about it since the pandemic. We were urged to let ourselves off the hook when it came to the demands of the job, to quiet the interior voice that scolds us into pushing through the stress. I’m a supporter of the concept. I’m favorable to policy ideas, some of them coming from the degrowth community, to reduce the length of the workweek. How much could we de-accelerate the world’s metabolism, not to mention our own hedonic treadmills, if we all knocked off early and devoted the extra hours to restorative rather than productive activities?
“Self-care” isn’t bad, but I might prefer “self-artistry.” If I have faith in anything, I suppose it’s the faith that we can trust the ways and levels at which art can communicate, inside and outside of consciousness, and that it’s in these ways and at these levels that the foundational change we need most can occur. We’re lucky to have those special ones, whose artistry in some medium, whether god-gifted or doggedly earned, seems to allow them to communicate in the ways I’m talking about as easily as exhaling a breath. But that’s no reason to surrender the field completely to them.
Brooks misses his daughter. They were close. I’m not sure I read this word-for-word in the essay, but I can imagine how, before she went away to school, he would try to share his hard-earned knowledge with her, perhaps to help her prepare for adulthood, and how she’d undercut the pedantry with a look. Lately, they’ve taken to texting each other a photo every once in a while—message-free, just a random image from one of the day’s quiet moments. One might speak of how mindfulness, a certain quality of presence, can touch a small gesture like this with meaning. But Brooks says the practice makes him happy, and that’s easy enough to understand.
I’m not very artful with my flip phone. But lately I’ve learned how to take a photo with it and attach the photo to a text.
A version of this essay was first published in the Society for US Intellectual History Blog.
Quiz scores and discussion board posts suggest that my students appreciate those items on the syllabus that can be watched or listened to rather than read. The Uncivil and Throughline podcast series have been rich sources of materials. Popular with students and with the instructor, too, was the Throughline episode on Billie Holiday, her nemesis, federal narcotics cop, Harry L. Anslinger, and her iconic song, “Strange Fruit.” Of course, I knew the song, and I was aware of the 2001 David Margolick book on the topic. But I didn’t know about Anslinger’s Javert-like persecution of Holiday, which he conducted from the time of her first performances of the song to the time of her death in the Metropolitan Hospital in New York.
The story is wrenching; it packs a punch. Placed on a timeline of African American History, it resonates contextually and foreshadows things to come. It illustrates the racist character of the wars on drugs of later decades. As did others, Holiday acted in a wartime and postwar ideological climate that offered leverage in the struggle for Black freedom. Her insistence on performing “Strange Fruit”—despite how uncomfortable the song made some feel, despite how ‘divisive’ some considered it to be—was a resolute stand against Jim Crow. Not only did she serve time for taking this stand, she put her career and her very life on the line. These were lonely choices that made the way less lonely for those who made similar choices in the decades that followed.
As a bonus, we have the artifact, the song itself, indestructible, reproducible, and experience-able—synchronously with others—in about four minutes’ time.
Only recently was I able to appreciate another of Holiday’s signature songs, “God Bless the Child.” Of course, I knew this song, too. My first exposure was almost certainly to the Blood, Sweat, and Tears version, a top-ten hit in 1969, which I heard on the radio growing up. Something about that version made me tune the song out. Maybe it was the exuberant arrangement and vocal performance, which overshadow the lyrics. Maybe I was too young to take them in. In any case, I apparently took little notice of “God Bless the Child” whenever I heard it covered by others–or even when I eventually heard Holiday’s own recordings.
Then, not long ago, I came across a version on a “lost” Bobbie Gentry album, also recorded in the late 1960s. After failing to follow up on the massive commercial success of “Ode to Billie Joe” (1967), Gentry was taking a stab at becoming a jazz chanteuse, “a sort of Mississippi Julie London,” as the writer of the liner notes puts it. She planned an album of standards and interpretations of recent pop hits, and included “God Bless the Child,” but the project was abandoned, and for five decades, these recordings were unavailable. Hearing Gentry’s version, somber, minimally arranged—perhaps because it was never finished—I grasped the subtlety of the chord progression as if for the first time. For the first time, I grasped the idea of the song.
Holiday, with co-composer Arthur Herzog, Jr., contributed that idea. As the story goes, she’d gone to her mother for a loan and an argument ensued. Holiday crafted her mother’s refusal into the now well-known lyric:
Mama may have
Papa may have
But God bless the child that’s got his own
In his book about degrowth, Less is More, Jason Hickel uses a phrase that I mentioned in my last post, “theory of being.” I’m a little wary of the term. The concept Hickel is referring to is one many scholars find necessary to express and find a variety of ways to do so. Worldview. Mindset. Fundamental orientation. Ethos. Belief system. “The way we make sense of the world.” The systems theorist Gregory Bateson favored the term epistemology, which he used to emphasize the essential links he identified between ways of knowing, ways of learning, and ways of being or behaving. It’s hard to nail the concept down, in other words, and it takes one pretty easily into the weeds. But it seems to me that ultimately, a worldview, a mindset, a theory of being is a working vocabulary or set of metaphors used at a semi-consciousness level to account for the operations of reality. It tells us who we are, where we are, how we are, why we are.
Hickel uses the term theory of being when discussing the rise of capitalism in early modern Europe. That rise, Hickel argues, required not only violence in the form of enclosure and the manufacture of scarcity but a change in the way people understood themselves as beings in nature. That change evolved and intensified until it came to dominate across a broad society. It reproduces itself in the forms and structures of the society, both symbolic and material. It implies itself everywhere, even in songs. As Hickel and many others see it, the dominant theory of being in our society is mechanistic, atomistic, and reductionist. This theory provides that individual self-interest is primary and that all other life is exploitable to that end. These ideas disrupt all kinds of relations, including those between parent and child, so that in a society where it dominates, even a child must have “his own.”
If you analyzed every song for its theory of being, you’d run into trouble pretty quickly. I suspect the analysis would not be very productive. But what do you do with a song like “God Bless the Child,” a song that makes its “this-is-the-way-of-things” statement so plainly?
The theory of being expressed in “God Bless the Child” has a long pedigree in popular song. One of the very first American hits, at least as we understand the word today, was the song “Nobody,” recorded in 1905 by the renowned vaudeville performer Bert Williams, a mixed-race West Indies immigrant, a Black man who performed in blackface. The song comes out of a period of intensive, polyglot urbanization, and refinements in the edifice of racial caste. Its lyric expresses extreme social and economic anonymity. Most relevant to my topic is the sentiment expressed in the refrain:
I ain’t never done nothing to nobody,
I ain’t never got nothing from nobody, no time
And until I get something from somebody, sometime,
I don’t intend to do nothing for nobody, no time
Revisiting this song, I couldn’t help but think about a recent read, Robin Wall Kimmerer’s 2013 bestseller, Braiding Sweetgrass. The book juxtaposes an Indigenous American’s theory of being with the dominant European import. One of Kimmerer’s key concepts is “reciprocity.” Reciprocity represents the loving exchange of gifts between organisms that constitute the symbiotic relationships from which ecosystems emerge and flourish. “The breath of plants gives life to animals and the breath of animals gives life to plants,” she writes, in one especially fundamental expression of reciprocity in nature. “My breath is your breath, your breath is mine. It’s the great poem of give and take, of reciprocity that animates the world” (344). In stark contrast to this, the singer in “Nobody” gives nothing, gets nothing. This organism exists in a state of no relation with the rest of the world. When scholars such as Hickel describe our society’s dominant theory of being as atomizing, this is pretty much what they mean.
A related, more pointed sentiment is expressed in “God Bless the Child”‘s middle eight:
Money, you’ve got lots of friends
Crowding ’round your door
When you’re gone and spending ends
They don’t come around no more
It is the way of things, the song says, that friendship is purely transactional. To put this in Kimmerer’s idiom, transactional relationships are those starved of the natural history of symbiosis that constitutes reciprocal relationships. Transactional relationships are those understood as monotone, stripped of complexity and the mutual interests of a thriving ecosystem. When scholars such as Hickel describe our society’s dominant theory of being as reductionist, this is one aspect of what they mean.
“When you have money, you have friends” is an American pop song trope, surely. It signals the sort of hard-nosed cynicism expressed commonly in blues. It’s a pose, a posture, a self-protecting, self-atomizing tactic that is struck often in the performances of songs, including performances of “God Bless the Child.” That we often call this cynicism “realism” indicates how dominant the dominant theory of being is. What thinkers such as Kimmerer are proposing is that change in theory of being is the root of change, is where the kind of long-term change we need today takes place.
To my ears, the Blood, Sweat, and Tears’ version leans toward cynical posturing. The effect is to endorse or reinforce the dominant theory of being, to reproduce it once again. But what about the Bobbie Gentry version, or, let’s say, the 1962 Lou Rawls version, which probably influenced Gentry, a big Rawls fan? These performances sound to me more like objections, protests, not against something as specific as lynching, but against the dominant theory of being itself.
They indicate a troubling tension. Many aspects of our lives are characterized by reciprocity—within our families, among our friends and close communities. But there is no denying the destructive influence the dominant theory of being has over these relations, over the way we see the world, and how we exist in it collectively.
An earlier version of this essay was posted on the Society for US Intellectual History blog.
In his new book, The Web of Meaning (New Society, 2021), Jeremy Lent writes about the Buddhist concept of “dependent origination,” which describes how “all that arises depends on everything that came before it” (253). I hope I’m not doing too much damage to the concept by understanding it to mean that everything originates from and exists within a context of relations, which for historians, it seems to me, is doctrinal. How could the discipline do what it does without a belief in this concept?
And not historians alone. One of the greatest challenges to storytellers of all kinds is figuring out where to begin. Struck by the insight that he’d begun his story too early, Hemingway famously deleted the first chapter or two from The Sun Also Rises, and the novel was probably better for it. On the other hand, I remember enjoying the long, stage-setting prologue of Richard Russo’s Empire Falls a good deal more than I did the remaining chapters. The point is that all stories have roots that wind back indefinitely; all narratives begin in medias res.
Non-fiction books of persuasive prose have their own ways to set an argument in context. “Capitalism: A Creation Story,” is the title of the first chapter of economist Jason Hickel’s 2020 book, Less Is More: How Degrowth Will Save the World. In forty swiftly-moving pages, Hickel surveys 500 years of history, drawing on Marx, Weber, Polanyi, the Frankfort School, E. P. Thompson, as well as more recent feminist takes on the rise of the modern, Carolyn Merchant’s 1980 classic, The Death of Nature, and Silvia Federici’s Caliban and the Witch (2004).
Hickel’s story is set in Europe, and it has two parts. The first part begins with the peasant revolts in the thirteenth and fourteenth centuries. Elites responded with enclosure. Shared or autonomous spaces, relatively abundant, were violently appropriated, made to be scarce, and those who had drawn livings from them were now forced to compete over the ever-shrinking remainder. Colonization repeated the pattern overseas; slavery applied it to human bodies. At first they called the result “improvement.” Now they call it growth, and it’s the justification and overarching goal of all macro-economic activity.
Simultaneous to the West’s transition in economics was a change in its “theory of being” (31). This is the second part of Hickel’s creation story. I was more familiar with this one. He calls it “the great separation.” Europeans came to think of themselves as separate from nature; the mind became perceived as separate from the body. Bacon and Descartes are identified as major perpetrators, though (due to dependent origination) they were only mining veins of ore running back to Plato and beyond. Bacon’s desacralized nature and Descartes’ dualism replaced the ages-old animist position, which recognized the embeddedness of spirit in matter and saw the whole of nature as intelligent and alive.
This theory’s historical function, however, was to justify the thingification of land and bodies that the enclosure pattern necessitated. The logic of dualism allowed humans to monopolize mind, to see themselves as supreme over the non-human world, and the result was unrestricted plunder. The survival of the fittest. The selfish gene. Homo economicus and the management of scarcity. None of this is natural, Hickel argues. It’s “the product of five centuries of cultural re-programming” (74).
Having set the stage, Hickel devotes the remainder of his book to arguing for the need of a postgrowth economics, to explaining the degrowth concept in detail, to defending it from misunderstandings and criticisms, and to describing alternatives to the growth imperative and the path ahead. This is, of course, the meat of the book, and its chapters work as intended, but my heart was in the backstory.
Jeremey Lent’s The Web of Meaning, as well as its predecessor, The Patterning Instinct (2017), elaborate on many aspects of the great separation, taking the story back to prehistory and the evolution of the human brain. The first separation, one might say, was the development of the prefrontal cortex. Humankind’s awesome powers of cooperative creativity and destructiveness were born in the split between what Lent calls animate consciousness and conceptual consciousness.
I read Lent’s books slowly and with pleasure, pausing at almost every page to examine the extensive footnotes, because he weaves these backstories together with all my other special interests: the intellectual history and contents of systems science, support for its insights in social scientific research—including the postgrowth economics of scholars such as Hickel—and the traditional environmental knowledge of indigenous peoples around the world. To this not unusual merger, Lent adds something of his own: the correspondence between the ecological imagination, which is supported by scientific insights developed mostly in the twentieth century, and Neo-Confucianism, a philosophical synthesis of traditional Confucianism, Buddhism, and Taoism initiated some eight centuries earlier, during the time of the Song dynasty.
The Song-era philosophers grappled with a problem that had concerned the Greeks: the persistence of form in the midst of flux. Drawing on the traditional thought of the I Ching, they understood that “the entire universe is comprised of a dynamic flow of energy and matter called qi (pronounced chee).” As they pondered this dynamism, “they realized that while everything was composed of qi, the principles by which the qi was organized were just as important. The word they used for these principles was li, which originally referred to the swirling patterns visible in a piece of jade.” For Lent, li is a key concept, and like dependent origination, one at home in the ecological imagination. When the Neo-Confucians wanted to understand the li of a plant, Lent explains, they investigated “its relationship to everything else around it: the soil, other plants, the weather, its own history and the broader context of space and time beyond the plant’s immediate environment.” Their investigation of li led them to “a deeply integrated understanding of how humans relate to the natural world, how core values arise from human embeddedness in nature and how there is no ultimate distinction between what is material and spiritual” (The Web of Meaning, 93-94).
Throughout our history, Westerners have turned to Eastern thought from time to time for alternative, corrective wisdom. Lent is doing more than that. He’s laying out a backstory of world-historical proportions. He identifies the divergence between West and East in the Axial Age. Lent acknowledges the advances in moral philosophy across civilizations that we associate with this period. Nevertheless, it was during this time that the West constructed perception on the basis of “a split cosmos, a split human,” and the Far East built on the notion of a “harmonic web of life.” In short, Confucius and the Taoists went one way while the rationalists, monotheists, and religious dualists went another. (See Part Three of The Patterning Instinct). That set the stage for the great separation of the early modern West, which in turn, set the stage for our present crisis. All this backstory serves a purpose. Lent sees contemporary systems science and Song-era Neo-Confucianism as two pillars, each set widely apart in history, to support a world-cultural transition to an ecological civilization, a second axial age.
One is rightly wary of grand narratives that serve to organize centuries of data into useful interpretations. On the other hand, we currently find our society on several trajectories of cascading collapse, and the collective action necessary to respond requires a basic orientation. To put it simply, we need a story that tells how we got here, one that provides for a radical change of course. More than that, the story requires widespread and immediate dissemination. I find myself wondering where and how that dissemination would take place. I’m an educator. Should I be teaching The Great Separation, and if so, where? In what department, what division does it belong, in what prefix and course number? Should Western Civ be brought back into the core curriculum, this time with a major tonal shift? Maybe Economics is the best avenue, but from the perspective of Ecological rather than Classical Economics, and with short, accessible books by Jason Hickel or Kate Raworth as the texts. Or should there be some interdisciplinary course that mixes systems science, economics, and the history of ideas?
And here’s a related question. If certain partisan factions are upset to the point of violence over stories in our schools that spotlight white supremacy, how would they react to stories that spotlight human supremacy, which goes back further and is even more foundational than is our civilization’s structurally embedded hierarchy of race?
A version of this essay was published on the Society for US Intellectual History blog.
Once upon a time a series called Game of Thrones was broadcast on the television machine. It was one of those remote and long-lost events that occurred before The Great Scrambling, during that vast stretch of time reaching back to the first civilizations, which historians know as Antiquity.
“Wait!” a reader may object. “I watched that series and remember it well.” The reader would be technically correct. But that just goes to show that a) history is not an exact science and b) some passed through the Great Scrambling with clearer heads. Others of us were too busy surviving the reign of a deranged child-king and the invasion of zombie hordes from the Global North to recall a televised fiction with similar plot elements.
Some archival work was necessary to piece together this past. Here are a few of the items that I uncovered.
1. On April 25, 2015, at the Washington Correspondents Dinner, an early connection was made between Game of Thrones and contemporary American politics. President Obama was at the mike along with “Luther,” his “Anger Translator,” played by comedian Keegan-Michael Key. The bit allowed the President to lightly mock the press, thanking them for “two weeks of non-stop Ebola coverage,” and for a series of perceived “Obama’s Katrina,” which never came to pass. Comfortable as he neared the end of two terms, Obama/Luther mentioned the 2016 election, still nineteen months away. He named the most likely candidates and paired each with a joke. The Jeb Bush joke was mild, the Cruz joke skewering. It was the Hilary Clinton joke that made reference to the popular HBO series. Luther, strutting, his fingers interlaced across his chest, spoke of Clinton’s fearsome ability to raise funds. “Khaleesi is coming to Westeros!” he said.
To this, I’ll just add a few comments. First, Games of Thrones was by this time past its sixth season, and all the pieces of its endgame were in place. Dany Targaryen—Khaleesi, Queen of the Dothrakis, Mother of Dragons—would travel over The Narrow Sea with her armies to claim the Iron Throne. It was hers rightfully, that is to say, by blood, but she also earned it. She had grown into a strong leader, become a champion of the enslaved. She had passed through the fire, literally. What is more, the show’s audience was behind her. How great that a woman, long-abused and underestimated, would rise in this man’s world and bring to it a higher politics!
Second, and this hardly needs be said, the joke hinged on an association between Dany’s story and Clinton’s in the popular mind. There were those of us who might have aligned more closely with the progressive wing of the Democratic Party in terms of policy, but who really wanted to see a woman president. I don’t know what to say about this. The cells in my brain where this thinking existed have since been replaced by scar tissue.
Finally, judging by the available video, the joke received a mixed response from its audience. Why? Was the reference too obscure? Professionals working at the top of their fields have less time for television, supposedly. Did this perceived signal from Obama himself that Clinton was the presumptive nominee make them shift uncomfortably in their chairs? I won’t hazard a guess. This is history, after all. Most of the data is unrecoverable.
2. On April 11, 2016, a Game of Thrones think-piece, authored by Clive James, was published in The New Yorker. Although the primary season was well underway, James drew no connection to the political scene. His goals were loftier: What explained Games of Thrones’ immense popularity? James had been a reluctant viewer whose ability to suspend disbelief was typically restricted by “a total embargo of dragons.” But like so many others, he soon overlooked the trolls, giants, magic, and all the other trappings of fantasy fiction, and got caught up in the moral battle. “The whole thrust of the show,” James wrote, “is to give us a world where the law is not yet formed,” a world governed by nothing but “the lawless interplay of violent power.”
Game of Throne‘s power players were ruthless, as was the narrative itself, which it proved by its willingness to kill off its purported heroes. “Everyone in the show is dispensable,” James wrote, “as in the real world.” The one exception, James explained, was Tyrion Lannister, the character played by Peter Dinklage. He was
the epitome of the story’s moral scope. His big head is the symbol of his comprehension, and his little body the symbol of his incapacity to act upon it. Tyrion Lannister is us, bright enough to see the world’s evil but not strong enough to change it.
James captured much that was true about Game of Thrones, but his conclusion was hasty. A full year had passed since the Washington Correspondents Dinner, and only now were new episodes in release. The series’ endgame was still in play. Evil might not yet triumph. James claimed that the appeal of the show was its “raw realism,” and certainly, the ruthlessness in storytelling raised the stakes. But what kept people watching wasn’t a consensus that this rawness was real, but the requirement that in the end, viewers would be reassured that it wasn’t.
Is this really the way the world is? That was the question the show asked over and over, and that it would keep asking until the end.
Many are tempted to answer yes, that life is nothing but a nasty scramble for domination, with a few big winners on one side and the rest of us on the other. The evidence in support of this answer is overwhelming, and mainstream science provides a basic rationale. In a 2018 essay collection, titled What Are We Doing Here?, novelist Marilynne Robinson summarized this rationale: humans were “locked in a perpetual cost-benefit analysis, unconsciously guided by a calculus of self-interest somehow negotiated at the level of genome.” We were deceived to think otherwise, the science claimed. But the question is contested. Robinson’s essays contest it. Lots of people think otherwise. This is what I mean by the appeal of Game of Thrones. We thought otherwise and had to watch to the end to find out if we were deceived.
3. On June 1, 2017, George R. R. Martin, the author whose series of novels Game of Thrones was based, was quoted in Esquire, saying, “I think Joffrey is now the king in America.” Joffrey was the show’s first and most despised villain, a disturbed boy, installed unexpectedly onto the throne. Vainglorious and cruel, Joffrey loved and even craved power because it shielded him from seeing himself as he really was: a coward, a bully, a bore, and a creep. His was also weirdly blonde.
It doesn’t matter that Joffrey had been killed off in season four, way back in 2014. A direct comparison between this fictional creation and the former US president seems almost too obvious to make. Still, I have no recollection of making it at the time, not even as baby-Trump-in-diaper balloons began to show up at protest gatherings around the world. Trauma does funny things to the brain. It can disturb the most elementary of calculations; it can scramble the very experience of time.
4. Between April and May, 2019, the final season of Game of Thrones was broadcast to widespread disappointment. Part of this can be explained by delays in production and a decline in quality. The producers had been running ahead of Martin’s unfinished novel series for some time, and now there seemed an absence of vision as to how to conclude. All the time the narrative had invested in characters and their relationships was felt to be squandered in a headlong rush to the finish.
Disappointment, too, coalesced around the ending. Khaleesi came to Westeros, but she wouldn’t win the throne, after all. Those who thought Dany’s story was about the feminization of power were let down to find her representative of something else, something more familiar. A Robespierre-type, or perhaps a Stalin, Dany got paranoid, went crazy with her dragons, and started setting the populace on fire. Why end a story with so many interesting, varied, and powerful women on such a note of classic conservatism? Dany was supposed to be the anti-Joffrey, and this turning of the tables left many confused, especially since, in our actual lives, we were suffering under the government of the paranoid fringe.
No, the winner of the Game of Thrones wouldn’t be Dany, or Sansa, or the indispensable Tyrion, or even Jon Snow, but instead—and unexpectedly—the younger Stark brother, Bran. Bran, an adolescent in the show’s first season, was left paralyzed from the waist down from an attempted murder. Physicality was unavailable to him, so he had to train his intellectual gifts. The human reality of Game of Thrones might have been one of naked self-interest, but beneath this thin surface was an animistic world of forest sprites and non-human wisdom. Bran plugged into those enchanted depths. Tutored by some sort of tree man, Bran learned to see the past, the future, and everything in between. Because he could not focus on every instance simultaneously, he had to steer his attention from one data source to another, and then to visualize it, to form an interpretation. His powers—unwieldy, imprecise—were those of curating the endless texts libraried inside his head. Bran was, in short, the Humanities.
Or to state it more broadly, he represented the academy—that complex apparatus by which the record of human experience and understanding is remembered, organized, sheltered, passed down, disseminated, and applied to conditions of ever-present change. In the Borges story, “Funes the Memorious,” the boy Funes is thrown from a horse and paralyzed. He develops “infallible” perceptive abilities and a perfect memory. Borges understood the importance of the academy. He also understood its fragility and the near-impossibility of its task. “My memory, sir, is like a garbage heap,” Borges has his character say. He is cursed to live the abolishment of the general by the specific in all its exponential abundance. “His own face in the mirror, his own hands, surprised him every time he saw them.”
“The Great Scrambling,” like any period marker, is an attempt to chart change amidst the bombardment of events. Are things different now than they used to be? It sure seems that way. As I remember it, the academy, when not being pressed to make a case for its usefulness, was viewed mostly as a harmless indulgence, residue from some obsolete past. The joke about the liberal arts major and the fast food industry was good for a chuckle in almost any American setting. Yet during the Great Scrambling, historians and other humanist scholars stepped forward as witnesses to and interpreters of our collective ordeal. We needed their expertise. Heather Cox Richardson’s “Letters from an American”—just to name one obvious example—served as a lifesaver for people from all walks of life, as something solid to hold onto as the floor beneath us rocked and seized. So I want to say it’s reasonable that the final hero of the most popular television series of its kind would be the personification of this service.
Meanwhile, generations of scholarship, reduced to an umbrella term, Critical Race Theory, are being twisted into a racial dog whistle. A controversy surrounds Professor Hannah-Jones, her tenure, and the 1619 Project. Not long ago in Austin, Texas, the lieutenant governor canceled a talk by authors of a book of history at the Bullock Texas State History Museum, there at the state’s flagship public university. That the governor is touting something he calls “the 1836 Project” testifies to the stature of its model and namesake. The Great Scrambling was a disruptor of paradigms. What once seemed an amorphous, blanket disrespect for the academy has been replaced by a cluster of pointed assaults on it, each traceable to a particular source and tinged with desperation.
A version of this essay was published at the Society for US Intellectual History blog.
President Obama’s remarks at Washington Correspondents Dinner, 2015: https://www.youtube.com/watch?v=HkAK9QRe4ds
Clive James, “Thrones of Blood,” The New Yorker, April 11, 2016: https://www.newyorker.com/magazine/2016/04/18/the-raw-appeal-of-game-of-thrones
Marilynne Robinson, What Are We Doing Here? (Farrar, Straus and Giroux, 2018), 52.
Logan Hill, “Kill the boy … let the man be born,” Esquire, June 1, 2017: https://classic.esquire.com/article/2017/6/1/kill-the-boy
Jorge Luis Borges, Labyrinths, (New Directions, 2007), 59-66.
Letters from an American: https://heathercoxrichardson.substack.com/
Fifty years ago last month, Curtis Mayfield and a four-piece band appeared at the Greenwich Village nightclub, Paul Colby’s Bitter End. A small room, a small band, an intimate atmosphere. In the recording made of the performance, one hears the sound of the audience by hearing the individual sounds that comprise it—pairs of hands clapping, appreciative shouts, calls out from here and there, “Right on!” “Right on!” When Mayfield speaks to the crowd, his tone is mild, relaxed, conversational. He doesn’t have to raise his voice even when the band continues to play behind him.
At one point, introducing “We’ve Only Just Begun,” a song not associated with him, Mayfield acknowledges the odd choice. “A lot of folks think this particular lyric may not be appropriate for what might be considered underground,’” he says. “But I think underground is whatever your mood or feelings may be at the time so long as it’s the truth.”
“We’ve Only Just Begun” had itself only just begun. It had started as a jingle in a bank commercial a year earlier; currently it was a smash hit for a new group called The Carpenters. This sort of data can be discovered on the internet in a matter of seconds. More difficult to decipher would be what Mayfield meant by the term “underground.” This was an artist who, as principal singer and songwriter for The Impressions, had had numerous hits on the R&B and pop charts for the better part of a decade.
But these matters don’t necessarily play into the feeling I have when I hear those scattered shouts of “Right on!” and when I hear the particular pronunciation Mayfield gives to the phrase so long as it’s the truth. The specificity of these aural artifacts transports me not only into the room but into the political-cultural moment. It may be the closest thing to time travel that I’m able to know.
In his anthology of African American social and political thought, Let Nobody Turn Us Around (2009), Manning Marable divides the Black Freedom movement of 1955-1975 into two roughly decade-long phases: “the campaign for desegregation and the struggle for Black Power” (352). Mayfield has a presence in both phases. His hits with The Impressions, “Keep on Pushing” and “People Get Ready,” released in 1964 and 1965 respectively, associate with the first phrase in the years of its great legislative victories. These songs emphasize the moral legitimacy of the movement and the religious underpinnings behind its commitment to integration and non-violent resistance.
The Impressions’ movement songs, while memorable, were a very small part of the group’s catalog. “We’re a Winner,” released in November of 1967, marks a transition. As a seeming call for black pride, it preceded by nine months James Brown’s “(Say It Loud) I’m Black and I’m Proud.” Its chorus calls more explicitly for activism than do the group’s earlier songs. During the Bitter End set, Mayfield remarks on the political blowback he’d received for the song: “You may remember reading in your Jet and Johnson publications, whole lot of stations didn’t want to play that particular recording. Can you imagine such a thing?” To this Mayfield adds a rhymed couplet, his primary lyric mode:
I would say, like I’m sure most of you would say
we don’t give a damn, we’re a winner anyway.
“Right on! Right on!” shout a number of individuals in the crowd.
Mayfield, like any perceptive commercial artist, adjusted to the changing times. Starting with “We’re a Winner,” The Impressions’ took on a more socially conscious image and urban style. Gone was the formal, prosperous, nightclub presentation. The photo on the cover of This is My Country (1967) posed the vocal group in a crumbling cityscape; The Young Mod’s Forgotten Story (1968) showed them again on the street, in leathers and brimmed cloth caps.
At the same time, Mayfield began a soft break from his group. Curtis/Live! was his second solo record, following Curtis in 1970, on which he continued both this stylistic and topical transition, as well as his experiments in psychedelic soul. Arrangements were complex, hand drums more prominent; he ran the guitar through a wah-wah pedal and other effects. But playing at the Bitter End, live with a small band, did not allow Mayfield the studio elaborations he’d been dabbling in. The sound was less busy, more organic. The players are wide-awake but never aggressive. They weave a warm, improvisational nest in which to hold Mayfield’s gentle falsetto. This distillation of Mayfield’s sound paved the way for his crossover triumph in 1972, the soundtrack to Superfly.
In describing the first phase of the Black Freedom movement, Marable includes the phrase, “the deracialization of popular music.” I’m not sure I understand what he means. Interracial exchange and appropriation—love and theft—had always driven the genius of American music, though the degree of racial segregation of performance sites and the market waxed and waned. Genres are largely market-driven and racialized; they are often the tail that wags the dog. On the other hand, 1955 is a pretty good marker for the birth of rock and roll, one of the most explicitly recognized moments of love and theft. A lot of change followed in its wake, both in terms of the development of the music, and the embrace of black artists by white audiences. Music was an unmistakable cultural component of the desegregation campaign, and its contribution was simultaneous with a kind of desegregation of the popular music market. If all of these aspects are included in what Marable means by “the deracialization of popular music,” then yes, the result was an enormous cultural achievement, a vast catalog of songs and recordings that can scarcely be overpraised.
In this sense, Curtis/Live! and Superfly—along with There’s a Riot Going On (1971) and Fresh (1973) by Sly and the Family Stone, and What’s Going On (1971) by Marvin Gaye—are transitional. They are among the many jewels in the crown of this deracialization. But in them we can also see the beginning of a reversal. By the end of the decade, with a major inflection point coming with the anti-disco debacle, genres would become more racialized and politicized. Black music and white music would again be largely segregated on radio and in the marketplace.
This may have something to do with the theme of ambiguity Marable identifies in the second half of the Black Freedom movement, the struggle for Black Power. Black Power “replaced liberal integrationism as the dominant political ideology and discourse for many African Americans,” Marable writes. Yet it “never consolidated itself as a coherent social philosophy or strategy” (348). The united front of the desegregation campaign fragmented over the concept. But what did Black Power mean and what form of radicalism did it call for?
Marable indicates several “overlapping tendencies” that gathered in answer to these questions (349). Included are religious and cultural nationalisms, both of which Mayfield gestures toward lyrically on Curtis/Live! There’s the liberation gospel of “People Get Ready” and the emphasis on self-definition and solidarity in “We the People Who Are Darker Than Blue.” Marable sympathizes most with the tendency he calls revolutionary nationalism. Curtis/Live!’s most progressive sentiment may be the one expressed in “Stone Junkie,” the final cut on the record. The song points out the classist and racialized stigmatization surrounding the use of drugs and other stimulants, a critique that would become more pointed as the emerging War on Drugs took its toll.
Mayfield mostly floats atop these crosscurrents. Does he take any particular stand? Songs are different from persuasive prose. Compared to the speeches and essays that populate anthologies such as Marable’s, songs allow for much hedging and evasion. Yet in “Choice of Colors,” one of the socially conscious, late Impressions tunes that Mayfield performs on Curtis/Live!, his message is a little too slippery. The song asks a series of questions, and both their asking and their answering are never quite nailed down. It’s usually a strength that Mayfield’s rhymes don’t sound long thought-over, but in this case, the artlessness loses some of its charm.
I can only understand the song in the context of those that address the same topic, songs such as “Mighty Mighty (Spade and Whitey)” and “(Don’t Worry) If There’s a Hell Below, We’re All Gonna Go.” On these songs, Mayfield is carrying the ideals of the first phase’s united front into the ideologically contentious second period. Mayfield has made some legitimate and productive concessions to the greater aggressiveness of Black Power and to its demands for full autonomy. I’ve mentioned the style change, and the music’s complex, improvisational space. Significant too is the greater grittiness, the cursing, and the appropriation of racial epithets for new purposes. All this points to the future. But on the question which Mayfield seemed to see as the question of the day—the question of racial separation advocated by the crosscurrents of Black Power—he upholds some form of King’s integrationist message, its spiritual core, and its insistence that the destinies of all Americans are intertwined.
A version of this essay appears on the Society for US Intellectual History blog.
Many dislike John Irving’s fiction for its cruel implausibilities, but my friend wanted to push back on that opinion. Implausible things happen all the time, and people are regularly traumatized by them. The comment made sense, coming from this friend, whose recent experience is indeed worthy of a John Irving novel. It was his last remark that struck me. “Everyone needs therapy, more or less.”
The words set in motion a lightning sequence of thoughts. If my friend was right, and everyone got the therapy they required—just to achieve some baseline of health—society would need to find a way to train and employ a great number of therapists. And would that be such a bad thing? I mean therapy not as one more consumer item designed to feed insatiable impulse but therapy broadly understood as any work aimed at the restoration of health. What would our lives be like if more of us worked at jobs that served to heal rather than to wound? What if our economy were organized not for the extraction of value from living systems but for the restoration of their health?
Hardly a beat had passed, and I don’t remember exactly how I responded, except that I used the term, “restorative economics.” It piqued my friend’s interest. He wanted to know what it was and asked for something he might read that would explain it.
It was then I realized that I’d used the term as if it were a thing when I wasn’t really certain that it was. Or if it was a thing—a lesser-known discipline, perhaps, a research field, an item you could look up on Wikipedia—it wasn’t the thing I was referring to. It wasn’t the thing I was referring to because it couldn’t encompass all that was behind what I meant by it. And behind what I meant by it was several years’ worth of reading a variety of texts, some fairly disparate, and the connections between them which are not necessarily gathered under any particular term or in any particular place, except in me, in my own mind, as the reader of those texts.
Readers are nodes in a network of ideas. Readers are essential workers.
The work I did when I got home was to assemble a list, just to understand my process. I’d never used the term before, nor planned to. Why restorative? I stopped at item seven but could have kept going.
1. Chapter 6 of Kate Raworth’s Doughnut Economics, 7 Ways to Think Like a 21st Century Economist (2017). In this chapter, titled “Create to Regenerate,” Raworth describes the concept of a Circular Economy. One illustration she used looks something like a butterfly with two wings, one marked “Regenerate” and the other, “Restore.” Raworth’s book is among many that ask this question: What if the purpose of economic activity was not growth but healing?
2. The “restoration story” George Monbiot tells in the first chapter of his book, Out of the Wreckage, A New Politics for an Age of Crisis (2017). “There is something deeply weird about humanity,” Monbiot writes. “We possess an unparalleled sensitivity to the needs of others, a unique level of concern about their welfare, and a peerless ability to create moral norms that generalize and enforce these tendencies” (14). Monbiot represents a strain of postmodernism when he argues that modern ideologies have overshadowed this understanding of what human beings are. That understanding must be restored, Monbiot argues. “Through invoking the two great healing forces–togetherness and belonging–we can rediscover the central facts of our humanity: our altruism and mutual aid” (25).
3. Paul Hawken’s description, in Blessed Unrest (2006), of the international ecological movement. He calls this movement (which barely registers in the US), the “largest social movement in all of human history.” Its participants are “willing to confront despair, power, and incalculable odds in order to restore some semblance of grace, justice, and beauty to this world.” The movement proposes a regime of words beginning with the letter R: “restore, redress, reform, rebuild, recover, reimagine, reconsider” (4).
4. Jason Hickel, author of Less Is More (2020), on what life might be like if the economy was designed for healing rather than growth: “People would be able to work less without any loss to their quality of life, thus producing less unnecessary stuff and therefore generating less pressure for unnecessary consumption. Meanwhile, with more free time people would be able to have fun, enjoy conviviality with loved ones, cooperate with neighbors, care for friends and relatives, cook healthy food, exercise and enjoy nature, thus rendering unnecessary the patterns of consumption that are driven by time scarcity. And opportunities to learn and develop new skills such as music, maintenance, growing food, and crafting furniture would contribute to local self-sufficiency.”
Under such a new framing of economic life, “We would not have to feed our time and energy into the juggernaut of ever-increasing production, consumption and ecological destruction. The economy would produce less as a result, yes – but it would also need much less. It would be smaller and yet nonetheless much more abundant … but public wealth would increase, significantly improving the lives of everyone else.”
When I went back and re-read this passage, I read public wealth as mental health.
5. Daniel Christian Wahl’s project in Regenerative Cultures (2016). Wahl also proposes a regime of R’s: “restorative design,” to restore healthy self-regulation to local ecosystems; “reconciliatory design,” to reintegrate humans into “life’s processes and the united of nature and culture”; and “regenerative design,” to create cultures “capable of continuous learning and transformation in response to, and anticipation of, inevitable change.”
6. The entry for “care” in Degrowth: A Vocabulary for a New Era (2015). It’s first paragraph reads, “Care is the daily action performed by human beings for their welfare and for the welfare of their community. Here, community refers to the ensemble of people within proximity and with which every human being lives, such as the family, friendships or the neighborhood. In these spaces, as well in the society as a whole, an enormous quantity of work is devoted to sustenance, reproduction, and the contentment of human relations. Unpaid work is the term used in feminist economics to account for the free work devoted to such tasks. Feminists have denounced for years the undervaluation of work for bodily and personal care, and the related undervaluation of the subjects delegated to undertake it, i.e. women. Feminists continue to affirm the unique role that care has in the well-being of humans. … [C]are is fundamental in the support the mental, physical and relational integrity of each and every human being.”
7. Restorative justice, a field that explores and promotes legal modes of atonement for crimes, recent and historical. Individual human beings aren’t the only living systems that require restoration. Social groups, too, have suffered damage and deserve reparations.
Right now I’m flashing on the work being done to remove a mountain of asbestos shingles that had been allowed to accumulate in South Dallas and affect the lives and health of the residents of the African American community nearby. It is a classic case of environmental injustice that is finally being addressed and at least partially rectified. When I think of the work of removal being done now, and the years of work that has led to this moment—the reading, the reporting, the organizing and advocating—I think of this as noble work, as therapy, broadly understood.
First pages of book chapters and introductions are formatted differently than interior pages. For instance, at the top of the first page of the introduction to Daniel Belgrad’s The Culture of Feedback: Ecological Thinking in Seventies America (University of Chicago Press, 2018), are a few extra inches blank space. Below that is the word, “Introduction,” in large type, and then in a smaller, special type, an epigram—several lines of lyric from a 1972 song by Funkadelic. At this point, there isn’t much room left on the page, but Belgrad makes it count with these two brief paragraphs:
We speak casually of improving a course of action by getting some feedback, as if that were the most natural thing in the world. But the idea of feedback itself has a history. During the Second World War, “feedback” developed as a term to refer to the dynamics of self- regulating mechanical systems, which correct their actions by “feeding” some effects “back” into the system as input to influence later actions. Due to the ability of such systems to self- correct, or “learn,” they could be considered intelligent.
Conversely, systems theory, which developed to describe how such systems worked, came to define intelligence itself as the ability to self- correct in response to feedback. Redefining intelligence this way—not as a uniquely human faculty produced by consciousness, but as the property of a system governed by feedback loops—eventuated in new ways of thinking about the varieties of intelligence found in nature. This is what I mean by ecological thinking.
I responded to these sentences with both joy and consternation. How many words had I committed to paper, how many lines and paragraphs had I sweated through, and never stated the case and its relevance so clearly? That was the consternation. The joy was mostly in that last sentence. With his articulation of “ecological thinking,” Belgrad confirmed the basic insight that had guided my own work. I was all in. But then again, he probably had me at Funkadelic.
All in, yes, but not without at least one arched eyebrow. The ideas of anthropologist and systems theorist Gregory Bateson, the central figure in my own book, are prominent in The Culture of Feedback, and two scholars, looking at the some of the same evidence, can’t be expected to agree in every particular.
Later in the introduction, Belgrad quotes a line from Bateson’s essay, “Social Planning and the Concept of Deutero-Learning,” which Bateson gave at the Second Symposium on Science, Philosophy and Religion in Their Relation to the Democratic Way of Life in New York in September of 1941. The question on the table at that symposium was to what degree the knowledge created by the social sciences should be applied to the fight against Nazi Germany, especially in regard to using that knowledge to fashion effective pro-democratic, anti-fascist propaganda.
Some argued that if the Western democracies wanted to defeat the fascists, they ought to use every tool in their bag. But both Bateson and Margaret Mead, then married, advised that the reckless use of anthropological learning for something as instrumental as defeating an enemy could undermine the very values they wished to defend. Rather than embracing the premise that ends justified means, Mead proposed that ends and means be integrated, so that democratic values were present at every moment in the process of working toward the goal. Bateson restated Mead’s proposal this way: “that we discard purpose in order to achieve our purpose.”
This is the quote Belgrad uses, and it winds up carrying significant weight in the book overall. The paradoxical character of the quote, and especially the way it represents a willingness to cede the desire to control natural processes in order to cultivate a healthier alignment with them, “would become” Belgrad writes, “a central tenet of the culture of feedback” (12).
Yes, something like this idea would become a tenet in thinking in terms of feedback and in thinking ecologically. But that wasn’t what Bateson was really after when he wrote these words. For him, the construction was mostly rhetorical. By restating Mead’s earlier proposal to integrate means and ends with a phrase that owed more to poetry than science, Bateson wanted his listeners to sit up and take notice. Having gained their attention, he then delivered a largely technical contemplation of how difficult it would actually be, “to discard purpose to achieve purpose,” especially for a society whose commitment to “purpose and instrumentality” was both stronger and deeper than its commitment to self-government and the human rights of individuals (Steps to an Ecology of Mind, 159-60). This contemplation involved a reframing of “value” by way of a theory of learning, the laying out of which was Bateson’s primary objective with his piece—but that’s another topic.
It’s not that Belgrad’s use of the Bateson quote isn’t legitimate. “Social Planning and Deutero-Learning” is an important work in the development of Bateson’s thought. That makes it important, too, to the development of what Belgrad calls the culture of feedback—and important to ecological thought, generally. That’s how significant Bateson is to this history of ideas. Both Belgrad and myself give attention to the essay in our respective books and quote some of the same lines, including the one in question. The difference between how we do so says something about the practice of scholarship.
Here’s how I see that difference. In regard to the historical evidence—Bateson’s essay, in this case—Belgrad stands at a further distance away. That position allows him to telescope time and make a broader claim. My stance is close in. The culture of feedback doesn’t yet exist. The closer-in position allows for insights not only into the essay’s main ideas but also into the relationship between Mead and Bateson, their difference in styles, the concerns and attitudes that drew them together, the concerns and attitudes that were pulling them apart. The story becomes one of persons as well as of ideas.
Having made that observation, let me hasten back to unreserved praise for The Culture of Feedback. For one thing, I admire how readable it is. Data from the philosophical, the political and the aesthetic sit side by side in these pages, with nary a creak in the prose. It’s thoroughly researched and comes with all the expected academic apparatus, yet the book reads weightlessly. Belgrad’s claims are straightforward and clearly demonstrated.
In one chapter, for example, he shows how the ideas related to the culture of feedback were applied in a series of musical experiments. For these experimental artists, “music was a way of integrating sounds into a natural, open system, organized by feedback relations rather than having been put in order by the composer’s dictatorial authority” (111). Under the heading of what was sometimes called “acoustic ecology,” some sought to design music-making and listening processes that didn’t contain a blueprint of its outcome—that discarded purpose to achieve purpose, as it were. Involved in these experiments, too, were efforts to include input from the environment in the acts of composition, performance, and reception—input that was by its nature undetermined and ever-changing.
Before reading Belgrad, I was unaware of names such as Pauline Oliveros, Max Neuhaus, La Monte Young, Steve Reich, Terry Riley, and Charlemagne Palestine, but I was familiar with Brian Eno, who built on and incorporated a number of these concepts and methods in recordings with titles such as Music for Airports and Music for Films. With these and other recordings, Eno opened a pathway for what he called “ambient music” through the semi-popular to the mainstream.
Belgrad’s chapter on experimental and ambient music made me dig out my copy of Eno’s first LP in this series, from 1975, called Discrete Music. On the back of the cover was an “operational diagram,” looking very similar to those that Donella Meadows used repeatedly in Thinking in Systems: A Primer. As Eno explains in the liner notes, the diagram shows how he fed “two simple and compatible melodic lines of different duration” into his recorder while continually feeding them back in the recorder on a delay loop. He didn’t know how it would all turn out but, after designing the process, became “an audience to the result.”
Interesting, too, is how Eno describes his motivation for these experiments as a personal disinclination to intervene: “It was a point of discipline to accept this passive role, and, for once, to ignore the tendency to play the artist by dabbling and interfering.” This brought to mind Heidegger’s concept of “releasement” (which I’ve written about previously on this blog), as an alternative to the active sort of intervention that is the typical response to an urgent need. Eno acts at the level of premise, rather than at the level of behaviors allowed within the scope of premise, one might say. He’s not writing or even playing music. He’s making change in what music is understood to be.
How marvelous to discover that music can be that, it turns out.
Ecological thinking encourages interventions like this in the economy and other fundamentals— especially the thinking done by degrowth and other postgrowth communities. These are present-day advocacies, but what they are talking about isn’t exactly new. I’ll conclude with a sentence from Belgrad that incorporates a quote from Ervin Laszlo’s 1972 book, Introduction to Systems Philosophy:
The Western world tends to offer the values of affluence as the panacea for all social ills,” [Laszlo] observed; but this behavior had resulted in an unsustainable level of resource consumption. Therefore, now “progress must be redefined, and that means a new system of values.
A version of this essay appeared on the Society for US Intellectual History blog
My first social gathering since the quarantine was a meeting of our record club, which I’ve mentioned before on this blog. We met outdoors, across a wide deck, four couples, well-separated and following recommended precautions. Each couple brought a separate speaker, and we played our selections from our phones. Our theme: what music has helped you during the pandemic?
It wouldn’t be a hard question to answer–not for me. Facing a school shut-down and a week to shift to online instruction, I happened to find myself in possession of Fela Kuti’s entire recorded catalog, a recent gift, some twenty-seven hours of music. I ripped the songs to my computer, arranged them on a playlist, and set the playlist on repeat. During what turned out to be more than a month of days spent in front of the computer, this was my soundtrack. I listened through a few times in chronological order, then a few more times through on shuffle. At some point, I started rating the songs as they came up, later sorting them according to rating so as to listen to my favorites first, and so on, in various configurations.
Fela Anikulapo Kuti was born in 1938 to a prominent Nigerian family. His father was a minister and educator, his mother an activist for women’s rights and other causes, widely respected, a real personage in her country. Fela was educated in London. He was to study medicine, like his brothers, but switched to music, strongly influenced by modern jazz, particularly Miles Davis. He formed a band with drummer Tony Allen, playing something very close to highlife, a West African genre popular since well before decolonization.
A turning point came in 1969 when a benefactor sent Fela and his band to Los Angeles. Fela was reading The Autobiography of Malcolm X and socializing with members of the Black Panther Party. He formed an intellectual and romantic relationship with party member Sandra Smith (later Izsadore). While Smith schooled him on the movement, Fela represented to Smith something authentically African at a time when many American blacks were committing to pan-Africanism, third world alliances, and a cultural return to roots. The result was a transatlantic exchange of music and ideas that would have a lasting significance.
Back in Nigeria, Fela and Allen formed a new band, Africa 70, and began to play a new kind of music, which they called afrobeat. Afrobeat retained components of highlife but also incorporated percussive and chant elements of traditional African styles as well as the various currents of postwar jazz. Lyrics were sung mainly in pigeon English, which made them more accessible across the continent and signaled a consciousness of social class. Critical, too, was the influence of James Brown–as an innovator, a bandleader, and a personality. Fela had followed Brown’s music for some years, but he also may have seen Brown and his band play live during his American sojourn.
A live recording of Brown that year in Augusta, Georgia, offers an idea of what Fela would have heard. Having established a new genre of his own–“a brand-new bag,” today called funk–Brown led a crack band of some fourteen members, including three drummers, and a road crew, staff, and entourage of similar size. The James Brown Orchestra crisscrossed the country, slaying auditoriums nightly, playing their vamp-based songs at breakneck tempos, making a sound never before heard. Importantly, too, as an influence on Fela, was James Brown’s August 1968 hit, “Say It Loud–I’m Black and I’m Proud.” The political element in Brown’s music would be fairly short-lived. For Fela, the merger of politics and music endured.
“Gentleman,” “Water Get No Enemy,” “Go Slow”: as the weeks went by, my favorites list from the discography grew. Choosing which one to play for the record club was mainly a problem of length. The whole concept of the club, in which we went around the circle, each person taking a turn, was built on the tacit understanding that a three- or four-minute song was the norm. A Fela cut, in contrast, tends to run up toward twenty minutes. Typically, it will begin with a slow build-up of instruments– guitar, bass, second guitar, percussionists–vamping on a one- or two-measure rhythmic pattern. Then comes a series of jazz choruses, improvised solos by saxophone or keyboard, intercut with R&B-style melodic heads played by a full horn section. At about the halfway mark, recognizable verses begin, and then vocal choruses, a call-and-response between Fela and the back-up singers. Now fully mature, the song serves its teaching function, and Fela delivers his political and social commentary.
Maybe it was merely coincidental that a) the virus had sentenced me to long hours at the computer, and b) I had this enormous body of music available to consume. Maybe that was just a happy opportunity. On the other hand, when the workday was over, after I’d taken in a painful dose of the evening news, after dinner was finished and the dishes done, it was my handful of Fela vinyls, procured back in the 1980s, that invariably found their way to the turntable. I’d been obsessed with artists before. This was different. This was more a kind of therapy. I began to wonder if there wasn’t something particular that this music supplied that met what the moment required.
Fela’s lyrics showed him to be in rebellion not only against the emotional and structural legacies of imperialism but also against Nigeria’s post-colonial, military government, awash in petro-dollars. Fela, his large band, his dancers, his family–which included numerous wives–and the many others associated with his organization, lived on a compound in a poor section of Lagos, Nigeria’s largest city. The compound included living spaces, a studio, a nearby performance hall, and a health clinic that served the neighborhood. Dress was casual, herb-smoking a daily practice. Fela named his commune the Kalakuta Republic, after a jailhouse where he’d been incarcerated, and declared it to be an independent state, outside the jurisdiction of the authorities.
“The government regarded Kalakuta as an affront,” scholar Randall Grass writes, “a first step toward incipient, secessionist anarchy, no joke in a country racked by civil war.” Fela seemed to like nothing better than to taunt the junta and its leader, General Olusegan Obasanjo. His massive 1976 hit, “Zombie,” depicted Obasanjo’s troops as mindless automatons. In February of 1977, soldiers and police invaded the compound. Inhabitants were beaten, raped. The buildings were set afire. Instruments and masters tapes were destroyed. Fela was beaten almost to death, and his mother, the famous activist then in her late seventies, was thrown from a window. She died from her injuries the following year.
The effects of this assault on Fela should not be underestimated. He redoubled his resistance against Obasanjo. Numerous recordings addressed the attack head-on: “Unknown Soldier,” “Sorrow, Tears, and Blood.” Fela used music to process his trauma, though no music could have completely healed the physical and psychic wounds. In the new commune Fela established to replace Kalakuta, he put a coffin on the roof to memorialize his mother’s martyrdom. He began to speak regularly of receiving communications from her spirit.
The attack may have also encouraged an already stout megalomania. As did two other domineering, genre-inventing band leaders–I’m thinking not only of James Brown but also Bill Monroe–Fela faced rebellions and ultimatums within his own camp. Tony Allen left in 1979 and took many of Africa 70 with him. Fela brought in new talent, renamed the band, and continued to record for almost two decades, until his death from AIDS in 1997.
One of Africa 70’s last recordings, made during a transitional period following Allen’s departure, was “Coffin for Head of State” (1981). The record and its cover art recount General Obasanjo’s last day in office. A civilian government had been voted in. Fela removed his mother’s coffin from the rooftop display and carried it in a procession to Dodan Barracks, the General’s quarters. There he deposited the coffin as a final protest and reminder of the brutality of his loss.
The song starts, as usual, with interlocking bass and guitar figures. Fela plays a few stately chords on electric piano. The groove builds and releases, builds and releases, breaking down again and again to its initial, two-measure vamp. The lyrics are gospel-like when they finally kick in. “Almighty Christ our Lord,” Fela sings; “Amen, amen, amen,” the singers chant. Yet Fela wants to denounce Christianity–and Islam, too–as illegitimate imports, hypocritical covers for corrupt power. He mockingly imitates the sound of Latin and Arabic prayers. Then he’s walking, describing a journey across Africa.
I waka waka waka
I go many places
I see my people
Them dey cry cry cry
He notes the hurts, the abuses. General observations become personal ones.
Them steal all the money
Them kill many students
Them burn many houses
Them burn my house too
Them kill my mama
So I carry the coffin
I waka waka waka
“Coffin for Head of State” clocks in at almost twenty-three minutes. As with all great Fela songs, uncompromising sentiments are delivered inside a deeply soothing, hypnotic groove. “Ain’t it good to ya?” James Brown would say about music like this. As I’ve lived this pandemic, and as I’ve watched how our highest leadership has responded, its seemingly eager sacrifice of the weak, the disadvantaged, and the old to self-interest, Fela Kuti’s music served as an anodyne without diluting the anger, while leaving contempt intact.
A version of this essay appears on the Society for US Intellectual History Blog.
Both a recent documentary on Fela called Finding Fela! and another from 1982 called Music is the Weapon are available to stream from Kanopy and other services.
Randal F. Grass writes about Fela in The Drama Review, https://www.jstor.org/stable/1145717?read-now=1&seq=1#page_scan_tab_contents
Sandra Izsadore remembers Fela in the LA Weekly: https://www.laweekly.com/fela-kutis-lover-and-mentor-sandra-smith-talks-about-afrobeats-l-a-origins-as-fela-musical-arrives-at-the-ahmanson/