The English word “person” has a long and convoluted history. Though the word itself likely derives from the Latin, persona, referring to the masks worn in theatre, its meaning has evolved over time. One of the biggest conceptual overhauls came in the 4th century AD during a church council that was held to investigate the concept of person as it related to the Trinity. Whereas the Greek fathers defined the Trinity as three hypostases, roughly translated as “substances” or “essences,” the Latin fathers saw them as one hypostasis that could be distinguished by the concept of persona. Because both the Roman Church and the Greek Church viewed each other as orthodox, they brushed off the difference of terms as semantics. Over time, this resulted in a conceptual conflation of the terms, effectively leading to persona encapsulating the notion of both the “role” one plays and one’s “essence” or “character” .
It’s difficult to overstate our society’s fascination with Artificial Intelligence (AI). From the millions of people who tuned in every week for the new HBO show WestWorld to home assistants like Amazon’s Echo and Google Home, Americans fully embrace the notion of “smart machines.” As a peculiar apex of our ability to craft tools, smart machines are revolutionizing our lives at home, at work, and nearly every other facet of society.
We often envision true AI to resemble us – both in body and mind. The Turing Test has evolved in the collective imagination from a machine who can fool you over the phone to one who can fool you in front of your eyes. Indeed, modern conceptions of AI bring to mind Ex Machina’s Ava and WestWorld’s “Hosts,” which are so alike humans in both behavior and looks that they are truly indistinguishable from other humans. However, it seems a bit self-centered to me to assume that a being who equals us in intelligence should also look like us. Though, it is perhaps a fitting assessment for a being who gave itself the biological moniker of “wise man.” At any rate, it’s probably clear to computer scientists and exobiologists alike that “life” doesn’t necessarily need to resemble what we know it as. Likewise, “person” need not represent what we know it as.
Though we often take for granted that humans are persons, they are not exempt from questions surrounding personhood. Indeed, what it means to be a person is largely an unsettled argument, even though we often speak of “people” and “persons.” Just as it’s important to ask if other beings might ever be persons, it is also important to ask if humans are ever not persons. In this pursuit, it’s crucial to separate the concept of personhood from notions of respect, love, and importance. That is to say, while a person may necessitate respect, love, and importance, something need not be a person to also demand respect, love, or importance.
When the concept of personhood in humans comes into discussion, it inevitably is punted to the medical community, often in the context of abortion and end of life. When does the heart first beat? When can a fetus feel pain? When does the brain begin/stop producing electrical activity? There is no doubt that advancements in our understanding of human physiology have enlightened discourse on what it means to be both a human and a person. However, the question of personhood is all too often debated solely in light of Western medical contexts. This conflation of physiology and personhood is the same issue that was discussed in my previous post on primate personhood and will be revisited in my next post on artificial intelligence. To escape this quandary we need to consider factors outside of physiology that are important to the concept of personhood, such as the social.
Savage Minds welcomes guest blogger Coltan Scrivner for the month of January. Coltan will be writing a series of posts on personhood from different disciplinary perspectives.
When I moved to Chicago for graduate school, one of the first things I did was go to the Lincoln Park Zoo. Just like with other zoos I’ve been to, I was most eager to visit the Great Ape exhibit. As always, after sitting and watching the chimpanzees for some time, I inevitably start to feel a bit guilty. There’s something about the chimps, with their eerily human-like behavior, that makes it feel wrong to be watching them in an enclosure.
You can get at the familiarity from a biological perspective by rattling off scientific facts like “they share 99% of our protein-coding genes,” or “our lineages split just 5-7 million years ago.” As a biological anthropologist, I am prone to do so. These things are often invoked to shed light on similarities between Homo sapiens and Pan troglodytes. Between species. Yet, even to someone who knows nothing of biology, there is still something about chimpanzees that rings familiar. Something about the way they behave, about the way they interact with other chimpanzees and their environment. You don’t need the biology or the genetics to begin to wonder if perhaps they should be considered as something more than animal. It’s clear they aren’t humans, but could they be individuals? Can a chimpanzee possess an understanding of a self, be a someone as opposed to a something; can they be “persons?” Continue reading
This is the third in a series of guest blogs this November from the AAA Archaeology Division Executive Board detailing ideas generated at retreat at the Amerind Foundation this past June. This post is by outgoing AD Secretary, Jane Eva Baxter.
As thousands of anthropologists make their way to Minneapolis to take part in the AAA Annual Meetings, it is worth thinking about the potential ways this organization might help to foster a more robust and inclusive anthropology that actively embraces all of the subfields in intellectual and not just structural ways. When the Executive Board of the Archaeology Division (AD) of the AAA met at Amerind in June, one of the major areas of discussion was how to leverage the resources available through the AAA to create a unique intellectual space among all the professional organizations available to archaeologists.
It’s important to provide a bit of context for this discussion. Most archaeologists do not seethe AAA as their primary intellectual or professional home, but rather are more actively involved in the Society for American Archaeology, the Society for Historical Archaeology, the Archaeological Institute of America and/or the American Cultural Resources Association. The AAA is a secondary or tertiary membership for most current AAA AD members. The AAA is also the most expensive professional organization among these to join, and as Patricia McAnany noted in last week’s post the intellectual ties between archaeology and anthropology were disrupted significantly in the 1990s. These factors have resulted in a substantial reduction in AAA membership by archaeologists. Most of us who have retained our AAA membership have done so because of an enduring belief in the anthropological nature of archaeological inquiry and practice, and because we still find engaging with anthropology outside of our own subfield to be an enriching and nourishing intellectual experience. Continue reading
U.S. presidential elections are extraordinary moments—ruptures in everyday time, full of transformative promise. Maybe. More than two decades ago, in her seminal essay on time, Nancy D. Munn wrote: “the topic of time frequently fragments into all the other dimensions and topics anthropologists deal with in the social world.” So, in the cacophonous 2016 U.S. presidential campaign, how do we perceive time and why might that matter?
Elections, embedded in cyclical time, are sometimes interpreted as pivotal events that shape longer histories. Such histories can be narrated as slow change, fast change, or stasis; crisis or normalcy; repetitive or linear process; progress or regress. Anthropologists are attuned as well to smaller-scale temporalities. They listen for different personal experiences of time and observe social configurations in which they nest.
Savage Minds welcomes guest blogger Angelique Haugerud.
“America is a shining example of how to hold a free and fair election, right?” asks Bassem Youssef, a comedian and former heart surgeon who is often referred to as “the Egyptian Jon Stewart.” Astute answers to that question about the condition of U.S. democracy often come from foreigners such as satirists, as well as my East African research interlocutors.
Like Jon Stewart and Trevor Noah (The Daily Show), Stephen Colbert (The Colbert Report), and Jon Oliver (Last Week Tonight), Bassem Youssef uses irony and satire to hold a mirror up to society, and to unsettle conventional political and media narratives. State political pressure forced termination of the popular satirical news show Youssef created in Egypt during the Arab Spring. He then moved to the United States, became a fellow at Harvard University’s Institute of Politics in 2015, and in 2016 started a new show in the United States called “Democracy Handbook” on Fusion TV. As foreigners, Youssef, Jon Oliver (British), and Trevor Noah (South African) wittily play off stereotypes of their own home regions as they comment on events in the United States—such as Trevor Noah’s Daily Show segment comparing the 2016 Republican presidential nominee to African dictators.
Climate change is the nightmare that keeps me up at night. The consensus seems to be that the world will be significantly different within my children’s lifetimes. Many places will be uninhabitable. Many if not most of the world’s great cities, which are built on waterfronts, will be flooded and destroyed by unpredictable weather events and rising oceans. The global refugee crisis will become much, much larger. The food supply will become uncertain. The American landscape and economy will be different in ways I cannot imagine, while India, where I conduct my research, will be a place exponentially more difficult for the millions of people already struggling to get by. There is a degree of uncertainty in these statements, albeit a hopeful uncertainty. Many of the predicted changes are already happening, faster than scientists had thought.
For me, climate change is a crisis so big it is hard to think about at all. Can anthropology help us think through a problem that leaves us feeling overwhelmed? I would argue that yes, anthropological thinking can tackle these thorny problems, and in fact, it’s one of the few approaches that can. The recent AAA Global Climate Change Task Force Report makes this clear, by pointing out anthropology’s unique view on historical and current adaptation. Here, I also want to look back and find some inspiration in the public anthropology of Margaret Mead, who did not hesitate to comment on thorny problems of her day. Continue reading
Last month, a New York Post article about video games being like “digital heroin” for kids caused a bit of an uproar. The article describes a young boy losing interest in reading and baseball in favor of Minecraft, increasingly throwing tantrums until late one night his mother finds him in a catatonic state. Many have refuted this article as based on suspect evidence and even as a plug for the author’s addiction recovery center, noting the human tendency to treat new technologies—especially those used by children—with hysteria. It’s just the latest in the “screen time” debates.
But beyond scaremongering, what does screen time and immersion in digital worlds actually mean in terms of child rearing? Continue reading
In my field site of Bangalore, south India, I found support among young female professionals for feel-good feminism—that is, messages of female empowerment in pop culture that do not seek to shift the status quo much. This kind of feminism is often used by advertisers to appeal to female customers, as in this much-talked-about detergent ad in which a father belatedly realizes the bad example he set for his daughter by not helping with housework, or this recent Nike ad featuring female athleticism in India, where few women participate in sports. The idea here seems to be that a general female empowerment can allow (middle and upper-class) women to push the boundaries of gender norms ever so slightly.
But how much deviance from gender norms is really possible? Deviance is a word not used in contemporary anthropology very much anymore. It suggests a rigid norm that can be identified and described with a certainty few anthropologists would agree with now. It is also a term loaded with stigma. Who are the deviants? Continue reading
Savage Minds welcomes guest blogger Rachel C. Fleming
In my first introductory anthropology class of the year, I spoke a bit about the figures I consider “founding” to cultural anthropology, and asked if anyone had heard of them. Franz Boas, I inquired? After a pause, one woman tentatively asked, “Isn’t he the father of anthropology or something?” Yes, ok, close enough. She allowed that she had learned about Boas in another anthropology class. Bronislaw Malinowski? One hand went up in the back. A bearded young man said, “I’ve heard of him, but that’s probably because my girlfriend is an anthropology major.” Yes, that would explain it. And then I asked, Margaret Mead? Silence. I was frankly taken aback. I realize her popular appeal peaked from the 1920s through the 1960s, ancient history to this generation of students. However, she is consistently remembered in our field as possibly the most famous anthropologist to date. She wrote popular columns in national magazines about sexuality, gender, and childhood in the US. Coming of Age in Samoa was a massive bestseller and is still in print. The controversy over her research in Samoa was headline news in anthropology for years. The recent bestselling novel Euphoria fictionalizes her life.
Whatever you may think about Margaret Mead, we cannot dispute that she was a major early figure in what we now call public anthropology. With the efforts of anthropologists such as David Graeber, Barbara King, Tanya Luhrmann, Jonathan Marks, Carole McGranahan, and Paul Stoller, to name just a few, we have a growing voice in the public sphere, spurred along by social media. Yet I cannot help but feel nostalgic for a time when Mead was so well known that she was widely derided in the academy as a “popularizer.” Given the value of anthropological insight for current issues—a point we all strive to make in our classes and elsewhere—I suggest that we could learn from such a popularizer now. In this blog series I will thus reconsider Mead’s work on sexuality, childhood, gender, feminist anthropology, and public change by imagining what she might make of today’s world and the questions and crises we face. Continue reading
Anthropologists seeking to communicate their research to general audiences are likely to work with fact-checkers. Here’s some advice on how to handle the process if you’ve been interviewed by a reporter.
I write a lot of emails that make me seem much less educated than I am. Why? I often work as a professional fact-checker.
In this capacity, it’s my responsibility to confirm the accuracy of the words someone else has written. I’m not conducting original research; I’m making sure that another writer got their facts right.
This usually entails contacting the experts the author chose to interview and asking them a series of questions to determine whether or not the wealth of information they provided to the author was adequately distilled into a handful of words. I frequently do this by rewriting the author’s article into a series of “yes” or “no” questions.
Years ago, I was fact-checking for a glossy magazine and wrote an email to a well-respected biological anthropologist who had been quoted in the story I was working on. I asked: “Did marriage evolve so that we can find someone to fall in love with, in order to reproduce?” I’d read enough Gayle Rubin to answer this question from the point of view of a cultural anthropologist. I had to remind myself that, as a fact-checker, my job was not to challenge the statement the scholar had made. My responsibility was to confirm that these were words this media-savvy scholar would have spoken. She answered with a simple “yes.”
After nearly three years of eating almost nothing but the watery beans and undercooked rice I was served while conducting research in Brazilian prisons, I couldn’t wait to hit the restaurants of New York City when I returned from the field. I was surprised to find that even the spiciest chana masala tasted bland. I was numb. Kind neighbors had to remind me to put on a coat when I left my apartment to walk to the library, even though the sidewalks were covered with ankle-deep snow. My nose didn’t even twitch when I was forced to wait for a train on a piss drenched subway platform.
Well-meaning friends recommended therapy. Graduate advisors suggested writing as a strategy for self-care. I watched movies instead.
One night, I went out to see Ônibus 174, a slick documentary directed by José Padilha that tells the story of a Rio bus robbery that turned into a nationally televised hostage situation. The film manages to vilify poor black youths who turn to violence out of desperation, and the police officers who are tasked with keeping such violence out of the neighborhoods where privileged Brazilians like Padilha live. I left the movie theater with hot tears in my eyes and cried for six hours. Then I opened a brand new notebook and, for four straight hours, wrote about the seemingly endless reasons my fieldwork experiences led me to despise Padilha’s film.
No one but me will ever read those pages. The writing they contain is too raw to share. I confirmed this a few weeks ago, when I pulled out that notebook to verify that the writing was as awful as I remembered; it was. Sure, I’d vividly described a few places and had jotted down the kernels of thoughts that have since ripened, or that I am still cultivating. But, overall, the prose was too emotional and self-absorbed to be ethnographic.
I’ve thought of that private notebook when reading the texts of some emerging ethnographers who have recently studied violence in the field and have rushed to write publicly about their experiences before they’ve had the time to really think them through. While I commend such individuals for having the courage and the discipline to write, I also invite them to pause before publishing. Ethnographic writing can be a therapeutic exercise, but to be effective it must also be more.
Ethnographers of violence who are far, far more accomplished than I have argued that writing can help an anthropologist who has been emotionally taxed by fieldwork to recover. Even as the act of writing plunges the anthropologist back into the field, it also offers him or her a way to move beyond personal experiences of horror or fear to arrive at larger conclusions about the human condition. But the movement from therapy to theory is not as simple as this statement implies. It is only over time, and via multiple drafts, that writing permits the ethnographer to tease out the ways that intensely felt personal experiences of fear or suffering jarred their previous understanding and challenged them to rethink troubling problems and uncomfortable truths from unexpected angles.
When we read Philippe Bourgois, Mick Taussig, or Donna Goldstein—or many, many others who write about violence with style and grace—we don’t always notice the intellectual labor that went into producing their work. The grit and urgency of the writing belies its polish. Many of us aspire to write so vividly, so personally. Yet, it is crucial to note that when we read texts like In Search of Respect, Law in a Lawless Land, or Laughter out of Place, even though we feel the immediacy of the ethnographic encounter by being privy to the author’s thoughts and emotions while in the field, the enduring contribution of these texts lies in what their authors have told us about the people and the places they have studied, not in what the authors have revealed to us about themselves.
Moving from therapy to theory in writing about personal experiences of violence is intellectually demanding work. The difficulty of the task is exacerbated by the imperative to publish quickly and often. When still overwhelmed by the stresses and emotions of recent fieldwork, it is often easier (and more immediately rewarding) to write about the personal effects of what we experienced in the field. But allowing time and reflection to intervene between our ideas and the visceral and the emotional aspects of certain ethnographic encounters can enable us to better think through the ways that personal experiences of fear or suffering can illuminate larger patterns or problems. To put it simply: while ethnographic writing can offer catharsis, it should also offer critique.
Bourgois, Philippe. In Search of Respect: Selling Crack in El Barrio. Cambridge University Press, 2003.
Goldstein, Donna. Laughter out of Place: Race, Class, Violence, and Sexuality in a Rio Shantytown. University of California Press, 2013.
Taussig, Michael. Law in a Lawless Land: Diary of a Limpieza in Colombia. University of Chicago Press, 2005.
Theidon, Kimberly. “‘How was Your Trip?’ Self-care for Researchers Working and Writing on Violence.” Drugs Security and Democracy Program DSD Working Papers in Research Security. New York: Social Science Research Council, 2014.
Writing is never easy. Writing ethnographically about people who perpetrate violence is exceptionally difficult. Not only does the ethnographer have to cautiously avoid slipping into what we call “pornographic’ representation, she (or he) must find a way to convey the humanity of people who do “inhuman” things, while also doing justice to the victims of their violence. Writing in the first person compounds these difficulties. How does one insert his or herself, as ethnographer, into such a narrative?
In writing up my research on prison rapes and murders, I struggle with the competing desires of wanting to present myself as a likeable protagonist and wanting to honestly relate the ways that my ethnographic practice cannot help but become entwined with the forms of violence that I study. I also worry that as I try to navigate between these two treacherous poles of representation, my writing will be either disastrously self-exculpating or unnecessarily self-flagellating.
One solution to this problem might be to consider the ethnographer in the stories I write about violence as a character, rather than a robust and authentic representation of me. But, would doing so necessitate writing the violent events of my fieldwork as fiction? And would turning into ethnographic fiction events that I experienced as being too-real (and as having too-real consequences) be just another way to avoid confronting their ethical ramifications?
A simpler solution would be to pretend that the violence I either witnessed or experienced in the field did not happen at all. I would not be the first to elide physical violence in my ethnographic writing. In fact, I’ve admittedly written much less about the violent events that were central to my fieldwork than I have about the forms of structural violence that have shaped the ethnographic contexts in which I study because I find doing so to be less fraught than writing about specific instances of physical aggression or pain. But blood, bullets, and torn flesh were so prevalent in my fieldwork, I would feel dishonest if I wrote them out of my work.
Another course I could steer in writing about my ethnographic encounters with perpetrators of violence would be to unequivocally position myself as observer rather than participant. But, to me, this would hearken back to the late nineteenth century, when ethnography was decidedly about “the other,” not about the complex relationships that entangle us with people we might—especially when acts of murder or torture are involved—prefer to refer to as “them.”
The choice I have made is to directly acknowledge both my discomfort with and my complicity in the violence that I study. The subsequent challenge I face is how to write this way without dipping into the egocentrism that, as my next post will discuss, sometimes plagues writing about ethnographic encounters with violence.
Fassin, Didier. 2014 “True Life, Real Lives: Revisiting the Boundaries Between Ethnography and Fiction.” American Ethnologist 41(1): 40-55.
Nader, Laura. 2011. “Ethnography as Theory.” HAU: Journal of Ethnographic Theory 1(1): 211-219.
Taussig, Michael. 2010. “Viscerality, Faith, and Skepticism: Another Theory of Magic.” Walter Benjamin’s Grave. University of Chicago Press, p. 121-156.
My as work has an anthropologist in Brazil has drawn me into an historically layered matrix of racial, class-based, and gendered violence that I did not sufficiently understand when I entered the field. I am still working to understand it now. In my previous post I described how, when an off duty police officer held a gun to my temple, he made it impossible for me to claim that I stood fully outside that matrix because I was a light-skinned foreigner. Still, I could not claim that I stood fully within the matrix because I was an anthropologist. The threat I faced was an exceptional moment in my life; such moments were likely to become quotidian to the three little boys who knelt with me in the cane.
In writing about the event, my goal was to foreground the matrix in which the violent encounter I described unfolded and to think through my liminal place within it. While I do assume responsibility for making the event I described possible, I am more interested in examining the larger structures and forces that create the conditions in which violence occurs than I am concerned with assigning individual blame for particular acts of violence.
Admittedly, it would have been expedient to cast myself as an innocent victim of an “other’s” violence. But to me, the more productive question to ask is: How have innocence and complicity become intertwined in a context where murder is too often understood to be an acceptable response to perceived disrespect?