Post List

Anthropology posts

(Modify Search »)

  • October 29, 2010
  • 02:01 PM
  • 563 views

The Adoption of Altruism

by Eric Michael Johnson in The Primate Diaries in Exile

The latest stop in the #PDEx tour is being hosted by Barbara J. King:Since animals, including humans, are merely ambulatory vehicles for their selfish genes, according to the dominant framework, it would be to one's benefit to care for a niece or cousin that lost their mother but not for a stranger of which there was no genetic relation. This is because any genes that promoted such altruism towards unrelated individuals would end up losing out by using up resources that didn’t perpetuate themselves. However, these “altruistic genes” would be passed on and thrive if they were helping a kin member with similar genetic makeup. In the currency of reproductive fitness, nepotism pays.However, a study in the journal Primates by Cristiane Cäsar and Robert John Young report on a case of adoption among a wild group of black-fronted titi monkeys (Callicebus nigrifrons) from the rainforests of Brazil.Read the rest of the post here and stay tuned for the next entry in The Primate Diaries in Exile tour.Reference:Cäsar, C., & Young, R. (2007). A case of adoption in a wild group of black-fronted titi monkeys (Callicebus nigrifrons) Primates, 49 (2), 146-148 DOI: 10.1007/s10329-007-0066-x... Read more »

  • October 28, 2010
  • 05:43 AM
  • 774 views

Sons of the conquerers: the story of India?

by Razib Khan in Gene Expression


The past ten years has obviously been very active in the area of human genomics, but in the domain of South Asian genetic relationships in a world wide context it has seen veritable revolutions and counter-revolutions. The final outlines are still to be determined. In the mid-1990s the conventional wisdom was that South Asians were [...]... Read more »

Gyaneshwer Chaubey, Mait Metspalu, Ying Choi, Reedik Mägi, Irene Gallego Romero, Pedro Soares, Mannis van Oven, Doron M. Behar, Siiri Rootsi, Georgi Hudjashov.... (2010) Population Genetic Structure in Indian Austroasiatic speakers: The Role of Landscape Barriers and Sex-specific Admixture. Mol Biol Evol. info:/10.1093/molbev/msq288

  • October 27, 2010
  • 11:38 PM
  • 1,076 views

Food for thought: Cooking in human evolution

by gregdowney in Neuroanthropology

Did cooking make us human by providing the foundation for the rapid growth of the human brain during evolution?  If so, what does this tell us about the diet that we should be eating, and can we turn back the culinary clock to an evolutionarily ideal diet?  A number of provocations over the last couple of weeks have me thinking about evolution and diet, especially what our teeth and guts tell us about how our ancestors got their food.
I did a post on this a while back at Neuroanthropology.net, putting up my slides for the then-current version of my ‘brain and diet’ lecture from ‘Human evolution and diversity,’ but I’m also thinking about food and evolution because I just watched Nestlé food scientist, Heribert Watzke’s TED talk, The Brain in Your Gut. Watzke combines two intriguing subjects: the enteric nervous system, or your gut’s ‘second brain,’ and the evolution of diet.  I’ll deal with the diet, gastro-intestinal system and teeth today, and the enteric nervous system another day because it’s a great subject itself (if you can’t wait, check out Scientific American).

This piece is going to ramble a bit, as it will also include some thoughts on the subject of diet and brain evolution sparked by multiple conversations: with Prof. Marlene Zuk (of the University of California Riverside), with Paul Mason (about Terrence Deacon’s article that he and Daniel wrote about), and following my annual lecture on human brain evolution as well as conversations today with a documentary crew from SBS.  So let’s begin the meander with Dr. Watzke’s opening bit on why he thinks humans should be classified as ‘coctivors,’ that is, animals that eat cooked food, rather than ‘omnivores.’

Although I generally liked the talk, I was struck by some things that didn’t ring quite right, including Dr. Watzke’s opening bit about teeth (from the online transcript):
So everyone of you turns to their neighbor please. Turn and face your neighbors. Please, also on the balcony. Smile. Smile. Open the mouths. Smile, friendly. (Laughter) Do you — Do you see any Canine teeth? (Laughter) Count Dracula teeth in the mouths of your neighbors? Of course not. Because our dental anatomy is actually made, not for tearing down raw meat from bones or chewing fibrous leaves for hours. It is made for a diet which is soft, mushy, which is reduced in fibers, which is very easily chewable and digestible. Sounds like fast food, doesn’t it.
Okay, let’s not be pedantic about it, because we know that humans, in fact, do have canines.  Watzke’s point is that we don’t have extended canines, long fangs that we find in most carnivorous mammals or in our primate relatives like chimps or gorillas.
The problem is that the absence of projecting canines in humans is a bit more interesting than just, ‘eat plants=less canine development.’  In fact, gorillas are completely vegetarian, and the males, especially, have massive canines; chimpanzees eat a very small amount of animal protein (something like 2% of their caloric intake), and they too have formidable canines.  Our cousins don’t have extended canines because they need them for eating – rather, all evidence suggests that they need big fangs for fighting, especially intraspecies brawling among the males in order to reproduce.
Teeth of human (left), Ar. ramidus (middle), and chimpanzee (right), all males.
The case of chimpanzee canines is especially intriguing because, with the remains of Ardipithecus ramidus now more extensively discussed, a species potentially close to the last common ancestor of humans and chimps, we know very old hominids didn’t have pronounced canines.  If the remains are indicative of our common ancestor with chimpanzees (and there’s no guarantee of that), then it’s not so much human canine shrinkage alone that’s the recent evolutionary development but also the re-development of chimpanzee canines, probably due to sexual competition.
Even with all the possible points of disagreement, the basic point is that human teeth are quite small, likely due both to shifts in our patterns of reproduction and sexual selection and to changes in our diet.  Over the last few million years, our ancestors seemed to have gotten more and more of their calories out of meat, one argument goes, at the same time that our ancestors’ teeth were getting less and less capable of processing food of all sorts (or, for that matter, being effectively used as a weapon).
Hungrier and hungrier, with weaker jaws and smaller teeth
As I always remind my students in my lecture on human brain evolution, if big brains are so great, why doesn’t every animal have one? The answer is that big brains also pose certain challenges for an organism (or, if you prefer, ‘mo’ neurons, mo’ problums’).
The first and most obvious is that brains are hungry organs, devouring energy very fast and relentlessly, especially as they grow.  The statistic that we frequently throw around is that the brain constitutes 2% of human body mass and consumes 25% of the energy used by the body; or, to put it another way, brain tissue consumes nine times as many calories as muscle at rest.  So, if evolution is going to grow the brain, an organism is going to have to come up with a lot of energy – a smaller brain means that an animal both can eat less and be more likely to survive calorie drought.
But hominin brain growth also presents a few other problems, which sometimes get underestimated in accounts of our species’ distinctiveness.  For example, natural selection had to solve a problem of excess heat, especially if big-brained hominids were going to do things that their big brains should tell them are ill advised, like run around in the hot sun. As your brain chews up energy, it generates heat, and the brain can overheat, a serious problem with sunstroke.  The good news is that somewhere along the line our hominin ancestors picked up a number of adaptations that made them very good at shedding heat, from a low-fur epidermis and facility to produce copious sweat to a system of veins that run from the brain, shunting away heat (for a much more extensive discussion, see Sharma, ed. 2007, or the work of anthropologist Dean Falk, including her 1990 article in BBS laying out the ‘radiator theory’).
Not only is our brain hungry and hot; our enlarged cranium also poses some distinctive challenges for our mothers, especially as bipedalism has narrowed her birth canal by slowly making the pelvis more and more basket shape (bringing the hips under our centre of gravity).  The ‘obstetrical dilemma,’ the narrowing of the birth canal at the same time that the human brain was enlarging, led to a bit of a brain-birth canal logjam, if you’ll pardon the groan-worthy pun (see Rosenberg and Trevathan 1995).
Although frequently presented as a significant constraint on brain growth (and I’m sure all mother... Read more »

Rosenberg, K., & Trevathan, W. (2005) Bipedalism and human birth: The obstetrical dilemma revisited. Evolutionary Anthropology: Issues, News, and Reviews, 4(5), 161-168. DOI: 10.1002/evan.1360040506  

Suwa, G., Kono, R., Simpson, S., Asfaw, B., Lovejoy, C., & White, T. (2009) Paleobiological Implications of the Ardipithecus ramidus Dentition. Science, 326(5949), 69-69. DOI: 10.1126/science.1175824  

Wrangham, R. (2003) 'Cooking as a biological trait'. Comparative Biochemistry and Physiology - Part A: Molecular , 136(1), 35-46. DOI: 10.1016/S1095-6433(03)00020-5  

  • October 27, 2010
  • 09:25 PM
  • 1,070 views

Archaeologists Unearth a "Vampire" Grave

by Dan Bailey in Smells Like Science

In the 1990's archaeologists uncovered a grave in Connecticut dating from the mid-1800's that provided the first physical evidence of a historical belief in vampires in New England.... Read more »

  • October 27, 2010
  • 02:57 PM
  • 1,150 views

Where did all these primates come from? – Fossil teeth may hint at an Asian origin for anthropoid primates

by Laelaps in Laelaps

Where did anthropoid primates come from? This question has not been an easy one to answer. Since the early days of paleontology various experts have proposed a slew of scenarios for the origins of the primate group which today contains monkeys and apes (including us), with different experts favoring various combination of places, times, and [...]... Read more »

Bajpai, S., Kay, R., Williams, B., Das, D., Kapur, V., & Tiwari, B. (2008) The oldest Asian record of Anthropoidea. Proceedings of the National Academy of Sciences, 105(32), 11093-11098. DOI: 10.1073/pnas.0804159105  

K. Christopher Beard. (2006) Mammalian Biogeography and Anthropoid Origins . Primate Biogeography, 439-467. info:/10.1007/0-387-31710-4_15

Beard, K., Marivaux, L., Chaimanee, Y., Jaeger, J., Marandat, B., Tafforeau, P., Soe, A., Tun, S., & Kyaw, A. (2009) A new primate from the Eocene Pondaung Formation of Myanmar and the monophyly of Burmese amphipithecids. Proceedings of the Royal Society B: Biological Sciences, 276(1671), 3285-3294. DOI: 10.1098/rspb.2009.0836  

Jaeger, J., Beard, K., Chaimanee, Y., Salem, M., Benammi, M., Hlal, O., Coster, P., Bilal, A., Duringer, P., Schuster, M.... (2010) Late middle Eocene epoch of Libya yields earliest known radiation of African anthropoids. Nature, 467(7319), 1095-1098. DOI: 10.1038/nature09425  

  • October 25, 2010
  • 12:50 PM
  • 3,192 views

Anatomy of a Superstition: When Your Eye "Jumps"

by Krystal D'Costa in Anthropology in Practice


The eye sees all, and can possibly warn
of danger in Trinidadian folklore.
Credit: Wikipedia
Trinidadians have a rich collection of superstitions, many of which found their way to the island via colonialism. These beliefs reflect the ways ideas and explanations have been blended here—and elsewhere—in the face of globalization. There is one, however, that I have grown up with that seems unique to Trinidadians. It concerns an involuntary eye spasm known colloquially as when your eye "jumps." The superstition has multiple parts and meanings depending on which eye is affected:If your right eye jumps, you are going to hear good news. If your left eye jumps, you are going to hear bad news (Roberts 1927: 161).
If your right eye jumps, someone is speaking well of you. If your left eye jumps, someone is saying bad things about you.* (If you think of the name of people you know, when you name the right person—who is speaking badly about you—your eye will stop jumping) (Robert 1927: 161)
If your right eye jumps, you'll see someone you haven't seen in a long time.
If your left eye jumps, a loved one/friend is doing something behind your back.
If your left eye jumps, a love one/friend may be in trouble.
*There seems to be some confusion with this particular version of the superstition since I have also seen/heard it reverse (i.e., right eye = someone speaking ill of you). It is included here in the parallel form to match the other suggestions.
There are additional variations to this theme, but all emphasize the dichotomy between the left and right eye in relation to bad versus good events. The eye has long figured in superstitious lore—for example, the idea of the "evil eye" may date to 600 BC, and since this only marks documented reference to the belief, it may in fact be older than that. As a source of vision, awareness, and knowledge, it is no surprise that beliefs relating to the eye tend to suggest a forewarning.
Superstitions are often met with a certain degree of scorn. Rational folks are often quick to dismiss them. But still they lurk in the background until the opportunity arrives when they can suggest a potential "What if?" Historically, when discussing superstitions scholars (e.g., Matthews 1945; Roberts 1927) have categorized them as "primitive" beliefs of "simple" people, and overlooked the insights they may offer on the way people view the world. While many superstitions have religious or supernatural undertones, many others offer interesting observations on life in a particular location. And if you dig deep enough, there are sometimes suggestive details that can explain why some superstitions persist.
For example, in a collection of West Indian beliefs and superstitions Basil Matthews (1945) discusses the Caniteel in Trinidad: a particular hour on a particular day between July 15th and August 15th during which any plants planted will fail to grow (141). No one knows for sure when the day or the hour actually occurs. What they do know is that generally what happens is that during this period worms eat the heart of the plant. Trinidadian farmers view this period as a bad time. Many avoid planting on July 15th, and then plant on alternate days hoping to avoid the Caniteel. Some avoid planting altogether during this period. The farmers have connected a real event (the activity of the worms) with a superstition (don't plant, this period is bad).
The same may be the case for eye jumping. The phenomena is largely harmless, but appears to be poorly understood by science. It is officially classified as benign essential blepharospasm (BEB), a phenomenon that can be disruptive in severe cases causing functional blindness:The condition is progressive with the early symptoms being irritation and discomfort in the eyelids causing an increase in the blink rate, which can progress over time to frequent, forceful involuntary and uncontrollable closure of the eyelids (Kowal et. al. 1998: 123).The condition is idiographic, but researchers believe that it may be linked in part to fatigue, stress, eyestrain, and/or caffeine (Robb-Nicholson 2010: 8). In a health column in the Harvard Women's Health Watch, Dr. Celeste Robb-Nicholson advises a writer of ways to cope with "eyelid twitching":There are several things you can do to ease the spasms. Close the eye and apply a warm compress—or try pulling gently on the lid. Get more sleep, and reduce your caffeine and alcohol intake. If the twitching occurs while you're reading or using a computer, relax your eyes occasionally by focusing on something in the distance. If your eyes are dry or irritated, use lubricant eyedrops (8).Even in the less severe form, eye jumping can still be disruptive (or at the very least, irritating), marked by a fluttering sensation in the eyelid, twitching of the eye, or the repeated closing and reopening of the eyelid. And it can last anywhere from minutes to hours or can occur intermittently over the course of several days. Perhaps its disruptiveness has contributed to its role in superstition. Let's consider the following:Eye jumping may be caused by stress in some form.
Because it is disruptive, it is memorable.
When a negative or otherwise anticipated event occurs following an eye jumping episode, it can be easily connected to eye jumping because the phenomenon sticks in the mind of the afflicted.
Since Trinidadians appear to follow the traditional notions of right = good, left = bad, it may be that they are selecting events following experiences of stress that match the eye afflicted by BEB. So for example, if they are anticipating speaking to a relative who has missed a telephone call, the anticipation may turn to worry and as a result experience BEB as a stress response. When the relative finally calls, the afflicted person may recall that their eye jumped and connect the two. This may also explain the fluidity between assigning events to the eyes. While Trinis largely follow the right/left dichotomy, they have been known to blur the line and simply say "My eye was jumping." It may also be that events that can be tied to the afflicted eye are more readily remembered. Similar to the Caniteel, Trinidadians have connected a real event (BEB) with a superstition (the eye afflicted by BEB can predict or warn of events).
Superstitions, however you view them, can be a source of comfort. They offer a way to take control of a situation and in this case to reaffirm ties—note that the eye jumping superstition is connected to loved ones. They can become deeply ingrained. When my eye jumps, I'm inclined to tell myself quite seriously to just "quit it." Meaning, quit worrying about it. I know that my stress levels are generally elevated when my eye jumps, but invariably, when the phenomenon persists, it opens the door for "What if." The event in itself also adds to my stress levels, creating a nagging sensation of worry that I refuse to openly acknowledge but seem to acknowledge in small ways. For example, my behavior changes slightly. I might call loved ones more frequently. And if I happen to learn of an event that occurred to one of them in this period, I find myself wondering about which eye the was afflicted. Superstitions are persistent. It's one of the reasons they've survived time and travel.
Do you have a family superstition that crops up from time to time? Something your grandmother or mother said or did continuously? Something that you yourself came to believe for no explicable reason? With Halloween just around the corner, let's open the vaults and see what's lurking in the shadows of our minds.

Cited: ... Read more »

Kowal L, Davies R, & Kiely PM. (1998) Facial muscle spasms: an Australian study. Australian and New Zealand journal of ophthalmology, 26(2), 123-8. PMID: 9630292  

Matthews, B. (1945) West Indian Beliefs and Superstitions. The American Catholic Sociological Review, 6(3), 139. DOI: 10.2307/3707527  

Roberts, H. (1927) Louisiana Superstitions. The Journal of American Folklore, 40(156), 144. DOI: 10.2307/534893  

  • October 25, 2010
  • 11:12 AM
  • 813 views

How does an anthropological perspective contribute to our understanding of birth control? Part I

by Kate Clancy in Context & Variation

This is a heavily revised version of a series I wrote for my LEE Blog on biological anthropology and hormonal contraception. This post deals with contraindications for hormonal contraceptives.... Read more »

Burkman RT, Fisher AC, Wan GJ, Barnowski CE, & LaGuardia KD. (2009) Association between efficacy and body weight or body mass index for two low-dose oral contraceptives. Contraception, 79(6), 424-427.

Morin-Papunen L, Martikainen H, McCarthy MI, Franks S, Sovio U, Hartikainen AL, Ruokonen A, Leinonen M, Laitinen J, Järvelin MR.... (2008) Comparison of metabolic and inflammatory outcomes in women who used oral contraceptives and the levonorgestrel-releasing intrauterine device in a general population. American journal of obstetrics and gynecology, 199(5), 5290-2147483647. PMID: 18533124  

  • October 24, 2010
  • 10:16 PM
  • 557 views

Mesa Verde Water Control

by teofilo in Gambler's House

I’ve previously discussed water control technologies at Chaco, where they were particularly important given the extreme aridity of that area even by Southwestern standards.  There is abundant evidence, however, that water control was a widespread activity throughout the ancient Southwest, even in areas with more reliable water sources.  The best-studied water control systems have been [...]... Read more »

  • October 20, 2010
  • 08:51 PM
  • 1,201 views

Facing Death and Uncovering the Past

by Dan Bailey in Smells Like Science

A real life Indiana Jones style adventure story (with less whips) about a priceless archaeological discovery deep in the Guatemalan Jungle.... Read more »

Saturno WA, Stuart D, & Beltrán B. (2006) Early Maya writing at San Bartolo, Guatemala. Science (New York, N.Y.), 311(5765), 1281-3. PMID: 16400112  

  • October 19, 2010
  • 08:09 AM
  • 1,013 views

Mashing up banana wild relatives

by Jeremy in Agricultural Biodiversity Weblog

Over at the Vaviblog is a detailed discussion (though not nearly as detailed as the paper) of a new paper outlining a new theory for the origin of the cultivated banana. Edible bananas have very few seeds. Wild bananas are packed with seeds; there’s almost nothing there to eat. So how did edible bananas come [...]... Read more »

  • October 19, 2010
  • 08:00 AM
  • 864 views

Banana domestication revisited

by Jeremy in The Vaviblog

Edible bananas have very few seeds. Wild bananas are packed with seeds; there’s almost nothing there to eat. So how did edible bananas come to be cultivated? The standard story is that some smart proto-farmer saw a spontaneous mutation and then propagated it vegetatively. Once the plant was growing, additional mutants would also be seen [...]... Read more »

  • October 19, 2010
  • 05:39 AM
  • 717 views

Did cavemen eat bread?

by Razib Khan in Gene Expression


Food is a fraught topic. In How Pleasure Works Paul Bloom alludes to the thesis that while conservatives fixate on sexual purity, liberals fixate on culinary purity. For example, is it organic? What is the sourcing? Is it “authentic”? Obviously one can take issue with this characterization, especially its general class inflection (large swaths of [...]... Read more »

Anna Revedin, Biancamaria Aranguren, Roberto Becattini, Laura Longo, Emanuele Marconi, Marta Mariotti Lippi, Natalia Skakun, Andrey Sinitsyn, Elena Spiridonova, & Jiří Svoboda. (2010) Thirty thousand-year-old evidence of plant food processing. PNAS. info:/10.1073/pnas.1006993107

  • October 19, 2010
  • 03:07 AM
  • 395 views

Wired to be Social

by Glialdance in Glial Dance

Humans are a social species, we interact with other people – aided by language- and exchange information on daily basis. The effects of social isolation have been demonstrated and predicted to be very severe and “de-humanising” in many cases with a long list of adverse effects on cognitive abilities and emotional stability. The question often posed when [...]... Read more »

Umberto Castiello, Cristina Becchio, Stefania Zoia, Cristian Nelini, Luisa Sartori, Laura Blason, Giuseppina D’Ottavio, Maria Bulgheroni, & Vittorio Gallese. (2010) Wired to be Social: the ontogeny of human interaction. PLoS ONE. info:/

  • October 18, 2010
  • 02:34 PM
  • 1,067 views

There are more things in heaven and earth, cobber, than are dreamt of in your philosophy

by Alun in AlunSalt

Studying astronomy in culture should be simple. There’s only so much that is visible by the naked eye, and it follows predictable patterns. Modern astronomy means that we can reconstruct what was visible anywhere in the world in human history, within certain boundaries for errors. If we know what happens when, then studying a culture... Read more »

Clarke, P.A. (2007) An Overview of Australian Aboriginal Ethnoastronomy. Archaeoastronomy: The Journal of Astronomy in Culture, 39-58. info:/

  • October 18, 2010
  • 10:04 AM
  • 590 views

Two DonorsChoose projects you must support: Girls are good at math, and Technology tools while pregnant

by Kate Clancy in Context & Variation

A plea to fund DonorsChoose projects that highlights research on sexism in mathematics instruction.... Read more »

Alessandri SM, & Lewis M. (1993) Parental evaluation and its relation to shame and pride in young children. Sex Roles, 335-343. info:/

Fennema, E., Peterson, P., Carpenter, T., & Lubinski, C. (1990) Teachers attributions and beliefs about girls, boys, and mathematics. Educational Studies in Mathematics, 21(1), 55-69. DOI: 10.1007/BF00311015  

  • October 17, 2010
  • 07:20 AM
  • 1,772 views

settling the black death debate with ancient dna

by Greg Fish in weird things

While for most of us, it tends to be a given that the culprit behind the scourge known as the Black Death was the bubonic plague, a number of historians weren’t so sure. The reports from the time talked about the kinds of symptoms we’d expect from a bizarre hybrid of bubonic and hemorrhagic plagues, [...]... Read more »

Haensch, S., Bianucci, R., Signoli, M., Rajerison, M., Schultz, M., Kacki, S., Vermunt, M., Weston, D., Hurst, D., Achtman, M.... (2010) Distinct Clones of Yersinia pestis Caused the Black Death. PLoS Pathogens, 6(10). DOI: 10.1371/journal.ppat.1001134  

  • October 17, 2010
  • 02:00 AM
  • 981 views

More aDNA from the Black Death

by Michelle Ziegler in Contagions

    An international team has confirmed Yersinia pestis biomolecules in Black Death era* ‘plague pits’ (Haensch et al., 2010). Ancient DNA (aDNA) specific for Yersinia pestis and the Yersinial F1 antigen were discovered in skeletons from recognized plague pits in the Netherlands, England, and France. German and Italian skeletons tested positive for Y. pestis [...]... Read more »

Haensch, S., Bianucci, R., Signoli, M., Rajerison, M., Schultz, M., Kacki, S., Vermunt, M., Weston, D., Hurst, D., Achtman, M., Carniel, E., and Bramanti, B. (2010) Distinct clones of Yersinia pestis caused the Black Death. PLoS Pathogens, 6(10). info:/

Pusch CM, Rahalison L, Blin N, Nicholson GJ, & Czarnetzki A. (2004) Yersinial F1 antigen and the cause of Black Death. The Lancet infectious diseases, 4(8), 484-5. PMID: 15288817  

  • October 16, 2010
  • 05:21 AM
  • 2,129 views

The @#$% 2010 Ig Nobel Peace Prize: Pain files 1

by gregdowney in Neuroanthropology


The 2010 Ig Nobel Prizes were awarded recently by the Annals of Improbable Science, and a paper I read a while ago and wanted to comment on won the Ig Nobel for Peace! (By the way, comment on, not because I thought it was Ig Nobel-esque, but because it was actually relevant to my work — what does that say about my research!?)! Congratulations to Richard Stephens, John Atkins, and Andrew Kingston for the prize, awarded for their paper, ‘Swearing as a Response to Pain,’ in Neuroreport. I’ll blog specifically about their article below the fold.
Apparently, recipients didn’t get to try on bra-gas masks this year like at last year’s Ig Nobels 2009, but I’m sure much fun was had. Paper plane throwing, operatic songs about tooth bacteria, bat porn (well, they didn’t actually get to show the bat porn…), a little girl telling award winners she was bored to stop them from giving overly-long acceptance speeches… sounds like something for the whole family.
There are a number of great-looking winners, including a brilliant engineering team who used remote-controlled toy helicopters to breathalyze whales to better sample their snot (yeah, ‘why didn’t *I* think of that?!,’ you’re saying…), a game-theory demonstration that hierarchical firms were better off simply promoting people randomly, a demonstration that oil and water in fact do mix together (the award was shared with BP for demonstrating the same effect on a much larger scale), and a too-long-ignored, forty-year-old contribution to epidemiology which pointed out the dangers of beards on microbiologists.
For more on all the papers that won, check out Christie Wilcox’s prize-by-prize rundown with links or a great discussion by Jeff Hecht at New Scientist. If you wanted to watch the ceremony, it was streamed, but I haven’t found it archived yet; I’m sure it will eventually be available at the Annals of Improbable Research YouTube channel.

Pain and swearing
In their paper in Neuroreport, Richard Stephens, John Atkins, and Andrew Kingston discuss research in which they tested whether swearing really helped pain resistance. They hypothesized, like good emotionally repressive scientists, that swearing was a maladaptive response to pain, exacerbating the pain by generating additional ‘negative thoughts.’ They first asked experimental subjects to list five words that people might say if they struck their thumb with a hammer (controls were asked for five descriptors for a table). Subjects then did a cold pressor test, in which they held an open hand in iced water; both groups were instructed to repeat a word from their list, seeing how long they could keep their hands immersed.
The test subjects who swore kept their hands in the water longer than the folks who muttered about tables, and reported less perceived pain from the cold pressor than did the control subjects, with some variation: ‘Although both sexes experienced a reduction in perceived pain in the swearing condition, females did so to a greater extent’ (1057). The table below shows some of the key results: not only did the swearing subjects hold their hands in longer, they experienced less pain according to their reports, and their heart rates spiked higher.

The data not only disproved the researchers’ hypothesis (that swearing was maladaptive), it also revealed interesting sex differences between men and women, and the link to increased heart rate, which I think is especially interesting. Stephens and co-authors suggest that the increased heart rate while swearing may indicate a heightened emotional response like fear or aggression that downward regulates experiences of pain through ‘classic fight or flight mechanisms’ (1060).
As the researchers conclude:
This study has shown that, under certain conditions, swearing produces a hypoalgesic effect. Swearing may have induced a fight or flight response and we speculate on a role for aggression in this. In addition swearing nullified the link between fear of pain and pain perception. (1060)
Minor issues
Of course, the research did confront the occasional outlier. After they asked all the experimental subjects to list the words a person would say when hit on the thumb with a hammer, ‘One participant was excluded because none of their suggested words were swear words’ (1057). Unfortunately, no footnote reveals what precisely this outlier thought a person would say if hit on the thumb with a hammer. (‘Rats’? ‘Fudge’? ‘Sugar’? ‘Fiddlesticks’?)
Trying to create a careful control for the swearing/non-swearing conditions also created some issues, forcing a control that meant that some of the cathartic value of swearing had to be sacrificed: ‘Participants were asked to maintain a similar pace and volume of word recital across conditions’ (1056). Subjects were also required to repeat the same word over and over again — anyone who swears like a stuck sailor knows that part of the relief comes from the stream of changing epithets shouted at the top of one’s lungs, the profane creativity of the novel compound vulgarity, ‘@#$%! *&^%-*&^%! $@*^^&^$#@! &^%$#@@!’ (although sometimes you just can’t manage anything more interesting than repeating the same choice epithet).
Vocal volume especially would seem to be an issue if the researchers are postulating that the analgesic mechanism was an effect of aggression downward mediating perceptions of pain. I’m not sure that swearing relatively quietly and at a measured pace would actually provoke a strong emotional dynamic, no matter what sort of dynamic you’re expecting.
Thinking through the dynamic of vulgar analgesia
The research is fun, of course, but I find it particularly intriguing because the experiment, in its own way, illuminates some of the neurological complexity underlying pain, including the possibility that top-down mechanisms— like coping techniques, conscious thought, or learned emotional responses—might modulate our experience. As a number of neuropsychologists have argued (and demonstrated empirically), pain involves a constellation of interacting neural mechanisms, some of them more conscious than others, rather than a single ‘pain centre’ (see Apkarian et al., 2005). Donald Price (2000), for example, reviews the relations among pain, feelings of ‘unpleasantness’, and ‘secondary pain affect,’ such as the emotional feelings of long-term worry or ‘suffering’ that may accompany the unpleasantness. As Price writes (2000: 1769):
Psychophysical studies demonstrate that pain sensation and pain unpleasantness represent two distinct dimensions of pain that demonstrate reliably different relations to nociceptive stimulus intensity and are separately influenced by various psychological factors.
One of the clearest examples of the way that pain stimulus can be, to a limited degree, decoupled from sensations of unpleasantness is that subjects reliably report diminished sensations of pain if told that a pain is transitory and will produce no lasting effect and enhance reports of suffering if they believe that a pain will be long-lasting.
In some cases, the decoupling can produce a quite profound gulf between pain sensation and pain unpleasantness; in a study at a Forward Hospital in World War II, military surgeon Dr. Henry Beecher found that three-quarters of severely wounded men did not experience such pain that they asked for pain relief, even when they were reminded that it was available (1946: 99). Beecher was surprised by the result, pointing out that even patients suffering severe wounds to the chest and head and broken bones, caused a minority of sufferers to report significant subjective pain.
In addition, the relation between unpleasantness and secondary pain affect – suffering or distress – also varies. A study of ethnic differences in pain perception, for example, conducted by Zborowski (1952), found that different ethnic groups felt varying levels of distress in relation to similar reports of pain, some groups worrying extensively about the future implications of pain and others curtailing expression of suffering. Pric... Read more »

  • October 15, 2010
  • 04:11 AM
  • 750 views

The rise and crash of civilizations

by Razib Khan in Gene Expression

One of the questions of interest in the study of the evolution of culture is whether there is a direction in history in terms of complexity. As I have noted before in the pre-modern era many felt that the direction of history was of decline. That is, the ancients were wise and subtle beyond compare [...]... Read more »

Currie, Thomas E., Greenhill, Simon J., Gray, Russell D., Hasegawa, Toshikazu, & Mace, Ruth. (2010) Rise and fall of political complexity in island South-East Asia and the Pacific. Nature. info:/10.1038/nature09461

  • October 13, 2010
  • 11:05 AM
  • 1,917 views

Seeking Authenticity in Facebook Profiles

by Krystal D'Costa in Anthropology in Practice


I was chatting with a friend who is in the process of job hunting the other day and he told me that he friended a recruiter on Facebook. Perplexed, I asked if he was concerned about the information the recruiter might see. "No," he said. "I'm not really the drunken reverie poster." He also does not use features such as lists to organize contacts and restrict access to parts of his profile.
This exchange suggests to me that the boundaries between online social networks are still in flux. So far, the general suggestion has been that users should use LinkedIn for professional contacts and keep Facebook for personal ones—or make use of the lists feature to set privacy settings accordingly. We've heard the horror stories about the times when friending a supervisor went astray (seriously, Google Facebook fired—there's even a group!), and we're learning that HR is increasingly reviewing the social media profiles of applicants before they're even invited for an interview. Sites like Facebook allow users to craft a personalized image of themselves—does this personalization suggest a more authentic self? And if so, does that make Facebook a more desirable point of contact for a more "complete" view of a person?
Researcher Soraya Mehdizadeh (2010) proposes that sites like Facebook and MySpace have contributed to the rise of narcissistic tendencies. As Facebook has surpassed MySpace in overall general use and has a wider base of potential contacts (e.g., recruiters and applicants, supervisors and employees), I'll focus this discussion on Facebook. In my opinion, it seems to be the more "serious" of the two—at least, it appears to be the more trustworthy for authentic representations, which we'll explore in bit. First, what does Mehdizadeh mean by narcissism online?
She define narcissists as individuals who seek superficial relationships with high status individuals who can contribute to public glory (2010: 358). Online social networking sites encourage these sorts of relationships:First, this setting offers a gateway for hundreds of shallow relationships (i.e., virtual friends), and emotionally detached communication (i.e., wall posts, comments). While these sites do indeed serve a communicative purpose among friends, colleagues, and family, other registered users can initiate requests to be friends, and one's social network often snowballs rapidly across institutions in this fashion (358).One way this sort of relationships is achieved is through the presentation of an attractive self—the user must reveal something that encourages the connection. this connection may be emotionally appealing (e.g., a shared history: attending the same high school or college) or physically appealing (e.g., an enticing photo, a pleasant demeanor). The latter seems particularly important once the user moves past first tier connections and begins to add connections from the second tier (i.e., friend of a friend) and beyond. According to Mehdizadeh, this opens the door for a showing of the "hoped for possible self," which "emphasizes realistic socially desirable identities an individual would like to establish given the right circumstances" (358).
Facebook is a "nonymous" online environment—it requires the user to reveal herself. Users have to enter their real names, for example, although an increasing number of users are slightly tweaking their names to make themselves unsearchable. But it also requires users to place themselves within a physical network, whether that be an academic circle or a geographic one. And while a small number of users choose not to have a photo, many have some sort of pictorial representation associated with their name. Frequently, this is a photograph although in some cases avatars or other icons may be used, however, the representation is true in that it represents the user in some way. That is to say, Jane Doe does not use Mary Smith's image as her own. Nonymous environments require a degree of truth. The personal nature of Facebook adds another layer to that sense of truth which distinguishes the network from the likes of LinkedIn. Users are not free to simple pretend to be someone they are not.
Mehdizadeh correctly states that users are actively engaged in constructing their identity in nonymous settings:While the nonymity of this environment places constraint on the freedom of the individual identity claims, this setting also enables users to control the information projected about themselves. In particular, users can select attractive photographs and write self descriptions that are self-promoting in an effort to project an enhanced sense of self. Furthermore, Facebook users can receive public feedback on profile features from other users, which can act as a positive regulator of narcissistic esteem (360).Though she views the interaction between users and their connections as confirming narcissistic behaviors, I want to propose that Facebook relationships also help vet the image projected by calling attention to anomalies. For example, if a user posts a photo showing themselves in a new, unexplained location, someone will usually ask where the user was and what he was doing at the time.
In this sort of setting, it may be possible to expand your network to include business and other professional contacts. In this setting, would a drunken reverie poster get a pass because it would seem out of character? It may mean a more open social web.

Cited:Mehdizadeh S (2010). Self-presentation 2.0: narcissism and self-esteem on Facebook. Cyberpsychology, behavior and social networking, 13 (4), 357-64 PMID: 20712493
... Read more »

Mehdizadeh S. (2010) Self-presentation 2.0: narcissism and self-esteem on Facebook. Cyberpsychology, behavior and social networking, 13(4), 357-64. PMID: 20712493  

join us!

Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.

If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.

Register Now

Research Blogging is powered by SMG Technology.

To learn more, visit seedmediagroup.com.