Post List

Anthropology posts

(Modify Search »)

  • November 9, 2010
  • 03:25 AM
  • 716 views

Genes To Brains To Minds To... Murder?

by Neuroskeptic in Neuroskeptic

A group of Italian psychiatrists claim to explain How Neuroscience and Behavioral Genetics Improve Psychiatric Assessment: Report on a Violent Murder Case.The paper presents the horrific case of a 24 year old woman from Switzerland who smothered her newborn son to death immediately after giving birth in her boyfriend's apartment. After her arrest, she claimed to have no memory of the event. She had a history of multiple drug abuse, including heroin, from the age of 13. Forensic psychiatrists were asked to assess her case and try to answer the question of whether "there was substantial evidence that the defendant had an irresistible impulse to commit the crime." The paper doesn't discuss the outcome of the trial, but the authors say that in their opinion she exhibits a pattern of "pathologically impulsivity, antisocial tendencies, lack of planning...causally linked to the crime, thus providing the basis for an insanity defense."But that's not all. In the paper, the authors bring neuroscience and genetics into the case in an attempt to providea more “objective description” of the defendant’s mental disease by providing evidence that the disease has “hard” biological bases. This is particularly important given that psychiatric symptoms may be easily faked as they are mostly based on the defendant’s verbal report.So they scanned her brain, and did DNA tests for 5 genes which have been previously linked to mental illness, impulsivity, or violent behaviour. What happened? Apparently her brain has "reduced gray matter volume in the left prefrontal cortex" - but that was compared to just 6 healthy control women. You really can't do this kind of analysis on a single subject, anyway.As for her genes, well, she had genes. On the famous and much-debated 5HTTLPR polymorphism, for example, her genotype was long/short; while it's true that short is generally considered the "bad" genotype, something like 40% of white people, and an even higher proportion of East Asians, carry it. The situation was similar for the other four genes (STin2 (SCL6A4), rs4680 (COMT), MAOA-uVNTR, DRD4-2/11, for gene geeks).I've previously posted about cases in which a well-defined disorder of the brain led to criminal behaviour. There was the man who became obsessed with child pornography following surgical removal of a tumour in his right temporal lobe. There are the people who show "sociopathic" behaviour following fronto-temporal degeneration.However this woman's brain was basically "normal" at least as far as a basic MRI scan could determine. All the pieces were there. Her genotypes was also normal in that lots of normal people carry the same genes; it's not (as far as we know) that she has a rare genetic mutation like Brunner syndrome in which an important gene is entirely missing. So I don't think neurobiology has much to add to this sad story.*We're willing to excuse perpetrators when there's a straightforward "biological cause" for their criminal behaviour: it's not their fault, they're ill. In all other cases, we assign blame: biology is a valid excuse, but nothing else is.There seems to be a basic difference between the way in which we think about "biological" as opposed to "environmental" causes of behaviour. This is related, I think, to the Seductive Allure of Neuroscience Explanations and our fascination with brain scans that "prove that something is in the brain". But when you start to think about it, it becomes less and less clear that this distinction works.A person's family, social and economic background is the strongest known predictor of criminality. Guys from stable, affluent families rarely mug people; some men from poor, single-parent backgrounds do. But muggers don't choose to be born into that life any more than the child-porn addict chose to have brain cancer.Indeed, the mugger's situation is a more direct cause of his behaviour than a brain tumour. It's not hard to see how a mugger becomes, specifically, a mugger: because they've grown up with role-models who do that; because their friends do it or at least condone it; because it's the easiest way for them to make money.But it's less obvious how brain damage by itself could cause someone to seek child porn. There's no child porn nucleus in the brain. Presumably, what it does is to remove the person's capacity for self-control, so they can't stop themselves from doing it.This fits with the fact that people who show criminal behaviour after brain lesions often start to eat and have (non-criminal) sex uncontrollably as well. But that raises the question of why they want to do it in the first place. Were they, in some sense, a pedophile all along? If so, can we blame them for that?Rigoni D, Pellegrini S, Mariotti V, Cozza A, Mechelli A, Ferrara SD, Pietrini P, & Sartori G (2010). How neuroscience and behavioral genetics improve psychiatric assessment: report on a violent murder case. Frontiers in behavioral neuroscience, 4 PMID: 21031162... Read more »

Rigoni D, Pellegrini S, Mariotti V, Cozza A, Mechelli A, Ferrara SD, Pietrini P, & Sartori G. (2010) How neuroscience and behavioral genetics improve psychiatric assessment: report on a violent murder case. Frontiers in behavioral neuroscience, 160. PMID: 21031162  

  • November 8, 2010
  • 09:26 AM
  • 2,001 views

Fan Identity and Team Choice

by Krystal D'Costa in Anthropology in Practice

How does one become a fan? Choose an allegiance? Decide that you’re going to wear bright green, or purple and gold, or paint your face orange and black? In many cases, these allegiances are decided for us—handed down via familial loyalties or decided by geographic boundaries. I raised this question on Twitter a few weeks ago, and the results all indicated that team alliance is linked to one’s point-of-entry into fandom: if you begin watching Team A and learning about the sport via Team A, and your network is tied to Team A, then you’re likely to become a fan of Team A. And like all habits, longstanding fan ties are difficult to break.
But is acceptable to support Team B if you live in Team A territory? Particularly if Teams A and B are rivals? Initial ties are important in this scenario. It’s fine if you move to the Midwest from Massachusetts and want to continue to support a New England team—you’re maintaining loyalty to your geographic origins, and that’s totally acceptable. There’s a reason for you to break with the group. But in the absence of relocation, can you support a team with no apparent ties to the location or network to which you belong?
S is a HUGE New England Patriots football fan. It runs counter to our network where the New York Giants and the New York Jets reign supreme—and an allegiance to either would apparently be preferable to siding with the evil Coach Belichick and his platoon of Patriots. He’s a Met fan in accordance with the reasons given for team attachment by others: he comes from a line of Mets fans, and was raised in close proximity to the former Shea Stadium. There is both a network connection and a geographic connection that ties him to this baseball team, but his football allegiance has raised more than a few eyebrows and subjected him to taunts and criticism from colleagues, friends, and family alike. In football, he’s a displaced fan.
Michael Miller argues that fandom is an outlet for expression that may be lacking in the fan’s day-to-day life (1997: 125). In these contests where there must be a winner, the game offers a finiteness that often is not attainable is the norms of the average fan’s life: workers do not “win” at the end of eight hours, and in relationships, there are no measures of “familial performance” (1997: 124). Furthermore, Miller arguesThe joy and beauty of being a fan ultimately derives from the fact that his/her allegiance can never be effectively challenged. A sports fan is never challenged for holding his/her views as he/she might be for being a Democrat, Republican, Liberal, Conservative, Capitalist can be. These latter identifications are supposed to be represent deliberate choices often reached through intellectual inquiry, investigation, thought, and decision. This is not the case with the fan whose very being is characterized by an emotional attachment that cannot be rationalized. The fan is never required to justify his/her “faith” in a player or team (1997: 126).But this is only true when you fit with the prevalent group—unless you have good reason (i.e., relocation) for breaking with the norms. S is constantly questioned and taunted when they lose. Which I suppose is to be expected in any town that has a deeply rooted sports tradition.


Pat Patriot, logo for the NE Patriots
from 1961 to 1992. © NE Patriots
S became a fan of the Pats when he was about eight years old, well before they had their string of Super Bowl wins in the 2000s. When asked today why he supports the team he invariably replies, “It’s the Patriots!” The term to him is filled with nationalist imagery&38212”Aren’t we patriots?” is a favorite counter question he likes to pitch. He believes they should be America’s team. He also says, as an eight-year-old he was swayed by the logo and the colors, but the Giants, a local team, also share similar colors, so by this argument were it not for the logo he should have favored the Giants. But S was actually introduced to football outside of the normal familial team alliances, which could help explain his out-group association: His father, though a Giants fan, watched a lot of 49er’s football by way of introducing him to the sport, and those early years of fandom must have been immensely influential. This experience may have allowed the normal bonds of fandom to be bent.
Over time athletes and teams come to represent the public they play for, and fans believe that they can sway the outcome of these matches through their actions. It’s why we don the gear, paint our faces, and persevere through times of loss. Sports are one means by which we leave our mark on the larger world—and this “we” includes the spectators, the fans who are participants in their own right. Sociologists Raymond Schmitt and Wilbert Leonard III (1986) raise the concept of the “postself” as a means of explaining this belief. The postself is “the presentation of his or her self in history” (1088). The idea is that via our actions and choices—the teams we support and their performance—will impact how we are remembered.
The postself, Schmitt and Leonard argue, drives athletes, but because sports is a social experience, the postself of the athletes can possibly be extended to the participants:Although we did not find unequivocal evidence that fans identified with the athlete’s postself, this foes remain a distinct possibility. Caughey has emphasized the extent to which American fans identity with various types of media figures … “Through their simple connections with sport teams, the personal images of fans are at stake when their teams take the field. The team’s victories and defeats are reacted to as personal successes and failures. Another investigation found that 28% of 1252 adult Americans said that they sometimes, often, or always fantasize that they are the competing athlete when watching their favorite sport (1099).Support of a particular team allows us to become a part of that team’s victories and losses—our histories become intertwined. I’ll never forget Endy Chavez’s catch as a Met that kept hope alive in the NLCS in 2006. It was the catch heard ‘round the world. With so much uncertainty in our lives and in the world, perhaps these are the small ways in which we author our history and identity with some degree of control.
S has chosen a team whose logo and colors carry a message about him and his beliefs. It's no different than his decision to cheer for the Mets. In latter instance however, he has been handed a prepackaged view on what the team represents to the group and has adopted those views later in life as his own. In the former, he has created his own representation. In both cases, these teams constitute his personal history, and will comprise his postself. And since sports are a large part of his life, his participation is something that will figure prominently in his biography though it runs counter to the expectation of his peers.
In a post called Us, Them, and Non-Zero Sumness Patrick Clarkin did a fantastic job a few weeks ago analyzing intergroup conflict in sports. Drawing on Muzafer Sherif’s 1954 experiment, Clarkin leads the reader through a discussion on the way competition helps to exacerbate conditions of otherness. In situations where there must be a victor, such as in sports, a zero-sum condition emerges where the success of one group necessitates the failure of another group. In this setting, rivalries and conflicts emerge as a means of achieving the goa... Read more »

Miller, Michael. (1997) American Football: The Rationalization of the Irrational. International Journal of Politics, Culture, and Society, 11(1), 101-127. info:/

Schmitt, R., & Leonard II , W. (1986) Immortalizing the Self Through Sport. American Journal of Sociology, 91(5), 1088. DOI: 10.1086/228387  

  • November 7, 2010
  • 07:49 AM
  • 1,292 views

Life in the dark

by gregdowney in Neuroanthropology

My wife, along with her many other jobs – paid and unpaid – is the local director of a campus exchange program that brings US students to Wollongong, New South Wales.  Because of her background in outdoor education and adventure therapy, she does a great job taking visiting Yanks on weekend activities that get the students to see a side of life in Australia that they might not otherwise see.  From Mystery Bay on the South Coast, to Mount Guluga with an Aboriginal guide, to abseiling (rapeling) in the Blue Mountains, to surf lessons at Seven Mile Beach, I think she does a great job, and I frequently tag along to help and enjoy being reminded of the distinctiveness of my adopted home.
Abdulai Abubakari holds his infant child, Fakia. (Peter DiCampo/VII Agency)
Invariably, either at the beach or in the Blue Mountains, at night, students will confront a clear, dark Australian sky, staggered at just how many stars fill the darkness from horizon to horizon. I’ve seen the US students – well, not all of them get into it – just stand, necks craned backwards, and stare.  What they thought was darkness was actually full of innumerable points of light.
I’m sympathetic because I had a similar experience one clear night in the Chapada Diamantina (the Diamond Plateau) in Brazil, when I couldn’t believe how, given real darkness, desert-like humidity, and clear, pollution-free air, the sky was crowded with sources of light, just smeared with stars.  For the first time, I felt like I understood the name, the ‘Milky Way,’ because I could see the uninterrupted blur toward the centre of our galaxy.
I was reminded of my experience with seeing stars, as if for the first time, and the reactions of the American students when I stumbled across the photos of Peter DiCampo (click here for Peter’s website), an American freelance photographer and former members of the Peace Corps who volunteered in the village of Voggu in rural Ghana.  His photo essay, Full Frame: Life without lights, is up at Global Post, an online American newspaper launched at the start of 2009.  His beautiful photos of life by flashlight, candle and gaslight, capture the atmosphere in this part of Ghana without electricity, and got me to thinking about artificial light and the way the sensory environment affects human development (additional photos at Peter’s personal website, including photos from darkness in Kurdistan).


Dark photos as activism
Peter’s photos are a form of social activism as well as both photojournalism and art.  As he describes, the images seek to convey concretely life in Voggu without focusing entirely on deprivation, drawing attention to the villagers’ condition without simply recapitulating familiar visual stereotypes in images (and I think he’s quite successful):
The villagers of Voggu are among the 1.6 billion people worldwide who live without electricity.
I had a simple plan: to photograph only with the light available, so that the reader can see only what the subjects are able to see….
I have no desire to contribute to a stereotypical view of Africa, presenting people as miserable and helpless — but I have every desire to use my photographs toward humanitarian means. How to reconcile the two?
The photographs – of night markets and kids reading the Qur’an and actors shooting a movie scene, as well as flashlight-lit portraits – don’t just bring us to a different geographical locale, but to a profoundly different sensory reality (as do the photos from Kurdistan that appear on his own website).  For most Americans and Australians, I suspect, living truly in the dark, with only tiny patches of light, would be a rarity.
Studies of nightglow, the light reflected back by humid or polluted atmosphere from ground level electric lighting, suggest that large portions of Europe, North America, and most urban areas are constantly swathed in low levels of ambient glow.  Nightglow effectively banishes night, shifting the daily cycle for every one, and every thing, living in these areas.
The effects of light at night
Navara and Nelson (2007), in the Journal of Pineal Research, review the diverse effects of artificial light on biological systems.  They discuss the extensive research on the negative effects of ambient nighttime light on animals, including disruptions to reproduction, migration, foraging behaviour and predation.  I won’t discuss these effects in any detail, but one of the sadder examples is that hatchling sea turtles often use the contrast between dark bush behind the beach and lighter horizon over the water to orient themselves when they are born.  Too much light on the inland horizon can confuse them so that they cannot find their way to water before predators get them.
Humans are also affected in a host of ways by nighttime light.  These light-derived condition are widespread, even pervasive; as Navara and Nelson (2007: 216) review: ‘In 2001, the percentage of the world’s population living under sky brightness higher than baseline levels was 62%, with the percentages of US and European populations exposed to brighter than normal skies lying at 99%’ (Navara and Nelson cite Cinzano and colleagues 2001 atlas of the night sky).  For some of these populations, true night is never experienced, for artificial light is consistently brighter than a full moon.
The increasing prevalence of high intensity artificial light that tends to be blue (rather than incandescent yellow) is especially troubling because light near this wavelength affects the pineal gland, which regulates melatonin production; long-term light exposure correlates with shrinkage of the pineal gland.  Just 39 minutes of incandescent light at night can cause melatonin synthesis to drop by 50% (Navara and Nelson 2007: 217).  As Korkmaz and colleagues (2009: 267) cite, elevated melatonin levels correlate with darkness and have been referred to as ‘the chemical expression of darkness.’  Altered or disrupted production of melatonin has a range of neuroendocrine effects on a range of bodily systems, including the metabolism of prolactin, glucocorticoids, adrenocorticotropic hormone, corticotrophin releasing factor and serotonin.  Long-term exposure to light at night can contribute to chronic sleep deficit, especially through the effects on melatonin, with ‘countless’ other effects, according to Navara and Nelson’s (2007: 217) review.
Over the long term, disruption of the light-dark cycle can affect body composition, contribute to obesity, negatively impact gut efficiency, and otherwise disrupt metabolism, especially the moderation of energy uptake.  These changes have been linked to diabetes, heart disease, and other metabolic problems.  Moreover, chronic exposure to low levels of light at night can lead to oxidative stress, which can damage immune cells and even contribute to higher incidence of cancer and rates of physiological aging (see Navara and Nelson 2007 for research review).
Because of the diverse endocrine disruptions caused by ambient night light, some health advocates argue for a decrease in the number of lights, for a modification of design, or for a shift to using lights which function less in the blue-violet part of the spectrum, which seem to cause the most biological disruption.
The dark in Ghana
But Peter’s photos also struck me because I was fascinated by the way that perception, too, might be altered.  When I emailed Peter to ask his permission to use his photo in this post, I asked him how he felt his perceptions were affected by living in an area without streetlights and neon signs and all the other electricity-based technologies that transform our experiences of night.  He wrote back:
As far as perception or vision – it was obvious to me that I was the clumsiest person in town. My Ghanaian friends, it seemed, could simply see in the dark. Maybe they couldn’t read a book, but they knew who was coming when he or she was still a great distance away, and they didn’t stumble as they walked around at night. I, however, wasn’t there long enough to adjust, apparently!
I also asked Peter how living with dark night affected him more generally, his state of mind and overall experience of the Ghanaian countryside.  His answer highlights a range of issues:
... Read more »

  • November 6, 2010
  • 09:57 AM
  • 800 views

Evidence of an Extinct Tiger Found in Palawan

by bonvito in time travelling

Philip Piper et al reported the discovery of the presence of Panthera tigris in the island of Palawan, Philippines. The team of archaeologists who were excavating Ille Cave near El Nido, found the tiger bones in a “large human-derived animal bone assemblage dating to at least the early 11th millennium BP that included the remains [...]... Read more »

  • November 4, 2010
  • 10:14 AM
  • 1,710 views

C is for Cookie: Cookie Monster, Network Pressure, and Identity Formation

by Krystal D'Costa in Anthropology in Practice



Cookie Monster © Sesame Street
It’s not quite news that Cookie Monster no longer eats cookies. Well, he eats ONE cookie. After he fills up on vegetables! Vegetables!! Understandably, the public was outraged, and in response, Cookie felt the need to clarify: He still eats cookies—for dessert—but he likes fruit and vegetables too. Cookie Monster needed to reassert his identity, so he did what anyone would do: He interviewed with Matt Lauer.* The message was plain:He’s a Cookie Monster and  Cookie Monsters eat cookies. They dream of cookies. They would bathe in cookies if they could. They can’t get enough of cookies. But can Cookie Monsters eat fruits and vegetables too?
Sure, I can understand why Cookie has been dissuaded from pursuing his pastry delights with abandon. We’re in the midst of an obesity epidemic and children are most threatened. As adults, they’ll face a number of complications, including increased risk for diabetes, heart disease, high blood pressure, sleep apnea, and other woes. Cookie Monster IS a role model. Sesame Street has been a beacon in children’s education for decades, and the habits children learn in their early years will likely follow them through their lives. So Cookie has given up his plate of delicious cookies, practices restraint, and eats more leafy greens in the hopes that young children will do the same. Now I watched Cookie Monster devour plates of cookies when I was growing up, and I have to tell you … I don’t tend to devour plates of cookies now as an adult. I learned healthy eating habits from my parents. I understood that Cookie Monster was meant to eat cookies in a way that I wasn’t. (And truthfully, how many of those cookies actually made it into his mouth anyway? Most wound up as crumbs all over his fur, which was more reason NOT to devour cookies as he did.) But I suppose that having Cookie model moderation may support parental messages at home. As I said, I get it.


Cookie Monster raps about health foods.
(They taste so good!) © Sesame Street
However, are we forcing our standards on Cookie? Forcing him to change who he really is? Cookie Monster actually begin singing (really, rapping) about eating healthy foods in the late 80s. There was no uproar then perhaps because the shift seemed less radical. Who cared if he was a closet veggie eater? But is it acceptable for Cookie to change? Is Cookie Monster's identity caught in flux as a result of conflicting messages about who he is and who we as a society, as his network, expect him to be? And in all seriousness, what kind of message does this send to children? That they should repress who they are in favor of the norm? That there is an ideal to strive toward? That once they’ve established an identity, they can’t change? Cookie Monster’s cookie/vegetable dilemma provides a good opportunity to investigate the mechanisms of identity formation.
Daniel MacFarland and Heili Pals (2005) evaluated internal and external motives that inspire change, and determined that change to the network is the driving factor in identity shifts over time. MacFarland and Pals thoroughly dissect social identity theory (SIT) and identity theory (IT), competing models of identity change, and conclude that in both theories the actor perceives an inconsistency which leads to a change in identity. Essentially, the actor seeks to establish an identity because he or she believes that it fails to meet a standard. In the case of Cookie Monster, we have two standards in conflict: Cookie is a monster who eats cookies (internal), and Cookie needs to promote moderation as demanded by his larger network of fans and supporters and in keeping with social trends (external). We shall see how these theories lead to pressure for Cookie Monster to change.
SIT describes categories as a context for identity development where the individual responds to category traits and develops an identity that permits desirable group membership. Motivation for change therefore comes from the individual’s perception of how well he or she fits in with the established category. However, categories are not fluid:Categories are more than labels; they act as constitutive rules or representational systems of meaning that are recognized by wide segments of a society. Categories establish expectations of and for behavior, and even suggest a narrative history of group membership … Category labels can reflect notions of status, permanence, size, and other meanings that influence the actor’s motive to improve his or her situation (MacFarland and Pals 2005: 291).So categories like race and gender tend to shape identities in very concrete ways because it is hard to separate self from visible traits of these categories (292). Cookie Monster is a monster who likes cookies. He is bound by this identity, and it places an expectation on him that he will behave in particular ways. He strives to be the best Cookie Monster he can be, as is evidenced from the clip with Matt Lauer where he insists that he can have the cookie—after he has had the veggies.
IT suggests that social networks provide the context for identity development, which in this case, is a response to shifting relationships:According to IT, the self consists of a collection of “role identities.” Persons switch role identities depending on the salience of those identities to the context. According to Stryker and Burke (2000), network contexts create a hierarchy of salience among the various identities that constitute the self, and lead the actor to invoke the same role or to alter performances over time. Therefore the feedback from a relational context defines identity salience, and this in turn gives rise to motives for identity change (MacFarland and Pals 2005: 290).

C is for cookie.
From muppet.wikia.com
IT is external and fluid, driven by the changing needs of the network. Our society—Cookie Monster’s larger network—is oriented toward a certain image of acceptable eating—even though it has largely not been the norm—and Cookie has to shift to meet this expectation. However what we have is an inconsistency between the ideal self (the self that the individual would like to be), the actual self (the self that the individual is), and the public self (the self that other perceives the individual to be). Cookie Monster would like to be able to eat cookies and vegetables (ideal), but feels that he has to eat more vegetables and downplay his cookie eating tendencies (actual), while the public thinks that he is hypocritical for enjoying vegetables but a subset wants him to eat more greens (public).
To understand what ... Read more »

  • November 3, 2010
  • 04:02 PM
  • 968 views

The Progressive Roots of American Anthropology (versus the Tea Party last time)

by Kevin Karpiak in Kevin Karpiak's Blog

Two seemingly unrelated evens have occurred in my life the last two days which have caused me to think. I spent the day yesterday helping out with the campaigns of some of the local candidates here in Southeastern Michigan. Obviously the overall effect was not as successful as I would have liked. I can’t say, [...]... Read more »

  • November 3, 2010
  • 03:25 PM
  • 827 views

Why It Takes Long-Term Thinking to Influence a Fetus

by David Berreby in Mind Matters


Low weight at birth is associated with all sorts of health troubles later in life, so it seems a great idea to give nutritional supplements to pregnant women in developing nations, to add some heft to their babies. Yet the results aren't impressive. (The anthropologist Christopher W. Kuzawa notes, for instance, that this review of 13 such programs found the average weight improvement for the babies was a paltry one ounce.) Which illustrates the state of work on "fetal origins"—the theory that pre-birth experiences in the womb have a powerful effect decades later on the adult mind and body. On the one hand, as Annie Murphy Paul writes in Origins, her fine new book about the field, the idea suggests that we should ensure that developing fetuses have a healthy environment. On the other hand, the work so far can't say how to do that.
The other day, at this conference I heard Kuzawa propose an explanation for some of this befuddlement: We can "tell" the baby-to-be that food is abundant by making sure its mother ate well last month; but its development may depend instead on how that mother ate through her entire life.
A keystone of much "fetal origins" work is that the developing infant responds to cues about the kind of world it will have to join. Is food scarce or plentiful? Is life anxiety-filled or calm? Do people live to be 90 around here, or drop by 62? The mother's experience of life, translated into hormones, blood sugar, blood pressure and other chemical signals, helps determine which of the developing baby's genes are activated and how much, molding it to fit its future environment.
New parents easily imagine this fetus as a clueless investor, reacting every hour to the latest news flash. (Hence their neurotic fears about that one sip of alcohol, bite of sushi, or late-night fight that will ruin the budding child's life.) Instead, Kuzawa proposes, we should see the developing baby as a long-term player, looking for clues about its world on different time scales: months, years or even generations.
That perspective could provide a framework to organize and explain disparate pieces of data that Murphy Paul mentions. For instance, some research has found that a mother's malnourishment late in pregnancy puts her child at higher risk, decades later, for diabetes, while malnourishment in early gestation is a risk for heart disease. And the stresses of the Arab-Israeli war in 1967 seem to increase the risk of schizophrenia in adults who were gestated then—if their mothers were in the second month of pregnancy, but not the fourth or fifth. All this suggests there are distinct periods in a pregnancy, each one particularly sensitive to one environmental influence, but not others.
Kuzawa proposes to use evolutionary reasoning to find the hidden logic of these different windows. Doing so, he says, will probably require figuring out the time-frame of the each cue that an embryo uses to prepare itself for its world.
Which brings us back to nutritional supplements: If fetuses were "tuned" to today's cues about their environment, then they should respond to maternal nutritional supplements much more than they do. But it stands to reason, Kuzawa said, that a creature that will live for 30, 40 or even 90 years should not prepare itself for the environment of next month. Next month may be way out of line with typical conditions. To prepare for eating in the mother's world, the fetus ought to find out what her whole life was like.
Some eerie facts of fetal-growth research seem to line up with the idea. Here are a couple Kuzawa cited. In this study of mothers and children in 1930's England, a woman's adult height was not a great predictor of her daughter's birth weight. Much better was the mother's height back when she herself was 7. And this study in Guatemala found that infants who grew faster in their first three years of life were those whose mothers had eaten better as children.
Kuzawa thinks our species may have evolved a mechanism for "telling" the fetus what to expect on average over decades or even generations. By basing its development on its mother's childhood condition (which means, Kuzawa points out, that it's reflecting its grandmother's experience, and so on backwards in time), the fetus takes a long-term average of conditions in its world. It can't be "fooled" by one rich harvest or a plague year. That makes sense from an evolutionary point of view. But it also means the fetus can't be "fooled" by our well-meaning attempts to guide it for the few months of its mother's pregnancy.
How does this square with evidence that disasters and wars do have a big and immediate effect on the children gestated during the crisis? As Murphy Paul recounts, these "natural experiments" suggest that bad experiences have immediate effects. (Douglas Almond has found (pdf) that children whose mothers were pregnant during the 1918 flu pandemic were 15 percent more likely than near-peers to drop out of high school; they earned lower wages throughout life, and as older adults were 20 percent more likely to be disabled.)
Perhaps severe shocks overwhelm the usual pathways by which environment communicates with embryo, Kuzawa suggests. If that's so, then extreme cases like the 1918 flu pandemic or the Dutch Hunger Winter are a mixed blessing for fetal-origins research. On the one hand, they show dramatic effects which helped convince skeptics; but, on the other, they may not represent the way the system usually works.
For parents and policy wonks, too, Kuzawa's idea is a mix of good news and bad news. If the developing fetus is impervious to day-to-day ups and downs, our minor scrapes and flubs can't harm it. But that also means that our well-intentioned efforts won't help it much, either. At least, not until we take its own long view of life.
Kuzawa, C. (2005). Fetal origins of developmental plasticity: Are fetal cues reliable predictors of future nutritional environments? American Journal of Human Biology, 17 (1), 5-21 DOI: 10.1002/ajhb.20091
Martin RM, Smith GD, Frankel S, & Gunnell D (2004). Parents' growth in childhood and the birth weight of their offspring. Epidemiology (Cambridge, Mass.), 15 (3), 308-16 PMID: 15097011
Stein AD, Barnhart HX, Wang M, Hoshen MB, Ologoudou K, Ramakrishnan U, Grajeda R, Ramirez-Zea M, & Martorell R (2004). Comparison of linear growth patterns in the first three years of life across two generations in Guatemala. Pediatrics, 113 (3 Pt 1) PMID: 14993588 ... Read more »

  • November 3, 2010
  • 05:00 AM
  • 1,051 views

Paleo and Low-Carb Diets: Much In Common?

by Steve Parker, M.D. in Diabetic Mediterranean Diet Blog

My superficial reading of the paleo diet literature led me to think Dr. Loren Cordain was the modern originator of this trend, so I was surprised to find an article on the Stone Age diet and modern degenerative diseases in a 1988 American Journal of Medicine.  Dr. Cordain started writing about the paleo diet around 2000, [...]... Read more »

Kuipers, R., Luxwolda, M., Janneke Dijck-Brouwer, D., Eaton, S., Crawford, M., Cordain, L., & Muskiet, F. (2010) Estimated macronutrient and fatty acid intakes from an East African Paleolithic diet. British Journal of Nutrition, 1-22. DOI: 10.1017/S0007114510002679  

  • November 1, 2010
  • 10:29 PM
  • 1,223 views

Witchcraft or Psychedelic Trip?

by Dan Bailey in Smells Like Science

Were the Salem Witch Trials sparked by grain infected with toxic hallucinogens?... Read more »

  • November 1, 2010
  • 08:35 PM
  • 835 views

The diversity of values held by conservation scientists and why this matters

by Phil Camill in Global Change: Intersection of Nature and Culture


Right up there with climate change, biodiversity conservation is one of the most challenging issues at the intersection of nature and culture.  Part of this challenge arises because of genuine differences in how people value other species.
In an interesting forthcoming article in Conservation Biology, Chris Sandbrook and colleagues at Cambridge University argue that these value [...]... Read more »

SANDBROOK, C., SCALES, I., VIRA, B., & ADAMS, W. (2010) Value Plurality among Conservation Professionals. Conservation Biology. DOI: 10.1111/j.1523-1739.2010.01592.x  

  • October 31, 2010
  • 05:43 PM
  • 850 views

The Vampire in the Plague Pit

by Michelle Ziegler in Contagions

Amid the chaos of a mass grave of plague victims, the 2006-2007 summer project team from the Archeoclub of Venice got a surprise. Among the dead they found evidence of belief in the undead, fear of the vampire. So how do you stop the undead from feasting on the corpses in the mass grave?  The [...]... Read more »

  • October 31, 2010
  • 11:53 AM
  • 812 views

"Rebel access to [natural] resources crucially shapes armed civil conflict"

by Benno Hansen in Ecowar

How does rebel access to natural resources affect conflict? "How". Not "if". That is the question investigated by Päivi Lujala of the Norwegian University of Science and Technology, recently published in the Journal of Peace Research.

Or rather: Where previous research has either suggested a link or sought to explain it by an indirect effect through resource abundance tending to corrupt weak ... Read more »

  • October 29, 2010
  • 02:01 PM
  • 570 views

The Adoption of Altruism

by Eric Michael Johnson in The Primate Diaries in Exile

The latest stop in the #PDEx tour is being hosted by Barbara J. King:Since animals, including humans, are merely ambulatory vehicles for their selfish genes, according to the dominant framework, it would be to one's benefit to care for a niece or cousin that lost their mother but not for a stranger of which there was no genetic relation. This is because any genes that promoted such altruism towards unrelated individuals would end up losing out by using up resources that didn’t perpetuate themselves. However, these “altruistic genes” would be passed on and thrive if they were helping a kin member with similar genetic makeup. In the currency of reproductive fitness, nepotism pays.However, a study in the journal Primates by Cristiane Cäsar and Robert John Young report on a case of adoption among a wild group of black-fronted titi monkeys (Callicebus nigrifrons) from the rainforests of Brazil.Read the rest of the post here and stay tuned for the next entry in The Primate Diaries in Exile tour.Reference:Cäsar, C., & Young, R. (2007). A case of adoption in a wild group of black-fronted titi monkeys (Callicebus nigrifrons) Primates, 49 (2), 146-148 DOI: 10.1007/s10329-007-0066-x... Read more »

  • October 28, 2010
  • 05:43 AM
  • 791 views

Sons of the conquerers: the story of India?

by Razib Khan in Gene Expression


The past ten years has obviously been very active in the area of human genomics, but in the domain of South Asian genetic relationships in a world wide context it has seen veritable revolutions and counter-revolutions. The final outlines are still to be determined. In the mid-1990s the conventional wisdom was that South Asians were [...]... Read more »

Gyaneshwer Chaubey, Mait Metspalu, Ying Choi, Reedik Mägi, Irene Gallego Romero, Pedro Soares, Mannis van Oven, Doron M. Behar, Siiri Rootsi, Georgi Hudjashov.... (2010) Population Genetic Structure in Indian Austroasiatic speakers: The Role of Landscape Barriers and Sex-specific Admixture. Mol Biol Evol. info:/10.1093/molbev/msq288

  • October 27, 2010
  • 11:38 PM
  • 1,097 views

Food for thought: Cooking in human evolution

by gregdowney in Neuroanthropology

Did cooking make us human by providing the foundation for the rapid growth of the human brain during evolution?  If so, what does this tell us about the diet that we should be eating, and can we turn back the culinary clock to an evolutionarily ideal diet?  A number of provocations over the last couple of weeks have me thinking about evolution and diet, especially what our teeth and guts tell us about how our ancestors got their food.
I did a post on this a while back at Neuroanthropology.net, putting up my slides for the then-current version of my ‘brain and diet’ lecture from ‘Human evolution and diversity,’ but I’m also thinking about food and evolution because I just watched Nestlé food scientist, Heribert Watzke’s TED talk, The Brain in Your Gut. Watzke combines two intriguing subjects: the enteric nervous system, or your gut’s ‘second brain,’ and the evolution of diet.  I’ll deal with the diet, gastro-intestinal system and teeth today, and the enteric nervous system another day because it’s a great subject itself (if you can’t wait, check out Scientific American).

This piece is going to ramble a bit, as it will also include some thoughts on the subject of diet and brain evolution sparked by multiple conversations: with Prof. Marlene Zuk (of the University of California Riverside), with Paul Mason (about Terrence Deacon’s article that he and Daniel wrote about), and following my annual lecture on human brain evolution as well as conversations today with a documentary crew from SBS.  So let’s begin the meander with Dr. Watzke’s opening bit on why he thinks humans should be classified as ‘coctivors,’ that is, animals that eat cooked food, rather than ‘omnivores.’

Although I generally liked the talk, I was struck by some things that didn’t ring quite right, including Dr. Watzke’s opening bit about teeth (from the online transcript):
So everyone of you turns to their neighbor please. Turn and face your neighbors. Please, also on the balcony. Smile. Smile. Open the mouths. Smile, friendly. (Laughter) Do you — Do you see any Canine teeth? (Laughter) Count Dracula teeth in the mouths of your neighbors? Of course not. Because our dental anatomy is actually made, not for tearing down raw meat from bones or chewing fibrous leaves for hours. It is made for a diet which is soft, mushy, which is reduced in fibers, which is very easily chewable and digestible. Sounds like fast food, doesn’t it.
Okay, let’s not be pedantic about it, because we know that humans, in fact, do have canines.  Watzke’s point is that we don’t have extended canines, long fangs that we find in most carnivorous mammals or in our primate relatives like chimps or gorillas.
The problem is that the absence of projecting canines in humans is a bit more interesting than just, ‘eat plants=less canine development.’  In fact, gorillas are completely vegetarian, and the males, especially, have massive canines; chimpanzees eat a very small amount of animal protein (something like 2% of their caloric intake), and they too have formidable canines.  Our cousins don’t have extended canines because they need them for eating – rather, all evidence suggests that they need big fangs for fighting, especially intraspecies brawling among the males in order to reproduce.
Teeth of human (left), Ar. ramidus (middle), and chimpanzee (right), all males.
The case of chimpanzee canines is especially intriguing because, with the remains of Ardipithecus ramidus now more extensively discussed, a species potentially close to the last common ancestor of humans and chimps, we know very old hominids didn’t have pronounced canines.  If the remains are indicative of our common ancestor with chimpanzees (and there’s no guarantee of that), then it’s not so much human canine shrinkage alone that’s the recent evolutionary development but also the re-development of chimpanzee canines, probably due to sexual competition.
Even with all the possible points of disagreement, the basic point is that human teeth are quite small, likely due both to shifts in our patterns of reproduction and sexual selection and to changes in our diet.  Over the last few million years, our ancestors seemed to have gotten more and more of their calories out of meat, one argument goes, at the same time that our ancestors’ teeth were getting less and less capable of processing food of all sorts (or, for that matter, being effectively used as a weapon).
Hungrier and hungrier, with weaker jaws and smaller teeth
As I always remind my students in my lecture on human brain evolution, if big brains are so great, why doesn’t every animal have one? The answer is that big brains also pose certain challenges for an organism (or, if you prefer, ‘mo’ neurons, mo’ problums’).
The first and most obvious is that brains are hungry organs, devouring energy very fast and relentlessly, especially as they grow.  The statistic that we frequently throw around is that the brain constitutes 2% of human body mass and consumes 25% of the energy used by the body; or, to put it another way, brain tissue consumes nine times as many calories as muscle at rest.  So, if evolution is going to grow the brain, an organism is going to have to come up with a lot of energy – a smaller brain means that an animal both can eat less and be more likely to survive calorie drought.
But hominin brain growth also presents a few other problems, which sometimes get underestimated in accounts of our species’ distinctiveness.  For example, natural selection had to solve a problem of excess heat, especially if big-brained hominids were going to do things that their big brains should tell them are ill advised, like run around in the hot sun. As your brain chews up energy, it generates heat, and the brain can overheat, a serious problem with sunstroke.  The good news is that somewhere along the line our hominin ancestors picked up a number of adaptations that made them very good at shedding heat, from a low-fur epidermis and facility to produce copious sweat to a system of veins that run from the brain, shunting away heat (for a much more extensive discussion, see Sharma, ed. 2007, or the work of anthropologist Dean Falk, including her 1990 article in BBS laying out the ‘radiator theory’).
Not only is our brain hungry and hot; our enlarged cranium also poses some distinctive challenges for our mothers, especially as bipedalism has narrowed her birth canal by slowly making the pelvis more and more basket shape (bringing the hips under our centre of gravity).  The ‘obstetrical dilemma,’ the narrowing of the birth canal at the same time that the human brain was enlarging, led to a bit of a brain-birth canal logjam, if you’ll pardon the groan-worthy pun (see Rosenberg and Trevathan 1995).
Although frequently presented as a significant constraint on brain growth (and I’m sure all mother... Read more »

Rosenberg, K., & Trevathan, W. (2005) Bipedalism and human birth: The obstetrical dilemma revisited. Evolutionary Anthropology: Issues, News, and Reviews, 4(5), 161-168. DOI: 10.1002/evan.1360040506  

Suwa, G., Kono, R., Simpson, S., Asfaw, B., Lovejoy, C., & White, T. (2009) Paleobiological Implications of the Ardipithecus ramidus Dentition. Science, 326(5949), 69-69. DOI: 10.1126/science.1175824  

Wrangham, R. (2003) 'Cooking as a biological trait'. Comparative Biochemistry and Physiology - Part A: Molecular , 136(1), 35-46. DOI: 10.1016/S1095-6433(03)00020-5  

  • October 27, 2010
  • 09:25 PM
  • 1,084 views

Archaeologists Unearth a "Vampire" Grave

by Dan Bailey in Smells Like Science

In the 1990's archaeologists uncovered a grave in Connecticut dating from the mid-1800's that provided the first physical evidence of a historical belief in vampires in New England.... Read more »

  • October 27, 2010
  • 02:57 PM
  • 1,167 views

Where did all these primates come from? – Fossil teeth may hint at an Asian origin for anthropoid primates

by Laelaps in Laelaps

Where did anthropoid primates come from? This question has not been an easy one to answer. Since the early days of paleontology various experts have proposed a slew of scenarios for the origins of the primate group which today contains monkeys and apes (including us), with different experts favoring various combination of places, times, and [...]... Read more »

Bajpai, S., Kay, R., Williams, B., Das, D., Kapur, V., & Tiwari, B. (2008) The oldest Asian record of Anthropoidea. Proceedings of the National Academy of Sciences, 105(32), 11093-11098. DOI: 10.1073/pnas.0804159105  

K. Christopher Beard. (2006) Mammalian Biogeography and Anthropoid Origins . Primate Biogeography, 439-467. info:/10.1007/0-387-31710-4_15

Beard, K., Marivaux, L., Chaimanee, Y., Jaeger, J., Marandat, B., Tafforeau, P., Soe, A., Tun, S., & Kyaw, A. (2009) A new primate from the Eocene Pondaung Formation of Myanmar and the monophyly of Burmese amphipithecids. Proceedings of the Royal Society B: Biological Sciences, 276(1671), 3285-3294. DOI: 10.1098/rspb.2009.0836  

Jaeger, J., Beard, K., Chaimanee, Y., Salem, M., Benammi, M., Hlal, O., Coster, P., Bilal, A., Duringer, P., Schuster, M.... (2010) Late middle Eocene epoch of Libya yields earliest known radiation of African anthropoids. Nature, 467(7319), 1095-1098. DOI: 10.1038/nature09425  

  • October 25, 2010
  • 12:50 PM
  • 3,259 views

Anatomy of a Superstition: When Your Eye "Jumps"

by Krystal D'Costa in Anthropology in Practice


The eye sees all, and can possibly warn
of danger in Trinidadian folklore.
Credit: Wikipedia
Trinidadians have a rich collection of superstitions, many of which found their way to the island via colonialism. These beliefs reflect the ways ideas and explanations have been blended here—and elsewhere—in the face of globalization. There is one, however, that I have grown up with that seems unique to Trinidadians. It concerns an involuntary eye spasm known colloquially as when your eye "jumps." The superstition has multiple parts and meanings depending on which eye is affected:If your right eye jumps, you are going to hear good news. If your left eye jumps, you are going to hear bad news (Roberts 1927: 161).
If your right eye jumps, someone is speaking well of you. If your left eye jumps, someone is saying bad things about you.* (If you think of the name of people you know, when you name the right person—who is speaking badly about you—your eye will stop jumping) (Robert 1927: 161)
If your right eye jumps, you'll see someone you haven't seen in a long time.
If your left eye jumps, a loved one/friend is doing something behind your back.
If your left eye jumps, a love one/friend may be in trouble.
*There seems to be some confusion with this particular version of the superstition since I have also seen/heard it reverse (i.e., right eye = someone speaking ill of you). It is included here in the parallel form to match the other suggestions.
There are additional variations to this theme, but all emphasize the dichotomy between the left and right eye in relation to bad versus good events. The eye has long figured in superstitious lore—for example, the idea of the "evil eye" may date to 600 BC, and since this only marks documented reference to the belief, it may in fact be older than that. As a source of vision, awareness, and knowledge, it is no surprise that beliefs relating to the eye tend to suggest a forewarning.
Superstitions are often met with a certain degree of scorn. Rational folks are often quick to dismiss them. But still they lurk in the background until the opportunity arrives when they can suggest a potential "What if?" Historically, when discussing superstitions scholars (e.g., Matthews 1945; Roberts 1927) have categorized them as "primitive" beliefs of "simple" people, and overlooked the insights they may offer on the way people view the world. While many superstitions have religious or supernatural undertones, many others offer interesting observations on life in a particular location. And if you dig deep enough, there are sometimes suggestive details that can explain why some superstitions persist.
For example, in a collection of West Indian beliefs and superstitions Basil Matthews (1945) discusses the Caniteel in Trinidad: a particular hour on a particular day between July 15th and August 15th during which any plants planted will fail to grow (141). No one knows for sure when the day or the hour actually occurs. What they do know is that generally what happens is that during this period worms eat the heart of the plant. Trinidadian farmers view this period as a bad time. Many avoid planting on July 15th, and then plant on alternate days hoping to avoid the Caniteel. Some avoid planting altogether during this period. The farmers have connected a real event (the activity of the worms) with a superstition (don't plant, this period is bad).
The same may be the case for eye jumping. The phenomena is largely harmless, but appears to be poorly understood by science. It is officially classified as benign essential blepharospasm (BEB), a phenomenon that can be disruptive in severe cases causing functional blindness:The condition is progressive with the early symptoms being irritation and discomfort in the eyelids causing an increase in the blink rate, which can progress over time to frequent, forceful involuntary and uncontrollable closure of the eyelids (Kowal et. al. 1998: 123).The condition is idiographic, but researchers believe that it may be linked in part to fatigue, stress, eyestrain, and/or caffeine (Robb-Nicholson 2010: 8). In a health column in the Harvard Women's Health Watch, Dr. Celeste Robb-Nicholson advises a writer of ways to cope with "eyelid twitching":There are several things you can do to ease the spasms. Close the eye and apply a warm compress—or try pulling gently on the lid. Get more sleep, and reduce your caffeine and alcohol intake. If the twitching occurs while you're reading or using a computer, relax your eyes occasionally by focusing on something in the distance. If your eyes are dry or irritated, use lubricant eyedrops (8).Even in the less severe form, eye jumping can still be disruptive (or at the very least, irritating), marked by a fluttering sensation in the eyelid, twitching of the eye, or the repeated closing and reopening of the eyelid. And it can last anywhere from minutes to hours or can occur intermittently over the course of several days. Perhaps its disruptiveness has contributed to its role in superstition. Let's consider the following:Eye jumping may be caused by stress in some form.
Because it is disruptive, it is memorable.
When a negative or otherwise anticipated event occurs following an eye jumping episode, it can be easily connected to eye jumping because the phenomenon sticks in the mind of the afflicted.
Since Trinidadians appear to follow the traditional notions of right = good, left = bad, it may be that they are selecting events following experiences of stress that match the eye afflicted by BEB. So for example, if they are anticipating speaking to a relative who has missed a telephone call, the anticipation may turn to worry and as a result experience BEB as a stress response. When the relative finally calls, the afflicted person may recall that their eye jumped and connect the two. This may also explain the fluidity between assigning events to the eyes. While Trinis largely follow the right/left dichotomy, they have been known to blur the line and simply say "My eye was jumping." It may also be that events that can be tied to the afflicted eye are more readily remembered. Similar to the Caniteel, Trinidadians have connected a real event (BEB) with a superstition (the eye afflicted by BEB can predict or warn of events).
Superstitions, however you view them, can be a source of comfort. They offer a way to take control of a situation and in this case to reaffirm ties—note that the eye jumping superstition is connected to loved ones. They can become deeply ingrained. When my eye jumps, I'm inclined to tell myself quite seriously to just "quit it." Meaning, quit worrying about it. I know that my stress levels are generally elevated when my eye jumps, but invariably, when the phenomenon persists, it opens the door for "What if." The event in itself also adds to my stress levels, creating a nagging sensation of worry that I refuse to openly acknowledge but seem to acknowledge in small ways. For example, my behavior changes slightly. I might call loved ones more frequently. And if I happen to learn of an event that occurred to one of them in this period, I find myself wondering about which eye the was afflicted. Superstitions are persistent. It's one of the reasons they've survived time and travel.
Do you have a family superstition that crops up from time to time? Something your grandmother or mother said or did continuously? Something that you yourself came to believe for no explicable reason? With Halloween just around the corner, let's open the vaults and see what's lurking in the shadows of our minds.

Cited: ... Read more »

Kowal L, Davies R, & Kiely PM. (1998) Facial muscle spasms: an Australian study. Australian and New Zealand journal of ophthalmology, 26(2), 123-8. PMID: 9630292  

Matthews, B. (1945) West Indian Beliefs and Superstitions. The American Catholic Sociological Review, 6(3), 139. DOI: 10.2307/3707527  

Roberts, H. (1927) Louisiana Superstitions. The Journal of American Folklore, 40(156), 144. DOI: 10.2307/534893  

  • October 25, 2010
  • 11:12 AM
  • 826 views

How does an anthropological perspective contribute to our understanding of birth control? Part I

by Kate Clancy in Context & Variation

This is a heavily revised version of a series I wrote for my LEE Blog on biological anthropology and hormonal contraception. This post deals with contraindications for hormonal contraceptives.... Read more »

Burkman RT, Fisher AC, Wan GJ, Barnowski CE, & LaGuardia KD. (2009) Association between efficacy and body weight or body mass index for two low-dose oral contraceptives. Contraception, 79(6), 424-427.

Morin-Papunen L, Martikainen H, McCarthy MI, Franks S, Sovio U, Hartikainen AL, Ruokonen A, Leinonen M, Laitinen J, Järvelin MR.... (2008) Comparison of metabolic and inflammatory outcomes in women who used oral contraceptives and the levonorgestrel-releasing intrauterine device in a general population. American journal of obstetrics and gynecology, 199(5), 5290-2147483647. PMID: 18533124  

  • October 24, 2010
  • 10:16 PM
  • 570 views

Mesa Verde Water Control

by teofilo in Gambler's House

I’ve previously discussed water control technologies at Chaco, where they were particularly important given the extreme aridity of that area even by Southwestern standards.  There is abundant evidence, however, that water control was a widespread activity throughout the ancient Southwest, even in areas with more reliable water sources.  The best-studied water control systems have been [...]... Read more »

join us!

Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.

If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.

Register Now

Research Blogging is powered by SMG Technology.

To learn more, visit seedmediagroup.com.