Did our ancestors began to stand on two legs, because it gave them an advantage in beating up their rivals? Well at least this is what David Carrier tried to find out in his most recent study, as he looked at how hard people were able to punch when they stood upright and when they didn’t.First of all, how does someone come to this kind of idea? Carrier explains that an upright stance is a common behaviour seen ion other mammals when they want to threat/fight their opponents and that especially apes often display this kind of behaviour.And indeed, an upright posture is more effective when it comes to smack people in the face, but does this mean that male to male aggression has anything to do with the evolution of human bipedalism?It’s funny that my last post was about how we’re able to build up testable hypothesises in evolutionary biology and which kind of problems you face while doing so, because this study completely made some huge mistakes in this regards. First of all, the study only relies on data from present day organisms. We have little knowledge about how our earliest ancestors (or their ancestors) even looked like, which makes it even more difficult to make any serious assumptions on how they behaved. Therefore evolutionary models solely relying on behavioural evidence from extant animals are almost untestable via the fossil record. But we need to test those models with fossil evidence if we want to avoid telling “just so” stories. I have mantra that I picked up from one of my teachers: “The past is a foreign country, they did things differently there.” Surely we need observations on recent animals to build up our models, but they can never be a complete substitute of the fossil record.Papers like this make me wonder if I might get something wrong in how I approach this field. In my eyes it completely omits all standards of how to build a scientific theory in favour of making some wild assumptions on human evolution and I don’t understand how this can happen or how such stuff gets published in the first place.References:Carrier, D. (2011). The Advantage of Standing Up to Fight and the Evolution of Habitual Bipedalism in Hominins PLoS ONE, 6 (5) DOI: 10.1371/journal.pone.0019630... Read more »
A new paper published in PLoS ONE by David Carrier tests the hypothesis that bipedalism in humans evolved because it helps them to fight better. The first fatal flaw lies in the first sentence: Many quadrupedal animals stand on their hindlimbs to fight. How then, does this explain human uniqueness? Clifford Jolly wrote in The [...]... Read more »
An analysis of Magical Clothing in folklore.... Read more »
Juliette Wood. (1992) The Fairy Bride Legend in Wales . Folklore, 103(1), 56-72. info:/
What's tall, angry, and walks on two legs? Only hominin males, apparently.... Read more »
Carrier DR. (2011) The advantage of standing up to fight and the evolution of habitual bipedalism in hominins. PloS one, 6(5). PMID: 21611167
A satyr balances a Kantharos,
signed drinking cup (kylix)
of the potter Kachrylion, 520/10 BC,
Antiquities Berlin / Altes Museum.
Marcus Cyron © 2007.
I was always a great fan of Aristophanes works. But Aristophanes should be something more than a highly appreciated ancient comedian; he is a remarkable source for the ancient Greek day-to-day life and Athenian 'communal' culture, as well as
... Read more »
An examination of a recent paper on maternal breast size after pregnancy, and the Trivers-Willard hypothesis.... Read more »
Dimitrakakis C, Jones RA, Liu A, & Bondy CA. (2004) Breast cancer incidence in postmenopausal women using testosterone in addition to usual hormone therapy. Menopause (New York, N.Y.), 11(5), 531-5. PMID: 15356405
Galbarczyk A. (2011) Unexpected changes in maternal breast size during pregnancy in relation to infant sex: An evolutionary interpretation. American journal of human biology : the official journal of the Human Biology Council. PMID: 21544894
Helle, S. (2002) Sons Reduced Maternal Longevity in Preindustrial Humans. Science, 296(5570), 1085-1085. DOI: 10.1126/science.1070106
Hinde K. (2009) Richer milk for sons but more milk for daughters: Sex-biased investment during lactation varies with maternal life history in rhesus macaques. American journal of human biology : the official journal of the Human Biology Council, 21(4), 512-9. PMID: 19384860
Hrdy, S. (1990) Sex bias in nature and in history: A late 1980s reexamination of the “biological origins” argument. American Journal of Physical Anthropology, 33(S11), 25-37. DOI: 10.1002/ajpa.1330330504
Jasienska G, Nenko I, & Jasienski M. (2006) Daughters increase longevity of fathers, but daughters and sons equally reduce longevity of mothers. American journal of human biology : the official journal of the Human Biology Council, 18(3), 422-5. PMID: 16634019
Poretsky L, Seto-Young D, Shrestha A, Dhillon S, Mirjany M, Liu HC, Yih MC, & Rosenwaks Z. (2001) Phosphatidyl-inositol-3 kinase-independent insulin action pathway(s) in the human ovary. The Journal of clinical endocrinology and metabolism, 86(7), 3115-9. PMID: 11443175
"Braaiiiinnns ..." Zombies on the hunt for a meal, Night of the Living Dead.
May is Zombie Awareness Month—just in case you were wondering. Don’t roll your eyes: yes, we need a whole month of preparedness. I too was skeptical, but as the inimitable Christie Wilcox tweeted in response to my disbelief that May would be so used:
I think I must be. Prepared, that is. Surely the plethora of zombie movies, books, survival guides, and even exercise regimens have given me a sense of how to survive the coming zombie apocalypse. If you’ve seen even one zombie movie, I’d be willing to bet that you’re pretty prepared too. If you haven’t, go watch Zombieland. It provides a fair list of “rules” that should boost your chances of survival. For example, "When in doubt, know your way out" and "check the backseat" make a lot of sense. Then again, those might be things you should be doing anyway. And yet, they keep coming: Wikipedia lists seventeen zombie movies scheduled for release this year—and there are already films on the docket through 2014.
Zombies aren’t pretty creatures. Popular media depicts them in assorted states of decay. They shamble. They’re insatiable cannibals. And, well, they’re dead. So why can’t we get enough of them?
Folklore is home to a host of undead characters: mummies, skeletons, vampires, ghouls, and ghosts can be found under one name or another in almost all mythologies. Though closely linked with Vodoun magic and religion, the zombie is no exception: dybbuks, jumbies, djodjos, and duppies all bear some resemblance to the Haitian zombie, which is a composite of African beliefs transported to the Caribbean via the slave trade (1). Slaves from the Gulf of Guinea transported rites and rituals from the classical East and the Aegean, which took root following Haiti’s revolution in 1804 (2). This zombie is a complex creature: though, like it’s cinematic counterpart it lacks consciousness, it is a much more nuanced and manipulated figure. Anthropologist Wade Davis proposed that the Haitian zombie is a pharmacological product created by a bokor (3). A powder created from the toxins found in the puffer fish is administered to induce a lethargic sleep from which the afflicted may be roused and controlled. However, analysis of this powder has yielded frustratingly little information: ingredients appear to vary (including human remains, toads, lizards, millipedes, tarantulas, ground glass, and various plants) as does administration. For example, the powder may be strewn over the path frequented by the intended victim, or on his doorstep—this hardly seems very effective. Haitian traditions allow for the possibility of poisoning, but also posit a supernatural origin: a body is buried and resurrected without cause—it is just called by name by a sorcerer and emerges without will, memory, or consciousness, ready to do one’s bidding (4). Typically, it will work for its creator, either performing labor or serving as a guard of some sort, and may be rented or loaned.
This corporeal zombie—distinguished from the spiritual zombie that Vodoun beliefs also permit—is the basis for the Hollywood zombie, which cannot be controlled and is bent on destruction. In both cases, the zombie is a shade of the human from whom it is derived, however the degradation of the latter’s former humanity is precisely what makes it horror that we can’t turn away from. Much has been written about the metaphors inherent in Hollywood zombies and their ties to capitalism, the Other, and science and technology gone awry (5,6). Yet, zombies capture our imagination because they are extensions of what we know to be human—they provide a glimpse into the breakdown of the social order.
Zombies are not meant to be. We engage in systematic mourning and funeral rites to remove the deceased and his remains from our immediate awareness. While we make allowances for the length of the grieving period, we support the bereaved with the belief (even if it is unspoken) that they will eventually cease to mourn as deeply in a visible way. But we also distance ourselves from the dead because they are a reminder of our mortality. They are gone, after all. And not only that, but their bodies begin to decay. Zombies force us to confront these sorts of issues. In death, the social order does not matter:Society’s infrastructure begins to break down, especially those systems associated with the government and technology. Law enforcement is depicted as incompetent and backwater (the local sheriff is a stereotyped yokel with a “shoot first” attitude), so people must fend for themselves instead. The media do what they can, broadcasting tidbits of helpful information and advice by way of the radio and television, but the outlook is fundamentally grim: Hide if you can, fight if you have to. In the end, the rigid structure of society proves little help; human survivors are left to their own devices with no real hope of rescue or support. Motley groups are forced into hiding, holing up in safe houses of some kind where they barricade themselves and wait in vain for the trouble to pass (7).Infectious disease researchers have already determined that in the event of a zombie outbreak, humanity would be up the creek—to put it mildly. In all the models investigated, the collapse of civilization is imminent (8). All but the most aggressive quarantine strategies would fail, and when the dead can come back to life, well, it means that there is an endless source of recruits waiting to be called forth. Albeit a bit tongue-in-cheek, researchers advise that in the face of a zombie apocalypse, quick, decisive action would be necessary: “the most effective way to control the rise of the undead is to hit hard and hit often” (9).
In these instances, the undead reflect concerns about mortality and social order. Zombies are our creation—whether in the Vodoun tradition or the result of radiation or a viral outbreak, zombies rise because we make it possible for them to do so. But perhaps because they are former human beings, it is hard for us to imagine that society would come completely undone. In recent years, we have seen the rise of a smarter class of the undead, one that can organize. And though that organization seems to make its mission the eradication of "normal" people, these new zombies are more intimidating in that they retain a bit more of their former selves. The film version of I Am Legend, gives us a class of zombie that is frightening fast, strong, and aggressive. They coordinate attacks and set traps—they can retaliate. Stephen King's The Cell features undead who are organized around a leader, and begin a process of "turning" normals (rather than simply eating them). These ... Read more »
Bishop, K. (2008) The Sub-Subaltern Monster: Imperialist Hegemony and the Cinematic Voodoo Zombie. The Journal of American Culture, 31(2), 141-152. DOI: 10.1111/j.1542-734X.2008.00668.x
I'm reading up on mandibular rotation, which is the change in orientation of the mandibular corpus relative to the rest of the skull during growth (the corpus is the horizontal part of your jaw that holds up your teeth; check out the shape changes in the mandibles in the blog header). So far as I can tell, the original classic paper on the topic is by Bjork (1955). Growth was studied by implanting metal pins into the jaws, then seeing how they move across ontogeny via X-rays (which were once called "roentgenograms," neat-o!) Here's a picture of the procedure, from Bjork (1955):HOLY GOD WHAT DID THAT KID DO TO DESERVE THIS?! And although there must be a third person there, it sorta looks like there's a three-handed dentist wielding a hammer, a nail, and a kid's face. No wonder so many people are afraid of the dentist.
ReferenceBJORK A (1955). Facial growth in man, studied with the aid of metallic implants. Acta odontologica Scandinavica, 13 (1), 9-34 PMID: 14398173... Read more »
BJORK A. (1955) Facial growth in man, studied with the aid of metallic implants. Acta odontologica Scandinavica, 13(1), 9-34. PMID: 14398173
Neocortex matters more than social enhancements à la Facebook, says Robin "Dunbar's number" Dunbar in a recent article. Will these results get him a Nobel Prize?... Read more »
Pollet, T., Roberts, S., & Dunbar, R. (2011) Use of Social Network Sites and Instant Messaging Does Not Lead to Increased Offline Social Network Size, or to Emotionally Closer Relationships with Offline Network Members. Cyberpsychology, Behavior, and Social Networking, 14(4), 253-258. DOI: 10.1089/cyber.2010.0161
Pollet, T., Roberts, S., & Dunbar, R. (2011). Use of Social Network Sites and Instant Messaging Does Not Lead to Increased Offline Social Network Size, or to Emotionally Closer Relationships with Offline Network Members Cyberpsychology, Behavior, and Social Networking, 14 (4), 253-258 DOI: 10.1089/cyber.2010.0161 I recently happened to read this new article (access here). Its [...]... Read more »
Pollet, T., Roberts, S., & Dunbar, R. (2011) Use of Social Network Sites and Instant Messaging Does Not Lead to Increased Offline Social Network Size, or to Emotionally Closer Relationships with Offline Network Members. Cyberpsychology, Behavior, and Social Networking, 14(4), 253-258. DOI: 10.1089/cyber.2010.0161
Human infants have one important job during the first years of life, and that is to learn about the world and their culture from their parents and other caregivers. But what is learning? I've previously written that Hungarian developmental psychologists Gergely and Csibra have defined learning as the acquisition of new, generalizable knowledge that can later be used within a new context. Further, they have posited that evolution has prepared humans to learn generalizable knowledge from their caregivers. They proposed an elegant hypothesis: that a specialized innate pedagogy mechanism - called the pedagogical learning stance - is in place that allows an infant to retain generic information. This means that they are able to learn information in a given instructional setting, that they can later apply to wide range of potential new situations.
This sort of cognitive system requires at least three things. First, the learner must understand the communicative intent of the teacher via ostensive cues. One such cue for humans is the use (by a parent or teacher) of infant-directed speech, or baby-talk. Second, the teacher and learner must be able to jointly use referential signals, such as eyegaze and pointing, in order to facilitate joint attention. Third, the learner must be able to understand the information content of the pedagogical interaction; that is, they must realize that they are getting relevant information for the given task.
There is good evidence that humans do, indeed, have innate pedagogy. But Gergely and Csibra take their claim a bit further. They claim: (1) that natural pedagogy is human-specific, (2) that natural pedagogy is universal among human cultures, and (3) that this sort of human social communication was explicitly selected for in evolution, rather than having emerged as a by-product of some other selection.
The remaining posts in this series on pedagogy will deal separately with each of these three claims. Today, we'll ask if pedagogy is human-specific, or if it is possible that it is shared with other animals.
Pedagogy is, in its simplest form, a form of social learning that occurs via communication between two (or more) individuals, which has as its outcome the transfer of knowledge or skills. There are abundant cases of both social learning and of communication in non-human animals, of course. Are there ever any cases where social learning and communication combine in any other species that allows for knowledge transfer similar to human teaching? The short answer is: no. But let's examine this in some more depth.
Read the rest of this post... | Read the comments on this post...... Read more »
Calico Jack Rackham's Jolly Roger.
Blackbeard's Jolly Roger.
Walter Kennedy's Jolly Roger.
Emanuel Wynn's Jolly Roger.
Above: A sampling of pirate flags.
The NYTimes recently explored the "pirate brand" by tracing the emergence of the skull and crossbones—the Jolly Roger—as a symbol of terror on the high seas. The Times hails the ominous design as a magnificent exercise in collective hybrid branding, noting that economics drove pirates to adopt a version of this particular symbol to facilitate their intent to plunder. It's a fascinating discussion on the efficiency and power that good branding can deliver, but it overlooks the ways in which the power of the symbol as we recognize it draws in part in the acceptance and manipulation of the image by others.
Piracy has likely long been a feature of the open seas, following the earliest trade routes of the Aegean and Mediterranean. Cilicians were active in the Mediterranean and tolerated by the Roman Empire for the slaves they provided, and were only reigned in when they gained such a presence as to become a threat to the Empire's grain supply in 67 BCE. The Senate approved "a comprehensive and systematic strategy and an astutely humane policy to the vanquished" to eliminate the Cilicians within a matter of months (1). Despite this historical legacy, the familiar skull and crossbones that many of us associate with piracy is a recent development, emerging in the late 17th-century with the rise of the pirates of the Caribbean.
Following the discovery of the New World, the Caribbean quickly gained status as a center of trade with sugar, gold, and human capital flowing between the Old and New Worlds. The Spanish dominated the landscape but other colonial powers soon followed. Pirates, many of whom were drawn to the trade because it offered a chance to make a sustainable wage, found the waters of the Caribbean particularly attractive: largely unsettled, they would not be bothered by governing bodies; there were plenty of safe, natural harbors; and many opportunities to liberate spoils from the trade vessels of the Spanish (2). Tensions between Old World powers were not limited to their respective shores—traces of these conflicts echoed in the Western colonies, and the English, Dutch, and French sanctioned piracy—commissioning them as privateers—as a means of protecting their claims and controlling the goods in the region. These men were national heroes: defenders of the nation on the high seas. Their numbers included Francis Drake and Henry Morgan—hailed as Gentlemen of the seas.
Pirates have a bloodthirsty and lawless reputation. They're known for walking the plank, copious alcohol consumption, and lascivious tendencies, but these were skilled men drawn from maritime trades which had paid them poorly:Merchant seamen got a hard, close look at death: disease and accidents were commonplace in their occupation, rations were often meager, and discipline was brutal. Each ship was "a little kingdom" whose captain held a near-absolute power which he often abused (3).
Some pirates had served in the navy where conditions aboard ship were no less harsh. Food supplies often ran short, wages were low, mortality was high, discipline severe, and desertion consequently chronic (4). While privateers often had better food and pay and shorter shifts, the long arm of the law was sometimes unforgiving and held them to strict standards. Pirates who seemed to have no loyalties to man or country were able to set their own terms, albeit under the guise of crime. These seafaring groups were far from disorganized—they operated under strict codes of conduct that reflected a highly organized social order governing authority, distribution of plunder, and discipline. For example, spoils were systematically distributed:Captain and quartermaster received between one and one-half and two shares; gunners, boatswains, mates, carpenters, and doctors, one and one-quarter or one and one-half; all others got one share each (5).The Captain served at the mercy of the crew, and could be removed from his position for acts of cowardice, cruelty, or failure to act in the best interest of the crew. A council governed the crew, representing the highest authority aboard the ship. In many ways this order was necessary to the survival of piracy. This group knew that they were operating on borrowed time and on the edge of the hangman's noose. Though they could be commissioned, if caught by an opposing party, they faced death. The literally needed to hang together, or could find themselves hanging separately, which bred a sense of fraternity that spread among pirates and manifested in cooperative tendencies at sea and in port. In this context, flags emerged as identifiers:In April 1719, when Howell Davis and crew sailed into the Sierra Leone River, the pirates captained by Thomas Cocklyn were wary until they saw on the approaching ship "her Black flag," then "immediately they were easy in their minds, and a little time after" the crews "saluted one another with their Cannon (6).Though conflict between pirate bands was not unheard of, the groups were largely cooperative, even across national boundaries. And they would defend each other. For example, when survivors of the wrecked Whidah were jailed in 1717, pirates "acquired" a ship captain, whom they told "if the Prisoners Suffered they would Kill every Body [the pirates] took belonging to New England" (7).
A version of the Jo... Read more »
Anderson, JL. (1995) Piracy and World History: An Economic Perspective on Maritime History. Journal of World History, 6(2), 175-199. info:/
Burgess Jr., D. (2009) Piracy in the Public Sphere: The Henry Every Trials and the Battle for Meaning in Seventeenth‐Century Print Culture. Journal of British Studies, 48(4), 887-913. DOI: 10.1086/603599
Rediker, M. (1981) "Under the Banner of King Death": The Social World of Anglo-American Pirates, 1716 to 1726. The William and Mary Quarterly, 38(2), 203-227. DOI: 10.2307/1918775
The ruins of La Milpa lie at the top of a steep, slippery path that winds upward from a rutted dirt road in Belize’s Rio Bravo Conservation Area. After scrambling up this path for the first time, I found myself beneath a dense jungle canopy, in the midst of a shadowy ruin. Unlike many other large Maya sites, La Milpa has not been uncovered, reconstructed, and opened to tourists. Instead, it remains shrouded in a thick layer of dirt and a thousand years’ worth of jungle growth. As you enter La Milpa, it’s easy to feel as though you’re discovering it for the first time.... Read more »
Dunning, N., Scarborough, V., Valdez, F., Luzzadder-Beach, S., Beach, T., Jones, J. (1999) Temple mountains, sacred lakes, and fertile fields: ancient Maya landscapes in northwestern Belize. Antiquity, 73(281), 650-660. info:/
Ambigous figures are drawings that seem to flip from being one thing to another.Psychologists Melissa Allen and Alison Chambers recently showed these images to teenagers with autism in an attempt to find out whether they were able to perceive the effect normally: Implicit and explicit understanding of ambiguous ﬁgures by adolescents with autism spectrum disorderA leading theory of autism is weak central coherence - the idea that autistic people tend to be focussed on details, rather than the "big picture". This might predict that autism would interfere with the perception of these figures because the ambiguity is all about the global, gestalt meaning: the details are fixed, but you can see them as adding up to two different things.The autistic teens and a control group were showed the images and asked to copy them using a pen and paper. Then their drawings were rated for "duckness" or "rabbitness", or equivalent, by a rater who wasn't told which diagnosis the drawer had.The results showed that the autistic group were able to perceive both interpretations of the figures, and were equally likely to report experiencing the "reversal" phenomena in which the image seems to flip. However, when it came to the drawings, they were less biased by being told which interpretation to use. When the instructions said "Draw this rabbit" as opposed to "Draw this picture", controls tended to make their copy more rabbity, but autistic people copied it faithfully.Beyond their relevance to autism, these kinds of pictures are interesting because they tell us something important about perception.You can't see these images for what they really are. They really are ambiguous - they're neither duck, nor rabbit. They're both. However, our brains insist that they are one the other, at any one time. They're duck, rabbit, duck, rabbit. But they never seem to be a "duckrabbit". Not for me anyway. Even though I know, in an abstract sense, that this is what they really are.Both "duck" and "rabbit" are things we've encountered a thousand times before. So we seem to be drawn to see them in those familiar terms. "Duckrabbits" are unheard of, outside psychology. Rather than sit on the fence, our perceptions fall into the well-worn grooves of our preexisting categories.Allen ML, & Chambers A (2011). Implicit and explicit understanding of ambiguous figures by adolescents with autism spectrum disorder. Autism : the international journal of research and practice PMID: 21486897... Read more »
Allen ML, & Chambers A. (2011) Implicit and explicit understanding of ambiguous figures by adolescents with autism spectrum disorder. Autism : the international journal of research and practice. PMID: 21486897
A new method conclusively solves an ancient linguistic riddle.... Read more »
G. Artioli, V. Nociti, & I. Angelini. (2011) Gambling with Etruscan dice: a tale of numbers and letters. Archaeometry. info:/http://onlinelibrary.wiley.com/doi/10.1111/j.1095-9270.2011.00317.x/abstract
G. Bonfante, & L. Bonfante. (2002) The Etruscan language: an introduction. New York University Press. info:other/7190 5539 3
New research on ancient Chinese burials identifies male and female slave sacrifices.... Read more »
H. Zhang, F. Liu, W. Liu, J. Du, X. Wu, X. Chen, & G. Liao. (2011) Sex identification of slave sacrifice victims from Qin State tombs in the Spring and Autumn Period of China using ancient DNA. Archaeometry. info:/10.1111/j.1475-4754.2010.00553.x
Back when I was a mere first year biology student, the first thing we were taught was this:DNA makes RNA makes Protein.This is the Central Dogma of Molecular Biology, and it describes the intricate and beautiful process by which genes influence living things. The whole thing really is remarkable.Unfortunately, some people in psychiatry seem to have forgotten this. Reading some of the literature, you would think that:DNA makes DSM DiagnosesOr if you're feeling especially adventurous and concious of the fact that diagnoses are not necessarily real entitiesDNA makes Symptoms (which add up to make DSM Diagnoses)In fact, DNA has nothing to do with symptoms either, not directly. DNA makes proteins. Proteins interact with each other, and with all kinds of hormones and other signalling molecules, to control the growth and function of cells. Cells don't get symptoms. People get symptoms - and people are very complex systems made of billions of cells.So it would be extremely weird if a particular genetic variant only ever caused one specific disease. That would mean that, whenever you have that variant, and regardless of any other variants or environmental factors, it will always mess up cell function such that it causes the same ultimate symptoms.That does happen. There are lots of single-gene disorders - or to put it another way, single-disorder genes. But they may well be the exception. Rather, as Matthew State says in a short paper just out in Biological Psychiatry, the latest research suggests that genes that are linked to one psychiatric disorder are usually linked to lots of them, sometimes ones with quite different symptoms.I previously wrote about the case of "The ADHD Gene" that's actually a gene for lots of stuff including, sometimes, ADHD. State focusses on the example of the gene CNTNAP2, variants in which have been linked to (deep breath): epilepsy, mental retardation, autism, social anxiety, schizophrenia and Tourette's. Sometimes the same variant causes multiple different disorders in different people. Sometimes one variant causes one thing and protects against another, related, thing. Hmm.As State says, one possibility is that any given mutation always causes the same symptoms, it's just that our diagnostic categories are imperfect so the same symptoms get labelled as many different things. That's certainly true but as he points out, there's a more radical possibility: the same variant might cause genuinely different symptoms.mutations at single gene or locus may carry significant risks for truly divergent neurodevelopmental outcomes, neither demonstrating specificity for a clinically observable phenomenon nor conferring any reliable overlap among disparate behavioral phenotypes.How? Well, suppose there was a variant, "pinker", that codes for a fluorescent protein that makes half of your brain cells glow bright pink. By itself, that wouldn't cause symptoms. No-one would even know.Yet imagine another variant, "pinkophobe", that made cells refuse to communicate with pink cell. That wouldn't cause any symptoms either, by itself. But in conjunction with "pinker", where it would cause serious problems: half of your cells would be effectively out of action.But suppose you carried "pinker" and yet another variant, "welovepink", that made your cells respond much more strongly to pink cells. Then, you would have the opposite problem. Half of your cells would be super-responsivie to the other half, and that would probably cause epilepsy, amongst other things. You'd get symptoms, but they would be completely different symptoms from people who had "pinker" and "pinkophobe".So what symptoms does "pinker" cause? It doesn't cause symptoms. It's just a gene. The symptoms come much later. "pinker" would be associated with all kinds of stuff, even though it has a very specific role. It just codes for one protein. Genes are pretty simple folk. The complexity comes later.This is a silly example, but maybe not so far fetched after all. Neurons don't glow pink, but they do release neurotransmitters, and they don't have color preferences, but they do have receptors that respond to transmitters.State MW (2011). The Erosion of Phenotypic Specificity in Psychiatric Genetics: Emerging Lessons from CNTNAP2. Biological psychiatry, 69 (9), 816-7 PMID: 21497679... Read more »
State MW. (2011) The Erosion of Phenotypic Specificity in Psychiatric Genetics: Emerging Lessons from CNTNAP2. Biological psychiatry, 69(9), 816-7. PMID: 21497679
Life as we know it has taken some strange courses. Of all the things an animal could do with its time, pretending to be an ant is apparently pretty popular. According to a review article in the latest Current Biology, there are probably over 2000 abhorrent species of myrmecomorphs (ant impersonators), including spiders, caterpillars, mites, beetles, and other types of arthropod biodiversity I'm not familiar with, that have come to resemble ants in some form or another.
It's interesting how and why different life forms have come to p-ant-omime. For example, in the picture above, (Maderspacher & Stensmyr 2011, Fig. 3) on the left side is the crab spider (Aphantochilus rogersi) mimicking ant species in the genus Cephalotes - which the spider comes upon unawares and then feeds upon (getting pwned on the right side of the photo). If imitation is the sincerest form of flattery, then mimicry must be the most malevolent means of creepy.
Or here's a treehopper (Cyphonia clavata, an insect and not a spider like above) that doesn't just disguise itself as an ant, but rather has a whole ant-shaped appendage bursting from its back in a disgusting perversion of alien birth in the Alien series (Maderspacher & Stensmyr 2011, Fig. 1). It is quite remarkable that a surprisingly common yearning to be perceived as an ant has resulted in convergent evolution of an ant-ish figure in myriad of nature's more disgusting creations, not to mention in ants themselves.
ReferenceFlorian Maderspacher & Marcus Stensmyr (2011). Myrmecomorphomania Current Biology, 21 (9) : R291-293. doi:10.1016/j.cub.2011.04.006... Read more »
Florian Maderspacher, & Marcus Stensmyr. (2011) Myrmecomorphomania. Current Biology, 21(9). info:/doi:10.1016/j.cub.2011.04.006
True story.... Read more »
Marzluff, J., Walls, J., Cornell, H., Withey, J., & Craig, D. (2010) Lasting recognition of threatening people by wild American crows. Animal Behaviour, 79(3), 699-707. DOI: 10.1016/j.anbehav.2009.12.022
A little over 2 million years ago there a major divergence of hominids, leading on the one hand to our earliest ancestors in the genus Homo, and on the other hand to a group of 'robust' australopithecines, the latter group a failed evolutionary experiment in being human. In our ancestors, parts of the skull associated with chewing began to get smaller and more delicate, while the robust australopithecines increased the sizes of their crushin'-teeth and chewin'-muscle attachments...... Read more »
Cerling TE, Mbua E, Kirera FM, Manthi FK, Grine FE, Leakey MG, Sponheimer M, & Uno KT. (2011) Diet of Paranthropus boisei in the early Pleistocene of East Africa. Proceedings of the National Academy of Sciences of the United States of America. PMID: 21536914
McCollum, M. (1999) The Robust Australopithecine Face: A Morphogenetic Perspective. Science, 284(5412), 301-305. DOI: 10.1126/science.284.5412.301
Ungar PS, Grine FE, & Teaford MF. (2008) Dental microwear and diet of the Plio-Pleistocene hominin Paranthropus boisei. PloS one, 3(4). PMID: 18446200
Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.
If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.
Research Blogging is powered by SMG Technology.
To learn more, visit seedmediagroup.com.