Post List

  • August 11, 2011
  • 06:00 AM

Untangling MS gene by gene

by Suzanne Elvidge in Genome Engineering

Multiple sclerosis (MS) is an autoimmune disorder, and an international team of researchers have identified 57 different genes that could be associated with its development. These results are helping untangle the story of MS, and carrying these genes increasing the risk of developing MS – however, other factors may be involved, and genes could be shared with other autoimmune diseases.
... Read more »

Sawcer, S., Hellenthal, G., Pirinen, M., Spencer, C., Patsopoulos, N., Moutsianas, L., Dilthey, A., Su, Z., Freeman, C., Hunt, S.... (2011) Genetic risk and a primary role for cell-mediated immune mechanisms in multiple sclerosis. Nature, 476(7359), 214-219. DOI: 10.1038/nature10251  

  • August 11, 2011
  • 05:30 AM

Towards or away from?

by David Winter in Careers - in Theory

Think about a recent job change that you made by your own initiative (rather than by force of circumstance, such as redundancy). Why did you change? Had you got so fed up with your previous job that you had to move to preserve your sanity? Or were you tempted away by the opportunities on offer [...]... Read more »

  • August 11, 2011
  • 05:30 AM

Did you hear the one about kids who eat candy being thinner?

by Yoni Freedhoff in Weighty Matters

I did.

I heard about it when Linda Bacon from HAES tweeted a link to a press release about it multiple times, calling it "Myth Busting". Knowing that Linda knows how to critically appraise a journal article, I figured it'd be worth reading the actual study.

I was wrong.

The study looked at one solitary day's 24hr. dietary recall collected from 11,182 children between the ages of 2-18 years of age, and then compared candy intake to overweight and obesity status in those same children.

Now dietary recall is known to be fraught with error, especially when it comes to less than healthy foods.

So is there evidence of error here?

Well according to their results, only 30% of children have candy on a daily basis, where candy means a sugar candy or chocolate (more on that in a bit).

That sure sounds like an awfully small number.

And of the kids who actually admitted eating candy, how much were they found to be eating?

One chocolate bar worth for teens aged 14-18 and about 2/3 of a chocolate bar worth for kids aged 2-13.

That sounds like an awfully small number too.

Now maybe kids really don't eat candy any more. Maybe the world's changed more than I've envisioned and only 3 out of 10 children eat candy daily, and do so in rather tempered amounts. And maybe candy's not only not bad for you, but it's good for you, specifically good for you in regard to weight in that this study found that the kids who reported eating candy, were 22 to 26 percent less likely to be overweight or obese!

Of course the other possibility is that it's just an awful study that doesn't fairly lend itself to any conclusion whatsoever (pro or con).

And while we're at the awful study angle, given that this is a study where the authors' conclusion and the public relations spin is that candy's not bad for you, and apparently is protective against overweight and obesity, I think it's probably also worth asking what wasn't counted as candy?

Cookies, freezies, ice-cream, pudding, fruit roll ups, cake, pie, etc. Just chocolate candy and sugar candy. So what else wouldn't fit? Any other junk food - chips, pretzels etc.


The only conclusion I'm able to fairly draw from this study is that those of us who have any degree of Twitter influence, we really have to hold ourselves to a higher standard of retweeting. It's always tempting to retweet a press release or a blog post about a study that fits within our own confirmation biases, but before we do, we should really feel obligated to first read the actual study and evaluate it just as critically as we would those studies that don't fit neatly within our personal narratives.

E. O'Neil, C., L. Fulgoni Iii, V., & A. Nicklas, T. (2011). Association of candy consumption with body weight measures, other health risk factors for cardiovascular disease, and diet quality in US children and adolescents: NHANES 1999–2004 Food & Nutrition Research, 55 DOI: 10.3402/fnr.v55i0.5794

... Read more »

  • August 11, 2011
  • 04:06 AM

Split brains, autism and schizophrenia

by Kevin Mitchell in Wiring the Brain

A new study suggests that a gene known to be causally linked to schizophrenia and other psychiatric disorders is involved in the formation of connections between the two hemispheres of the brain. DISC1 is probably the most famous gene in psychiatric genetics, and rightly so. It was discovered in a large Scottish pedigree, where 18 members were affected by psychiatric disease.
The diagnoses ranged from schizophrenia and bipolar disorder to depression and a range of “minor” psychiatric conditions. It was found that the affected individuals had all inherited a genetic anomaly – a translocation of genetic material between two chromosomes. This basically involves sections of two chromosomes swapping with each other. In the process, each chromosome is broken, before being spliced back to part of the other chromosome. In this case, the breakpoint on chromosome 1 interrupted a gene, subsequently named Disrupted-in-Schizophrenia-1, or DISC1.

That this discovery was made using classical “cytogenetic” techniques (physically looking at the chromosomes down a microscope) and in a single family is somehow pleasing in an age where massive molecular population-based studies are in vogue. (A win for “small” science).

The discovery of the DISC1 translocation clearly showed that disruption of a single gene could lead to psychiatric disorders like schizophrenia. This was a challenge to the idea that these disorders were “polygenic” – caused by the inheritance in each individual of a large number of genetic variants. As more and more mutations in other genes are being found to cause these disorders, the DISC1 situation can no longer be dismissed as an exception – it is the norm.

It also was the first example of a principle that has since been observed for many other genes – namely that the effects of the mutation can manifest quite variably - not as one specific disorder, but as different ones in different people. Indeed, DISC1 has since been implicated in autism as well as adult-onset disorders. It is now clear from this and other evidence that these apparently distinct conditions are best thought of as variable outcomes that arise, in many cases at least, from disturbances of neurodevelopment.

Since the initial discovery, major research efforts of a growing number of labs have been focused on the next obvious questions: what does DISC1 do? And what happens when it is mutated? What happens in the brain that can explain why psychiatric symptoms result?

We now know that DISC1 has many different functions. It is a cytoplasmic protein - localised inside the cell - that interacts with a very large number of other proteins and takes part in diverse cellular functions, including cell migration, outgrowth of nerve fibres, the formation of dendritic spines (sites of synaptic contact between neurons), neuronal proliferation and regulation of biochemical pathways involved in synaptic plasticity. Many of the proteins that DISC1 interacts with have also been implicated in psychiatric disease.

This new study adds another possible function, and a dramatic and unexpected one at that. This function was discovered from an independent angle, by researchers studying how the two hemispheres of the brain get connected – or more specifically, why they sometimes fail to be connected. The cerebral hemispheres are normally connected by millions of axons which cross the midline of the brain in a structure called the corpus callosum (or “tough body” – (don’t ask)). Very infrequently, people are born without this structure – the callosal axons fail to cross the midline and the two hemispheres are left without this major route of communication (though there are other routes, such as the anterior commissure).

The frequency of agenesis of the corpus callosum has been estimated at between 1 in 1,000 and 1 in 6,000 live births – thankfully very rare. It is associated with a highly variable spectrum of other symptoms, including developmental delay, autistic symptoms, cognitive disabilities extending into the range of mental retardation, seizures and other neurological signs.

Elliott Sherr and colleagues were studying patients with this condition, which is very obvious on magnetic resonance imaging scans (see Figure). They initially found a mother and two children with callosal agenesis who all carried a deletion on chromosome 1, at position 1q42 – exactly where DISC1 is located. They subsequently identified another patient with a similar deletion, which allowed them to narrow down the region and identify DISC1 as a plausible candidate (among some other genes in the deleted region). Because the functions of proteins can be affected not just by large deletions or translocations but also by less obvious mutations that change a single base of DNA, they also sequenced the DISC1 gene in a cohort of callosal agenesis patients and found a number carrying novel mutations that are very likely to disrupt the function of the gene.

While not rock-solid evidence that it is DISC1 that is responsible, these data certainly point to it as the strongest candidate to explain the callosal defect. This hypothesis is strongly supported by findings from DISC1 mutant mice (carrying a mutation that mimics the effect of the human translocation), which also show defects in formation of the corpus callosum. In addition, the protein is strongly expressed in the axons that make up this structure at the time of its development.

The most obvious test of whether disruption of DISC1 really causes callosal agenesis is to look in the people carrying the initial translocation. Remarkably, it is not known whether the original patients in the Scottish pedigree who carry the DISC1 translocation show this same obvious brain structural phenotype. They have, very surprisingly, never been scanned.

This new paper raises the obvious hypothesis that the failure to connect the two hemispheres results in the psychiatric or cognitive symptoms, which variously include reduced intellectual ability, autism and schizophrenia. This seems like too simplistic an interpretation, however. All we have now is a correlation. First, the implication of DISC1 in the acallosal phenotype is not yet definitive – this must be nailed down and replicated. But even if it is shown that disruption of DISC1 causes both callosal agenesis and schizophrenia (or other psychiatric disorders or symptoms), this does not prove a causal link. DISC1 has many other functions and is expressed in many different brain areas (ubiquitously in fact). Any, or indeed, all of these functions may in fact be the cause of psychopathology.

One prediction, if it were true that the lack of connections between the two hemispheres is causal, is that we would expect the majority of patients with callosal agenesis to have these kinds of psychiatric symptoms. In fact, the rates are indeed very high – in different studies it has been estimated that up to 40% of callosal agenesis patients have an autism diagnosis, while about 8% have the symptoms of schizophrenia or bipolar disorder. (Of course, these patients may have other, less obvious brain defects as well, so even this is not definitive).

Conversely, we might naively expect a high rate of callosal agenesis in patients with autism or schizophrenia. However, we know these disorders are extremely heterogeneous and so it is much more likely that this phenotype might be apparent in only a specific (possibly very small) subset of patients. This may indeed be the case – callosal agenesis has been observed in about 3 out of 200 schizophrenia patients (a vastly higher rate than in the general population). Another study, just published, has found that mutations in a different gene – ARID1B – are also associated with callosal agenesis, mental retardation and autism. More generally, there may be subtle reductions in callosal connectivity in many schizophrenia or autism patients (including some autistic savants).

Whether this defect can explain particular symptoms is not yet clear. For the moment, the new study provides yet another possible function of DISC1, and highlights an anatomical phenotype that is apparently present in a subset of autism and schizophrenia cases and that can arise due to mutation in many different ... Read more »

Osbun N, Li J, O'Driscoll MC, Strominger Z, Wakahiro M, Rider E, Bukshpun P, Boland E, Spurrell CH, Schackwitz W.... (2011) Genetic and functional analyses identify DISC1 as a novel callosal agenesis candidate gene. American journal of medical genetics. Part A, 155(8), 1865-76. PMID: 21739582  

  • August 11, 2011
  • 03:15 AM

Do We Need Placebos?

by Neuroskeptic in Neuroskeptic

A news feature in Nature asks whether placebo controls are always a good idea: Why Fake It?

The piece looks at experimental neurosurgical treatments for Parkinson's, such as "Spheramine". This consists of cultured human cells, which are implanted directly into the brain of the sufferer. The idea is that the cells will grow and help produce dopamine, which is deficient in Parkinson's.

Peggy Willocks, a 44 year old teacher, took part in a trial of the surgery in 2000. She says it helped stave off the symptoms for years, but the development of Spheramine was axed in 2008 after a controlled trial found it didn't work any better than a placebo.

The placebo was "sham surgery" i.e. putting the patient through a full surgical procedure, and making holes in their skull, but without doing anything to their brain.

It's cheap and easy to do a placebo controlled trial of a drug - all you need is a sugar pill. But with neurosurgery, it's clearly a lot more involved. A placebo has to be believable. Convincing sham surgery is expensive, time-consuming, and it has real risks, albeit small ones.

Is it ethical to put patients through that?

That, I think, can only be decided on a trial-by-trial basis. It depends on the likely benefits of the treatment, and whether the trial is scientifically sound. Obviously, it'd be wrong to do sham surgery as part of a flawed trial that won't tell us anything useful.

The Nature article, however, goes further than this, and suggests that placebo controlled trials may be unsuitable for testing these kinds of treatments, failing to detect a real benefit in some patients:
There are hints from some of the failed phase II trials that patients followed up beyond study endpoints might tell a more positive story. Some say, therefore, that sham controls are sinking the prospects of valuable drugs.

Anders Björklund, a neuroscientist at Lund University in Sweden who is collaborating with [Roger Barker of Cambridge], says that sham surgery can lead researchers to throw out a strategy prematurely if the trial fails because of technical or methodological glitches rather than a true lack of efficacy.A patient advocate agrees:
According to Perry Cohen, who leads a network of patient activists called the Parkinson Pipeline Project, that’s exactly what is happening. He had always questioned the need for sham surgery, he says, but after the string of phase II failures, “We started saying, ‘Hey, this is a problem. These trials failed, but we know they are working for some people.’”...Cohen [says] that patients have different priorities and that researchers must take these into account. Researchers use placebo controls to weed out false positives. But for patients, the real ogre is the false negatives — which can sink a therapy before it has been optimized. I'm not sure about this. If I had Parkinson's, I would certainly hate to miss out on the genuine cure because a trial had failed to recognize that it worked. But equally, I would not be happy to be given a rubbish treatment that would have failed a placebo controlled trial, but never got one, because of arguments like this.

Placebo controlled trials can fail to detect benefits if they are too short, too small, methodologically flawed, or whatever. Certainly, a trial can be placebo controlled, and still crap. But the answer is surely to do better trials, not no trials.

It may well be that we shouldn't rush to do placebo controlled trials until later in the development process, when the technique has been properly refined. But the history of medicine is littered with treatments that "we know work for some people" - that didn't.

Katsnelson, A. (2011). Experimental therapies for Parkinson's disease: Why fake it? Nature, 476 (7359), 142-144 DOI: 10.1038/476142a... Read more »

  • August 11, 2011
  • 01:46 AM

Advanced cancer and the ketogenic diet

by Lucas Tafur in Ketotic

A restricted ketogenic diet for cancer management.... Read more »

  • August 10, 2011
  • 10:59 PM


by Lachlan Jackson in Language on the Move

Having lived and taught English in Japan for more than fifteen years, until last night I’d thought I’d seen it all. That was until I stumbled across the もし彼氏が外国人だったら英会話 (What if my Boyfriend was a Foreigner English Conversation [my translation]) … Continue reading →... Read more »

Takahashi, Kimie. (2010) Multilingual couple talk: Romance,identity, and the political economy of language. D. Nunan , 199-207. info:/

  • August 10, 2011
  • 09:19 PM

Bad Beetle Karma

by bug_girl in Bug Girl's Blog

I realized after my interview last weekend that I had never actually covered Japanese Beetle Bags on my blog!  That omission must be remedied! I’m sure you’ve seen them–they are for sale all over.   The sad truth is that they don’t work. Sure, they fill up the bag-o’-death in a really satisfying way, but they [...]... Read more »

Gordon, F. C., & D. A. Potter. (1985) Efficiency of Japanese beetle (Coleoptera: Scarabaeidae) traps in reducing defoliation of plants in the urban landscape and effect on larval density in turf. J. Econ. Entomology, 774-778. info:/

  • August 10, 2011
  • 08:00 PM

Temporal precision in the LGN

by Patrick Mineault in xcorr

There’s a new paper just out in J Neurosci by Dan Butts et al. (2011) that offers some key insights into temporal precision in the lateral geniculate nucleus (LGN). The spike trains of LGN cells are remarkably regular; while a Poisson train has Fano factor (variance to mean ratio) of 1, and cortical neurons in [...]... Read more »

Peter Dayan, & Larry Abbott. (2001) Theoretical Neuroscience. MIT Press. info:other/0262041995

Berry MJ 2nd, & Meister M. (1998) Refractoriness and neural precision. The Journal of neuroscience : the official journal of the Society for Neuroscience, 18(6), 2200-11. PMID: 9482804  

Butts DA, Weng C, Jin J, Yeh CI, Lesica NA, Alonso JM, & Stanley GB. (2007) Temporal precision in the neural code and the timescales of natural vision. Nature, 449(7158), 92-5. PMID: 17805296  

  • August 10, 2011
  • 07:36 PM

Pantomiming Primates

by Paul Norris in AnimalWise

When considering language abilities in non-human animals, it pays to keep in mind that spoken words are not the only path to sophisticated communication. For example, while great apes like chimpanzees and orangutans may be limited in their ability to … Continue reading →... Read more »

Russon, A., & Andrews, K. (2010) Orangutan pantomime: elaborating the message. Biology Letters, 7(4), 627-630. DOI: 10.1098/rsbl.2010.0564  

  • August 10, 2011
  • 06:13 PM

Pricing in Times of Disruption

by Daniel Dumke in SCRM Blog - Supply Chain Risk Management

Many articles, including my own research show, that companies tend to focus largely on risk mitigation measures concerning the supply side. Only little is done to include demand side risks or demand side measures into the mitigation of supply chain risks. The study “Pricing During Disruptions: A Cause of the Reverse Bullwhip Effect” focusses on optimal pricing measures during a disruption. And so it helps to close the gap a little bit.... Read more »

Rong, Y., Shen, Z.-J. M., & Snyder, L V. (2011) Pricing During Disruptions: A Cause of the Reverse Bullwhip Effect. SSRN. info:/

  • August 10, 2011
  • 04:15 PM

Should vaccinology embrace systems biology?

by Connor Bamford in The Rule of 6ix

Any intelligent fool can make things bigger, more complex, and more violent. It takes a touch of genius -- and a lot of courage -- to move in the opposite direction.
E. F. Schumacher (1911-1977) British Economist
Vaccines represent one of the most cost effective methods around to prevent loss of life and disease in a whole range of animals, including the human population. Over the last 200 years or so, we've become pretty adept at producing them and so the science of vaccinology - or how to generate these complex pharmaceuticals  - has led to the eradication (and near eradication) of many viral pathogens.

This is one network in your immune response following influenza vaccination (Kokke et al 2011) as id'd through systems biology approaches. Knowledge of key mediators in these pathways may allow for the rational design of new vaccines - but is it worth it?

But, it hasn't succeeded for a number of currently killer viruses (respiratory synycitial virus and HIV to name but two) and we have begun to think that maybe the method of 'isolate, attenuate, vaccinate' or the synthesis of single virus antigen molecules isn't gonna cut it anymore. So what are we going to do?

We are wanting to rationally generate vaccines - taking a wild-type, disease-causing isolate and through some genetic engineering, make it sufficiently weak so as to generate an effective immune response in patients while not causing disease. Yet, this is harder than it looks and so some researchers are now turning to systems biology to offer a glimpse into how some of our most successful vaccines function so as we can reconstruct these processes for the numbers of viral pathogens we are yet to protect against.

*I have explored how we may attempt to do this from the viruses perspective (mumps virus versus vaccine, here), more precisely: how come the vaccine strain is less deadly than the 'wild' strains. Both are valid approaches and probably just as difficult to carry out as each other.*

Systems biology affords us a chance to more fully understand the complexity of living systems. Through the collection of reams of data (DNA sequences, gene expression changes, protein levels - and other 'omics' technologies) we are now able to adequately model what is going on in the organism/cell through now more common bioinformatic and statistical analyses. As in the quote I used above, it is not about making the study of life more complex, it is really about realizing this fact and doing something to understand it better under a broader way of thinking. This allows us to 'see' changes and functional differences that we would never have observed had we gone about such an experiment using our a priori knowledge and this global, holistic view may just be the savior that vaccinology needs now.

Example of the complex data collected during a typical 'systems' experiment - what does it all mean, and how can we find something important and worthwhile to study?

I am aware of a number of papers that are currently using this process as a primer to develop improved vaccine products (see here, here and review here of virus vaccine examples). These guys - for example: Bali Pulandran of the Emory Vaccine Centre in Atlanta, USA - are interested in comparing the immune response (humoral response, innate immunity and gene expression changes) of human subjects administered with vaccines. The response to the yellow fever virus vaccine as well as two types of influenza vaccines have been approached and through complex bioinformatic modelling they were able to pull out some significant correlates of immune response - this they hope will aid in the future testing of novel vaccines and facilitate a rational take on vaccine generation by identifying a gene(s)/protein with a functional role in the immune response. This they have begun to do in some of ther papers above - it is nice to see this kind of work being used as a basis for experimental biology.

These types of studies hail a new way of thinking about viruses, vaccines and the immune response to them; if only we can realize the power in taking a step back and looking at the diversity in each. And this type of work could be applied to any number of mechanisms such as vaccine safety or applying it to different tissues during an infection.

Saying that - this stuff isn't particularly easy, cheap or quick as you might think. But as each study generates so much data, might it not take but a few such investigations to lead us on the way of rationally attenuated and protective vaccines? So, should vaccinology embrace systems biology? I think if you have the abilities to do such a study - which from a pharma perspective definitely yes, do it as the more information we have at our disposal the better position we are in. We await further results from these groups to compare how well systems thinking goes up against human ingenuity, that has worked well in the past.

Nakaya HI, Wrammert J, Lee EK, Racioppi L, Marie-Kunze S, Haining WN, Means AR, Kasturi SP, Khan N, Li GM, McCausland M, Kanchan V, Kokko KE, Li S, Elbein R, Mehta AK, Aderem A, Subbarao K, Ahmed R, & Pulendran B (2011). Systems biology of vaccination for seasonal influenza in humans. Nature immunology, 12 (8), 786-95 PMID: 21743478... Read more »

Nakaya HI, Wrammert J, Lee EK, Racioppi L, Marie-Kunze S, Haining WN, Means AR, Kasturi SP, Khan N, Li GM.... (2011) Systems biology of vaccination for seasonal influenza in humans. Nature immunology, 12(8), 786-95. PMID: 21743478  

  • August 10, 2011
  • 03:31 PM

Beware of Jurors Who Feel Downgraded

by Persuasion Strategies in Persuasive Litigator

By Dr. Ken Broda-Bahm - With Standard & Poor's recent decision to deny the U.S. a AAA credit rating, many Americans are feeling a little downgraded about now. For most of us, I can hope, that is a temporary feeling. But for others, especially in these economic times, it is a more constant aspect of their lives. These Americans, including increasingly those who show up for jury duty, are what the researchers call "status inconsistent." They may be higher in social prestige than in economic means (like teachers), or they may be highly educated, while holding a lower-level position. Indeed,...

... Read more »

Michael W. Kraus, Paul K. Piff, Dacher Keltner. (2011) Social Class as Culture: The Convergence of Resources and Rank in the Social Realm . Current Directions in Psychological Science, 20(4). info:/

  • August 10, 2011
  • 02:34 PM

1st Installment of the Roman Bioarchaeology Carnival

by Kristina Killgrove in Powered By Osteons

Since I'm gearing up for a new semester (finishing up syllabi, packing for a move, etc.), I haven't had as much time as I'd like to blog about the interesting reports and publications that have come out recently on the topic of Roman-era skeletons.  So here's a carnival or round-up of links from the past few weeks, things I've wanted to talk about but haven't had the time to craft full posts about.


Roman Child Skeleton from Durnovaria
(credit: DorsetECHO)
August 10 - Today's news brought a brief story about the discovery of a skeleton of a Roman child from what used to be Durnovaria (modern-day Dorchester, England).  There's no osteological information in the report, but there is a nice little history of Durnovaria and this photo of the skeleton, which was found within the settlement (unclear if it was in a house).  It's not unusual to find children buried outside of cemeteries - within houses, near walls, etc.
August 8 - On Monday, the BBC gave a bit more coverage to the discovery of nearly 100 infant skeletons in a Roman-era villa in Britain.  Jill Eyers, who rediscovered the skeletons in a storeroom, put forth the idea last year that these infants were killed on purpose and that the villa was in use as a brothel.  [Original BBC report here, bit of video here.]  Dr. Eyers remains convinced of her theory, but scholars in both the classical and anthropological blogospheres are questioning that.  Most notable are the posts by archaeologist Rosemary Joyce, who wrote a critique of the theory last year and wrote an updated post yesterday continuing to cast doubt on the whole brothel idea.  Dr. Joyce's posts are well worth a read, as she delves into the historical and archaeological evidence of Roman brothels to bring a counter-point to the discussion of this interesting discovery.
August 8 - The American Journal of Physical Anthropology published an interesting paper on Monday by Becky Redfern and Sharon DeWitt (2011) on the effect of status on mortality risk in Roman-era Dorset, England.  The authors looked at nearly 300 individuals dating to the 1st to 5th centuries AD and assigned them a status level based on burial type.  Using models of mortality, they found that indeed higher-status individuals had lower mortality risk.  This was especially true for children and for people who were buried (and presumably lived in) an urban environment.  Interestingly, male mortality risk was higher than female mortality risk (I presume owing to warfare and other job hazards).  Redfern and DeWitt conclude that, "...the cultural buffering afforded by being of high status enabled people to more effectively deal with urban environments and migration, with lower-status individuals having greater risk because of their forms of employment and living conditions."  We can, of course, assume that individuals with higher status had better diets and overall health, and therefore lower mortality risk. But it's great to see researchers actually test that hypothesis.  It's also interesting to see that urban denizens had lower risk of mortality; in many ancient societies, urbanism meant dramatic changes to health and wellbeing, but I've also been finding with my Romans that those who lived in or near the city were generally healthier than those from the suburbs and countryside.


Mummies arranged by age, sex, and occupation.
(credit: Panzer et al. 2010, Fig. 2)
Not Roman and not recent news, but still neat: last summer, S. Panzer and colleagues published a study of late 19th/early 20th century mummies from the Capuchin Catacombs in Palermo, Sicily.  The pictures in the article are astounding: the mummies are excellently preserved, and the radiographs show a variety of minor pathological conditions (e.g., healed fractures) in some of the mummies.  The authors were able to learn a lot about embalming techniques and about the health of the people who were given this treatment after death.

Interactive Teaching ToolsAnd finally, this link has been sitting in my bookmarks for a while.  I discovered the BBC's online video game Dig It Up: Romans through Katy Meyers' blog post (July 14, at Play the Past).  It's cute, fun, and educational.  Katy writes that, "not only does the game allow payers to see the different stages of archaeology, but it is all done in a cultural resource management with the threat of construction setting time limits."  Unfortunately, I didn't find a skeleton when I played... just a lamp and an amphora.  But the game shows that archaeologists need sampling strategies, that we don't always find every piece of an artifact, and that we don't always find anything of interest (ah, memories of Spam cans from my days excavating at Monticello).  Go play it now!  You know you need a break from work.
Panzer S, Zink AR, & Piombino-Mascali D (2010). Scenes from the past: radiologic evidence of anthropogenic mummification in the Capuchin Catacombs of Palermo, Sicily. Radiographics 30 (4), 1123-32. PMID: 20631372.

Redfern RC, & Dewitte SN (2011). Status and health in Roman Dorset: The effect of status on risk of mortality in post-conquest populations. American Journal of Physical Anthropology PMID: 21826637.... Read more »

  • August 10, 2011
  • 02:27 PM

Yeast Show Humans Why It's Better to Be a Clump

by Elizabeth Preston in Inkfish

When the first single-celled organisms left behind their loner ways and began existing as blobs of cells, it was a big step for life on this planet. Organisms could now grow orders of magnitude larger than each other. They could organize their cells into different types that performed different functions. Instead of drifting around with the other specks, multicellular organisms could grow, swim, crawl, fly, and evolve into everything else on Earth today.It's nearly impossible to know how organisms first made the leap into multicellularity. But in an effort to look back in time, researchers in Japan bred collections of yeast cells that competed with each other for food. They found that the innovation of living as a group of cells would have given the first many-celled organisms a clear advantage--and that it may have been spurred by nothing more than sloppy wall building.Yeast is a single-celled fungus, but in the wild it sometimes grows in clumps. This happens when a yeast cell clones itself, budding an identical "daughter cell," but fails to pinch the clone entirely free. These cells continue to reproduce, forming clumps. The researchers wondered if this incomplete separation of single-celled clones could have fostered the rise of multicellularity.To find out, they first looked at individual yeast cells growing in an environment where their only food was sucrose, a sugar they have to break down before ingesting. To do this, the yeast secrete an enzyme called invertase that splits sucrose into its components, fructose and glucose, which the cell can then absorb. (One advantage of being a complex, multicellular animal is that we don't have to digest our food outside our bodies.) When the yeast cells were all alone, they struggled to survive in the all-sucrose environment. But when other yeast cells were nearby, they all benefited by absorbing the leftover sugars that escaped their neighbors.Knowing that external digestion works better with friends around, the researchers hypothesized that clumps of yeast should be able to thrive where individual yeast cells can't. They genetically engineered yeast with a gene that encouraged clumping, rather than separation. As predicted, these clumpy yeast were able to grow in environments with a low concentration of food, where single yeast cells couldn't survive.It takes work for yeast cells to make and secrete invertase, though. Cells that are unable to make invertase are called "cheaters" because they can sit back and enjoy (or absorb, at least) the fruits of their neighbors' external digestion. If they're all alone, these cheaters won't find enough food. But when they're around enough other single cells, the cheaters will outcompete the hard-working invertase producers.To find out whether multicellularity would help yeast defend themselves against cheaters, the researchers arranged competitions between cheater yeast cells and non-cheaters, either alone or in clumps. At low food concentrations, the clumpy cells easily outcompeted the individual cells, ending up with higher numbers in their population. When cheaters were thrown into the mix, clumpy cells retained their advantage. All around, clumping was a better strategy.The authors may have modeled the beginning of multicellularity--or one beginning, since the trait evolved multiple times. They showed that the need to digest your food externally is enough to give groups of cells an edge over single cells, and that the strategy of keeping your clones all in one place is enough to seize that advantage. When the first mutant cells failed to fully separate from each other, and found that their little family was now growing faster than the individual cells around them, it could have been the start of something big. (That is, visible without a microscope.)They may be mere microscopic fungi, but a community of yeast cells has similarities to a community of animals, or even a city. Organisms live in groups when doing so helps them to get food, fend off predators, and pass on their genes. Even when our communities resemble disorganized, unsightly blobs, they're what keep our species alive.H. Koschwanez, J., R. Foster, K., & W. Murray, A. (2011). Sucrose Utilization in Budding Yeast as a Model for the Origin of Undifferentiated Multicellularity PLoS Biology, 9 (8) DOI: 10.1371/journal.pbio.1001122... Read more »

  • August 10, 2011
  • 01:02 PM

125 sq km of ice knocked off Antarctica by Tsunami

by Greg Laden in Greg Laden's Blog

The Honshu tsunami of March 11th (the one that caused the Fukushima disaster) caused the otherwise stable Sulzberger Ice Shelf to calve giant hunks of ice. Climate scientists call this "teleconnection." I call it a big whopping bunch of whack knocking off a gigunda chunka stuff. Either way, this is important and interesting. Read the rest of this post... | Read the comments on this post...... Read more »

Brunt, Kelly M., Okal, Emile A., & MacAyeal, Douglas. (2011) Antarctic ice-shelf calving triggered by the Honshu (Japan) earthquake and tsunami, March 2011 . Journal of Geology, 57(205), 785-788. info:/

  • August 10, 2011
  • 12:39 PM

Where did our smallpox vaccine come from?

by Connor Bamford in The Rule of 6ix

Edward Jenner's smallpox vaccination
Bring to mind the now famous 'first scientific exploration of vaccination', when, in the late 1700's, Edward Jenner - an English physician - first came up with the idea of using a non-pathogenic cowpox virus to vaccinate people against its deadly relative, smallpox (variola virus). 
Well, this virus and others like it, such as vaccinia virus (and its own viral derivatives, like the highly attenuated modified vaccinia virus Ankara) have been used worldwide to protect human populations from contracting smallpox (See dryvax and it's recombinant clone ACAM2000), which resulted in the eradication of variola virus during the second half of last century. These viruses are thought to all trace there ancestry back to cowpox isolates from around Jenner's time in the late 1700's, yet we don't really know for 100% where they originated.
Despite its renowned success, showing society the power of vaccination, the origins of cowpox have so far remained elusive. The story goes that Jenner's original cowpox isolate, through generations and generations has somehow become what we know of as vaccinia virus but how is anyone's guess. Maybe recombination with other poxviruses out there or maybe it is the last living representative of an extinct virus is the reason why.

What human cowpox looks like
Now, an international team of researchers (see paper here) has shed light on it's origins by sequencing and studying whole genomes - contrary to the single-gene-centric studies in the past - of multiple currently circulating isolates of cowpox from around the world in order to uncover the secrets of poxvirus evolution in general.
Viruses are a haven of genetic diversity - even DNA viruses, which have been largely ignored on that front in favor of their more mutation-prone RNA cousins; this fact is no more apparent than in the case of poxviruses. These viruses, including smallpox and the re-emerging human pathogen monkeypox represent an immense amount of genotypic and phenotypic variation, which is in itself medically and evolutionarily important. Just think of the devastation that smallpox caused to the human population and have a look at what monkeypox has been up to.

False-color electron micrograph of vaccinia virus particle

This group compared the DNA of  the cowpox strains to other closely related pox viruses, such as: smallpox itself, monkeypox, camelpox and tatera pox (viruses which themselves have a difficult to trace past) as well as current vaccine vaccinia strains. They thus generated large phylogenetic trees based on the compared sequence and then mapped these onto a map of Europe to see if they could uncover some geographical pattern of cowpox evolution.

This analysis found an as yet unappreciated diversity hidden under what we called 'cowpox viruses' through identification of a number of well-defined monophyletic groups that should in their own rights be designated separate species. Interestingly, it also appears that our vaccine strain has jumped species to horses and buffalo as viruses isolated from these species have close relatives in vaccinia-like strains.
Cowpox viruses were found to cluster in two major groups - cowpox like and vaccinia virus like suggesting that our smallpox 'vaccinia' vaccine potentially originated as a cowpox virus (as we thought) yet it was endemic to mainland Europe, something that goes against the tale of Jenner's isolation of cowpox from the UK.
The authors suggest that further sampling of more isolates from within the UK and across Europe may clear up any taxonomic uncertainties here. What this work does highlight is the oft under appreciated diversity of large DNA viruses, especially the medically important pox viruses and the difficulties of doing evolutionary analysis on viruses which such large genomes who like to recombine with each other.

Carroll, D., Emerson, G., Li, Y., Sammons, S., Olson, V., Frace, M., Nakazawa, Y., Czerny, C., Tryland, M., Kolodziejek, J., Nowotny, N., Olsen-Rasmussen, M., Khristova, M., Govil, D., Karem, K., Damon, I., & Meyer, H. (2011). Chasing Jenner's Vaccine: Revisiting Cowpox Virus Classification PLoS ONE, 6 (8) DOI: 10.1371/journal.pone.0023086... Read more »

Carroll, D., Emerson, G., Li, Y., Sammons, S., Olson, V., Frace, M., Nakazawa, Y., Czerny, C., Tryland, M., Kolodziejek, J.... (2011) Chasing Jenner's Vaccine: Revisiting Cowpox Virus Classification. PLoS ONE, 6(8). DOI: 10.1371/journal.pone.0023086  

  • August 10, 2011
  • 09:30 AM

Walking Linked to Cognitive Health in Women

by William Yates, M.D. in Brain Posts

Age-related cognitive decline is, to a certain extent, unavoidable.  Nevertheless, the rate of cognitive decline varies greatly between individuals.  This variance may include environmental and genetic determinants.Vascular disease is a risk factor for accelerated brain aging,  Alzheimer's disease and other neurodegenerative disorders.  Vascular disease is therefore an appropriate target to explore strategies for secondary prevention--preventing (or reducing) risk of cognitive decline in those with a risk factor for this decline.French scientist Marie-Noel Vercambre, along with colleagues from Harvard University have recently examined the role of exercise and cognitive decline in women with vascular disease.  Women participating in this study were 40 or older with evidence of vascular disease, or risk for vascular disease, by meeting one of the following criteria:History of stroke, transient cerebral ischemia attack, heart attack, angina, angioplasty, coronary artery bypass graft OR peripheral artery surgeryThree or more risk factors for vascular disease (diabetes mellitus, hypertension, hyperlipidemia, obesity and family history of early heart disease)The women in the study were followed prospectively with monitoring of cognitive function.  Cognitive function testing included tests of memory, mental status and category fluency were completed at baseline and during followup period over an average of about 5 years.The primary outcome measure in this study was rate of cognitive decline.  Subjects were grouped into those with the lowest and highest levels of physical activities including walking and then examined for correlation with physical activity levels.The women at the end of the study were in their early 70s.  Women with high levels of walking (approximately) 30 minutes of brisk walking daily) had significantly lower rates cognitive decline.  Women with daily walking habits performed at a cognitive level equivalent to non-walkers who were five to seven years younger.This study looked for correlation and was not designed to prove walking is the cause of reduced rate of cognitive decline.  Nevertheless, this research promotes further clinical trial research of aerobic exercise in women with vascular disease.  It will be important to include cognitive health as an important outcome measure in this type of research.Photo of Juno Beach sunrise with filter from the author's collection.  Original unfiltered photo can be found here.Vercambre, M., Grodstein, F., Manson, J., Stampfer, M., & Kang, J. (2011). Physical Activity and Cognition in Women With Vascular Conditions Archives of Internal Medicine, 171 (14), 1244-1250 DOI: 10.1001/archinternmed.2011.282... Read more »

Vercambre, M., Grodstein, F., Manson, J., Stampfer, M., & Kang, J. (2011) Physical Activity and Cognition in Women With Vascular Conditions. Archives of Internal Medicine, 171(14), 1244-1250. DOI: 10.1001/archinternmed.2011.282  

  • August 10, 2011
  • 09:26 AM

Tip of the week: CompaGB for comparing genome browser software

by Mary in OpenHelix

Here at OpenHelix we think a lot about the differences between nominally similar software that will accomplish some given task.  For example, in our workshops we are often asked about the differences between genome browsers.  Although UCSC sponsors our workshops and training materials on their browser, we know they aren’t the only genome browser out [...]... Read more »

Lacroix, T., Loux, V., Gendrault, A., Gibrat, J., & Chiapello, H. (2011) CompaGB: An open framework for genome browsers comparison. BMC Research Notes, 4(1), 133. DOI: 10.1186/1756-0500-4-133  

  • August 10, 2011
  • 08:19 AM

Special Editorial: Smoke Signals? How Second Hand Smoke Can Impact Your Child’s Mental Health

by Anita M. Schimizzi, Ph.D. in Child-Psych

We have known for a long time that secondhand smoke can have a serious impact on the physical health of children.  Asthma, sudden infant death syndrome, respiratory tract infections, dental decay, and middle ear infections are just a few of the illnesses that children exposed to secondhand smoke develop at significant rates.  In case parents [...]... Read more »

join us!

Do you write about peer-reviewed research in your blog? Use to make it easy for your readers — and others from around the world — to find your serious posts about academic research.

If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.

Register Now

Research Blogging is powered by SMG Technology.

To learn more, visit