Post List

Psychology posts

(Modify Search »)

  • October 30, 2014
  • 07:59 AM
  • 16 views

Fright Week: The Stranger in the Mirror

by The Neurocritic in The Neurocritic

In the mirror we see our physical selves as we truly are, even though the image might not live up to what we want, or what we once were. But we recognize the image as “self”. In rare instances, however, this reality breaks down.In Black Swan, Natalie Portman plays Nina Sayers, a ballerina who auditions for the lead in Swan Lake. The role requires her to dance the part of the innocent White Swan (for which she is well-suited), as well as her evil twin the Black Swan — which is initially outside the scope of her personality and technical abilities. Another dancer is favored for the role of the Black Swan. Nina's drive to replace her rival, and her desire for perfection, lead to mental instability (and a breathtaking performance). In her hallucinations she has become the Black Swan.1The symbolic use of mirrors to depict doubling and fractured identity was very apparent in the film:Perhaps Darren Aronofsky [the director's] intentions for the mirror was its power to reveal hidden identities. If you noticed the scenes where Nina saw herself in the mirror, it reflected the illusion of an evil. The mirror presented to her the darkness within herself that metaphorically depicted the evolution into the black swan. How can the recognition of self in a mirror break down?Alterations in mirror self-recognitionThere are at least seven main routes to dissolution or distortion of self-image:psychotic disordersdementiaright parietal-ish or otherwise right posterior cortical strokes and lesionsthe ‘strange-face in the mirror' illusion hypnosisdissociative disorders (e.g., depersonalization, dissociative identity disorderbody image issues (e.g., anorexia, body dysmorphic disorder) Professor Max Coltheart and colleagues have published extensively on the phenomenon of mirrored-self misidentification, defined as “the delusional belief that one’s reflection in the mirror is a stranger.” They have induced this delusion experimentally by hypnotizing highly suggestible participants and planting the suggestion that they would see a stranger in the mirror (Barnier et al., 2011): Following a hypnotic suggestion to see a stranger in the mirror, high hypnotizable subjects described seeing a stranger with physical characteristics different to their own. Whereas subjects' beliefs about seeing a stranger were clearly false, they had no difficulty generating sensible reasons to explain the stranger's presence. The authors tested the resilience of this belief with clinically inspired challenges. Although visual challenges (e.g., the hypnotist appearing in the mirror alongside the subject) were most likely to breach the delusion, some subjects maintained the delusion across all challenges. Ad campaign for the Exelon Patch (rivastigmine, a cholinesterase inhibitor) used to treat Alzheimer's disease. Photographer Tom Hussey did a series of 10 award-winning portraits depicting Alzheimer's patients looking at their younger selves in a mirror (commissioned by Novartis).Mendez et al. (1992) published a retrospective study of 217 patients with Alzheimer's disease. They searched the medical records for caregiver reports of disturbances in person identification of any kind. The most common type was transient confusion of family members that resolved when reminded of the person's identity (found in 33 patients). The charts of five patients contained reports of mirror misidentification, which was always associated with paranoia and delusions. Although not exactly systematic, this fits with other studies reporting that 2–10% of Alzheimer's patients have problems recognizing themselves in a mirror.A very thorough investigation of the topic was actually published 50 years ago, but largely neglected because it was in French. Connors and Coltheart (2011) translated the 1963 paper of Ajuriaguerra, Strejilevitch, & Tissot into English. The Introduction is quite eloquent:The vision of our image in the mirror is a discovery that is perpetually renewed, one in which our being is isolated from the world, from the objects surrounding it, and assumes, despite the fixed quality of reflected images, the significance of multiple personal and potential expressions. The image reflected by the mirror furnishes us not only with that which is, but also how our real image might be changed. It therefore inextricably combines awareness, indulgence and critique.They examined how 30 hospitalized dementia interacted with mirrors in terms of  (1) recognition of their own reflection; (2) use of reflected space; and (3) identifying body parts. The patients sat in front of a mirror and answered the following questions:What is this?Who is that?How old would you say that person is?How do you think you look?Then the experimenter stood behind them and asked questions about himself (e.g., “who is that man?”), and showed them objects in a mirror (e.g., an orange or a pipe – very funny).Eight patients did not recognize themselves in the mirror:Three didn't understand the concept of a mirror. They didn't pay attention to any reflections until directed to do so, and then they became transfixed. They also failed to recognize photos of themselves or their caretakers.Another three eventually admitted it might be themselves when prodded several times.These individuals had severe Alzheimer's disease.The final two recognized themselves the second time, and displayed considerably more anxiety. This sounds terribly frightening:These patients were attentive to their own reflections and those of the researchers, whom they identified. The first patient seemed a bit anxious; she began by touching herself, then laughed, then proclaimed “that is not quite me, it sort of looks like me, but it's not me.” When she was shown her photo head-on and then from the side, she immediately identified herself when the photo was head-on but from the side said “that's not quite me.” These individuals were in an earlier state of dissolution and likely had more awareness of what was happening to them.Other patients with mirrored-self misidentification show greater sparing of cognitive abilities. Chandra and Issac (2014) presented brief case summaries of five mild to moderate dementia patients with “mirror image agnosia, a new observation involving failure to recognize reflected self-images.” This is obviously not a new observation, but the paper includes two videos, one of which is embedded below. Sixty-two-year-old female was brought to the hospital with features of forgetfulness and getting... Read more »

Barnier AJ, Cox RE, Connors M, Langdon R, & Coltheart M. (2011) A stranger in the looking glass: developing and challenging a hypnotic mirrored-self misidentification delusion. The International journal of clinical and experimental hypnosis, 59(1), 1-26. PMID: 21104482  

Chandra SR, & Issac TG. (2014) Mirror image agnosia. Indian journal of psychological medicine, 36(4), 400-3. PMID: 25336773  

Mendez MF, Martin RJ, Smyth KA, & Whitehouse PJ. (1992) Disturbances of person identification in Alzheimer's disease. A retrospective study. The Journal of nervous and mental disease, 180(2), 94-6. PMID: 1737981  

  • October 30, 2014
  • 04:44 AM
  • 18 views

Pain and adolescent Chronic Fatigue Syndrome

by Paul Whiteley in Questioning Answers

"We found a higher prevalence of severe pain among adolescents with CFS [Chronic Fatigue Syndrome] and lowered pain thresholds compared with HCs [healthy controls]".That was the headline generated by the study from Anette Winger and colleagues [1] (open-access) looking to describe several parameters tied into experience of pain in the context of CFS. Further: "The total sum of bodily symptoms represented a heavy burden with great functional consequences".Your hokey pokey dragon is out helpin' Santa Claus pull his sled!The Winger paper is open-access, and pretty self-explanatory in terms of the hows and whys of the study (including strengths and limitations) so no need for me to further complicate things. As part of the NorCAPITAL project (The Norwegian Study of Chronic Fatigue Syndrome in Adolescents: Pathophysiology and Intervention Trial) (ClinicalTrial.gov entry here) which has already reported on the use of clonidine for CFS [2], the latest publication is an important add-on.There are a few details included in the results which do however merit some additional highlighting. So:"In the present study, almost three-quarters of the adolescents with CFS suffered from weekly pain, and pain on a daily basis was a problem for half of the patients". This was "highly significant" when compared with reports from controls, particularly where two-thirds of CFS participants reported weekly headaches. Muscle and joint pain were also recorded by adolescents with CFS alongside almost half reporting abdominal pain. Indeed, joint pain showed the most disparity between the groups with reports of such pain tipping 70% in the CFS group compared with only 10% of controls reporting this more frequently than once a month.When looking at result examining the pressure pain threshold (PPT) - "the minimum intensity of a stimulus that is perceived as painful" - and examining scores based on completion of the Brief Pain Inventory (BPI), authors concluded that: "At all measure points, PPTs were significantly lower (all p<0.001) among patients with CFS than HCs"."In our study, the adolescents reported that pain interfered with school, general activity and mood; however, we cannot conclude from this study that pain has a causal effect, because it could be the other way around". What's more to say about this research? Well, the very important message that the presentation of CFS might go well beyond just 'chronic fatigue' is paramount. This is not new news to science and practice as per the various reviews on the topic of pain exemplified by Nijs and colleagues [3]. I dare say that some public perceptions of CFS/ME would also change if more people understood that pain is a seemingly important manifestation of the condition. Oh and that CFS and pain sensation might not just be all in the mind...I'm also inclined to introduce the condition fibromyalgia (FM) into proceedings, given the many and varied reports talking about key symptoms overlapping [4]. I'm not altogether sure of the hows and whys of FM and CFS connecting, but certainly the primary FM symptom of widespread pain and extreme sensitivity strikes me as being potentially important. With no medical advice given or intended and perhaps somewhat counter-intuitive to analgesia, the increasing body of work looking at the use of something like low-dose naltrexone (see here for some of my interest in this area) for pain in FM [5] may also very well be something in need of a little more study with pain in CFS in mind, alongside other possible pain relief options.So then, The White Stripes with Ball and Biscuit.----------[1] Winger A. et al. Pain and pressure pain thresholds in adolescents with chronic fatigue syndrome and healthy controls: a cross-sectional study. BMJ Open. 2014; 4(10): e005920.[2] Fagermoen E. et al. Clonidine in the treatment of adolescent chronic fatigue syndrome: a pilot study for the NorCAPITAL trial. BMC Research Notes 2012, 5:418 [3] Nijs J. et al. Pain in patients with chronic fatigue syndrome: time for specific pain treatment? Pain Physician. 2012 Sep-Oct;15(5):E677-86.[4] Aaron LA. et al. Overlapping Conditions Among Patients With Chronic Fatigue Syndrome, Fibromyalgia, and Temporomandibular Disorder. Arch Intern Med. 2000;160(2):221-227.[5] Younger J. et al. Low-dose naltrexone for the treatment of fibromyalgia: findings of a small, randomized, double-blind, placebo-controlled, counterbalanced, crossover trial assessing daily pain levels. Arthritis Rheum. 2013 Feb;65(2):529-38.----------Winger, A., Kvarstein, G., Wyller, V., Sulheim, D., Fagermoen, E., Smastuen, M., & Helseth, S. (2014). Pain and pressure pain thresholds in adolescents with chronic fatigue syndrome and healthy controls: a cross-sectional study BMJ Open, 4 (10) DOI: 10.1136/bmjopen-2014-005920... Read more »

  • October 29, 2014
  • 09:13 AM
  • 45 views

7 things you probably didn’t know about blind people

by Usman Paracha in SayPeople

1.Blind people can't see in dreams:

Blind people are unable to see even in their dreams but they get a rich combination of different senses in their dreams. They get more feelings of taste, smell, touch, and hear in their dreams as compared to normal people.
2. They have fewer feelings of negative emotions:

Blind from birth people have fewer feelings of negative emotions such as anxiety and depression as compared to normal people.
Blind people see more nightmares (Image courtesy of Bogenfreund's Flickr stream)3. Vision in dream reduces with time:

The longer the blind people, who are not blind from birth, have lived without sight, the less they have chances of looking at things in their dreams.
4. They have the same level of emotions etc:

Blind people have almost same level of emotional and social contexts in their dreams as normal people. They have the same level of social interactions, prosperity, and failures in their dreams. They have almost the same intensity of emotions and bizarreness.
5. They have more nightmares:

People who are blind from birth have more chances of having nightmares that is about 4 times more than normal people. Interesting thing; they don’t know that they have more nightmares than normal people. However, people who became blind later in life have almost same chances of having nightmares as normal people.
6. Their nightmares are close to reality:

Nightmares of blind people are very close to reality as for example they may dream of getting lost, falling into an embarrassing situation, being hit by a vehicle, and/or losing their guide dog.
7. They visualize numbers in opposite direction:

People blind from birth visualize numbers in the opposite direction, i.e. their numbers are lined from right to left (6, 5, 4, 3, 2, 1) as opposed to left to right (1, 2, 3, 4, 5, 6) for normal people.
Sources:

How the Blind Dream – National Geographic (http://goo.gl/nFPbZU)[Meaidi, A., Jennum, P., Ptito, M., & Kupers, R. (2014). The sensory construction of dreams and nightmare frequency in congenitally blind and late blind individuals Sleep Medicine, 15 (5), 586-595 DOI: 10.1016/j.sleep.2013.12.008]

People Born Blind Suffer 4 Times More Nightmares – PsychCentral (http://goo.gl/PWbPLm)

Congenitally Blind Visualize Numbers Opposite Way to Sighted – Neuroscience News (http://goo.gl/I3y6pu)... Read more »

  • October 29, 2014
  • 08:30 AM
  • 38 views

How Does a Dog's Brain Respond to the Smell of a Familiar Human?

by CAPB in Companion Animal Psychology Blog

And what does it tell us about the importance of people to their dogs?Photo: hitmanphoto / ShutterstockNew fMRI research by Gregory Berns et al (in press) shows that dog’s brains respond differently to the smell of a familiar human compared to an unfamiliar human and other canines – suggesting that certain people are special to their dogs.The research focussed on a part of the brain called the caudate, which has been much investigated in humans, monkeys and rats. The scientists explain that “caudate activity is correlated with salient, usually rewarding signals that cause the animal to change its behavioural orientation to approach or consume the stimulus.” Previous research by the team showed that this part of the brain lights up when the dog is given a hand signal that means it will be given a treat, confirming that caudate activation in dogs is connected with rewards.The results showed that the caudate was activated significantly more in response to the smell of the familiar human than to any of the other smells – even the familiar dog. The scientists say, “Importantly, the scent of the familiar human was not the handler, meaning that the caudate response differentiated the scent in the absence of the person being present. The caudate activation suggested that not only did the dogs discriminate that scent from the others, they had a positive association with it. This speaks to the power of the dog’s sense of smell, and it provides clues to the importance of humans in dog’s lives.”Does this mean we can say that dogs love us? It’s certainly the case that when people look at photographs of loved ones, the same part of the brain is activated. But it's hard to interpret the activation on the scan in terms of the dog's subjective experience.The researchers caution there is another possible explanation in terms of conditioning. It may be that the familiar person had previously given the dog food and so the scent was simply eliciting a conditioned response. The researchers say they think it unlikely it is a conditioned response, because it was typically the handler – not the familiar human – who was responsible for feeding the dog. The results also showed that the olfactory bulb in the brain was activated by all five smells. This is not surprising but it is useful to know the result is as expected. The canine brain presents a bit of a challenge for fMRI studies – training needs aside – simply because of the great variety of head shapes in dogs. 12 dogs took part in the study. They had all previously taken part in fMRI research, in which they had to lie absolutely still during the scan. The smells came from swabs taken from the armpit of humans and from the perineal-genital area of dogs.The scents used in the study were of a familiar human, an unfamiliar human, a familiar dog, an unfamiliar dog, and the dog’s own scent. The familiar human was not the dog’s main caregiver – as that person was present during the scan – but someone else from the household, typically the husband or child of the main caregiver. The familiar dog lived in the same house.The dogs were trained using positive reinforcement and models of the equipment.  A clicker was used in initial stages of the training, but since the equipment is noisy it would not be heard during the scan itself. The dogs were taught a hand signal that meant they would get a reward, and this was used to replace a clicker in later stages of training. The training specific to this study included preparing the dog for a different head-coil than in previous scans, and getting used to having scent-impregnated cotton wool swabs put under the nose while they remained still.The number of dogs is small, and there are always trade-offs in the statistics used to make sense of fMRI scans. But the results are very intriguing, and we look forward to future research from this team.The full paper is available (open access) at the link below. Photographs of the dogs who took part are on page 3.  Do you have a special place in your dog’s heart?ReferenceBerns, G., Brooks, A., & Spivak, M. (2014). Scent of the familiar: An fMRI study of canine brain responses to familiar and unfamiliar human and dog odors Behavioural Processes DOI: 10.1016/j.beproc.2014.02.011If you enjoyed this, you might also like:Dogs Can Haz BrainScanz and EEG?Canine Neuroscience... Read more »

  • October 29, 2014
  • 04:36 AM
  • 42 views

The stability of an Asperger syndrome diagnosis

by Paul Whiteley in Questioning Answers

"Asperger Syndrome, when considered as an ASD/PDD [autism spectrum disorder/pervasive developmental disorder] diagnosis, was fairly stable into adulthood, but there was a significant increase over time in cases no longer meeting criteria for an ASD diagnosis according to the DSM-IV, or AS according to the Gillberg criteria".The night is darkest just before the dawn.That was one of the primary conclusions made in the paper by Adam Helles and colleagues [1] who prospectively followed a group of males diagnosed with Asperger syndrome (AS) in childhood into adulthood covering a period of some 20 years. I believe the starting point of this study has been seen before in the peer-reviewed literature in the paper by Cederlund & Gillberg [2] (open-access here) (a paper which takes me back to my own PhD days with it's important influence to some of my work). Other follow-ups have also been reported [3].Looking at the diagnostic stability of AS, Helles et al noted that compared with baseline where all participants fulfilled diagnostic criteria, at follow-up (two follow-ups actually) there was a "significant decrease in the rate of cases fulfilling any PDD diagnosis according to the DSM-IV, from 91% at T1 [time 1] to 76% at T2 [time 2] in the 47 cases followed up twice". The decline in cases according to the Gillberg criteria was even more stark (82% at T1 and 44% at T2).Researchers also noted a few other potentially important points in their findings such that: "Severity of autism spectrum symptoms at T1 was the main predictor of diagnostic stability at T2" and a fifth of those who met criteria for DSM-IV criteria for a PDD diagnosis "did not meet DSM-5 ASD criteria although they had marked difficulties in everyday life". This last point has been mentioned by other authors (see here).There are a few ways one could take the Helles findings. One could see it as further evidence of the fluidity of presented symptoms when it comes to the autism spectrum as per other discussions in this area (see here). You might even view it as an extension of all that chatter on something like differing developmental trajectories along the autism spectrum (see here) or 'optimal outcome' and autism (see here) albeit without the focus on early intervention as potentially being involved (see here) as far as we know. Indeed, one has to wonder whether for those not meeting the diagnostic criteria as they age and mature, this may in part be because of the various strategies learned over a lifetime to overcome some of the barriers posed by the diagnosis?But I can also see how for some people such research might be less well-received particularly when added to the 'disappearance' of the term Asperger syndrome from the latest revision of DSM (DSM-V). The paper by Spillers and colleagues [4] described concerns about "identity, community, the cure movement, and services" following the DSM-5 changes when talking to people on the autism spectrum. I wonder how the Helles findings on 'falling out of the spectrum' diagnostically speaking for some, might have similar tones if and when discussed.Accepting that the Helles findings were eventually based on quite a small participant group and their insinuation that not reaching the diagnostic thresholds for something like Asperger syndrome does not imply a life free of some of the more 'disabling' aspects on and around the diagnosis (yes, including various comorbidity), I do think there is more to see in this area. The realisation that we know so little about the autism spectrum in the long-term [5] and how behaviours ebb and flow, that our systems of diagnosis might not necessarily be as robust as we want them to be (see here) and the continued alliance between diagnosis and service receipt excluding many at the diagnostic periphery all come into play. With all the research data collected down the years, one suspects that with a little bit of organisation and willingness to plough some financial and other resources into this issue, further insight into exactly how stable an autism diagnosis might be and for who should be fairly readily available...Music to close, and having enjoyed the impressive tones of Sheryl Crow last evening, a song most parents will have a heard a few times: Real Gone.----------[1] Helles A. et al. Asperger syndrome in males over two decades: stability and predictors of diagnosis. Journal of Child Psychology and Psychiatry. 2014. 3 October.[2] Cederlund M. & Gillberg C. One hundred males with Asperger syndrome: a clinical study of background and associated factors. Dev Med Child Neurol. 2004 Oct;46(10):652-60.[3] Cederlund M. et al. Asperger syndrome and autism: a comparative longitudinal follow-up study more than 5 years after original diagnosis. J Autism Dev Disord. 2008 Jan;38(1):72-85.[4] Spillers JL. et al. Concerns about identity and services among people with autism and Asperger's regarding DSM-5 changes. J Soc Work Disabil Rehabil. 2014;13(3):247-60.[5] Howlin P. et al. Cognitive and language skills in adults with autism: a 40-year follow-up. J Child Psychol Psychiatry. 2014 Jan;55(1):49-58.----------Adam Helles, Carina I. Gillberg, Christopher Gillberg, & Eva Billstedt (2014). sperger syndrome in males over two decades: stability and predictors of diagnosis Journal of Child Psychology and Psychiatry : doi: 10.1111/jcpp.12334... Read more »

Adam Helles, Carina I. Gillberg, Christopher Gillberg, & Eva Billstedt. (2014) sperger syndrome in males over two decades: stability and predictors of diagnosis. Journal of Child Psychology and Psychiatry. info:/doi: 10.1111/jcpp.12334

  • October 28, 2014
  • 11:49 PM
  • 42 views

When Should Online Dating Partners Meet Offline?

by Wiley Asia Blog in Wiley Asia Blog - Social Science

Will the amount of online communications affect face-to-face (FtF) relational outcomes among online daters? Researchers analysed experience of using various online date sites of 433 online daters recruited by a market research firm.... Read more »

  • October 28, 2014
  • 01:40 PM
  • 55 views

The Final Girl: The Psychology of the Slasher Film

by Melissa Chernick in Science Storiented

Halloween has put me in the mood to talk about slasher movies. Once I got to looking around, I found more papers on the topic than I thought I would. I gotta warn you, this is a long read, so grab some popcorn and settle in for some slasher movie fun.If you are a fan of horror films then you know Randy Meek’s “Rules that one must abide by to successfully survive a horror movie”: (1) You can never have sex…big no-no, sex equals death, (2) you can never drink or do drugs…it’s the sin-factor, an extension of number 1, (3) never, ever, under any circumstances, say "I'll be right back" ‘cause you won’t be back. Scream got me to thinking about the psychology and tropes of the horror movie (don’t worry, this post is spoiler-free). Today I’m going to focus on journal articles and so won’t take the time and space going through the history of horror films (there are a list of good links below).I used Scream (1996) as an example because it is one of those movies that both parodies the genre and, at the same time, becomes an entry within the genre. That’s tough, and when done well, really great. In Scream’s case, it also resurrected a dormant genre to a whole new generation, the Gen Y teens of the 90’s (including me). In 2005, Valerie Wee published a paper in the Journal of Film and Video that looks at the role of this movie, and its sequels. She redefines and labels a more advanced form of postmodernism “hyperpostmodernism,” and in Scream, this is identified in two ways: (1) the loss of tongue-in-cheek sub-text in favor of actual text and (2) active referencing and borrowing of influential styles. The rules I quoted above are a great example of the first point, a type of discussion among characters that happens throughout the films. The dim lighting, camera angles, character names are all good examples of the second point. It’s s slasher film about slasher films, if you will. It worked so well because it acknowledged and played to the media hyperconsciousness of the American teenagers of that generation. As Wee puts it, a group that is “media literate, highly brand conscious, consumer oriented, and extremely self-aware and cynical.” Now doesn’t that make us sound like lovely people?As this hyperconscious generation, we can look back at the conventions, ideologies, and representations of those past works. We can ask what the attributes and associated tropes are of a successful horror movie. To do that, let’s go back to a 1991 paper by Douglas Rathgeb. In it, he identifies one of the most effective attributes of a horror film to be the unsettling sense of intrusion it creates, that feeling of normal versus abnormal. It is true that we must separate movie reality from real life but in the case of horror movies, the shock of the sudden, often freakish, intrusion of the horror to be terrifying element whether it is the perception of the reality or the acts of a bogeyman (or both). This unnerving feeling is evident in A Nightmare on Elm Street (1984) in which Wes Craven creates a nightmare state coexistent with reality. He removes the conventional signposts and distorts the physical parameters that we use to measure reality. Rathgeb spends a good amount of time on the id/superego model (specifically the “bogeyman id”) that I won’t go into, but he continually draws on the point of the victims’ moral blemish – the Original or Unpardonable sin – that permits some evil to terrorize the world. Randy’s first Rule plays out in Freddy Krueger’s increasingly disturbing, nocturnal, murderous visits upon the sexually active teenagers of Springwood. It is also exemplified in Michael Myers’ first murderous act in John Carpenter’s Halloween (1978), the act that leads him down the path of transformation into the bogeyman that menaces the morally deficient residents of Haddonfield.This segues nicely into a discussion of misogyny, the male monster, and the Final Girl. You don’t have to be a film expert to notice that slasher films contain a lot of violence primarily directed toward women, usually after they have broken Randy’s first Rule. In 2010 in the Journal of Popular Film and Television, Kelly Connelly studies this closely. She examines a marked change in the structure and action of horror films in the mid-1970’s, the birth of the slasher film subgenre. Within this subgenre a new role emerged: the Final Girl, the sole female survivor of a rampaging psychotic who has managed to rescue herself. You will know her when you see her in the beginning of the movie as she is the Girl Scout, the bookworm, the mechanic, the tomboy. She probably has a male name, isn’t sexually active, is resourceful, and is watchful to the point of paranoia. Connelly spends most of the paper breaking down Halloween and Halloween H2O: Twenty Years Later. I don’t have the space to cover all of that here, but ultimately, the Final Girl character boils down to one word: empowerment. The female victim achieves active empowerment through the act of rescuing herself.But are women really slasher film victims more often than men, or is it just more noticeable? A 1990 study by Cowan and O’Brien asked participants to analyze 56 slasher films to see how female and male victims survived as related to several traits, including markers of sexual activity (clothing, initiation, etc.). They found that women were neither more likely to be victims of slashers nor less likely to survive when attacked. In fact, they were more likely to survive than men. The authors postulate that this perception is likely because of the female victims in memorable films/scenes, especially when sex is involved, and that the female status in society as someone to be protected makes their victimization all that more evident. They also found that the non-surviving females were more frequently sexual, physically attractive, and inane. Randy got it right on that one. Nonsurviving males tended to be assholes (my term, not theirs) in that they had bad attitudes, engaged in illegal behaviors, and were cynical, egotistical and dictatorial. Notice that whether or not they broke the first Rule is not included.A 2011 paper by Richard Nowell argues that although early teen slasher (and/or stalker) films were made primarily for male youth, the marketing campaigns were also geared toward young women. Keep in mind that Nowell is not asking you focus on the nonviolent content and the films’ promotional campaigns. He asks you to consider movies like My Bloody Valentine (1981) and Prom Night (1980) which had posters picturing teens slow dancing beneath decorative hearts and tag lines like “There’s more than one way to lose your heart.” Even A Nightmare on Elm Street billed the principle protagonist as “she’s the only one who can stop it – if she fails, no one will survive.” Movies like Prom Night and Carrie (1976) spotlight female protagonists, female bonding, and various courtships. These tactics can been seen throughout the genre and even into their contemporary remakes.On the topic of remakes, in 2010, a paper by Ryan Lizardi compares the original movies to their remakes to see how they relate and how they speak to current cultural issues. Slasher movie remakes tend to stem from a particular period, the 1970’s through the early 1980’s, a period known for its ideological issues with gender and political ambivalence. There are quite a few anti-remakers that take the view that the original is better partly because they are best understood in relation to the periods in which they were produced. Remakes often have to redefine normal vs. abnormal to fit a contemporary time. Also, to fit the new time, the rules have changed, trending towards more gore and stylized production. Lizardi goes through the details, but Scream 4 boils down the Rules to successfully survive a horror movie remake: (1) death scenes are way more extreme, (2) unexpected is the new cliché, (3) virgins can die now, (4) new technology is now involved…cell phones, video cameras, etc. (5) don’t need an opening sequence, (6) don’t f- with the original, and (7) if you want to survive, you pretty much have to be gay. Okay, so maybe Lizardi doesn’t say the last one. He does spend some time revisiting the remake of the Final Girl. Remakes often have this character learn and witness the full extent of the killer’s depravity (in a really gory way) and endure the most psychological damage. Lizardi concludes that these films speak to contemporary concerns and even have endings. Let’s turn our head to sequels, 'cause let's face it baby, these days, you gotta have a sequel. A 2004 paper by Martin Harris examines the horror franchise. Here he draws on the concepts of postmodernism and the unsettling feeling of normal vs. not-normal. Why do the killers – Freddy, Michael, Jason – keep coming back? He argues that their resurrection in sequels engenders its own frightening uncertainty. Where is the threshold? When is dead really gone? Harris spends a good deal of time arguing that it is more the economic realities of Hollywood that drive the sequel-making process rather than a demand from moviegoers to revisit the killers. I can see a lot of truth in that as these are known prop... Read more »

Wee, Valerie. (2005) The Scream Trilogy, "Hyperpostmodernism," and the Late-Nineties Teen Slasher Film. Journal of Film and Video, 57(3), 44-61. info:/

  • October 28, 2014
  • 06:22 AM
  • 50 views

What I don’t hear can’t hurt me: insecure managers avoid input from employees

by BPS Research Digest in BPS Research Digest

Organisations do better when there are clear communication channels that allow staff to point out ways the company can improve. Similarly, teams who freely share ideas and concerns are more tight-knit and motivated. And their managers get enhanced awareness, and to share in the praise for any improvements that pay off. So encouraging employee voice should be a no-brainer, especially for any manager feeling unsure of their ability to deliver solo. Yet according to new research, these insecure managers are the ones least likely to listen and act on staff input.Nathanael Fast and colleagues began with a survey of 41 managers and their 148 staff within a multinational oil company. Managers who rated themselves lower on managerial self-confidence (e.g. they disagreed with statements like “I am confident that I can perform effectively on many different tasks”) tended to have staff who were less likely to speak out, stating that they perceived their manager did not encourage it. Why? A follow-up experiment aimed to find out.One hundred and thirty-one employed participants (84 women) read an imaginary scenario in which they were the manager of an airline that was receiving a rise in customer complaints. The scenario then described a meeting where the participant began announcing a solution. But before they had finished, an employee – a maintenance chief named Spencer – offered an alternative he argued was better for the airline in the long-term.The researchers found that whether participants heeded Spencer's advice depended on their confidence, which was manipulated at the start of the scenario. Some participants were told that they were performing impressively, others were told that people were questioning their competence. Those in the latter condition expressed lower faith in the maintenance officer’s expertise and showed less willingness to either implement his proposal or to seek help in the future from him or his colleagues.The underlying cause appears to be the existential threat posed to low-confidence managers by these employee ideas. As people are loath to admit to such insecurities, the researchers didn’t directly measure them. Instead, they showed they could cancel the effect of low confidence by asking participants to complete a positive affirmation: a short writing exercise reminding themselves of their other positive qualities, As this intervention worked, it suggests that the root cause of managers’ ignoring staff advice was related to their own defensiveness and desire to protect their managerial status.Accepting unsolicited feedback can be challenging for anyone. But “The Manager” is by definition on top of things, so gaps in awareness can be particularly threatening for people in that role. Self-confidence makes it easier to take that medicine, and enjoy its benefits in the long-term. But those anxious about their capability may be afraid of being unmasked, and turn away from sources of insight, at their own cost.Here we see how the harms caused by self-doubt can spill over into a wider climate. Organisations could help new managers put aside unrealistic expectations of their need to be omniscient, and to recognise the benefits of putting the entire team brain to work. After all, better to have the Spencers of this world on your side than against you._________________________________ Fast, N., Burris, E., & Bartel, C. (2014). Managing to Stay in the Dark: Managerial Self-Efficacy, Ego Defensiveness, and the Aversion to Employee Voice Academy of Management Journal, 57 (4), 1013-1034 DOI: 10.5465/amj.2012.0393 Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

... Read more »

  • October 28, 2014
  • 03:42 AM
  • 60 views

Zinc and depression

by Paul Whiteley in Questioning Answers

"Low dietary zinc intake is associated with a greater incidence of depression in both men and women, as shown in two prospective cohorts".At the risk of overdoing the whole 'you are what you eat' sentiment, today I'm addressing a portion of the peer-reviewed research literature linking issues with zinc availability to depression. That opening quote by the way, comes from the paper by Khanrin Phungamla Vashum and colleagues [1] who looked at self-reported dietary intake of zinc based on data derived from "the Australian Longitudinal Study on Women׳s Health (women aged 50-61 years) and Hunter Community Study (men and women aged 55-85 years)". I'll come back to that shortly...Absinthe is the aphrodisiac of the selfAs I've mentioned once or twice before on this blog, the description of 'depression' covers quite a lot of diagnostic ground, with all-manner of correlations put forward to try and account for why depression is seemingly so prevalent in modern society. What does seem to be apparent from the voluminous literature is that various factors, at various times and under various situations seem to contribute to depression as a clinical condition with comorbidity also seemingly playing an important hand. In short, it's very, very complicated and very, very individual.I've tended to talk more about the physiological correlates linked to various types of depression on this blog as per discussions as diverse as gut bacteria (see here) and autoimmune conditions (see here) and upcoming conversations on something like depression and inflammation. This on top of my borderline obsession with all-things vitamin D (see here). It's not that I'm not interested in the psychological or sociological side of things (as per some chatter about overlapping syndromes) but quantifying such factors is rather more difficult than for example, taking a blood sample and looking at the performance of compound X or gene Y. I'd also drop in the fact that quite a bit of the 'causative' research in the area of psychological and/or sociological factors seems to overlook important factors such as resilience too.Anyhow, aside from the focus on 'self-report' noted in the Vashum study, which is notoriously problematic when it comes to recording eating habits and patterns, there is actually quite a bit of research already published on the topic of zinc and depression.The splendidly named Walter Swardfager and colleagues [2] published a very comprehensive review and meta-analysis on the topic of zinc and depression a little while back (covering the peer-reviewed text up to June 2012). Based on data looking at blood zinc levels in over 1500 people with depression compared against 800 asymptomatic controls, they concluded that: "Depression is associated with a lower concentration of zinc in peripheral blood". Indeed, zinc levels were: "approximately -1.85 µmol/L lower in depressed subjects than control subjects".Other studies have complemented the Swardfager findings albeit with zinc deficiency present in a more general context in psychiatry. The findings from Grønli and colleagues [3] (open-access) are a good example, whereby researchers reported: "a significant difference in zinc deficiency prevalence between the control group (14.4%) and the patient group (41.0%)" where the patient group consisted of "psychogeriatric patients" who were "compared with the elderly controls". The findings from Maserejian and colleagues [4] (open-access) also suggested some gender effect mediating the zinc-depression link; specifically: "inadequate dietary zinc intake contributes to depressive symptoms in women".Insofar as the reasons for zinc deficiency appearing in cases of depression, science is yet to settle on a definitive answer. The paper by Marcin Siwek and colleagues [5] (open-access) suggested three possible reasons: (i) nutritional deficiencies as per the Vashum findings, (ii) "hyperstimulation of the hypothalamic-pituatary-adrenal (HPA) axis, and the associated hypercortisolism" and/or (iii) the result of an inflammatory response "associated with oxidative stress". That last variable on inflammation and oxidative stress might also tie into other data on zinc supplementation "decreasing oxidative stress and generation of inflammatory cytokines such as TNF-alpha and IL-1beta" in certain patient groups [6]. I might also refer you to an excellent post by Dr Emily Deans titled: 'Zinc! An Antidepressant?' with a more detailed analysis of some of the possible hows and whys.The supplementation of zinc in cases of depression has also been covered in the research literature. As an adjunctive therapy, Ranjbar and colleagues [7] (open-access) reported that: "zinc supplementation together with SSRIs antidepressant drug improves major depressive disorders more effectively in patients with placebo plus antidepressants (SSRIs)". A review of some of the controlled trials prior to the Ranjbar results by Lai and colleagues [8] concluded similar things with regards to zinc as an 'add-on' treatment but that: "There is less clear evidence on the effectiveness of zinc supplementation alone on depressive symptoms of non-depressed healthy subjects". Animal studies have complemented this collected literature as for example, reported by Tassabehji and colleagues [9] looking at rats; the authors suggested that: "zinc deficiency leads to the development of depression-like behaviors that may be refractory to antidepressant treatment".There is quite a compelling scientific case for far greater research inspection of zinc in relation to depression and for example, further working out what it seems to be doing. Important too is the issue of who might be the best-responders to something like zinc supplementation in relation to depression/depressive symptoms.That being said, I wouldn't want anyone to assume that I'm advocating zinc supplementation as some sort of cure-all for depression or anything else. To repeat myself: depression is a very complex set of conditions combining both biology and psychology. Science is still feeling its way around this area, despite the importance of nutrition to depression being increasingly recognised (see here).So... Cosmic Girl by Jamiroquai.----------[1] Vashum KP. et al. Dietar... Read more »

  • October 27, 2014
  • 10:54 PM
  • 45 views

Nature is helpful for your mind even in artificial settings

by Usman Paracha in SayPeople

Main Point:

In a study, researchers have found that sounds of nature played in the background even from a recording can help in recovery from a negative experience. In another study, it has been found that watching 3-D videos of trees can help in recovery from stress.
Published in:

Ecopsychology

Environment and Behavior
Study Further:

Everybody knows that moving in nature, listening to different natural sounds, and looking at beautiful sceneries can help in improving overall quality of health and mood. Writers like to walk in nature to gain inspiration for their writing, tourists go out for beauty, and scientists visit nature to get relaxation from daily routines. Recently scientists have found that nature in artificial settings can also help in recovery from negative experiences of life.

In a study, published in the journal Ecopsychology, researchers have reported that sounds of nature even on a recording can help in moving out of the negative experiences and getting a positive mood. They have also found that natural sounds are more helpful in recovery from bad experiences of life than hybrid sounds, i.e. sounds having mixture of natural sounds and man-made sounds, and completely man-made sounds. However, not all types of natural sounds are helpful as for example sounds of predatory animals such as lions and sounds of violent natural phenomena such as thunderstorms.

“Thus natural soundscapes can provide restorative benefits independent of those produced by visual stimuli,” Researchers concluded.

In another study, published in the journal Environment and Behavior, researchers have reported that watching 3-D videos of trees can help in moving out of stressful experiences. Interestingly, videos with no trees have less affect on reducing the stress as compared to watching videos having trees in them. Moreover, the more trees up to a certain limit (tree canopy of 24-34%) a person sees in the video, the more he or she will be able to recover from stress.

“These findings suggest that viewing tree canopy in communities can significantly aid stress recovery and that every tree matters,” Researchers stated in the paper.
References:

Benfield Jacob A.,, Taff B. Derrick,, Newman Peter,, & Smyth Joshua. (2014). Natural Sound Facilitates Mood Recovery Ecopsychology

Jiang, B., Li, D., Larsen, L., & Sullivan, W. (2014). A Dose-Response Curve Describing the Relationship Between Urban Tree Cover Density and Self-Reported Stress Recovery Environment and Behavior DOI: 10.1177/0013916514552321... Read more »

Benfield Jacob A.,, Taff B. Derrick,, Newman Peter,, & Smyth Joshua. (2014) Natural Sound Facilitates Mood Recovery . Ecopsychology. info:/

  • October 27, 2014
  • 03:40 PM
  • 65 views

Real Zombie-Making Parasites Among Us

by Miss Behavior in The Scorpion and the Frog

The mummified cat and the rat in the crypt of Christ Church in Dublin. Photo by Adrian Grycuk at Wikimedia Commons.The Happening, M. Night Shyamalan’s worst panned movie of all time, is a science fiction thriller about people going into a mysterious trance and committing suicide as a result of other mind-hacking species. One of the leading criticisms raised against this movie is the ridiculousness of the premise. One species can’t cause another to willingly commit suicide! …Or can they? Toxoplasma gondii (we’ll call it T. gondii) is a protozoan parasite that has developed just such mind-hacking abilities! As far as we can tell, T. gondii only reproduces in the digestive tract of cat species, where it lays fertile eggs that are pooped out into the environment. From there, T. gondii eggs can contaminate any number of things that are consumed by other animals, such as rodents, birds, or even humans. When cats eat prey animals that are infected with T. gondii, another generation of parasites is now positioned to reproduce and the cycle continues.However, prey animals can be pretty good at avoiding cats, in part by avoiding the smell of cats. This is a problem for the reproductive plans of T. gondii. The tiny protozoan has responded to this problem with remarkable biological sophistication: They alter the behavior of their rodent hosts so that the infected rodents find the smell of cat urine so irresistible that they run straight towards their predators! Now, researchers have found that T. gondii-infected rats don’t only like the smell of cat urine, but they even prefer the smell of wild cat urine over the smell of urine of weaker domesticated cats.A rat checks out odor-soaked papers in a Y-shaped apparatus. Image from Kaushik, et al. (2014) in Integrative and Comparative Biology.Maya Kaushik, Sarah Knowles and Joanne Webster at the School of Public Heath at the Imperial College of London compared the responses of rats that were either infected with T. gondii or not to urine produced by domestic cats or wild cats. To do this, they put infected or uninfected rats into a Y-shaped apparatus. For each trial, tissue paper soaked in domestic cat urine or wild cat (cheetah or puma) urine was placed in two of the three arms and nothing was placed in the third arm. The researchers then measured how much time the rats spent in each of the three arms and how much they moved.As expected, the T. gondii-infected rats avoided the cat-urine-soaked arms less than the uninfected rats did. Furthermore, when presented with a choice between arms with wild cat urine versus domestic cat urine, the infected rats (but not the uninfected rats) preferred the smell of the predatory wild cats over the domestic cats! Infected rats also moved more slowly around the wild cat urine compared to domestic cat urine, as if just begging any wild cats that may be around to eat them. It appears that T. gondii have developed a mechanism to turn rats into mindless zombies that practically run into the mouths of the nearest, most vicious cat they can find.These mind-hacked rat-zombies may not be the only victims of T. gondii. People (particularly those that change their kitties’ litter boxes) can also become infected with the parasite. Some estimates suggest that nearly one-third of all people are already infected! Furthermore, people that test positive for T. gondii infection find the smell of cat urine more attractive than people who test negative! Although we are not likely to run to be eaten by our house-bound kitties, we may be more likely to change the litter box (or get more cats and become a crazy cat lady). So it looks like many of us are mind-hacked zombies too! Want to know more? Check this out:Kaushik, M., Knowles, S., & Webster, J. (2014). What Makes a Feline Fatal in Toxoplasma gondii's Fatal Feline Attraction? Infected Rats Choose Wild Cats Integrative and Comparative Biology, 54 (2), 118-128 DOI: 10.1093/icb/icu060 ... Read more »

  • October 27, 2014
  • 07:02 AM
  • 63 views

So, potential juror, how much online porn do you watch?

by Doug Keene in The Jury Room

We can hear the snickers and gasps now–and likely the immediate objection from (probably) the opposing counsel or (unquestionably) the judge. But not always. So why might this be something you want to know? According to new research in the Journal of Sex and Marital Therapy, a distinguishing characteristic of narcissists is that they watch […]

Related posts:
An update on online research of potential jurors
Excuse me potential juror: Is your brain red or blue?
Excuse me, potential juror, but just how big is your amygdala?


... Read more »

Kasper TE, Short MB, & Milam AC. (2014) Narcissism and Internet Pornography Use. Journal of Sex , 1-6. PMID: 24918657  

  • October 27, 2014
  • 05:40 AM
  • 51 views

Doing the "happy walk" made people's memories more positive

by BPS Research Digest in BPS Research Digest

Walking in a more happy style could help counter the negative mental processes associated with depression. That's according to psychologists in Germany and Canada who used biofeedback to influence the walking style of 47 university students on a treadmill.The students, who were kept in the dark about the true aims of the study, had their gait monitored with motion capture technology. For half of them, the more happily they walked (characterised by larger arm and body swings, and a more upright posture), the further a gauge on a video monitor shifted to the right; the sadder their gait, the more it shifted leftwards. The students weren't told what the gauge measured, but they were instructed to experiment with different walking styles to try to shift the bar rightwards. This feedback had the effect of encouraging them to walk with a gait characteristic of people who are happy.For the other half of the students, the gauge direction was reversed, and the sadder their gait, the further the gauge shifted to the right. Again, these students weren't told what the gauge measured, but they were instructed to experiment with their walking style and to try to shift the gauge rightwards as far as possible. In other words, the feedback encouraged them to adopt a style of walking characteristic of people who are feeling low.After four minutes of gait feedback on the treadmill, both groups of students were asked how much forty different positive and negative emotional words were a good description of their own personality. This quiz took about two minutes, after which the students continued for another eight minutes trying to keep the gait feedback gauge deflected to the right. The students' final and crucial task on the treadmill was to recall as many of the earlier descriptive words as possible.The striking finding is that the students who were unknowingly guided by feedback to walk with a happier gait tended to remember more positive than negative self-referential words, as compared with the students who were guided to walk with a more negative style. That is, the happy walkers recalled an average of 6 positive words and 3.8 negative words, compared with the sad walkers who recalled an average of 5.47 positive words and 5.63 negative words. Focusing on the students who achieved the happiest style of gait, they recalled three times as many positive words as the students who achieved the saddest style of gait."Our results show that biased memory towards self-referent negative material [a feature of depression] can be changed by manipulating the style of walking," said the research team led by Johannes Michalak. The observed effects of gait on memory were not accompanied by any group differences in the students' self-reported mood at the end of the study, suggesting a direct effect of walking style on emotional memory processes.The results build on past research that suggests pulling a happy facial expression can lift people's mood. There could be exciting practical implications for helping people with depression, but the researchers acknowledged some issues need to be addressed. For example, the current study involved a small non-clinical sample, and the researcher who delivered the forty emotional words to the walking students was not blind to the gait condition they were in, raising the possibility that he or she inadvertently influenced the results in some way. It's also notable that there wasn't data from a baseline control group whose gait was not influenced; it would have been useful to see how they performed on the memory test. _________________________________  Michalak, J., Rohde, K., & Troje, N. (2015). How we walk affects what we remember: Gait modifications through biofeedback change negative affective memory bias Journal of Behavior Therapy and Experimental Psychiatry, 46, 121-125 DOI: 10.1016/j.jbtep.2014.09.004 Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

... Read more »

  • October 27, 2014
  • 04:37 AM
  • 58 views

Diagnosing autism late: after psychosis

by Paul Whiteley in Questioning Answers

The case report from Marly Simoncini and colleagues [1] (open-access) is the topic of today's post. Describing the case of Mr. A, a young man who attempted suicide during a psychotic episode, the paper tracks the developmental history and diagnostic evaluation of this person culminating in a diagnosis of autism spectrum disorder (ASD) "that had been completely overlooked".The best thing we can do is go on with our daily routineThe paper is open-access and I would encourage readers to take some time to look through the narrative. Not only are some of the more commonly cited features of autism in childhood described in the paper as per his toy preferences and his wish to "play alone for hours with a few toys" but also other potentially important points: "He continued with selecting his food (white and squared foods only) and drinking milk only from his infant feeding bottle, until he was an adolescent". The outcome of various psychometric assessments specific to autism are also discussed, including his scores on the ADOS and ADI (see here) eventually placing him on the autism spectrum.The important story of how this case report illustrates how much further we need to go in terms of awareness of autism across the lifespan is also complemented by the discussions on how the autism spectrum seems (in some cases) to merge with other spectrums. The authors note: "signs and symptoms of both a psychotic disorder and an ASD might run isolated or in clusters during the entire lifespan, often not reaching the threshold for a categorical diagnosis until adulthood". I might add that the 'autism overlooked' part of this study is not probably not something common to modern-day autism (see here).Treading quite carefully, I have, on a few occasions on this blog, talked about how there may overlapping presentation of autism and psychosis in some cases (see here and more recently here). Indeed not so long ago, I read a very personal account of a mother caring for a child on the autism spectrum and her experiences of a meltdown: "... apparently it used to be called ‘childhood schizophrenia’ and as I watched Ethan totally lost to me at that moment, in what looked like a possessed fit, I could see how it could have been labelled as schizophrenia". I should point out that schizophrenia is not the same as a 'possessed fit' (see here) but can, and does, present as a range of psychological symptoms as part of the psychosis spectrum (see here).Of course, one should not forget that a diagnosis of autism is seemingly protective of nothing in terms of other somatic or psychiatric conditions to be present. It might also be nothing more than coincidence that autism and psychosis ran parallel in the case of Mr. A. That being said and on the back of other texts such as the go-to paper by Tom Berney [2], I do wonder if greater thought needs to be put into looking at autism across the lifespan. How, in amongst the sometimes fluidic changes in presentation according to factors such as maturation [3], further screening for issues such as psychosis should be more regularly implemented in order to mitigate any negative effects they may have both for the person concerned and their loved ones?----------[1] Simoncini M. et al. Lifetime Autism Spectrum Features in a Patient with a Psychotic Mixed Episode Who Attempted Suicide. Case Reports in Psychiatry. 2014: 459524.[2] Berney TP. Asperger syndrome from childhood into adulthood. Adv Psychiatr Treat. 2004; 10: 341-351.[3] Helles A. et al. Asperger syndrome in males over two decades: stability and predictors of diagnosis. Journal of Child Psychology and Psychiatry. 2014. 3 October.----------Simoncini, M., Miniati, M., Vanelli, F., Callari, A., Vannucchi, G., Mauri, M., & Dell’Osso, L. (2014). Lifetime Autism Spectrum Features in a Patient with a Psychotic Mixed Episode Who Attempted Suicide Case Reports in Psychiatry, 2014, 1-4 DOI: 10.1155/2014/459524... Read more »

Simoncini, M., Miniati, M., Vanelli, F., Callari, A., Vannucchi, G., Mauri, M., & Dell’Osso, L. (2014) Lifetime Autism Spectrum Features in a Patient with a Psychotic Mixed Episode Who Attempted Suicide. Case Reports in Psychiatry, 1-4. DOI: 10.1155/2014/459524  

  • October 26, 2014
  • 10:18 PM
  • 78 views

Using neuroimaging to expose the unconscious influences of priming

by neurosci in Neuroscientifically Challenged

In 1996, a group of researchers at NYU conducted an interesting experiment. First, they had NYU students work on unscrambling letters to form words. Unbeknownst to the students, they had been split up into three groups, and each group unscrambled letters that formed slightly different words. One group unscrambled words with a "rude" connotation like aggressively, bold, and interrupt. Another group unscrambled "polite" words like considerate, patiently, and respect. And the third group unscrambled neutral words like watches and normally.The students were told they should come find the experimenter, who would be waiting in a different room, after they finished the unscrambling task. This, however, was just another part of the experiment. When the students walked up to the experimenter, he was engaged in a conversation with someone else (who was actually in on the experiment). The experimenter stood in such a way that it was clear he knew the student was waiting for him, but he nevertheless continued his conversation and didn't acknowledge the student.In fact, the experimenter continued talking for 10 minutes unless the student interrupted to draw attention to the fact that he or she was done with the unscrambling task (and being somewhat rudely ignored). What the experiment really had set out to determine was if the type of words the students unscrambled seemed to have an influence on whether or not they interrupted the experimenter. Interestingly, about 80% of students who unscrambled polite words waited a full 10 minutes without interrupting, while only 35% of the students who unscrambled rude words waited that long. On average the rude-word group only waited 5.4 minutes, compared to the polite-word group's 8.7 minutes. The students, of course, were not aware that the words they unscrambled had any effect on their patience, or lack thereof.Now, think about the implications of this experiment in your daily life. If its findings are valid--and it's worth noting that this particular area of research has been criticized for the publication of studies that others have been unable to replicate--it suggests that information that we are not consciously aware of shapes our thoughts and behavior. Taken a step further, we could begin to question how much of our behavior is even under our own conscious control. For example, you might swear that fight you got into with your significant other was about doing the dishes and it never would have happened if he/she hadn't blatantly disregarded your strong opinions--yet again--about leaving dirty dishes in the sink. But maybe your inclination towards hostility had been influenced by that jerk who cut you off in traffic an hour prior, causing you to overreact negatively and call it quits on an a relationship that was pretty good despite a lack of harmony on the relatively minor issue of timely dish washing.The influence a previous experience has on our likelihood of responding in a particular way later on is known as priming. It was first discovered in the 1970s through a series of simple experiments exploring response time in tasks like determining if groups of letters represented English words. For example, in one such experiment researchers presented participants with pairs of words. Sometimes the words used were actual English words (e.g. butter), other times they were nonsense words (e.g. nart), and they were presented in different combinations of each. The researchers found that participants were able to identify something as an English word more rapidly if the word presented previous to it had a related meaning (e.g. the first word was nurse and the second was doctor). Since then, a number of experiments have investigated this effect a previous experience can have on a subsequent response, showing that it can influence everything from reaction time to subtleties of behavior like the speed at which someone walks.Priming and memoryPriming is considered an an example of implicit memory, a term that describes a type of memory that can influence behavior even though we aren't consciously aware of it. We use a form of implicit memory called procedural memory every day when we engage in tasks that we have performed countless other times, like tying our shoes. In these cases we don't consciously think of the process involved in doing the job (often we are thinking of something quite different), but clearly we retain a memory of how to perform the task, and that memory facilitates its execution.The influence of priming extends much further than shoe-tying, however. Although it may be difficult for us to accept, our implicit memory seems to affect the beliefs we hold and the decisions we make. Because our brains are so good at forming connections between things we see around us and things we have seen or learned in the past, our implicit memory is being accessed on a continuous basis. For example, in another study researchers put participants in two groups: one group filled out a questionnaire in a room that smelled strongly of citrus all-purpose cleaner, while the other filled out a questionnaire in a room with no apparent odor. Then, the researchers had both groups eat a crumbly biscuit. The group that had been exposed to the citrus smell was significantly more likely to clean up the crumbs from their biscuit. Even though they weren't consciously thinking about it, the citrus scent (hypothetically) conjured up implicit associations with cleanliness, which prompted the participants to clean up after themselves.Priming and the brainUnderstanding the neuroscientific correlates of priming has not been simple, in part because it seems to involve a diverse selection of brain areas. One general finding has been that there is a reduction in brain activity during exposure to a primed stimulus (i.e. a stimulus that has been preceded by priming) vs. an unprimed stimulus. For example, if you prime someone by exposing them to words related to transportation, then ask them to unscramble letters that could readily be formed into words like traffic or drive, you will see less activity in their brains than if you hadn't primed them. This should make intuitive sense, as the brain that has been primed is not having to work as hard. It can rely on cues from implicit memory to bring to mind potential words the letters might form.One reason we tend to see many brain regions involved in priming is that different systems are used to process different types of stimuli--as well as different aspects of the same stimulus. For example, if the primed stimulus involved the meaning of a word, then we would see a decreased response to the primed stimulus in a number of areas of the brain associated with processing different aspects of a word, like meaning, spelling, phonology, and so on. If the primed stimulus involved an odor, we would see a reduction in brain activity in very different regions.  There are also, however, some commonalities in the neural activity underlying priming across different types of stimuli. For example, regions of the inferior temporal cortex and inferior frontal gyrus have been found to respond to abstract qualities of stimuli, and thus they are activated even when the prime and the primed stimulus are presented in different ways. For example, one study saw activity in these areas when the prime involved normally-oriented words and the primed stimulus involved mirror-reversed words. The inferior temporal cortex and inferior frontal gyrus are also activated in response to primed stimuli of different perceptual modalities (e.g. auditory and visual), and they are still activated when the prime and the primed stimulus are each presented in a different modality. Thus, it may be that areas like these mediate the priming of concepts, regardless of how the stimulus is introduced and initially processed.Neuroimaging evidence also suggests that the prefrontal cortex may play an especially important role in priming, as it is another area where activity is reduced in response to a number of different types of primed stimuli. The prefrontal cortex is frequently associated with executive functions, and as such it is involved in managing the activity of a network of brain areas in retrieving memories and handling other cognitive duties. Having an implicit memory to draw upon, however, may make its job a little easier, allowing the prefrontal cortex to work more efficiently to complete the task at hand. Thus, reduced activity in the prefrontal cortex during exposure to a primed stimulus may generally represent a decreased reliance on the conscious processing of a stimulus due to the contributions of implicit memory.There are some patterns of brain activity that we can associate with priming, but wh... Read more »

Schacter, D., Wig, G., & Stevens, W. (2007) Reductions in cortical activity during priming. Current Opinion in Neurobiology, 17(2), 171-176. DOI: 10.1016/j.conb.2007.02.001  

  • October 25, 2014
  • 03:47 AM
  • 93 views

Autism and intolerance of uncertainty

by Paul Whiteley in Questioning Answers

Good morning, gentlemen, the temperature is 110 degrees'Change' is often mentioned as something potentially problematic for many on the autism spectrum, and how unexpected change can sometimes have profound effects in terms of those so-called 'challenging behaviours' or when it comes to the presentation of important comorbidity such as anxiety. Like many others from the outside looking in, I was always taught that change as a more general concept was the important issue in autism, but recently the word 'uncertainty' has been creeping into various discussions that I've seen and in particular, the concept of an 'intolerance of uncertainty' noted in cases of autism.As far as I can ascertain, intolerance of uncertainty with autism in mind was first described in the peer-reviewed literature by Christina Boulter and colleagues [1] and subsequently by Sarah Wigham and colleagues [2]; both papers originating from the University of Newcastle, here in the bracing North East of England. The Boulter paper initially looked at how intolerance of uncertainty (IU) tied into the expression of anxiety in paediatric autism noting results "consistent with a causal model". The Wigham paper extended these findings, drawing on how the IU-anxiety relationship may also stretch to the presentation (interplay) of sensory issues among other things.Focusing specifically on the Boulter paper, a few details might be in order (unfortunately the paper is not open-access)IU - defined as "a broad dispositional risk factor for the development and maintenance of clinically significant anxiety" - was assessed as part of a larger research platform looking at anxiety and autism.Derived from various sources (including the Daslne initiative), participants (N=224) including children/young adults diagnosed with an autism spectrum disorder (ASD) (n=114) and asymptomatic controls (n=110) were assessed for IU via the Intolerance of Uncertainty Scales (child and parent report versions). "The scale assesses IU by asking respondents to rate the extent to which statements relating to emotional, cognitive and behavioural responses to uncertainty are like them, or... like their child". Various other measures including the SRS and the Spence Children's Anxiety Scales (SCAS) were also delivered to participants.Results: well as if we needed telling "children with ASD showed higher levels of anxiety than TD [typically developing] children". As per previous discussion on quality of life and autism, the question of who reports anxiety (first person vs. second person reports) featured in the Boulter findings, although "disagreement appears to have been more pronounced in the TD group than in the ASD group". Children with ASD were also reported to have "significantly higher levels of IU" and such elevations in IU "accounted for the increased levels of anxiety in the children with autism" hence the previous chatter about causal models et al. Perhaps also importantly, the relationship between IU and anxiety "was the same in both children with ASD and those without" so "similar processes may be at work within both populations".There are some obvious caveats to these results. The authors point out that their focus on ability "within the normal range" as a function of their questioning is a limitation, and the 'caution' that goes with "generalising conclusions to all children with ASD". I might add that the introduction of a non-ASD anxiety-only control group would probably not have gone amiss either. Drawing on the more general literature on IU, the findings from Yook and colleagues [3] might also suggest that additional measures of worry and rumination (another important concept [4]) might have been useful to investigate too. This may be particularly important given the reports of overlap in depressive-type symptoms/syndromes occurring alongside cases of autism. Me being me, I would also have liked to seen some physiological measure(s) included too...Still, I am rather intrigued by these initial findings on IU and how they may potentially fit into the often very disabling anxiety which can accompany a diagnosis of autism. If anything else, they may present a further target for intervention - bearing in mind the need for further research on the use of something like CBT for anxiety in autism - with the aim of improving quality of life.Music to close, and continuing a recent theme on this blog: The Smiths and Ask (yes, I have been listening to their greatest hits, and yes, they probably were one of the best bands ever).----------[1] Boulter C. et al. Intolerance of uncertainty as a framework for understanding anxiety in children and adolescents with autism spectrum disorders. J Autism Dev Disord. 2014 Jun;44(6):1391-402.[2] Wigham S. et al. The Interplay Between Sensory Processing Abnormalities, Intolerance of Uncertainty, Anxiety and Restricted and Repetitive Behaviours in Autism Spectrum Disorder. J Autism Dev Disord. 2014 Sep 27.[3] Yook K. et al. Intolerance of uncertainty, worry, and rumination in major depressive disorder and generalized anxiety disorder. J Anxiety Disord. 2010 Aug;24(6):623-8.[4] Hare DJ. et al. Anxiety in Asperger's syndrome: Assessment in real time. Autism. 2014 May 8.----------Boulter C, Freeston M, South M, & Rodgers J (2014). Intolerance of uncertainty as a framework for understanding anxiety in children and adolescents with autism spectrum disorders. Journal of autism and developmental disorders, 44 (6), 1391-402 PMID: 24272526... Read more »

  • October 24, 2014
  • 01:14 PM
  • 84 views

Fish Want to Play Too

by Elizabeth Preston in Inkfish

Yes, fish. These aquarium lap-swimmers and pursuers of flaked food aren’t known for their joie de vivre. Yet in one hobbyist’s tanks, scientists say they’ve captured a rare instance of fish playing around. James Murphy is a herpetologist at the Smithsonian National Zoological Park. Although he professionally studies reptiles and amphibians, he keeps fish as […]The post Fish Want to Play Too appeared first on Inkfish.... Read more »

  • October 24, 2014
  • 11:23 AM
  • 96 views

Publication bias afflicts the whole of psychology

by BPS Research Digest in BPS Research Digest

In the last few years the social sciences, including psychology, have been taking a good look at themselves. While incidences of fraud hit the headlines, pervasive issues are just as important to address, such as publication bias, the phenomenon where non-significant results never see the light of day thanks to editors rejecting them or savvy researchers recasting their experiments around unexpected results and not reporting the disappointments. Statistical research has shown the extent of this misrepresentation in pockets of social science, such as specific journals, but a new meta-analysis suggests that the problem may infect the entire discipline of psychology.A team of psychologists based in Salzburg looked at “effect sizes”, which provide a measure of how much experimental variables actually change an outcome. The researchers randomly sampled the PsycINFO database to collect 1000 psychology articles across the discipline published in 2007, and then winnowed the list down to 395 by focusing only on those that used quantitative data to test hypotheses. For each main finding, the researchers extracted or calculated the effect size.Studies with lots of participants (500 or more) had an average effect size in the moderate range r=.25. But studies with a smaller sample tended to have formidable effect sizes, as high as .48 for studies with under 50 participants. This resulted in a strong negative relationship between number of participants and size of effect, when statistically the two should be unrelated. As studies with more participants make more precise measurements, .25 is the better estimate of a typical psychology effect size, so the higher estimates suggest some sort of inflation.The authors, led by Anton Kühberger, argue that the literature is thin on modest effect sizes thanks to the non-publication of non-significant findings (rejection by journals would be especially plausible for non-significant smaller studies), and the over-representation of spurious large effects, due to researchers retrospectively constructing their papers around surprising effects that were only stumbled across thanks to inventive statistical methods.The analysts rejected one alternative explanation. To detect powerful effects a small sample is sufficient, so researchers who anticipate a big effect thanks to an initial "power analysis" might deliberately plan on small samples. But only 13 per cent of the papers in this report mentioned power, and the pattern of correlation in these specific papers appears no different to that found in the ones who never mention power. Moreover, the original 1000 authors were surveyed as to what they expected the relationship between effect size and sample size to be. Many respondents expected no effect, and even more expected that studies with more participants would have larger effects. This suggests that an up-front principled power analysis decision is unlikely to have been driving the main result.Kühberger and his co-analysts recommend that in future we give more weight to how precise study findings are likely to be, by considering their sample size. One way of doing this is by reporting a statistic that takes sample size into account, the “confidence interval”, which describes effect size not as a single value but as a range that we can be confident the true effect size falls within. As we all want to maintain confidence in psychological science, it’s a recommendation worth considering (but see here for an alternative view)._________________________________ Kühberger, A., Fritz, A., & Scherndl, T. (2014). Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size PLoS ONE, 9 (9) DOI: 10.1371/journal.pone.0105825 --further reading--Questionable research practices are rife in psychology, survey suggestsSerious power failure threatens the entire field of neuroscienceMade it! An uncanny number of psychology findings manage to scrape into statistical significanceFake data or scientific mistake?Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.
... Read more »

  • October 24, 2014
  • 02:51 AM
  • 67 views

Autism, siblings and DSM-5 Social Communication Disorder

by Paul Whiteley in Questioning Answers

A quick post to bring to your attention the paper by Meghan Miller and colleagues [1] who concluded that: "Pragmatic language problems are present in some siblings of children with ASD [autism spectrum disorder] as early as 36 months of age". Further: "As the new DSM-5 diagnosis of Social (Pragmatic) Communication Disorder (SCD) is thought to occur more frequently in family members of individuals with ASD, it is possible that some of these siblings will meet criteria for SCD as they get older".Isn't this a school day?The DSM-5, as many in the autism community will already know, has been the source of quite a bit of discussion/argument as to how it has started to re-define what we label as autism or autism spectrum disorder. The initial signs have been that use of the DSM-5 criteria does indeed impact on the numbers of cases of autism (see here) and in particular, that the category termed 'Social Communication Disorder' (SCD) is filling up with those who might present with social communication issues without the repetitive or restricted behaviours required to fulfil the ASD label. Whether this implies the same levels of services and resources will be available to those with SCD as it is supposed to for those with ASD remains to be seen.I did wonder whether the Miller findings were an important indication (although not the first [2]) that science might also be putting a bit more flesh on to the bones of the concept of a broader autism phenotype (BAP). Describing the subtle speech and language and social interactive issues described on the diagnostic borderlands of autism [3], it strikes me that there is more than a smidgen of overlap between SCD and the BAP (at least with more strength of data than the suggestion of a link between the BAP and postnatal depression). With cautions down the years about assuming "all children with pragmatic difficulties have autism" [4], does the advent of the SCD diagnostic category offer a viable alternative?Music to close, and the sheer brilliance of Morrissey (live). And for those who might want to know a little more about the man behind the music: The Importance Of Being Morrissey.----------[1] Miller M. et al. Early pragmatic language difficulties in siblings of children with autism: implications for DSM-5 social communication disorder? J Child Psychol Psychiatry. 2014 Oct 15.[2] Botting N. & Conti-Ramsden G. Pragmatic Language Impairment without Autism. Autism. 1999; 3: 371-396[3] Dawson G. et al. Defining the broader phenotype of autism: genetic, brain, and behavioral perspectives. Dev Psychopathol. 2002 Summer;14(3):581-611.----------Miller M, Young GS, Hutman T, Johnson S, Schwichtenberg AJ, & Ozonoff S (2014). Early pragmatic language difficulties in siblings of children with autism: implications for DSM-5 social communication disorder? Journal of child psychology and psychiatry, and allied disciplines PMID: 25315782... Read more »

  • October 23, 2014
  • 04:42 PM
  • 31 views

Trick-or-Treating: What Do You Hand Out On Halloween?

by Melissa Chernick in Science Storiented

Halloween is almost here. And you know what that means: Candy! It’s one of those Halloween traditions that I just never seem to have grown out of. Those little chocolate bars are seriously dangerous to my waistline. Remember how much Halloween candy you ate when you were a kid? Were you one of those kids who gorged on all that sugary goodness, or were you the type to parse it out and make it last? I was a Trader, that kid that made deals to trade all her bad candy for the good stuff. Anyway, the topic of Halloween candy got me to searching through the scholarly journals in search of an article for today’s post. I came across a paper that asks if children really need all of that candy on Halloween. My first instinct was “Of course they do! It’s Halloween!” But, well, read on…It seems that being a fat American isn't just limited to adults, childhood obesity has been on the rise over the last 3 to 4 decades. Access to unhealthy foods and the poor nutritional quality of their diets is much to blame for this. The promotion and glorification of high sugar, high fat foods on Halloween is simply a good example. It probably comes as no surprise that research has shown that when children are given free access to tasty food that they eat it, especially if it’s sweet, even when they are not hungry. But is there a good alternative that kids will like? A slightly older paper published in Journal of Nutrition Education and Behavior investigated the option and value of nonfood treats as substitutes for candy on Halloween.To do this, the researchers gave 284 trick-or-treating children a choice of a toy or candy. The toys included stretch pumpkin men, large glow-in-the-dark insects, Halloween theme stickers, and Halloween theme pencils. The candy choices were recognizable name brand lollipops, fruit-flavored chewy candies, fruit-flavored crunchy wafers, and “sweet and tart” hard candies. All of these toy and candy options ranged between 5 and 10 cents per item. When a trick-or-treating child arrived at a door, they were asked for their age, gender and a description of their Halloween costume (pretty typical…except maybe the gender question). Then they were presented with 2 identical plates: 1 with 4 different types of toys and the other with 4 different types of candy, alternating by site/household which on side the plates were located. Only children between 3 and 14 were included. And, if the child asked for both toy and candy, they were allowed to take both but were excluded from the study, but only 1 little girl did that.The results of the study showed that children chose toys as often as they chose candy. This suggests that children may forego candy more readily than adults expect. The authors cite Social Cognitive Theory as providing a way for candy alternatives to become more commonplace. According to this Theory, when parents see that children are accepting the candy alternatives they are more likely to continue the new toy giving behavior. Factor in the other fun Halloween activities – dressing up, walking around the neighborhood at night, etc. – and these toy treats become positively associated with the fun and the holiday.Okay, so what about the Halloween-only-comes-once-a-year argument. I’ll admit it is a good one and it something that the authors spend time to address. They point out that Halloween isn’t the only holiday where food and candy are advertised. Well, isn’t that the truth. They even make a nice list of other food laden holidays and events: weddings, new babies, graduation, back-to-school, birthdays, Christmas, Hanukkah, Valentine’s Day, Easter, St. Patrick’s Day, Cinco de Mayo, Earth Day (wait…you get food on Earth Day?), Mother’s Day, Father’s Day, and Independence Day. What they don’t add in are all of the other food-filled events that come seasonally (picnics, cook-outs, etc.) and socially (happy hours, get-togethers, etc.). Put together, that’s a lot of bad food choices all year long.Ultimately, what the authors are getting at is promoting healthy choices throughout the year. Part of this is giving children healthier options and traditions. It is almost more of a change for adults than it is for children. Breaking those food habits and associations isn’t easy, y’all. Food isn’t love, no matter how many Hershey’s Kisses you give someone. Wow did that ever sound shrinky!But I’ll throw in a last little note that I think almost every child on the planet would agree with: Don’t be that house than hands out toothbrushes.Schwartz, M., Chen, E., & Brownell, K. (2003). Trick, Treat, or Toy: Children Are Just as Likely to Choose Toys as Candy on Halloween Journal of Nutrition Education and Behavior, 35 (4), 207-209 DOI: 10.1016/S1499-4046(06)60335-7(image and product via epicurious)... Read more »

join us!

Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.

If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.

Register Now

Research Blogging is powered by SMG Technology.

To learn more, visit seedmediagroup.com.