765 posts · 555,598 views
Reports on the latest psychology research plus psych gossip and comment. Brought to you by the British Psychological Society.
People hold strong feelings about the meanings of irony and sarcasm. Just look at the reaction to Alanis Morissette's global hit 'ironic' - despite commercial success, the apparent misunderstanding of irony conveyed by the song provoked a chorus of derision (at least everyone agreed that this state of affairs was ironic). So I'd say it's with some courage that Melanie Glenwright and Penny Pexman have chosen to investigate the tricky issue of when exactly children learn the distinction between sarcasm and irony. Their finding is that nine- to ten-year-olds can tell the difference, although they can't yet explicitly explain it. Four- to five-year-olds, by contrast, understand that sarcasm and irony are non-literal forms of language, but they can't tell the difference between the two. So that we're all on the same page, here's what Glenwright and Pexman recognise as the distinction between sarcasm and irony. In both cases the speaker says the opposite of what they mean, but whereas an ironic statement is aimed at a situation, a sarcastic remark is aimed at a person and is therefore more cutting.Glenwright and Pexman presented five- to six-year-olds and nine- to ten-year-olds with puppet show scenarios that ended with one of the characters making a critical remark. This remark could be literal, aimed at a person or situation, or it could non-literal, again aimed either at a person (i.e. sarcastic) or situation (i.e. ironic). To illustrate: two puppets are playing on a trampoline, one falls on his face. 'Great trampoline tricks,' the other character says, sarcastically. Contrast this with two puppets playing on a saggy trampoline with little bounce. One of them says 'great trampoline', an ironic remark.To gauge the children's depth of understanding, the researchers asked them to rate how mean the utterances were (using a sliding scale of smiley to miserable faces) and asked them which character they most identified with - the idea being that in instances of sarcasm they would, out of sympathy, identify more with the target of that sarcasm. The children's responses showed that both age groups recognised the non-literal utterances as intending to mean the opposite of what was said. However, only the older age group showed a sensitivity to the difference between irony and sarcasm. They, but not the younger children, rated sarcastic utterances as meaner and were more likely to identify with the target of sarcasm, presumably out of sympathy. The older children's comprehension was not complete, though. In open-ended questioning they were unable to explain their differential response to sarcasm and irony.'By nine to ten years of age, children's sensitivity to the distinction between sarcasm and verbal irony highlights their impressive understanding of how people's feelings are affected by others' speech ...' the researchers said. 'We investigated one distinction here, but there are other non-literal forms that should be examined, such as understatement and hyperbole.'_________________________________Glenwright M, & Pexman PM (2010). Development of children's ability to distinguish sarcasm and verbal irony*. Journal of child language, 37 (2), 429-51 PMID: 19523264
... Read more »
Glenwright M, & Pexman PM. (2010) Development of children's ability to distinguish sarcasm and verbal irony*. Journal of child language, 37(2), 429-51. PMID: 19523264
Earlier this year a piece of emotion research provoked a rather heated reaction in some quarters after it claimed to show that, contrary to the writings of Charles Darwin, Paul Ekman and others, facial emotional expressions are not universal after all. "Seriously, is this all that it takes to be published in Current Biology? Sheesh," was the verdict of one incredulous online commenter to Reddit (a more considered critical reaction is here). Now, with a diplomat's tact, David Matsumoto and colleagues have presented new findings showing that facial emotional expressions start out universal, but then become culturally differentiated. We're all correct, everyone wins, big smiles all round, or maybe little ones, depending on where you were brought up.Matsumoto's team studied thousands of photographs taken of jūdōka at the Athens Olympics in 2004 just after matches had ended. The researchers were particularly interested in whether, and how quickly, competitors' altered their initial facial emotional expressions after winning or losing.The key finding was that cultural differences emerged, with athletes from collectivist cultures, such as China, tending to mask their emotional expressions more than athletes from individualistic cultures like the UK. The research also showed that jūdōka from more wealthy, densely populated countries tended to be less concerned to mask their emotional expressions than competitors from rural, less populated countries."These findings demonstrate that, across time, a given individual's emotional expressions in a single context can be both universal and culture-specific," the researchers said. Further analysis showed that cultural influences on emotional expressions tended to kick in within one to two seconds of the initial appearance of a facial emotional display. Matsumoto and his colleagues believe that the initial facial reaction is triggered automatically by subcortical brain structures, before more culturally specific modification is applied by the motor cortex. Dampening down an emotional expression appeared to take less time than completely masking an initial emotional display with another expression, consistent with the idea that masking requires more neurocognitive resources. "These findings open the door to future research and theory on the temporal dynamics of culturally moderated facial expressions," the researchers said. _________________________________Matsumoto D, Willingham B, & Olide A (2009). Sequential dynamics of culturally moderated facial expressions of emotion. Psychological science : a journal of the American Psychological Society / APS, 20 (10), 1269-75 PMID: 19754526
... Read more »
Matsumoto D, Willingham B, & Olide A. (2009) Sequential dynamics of culturally moderated facial expressions of emotion. Psychological science : a journal of the American Psychological Society / APS, 20(10), 1269-75. PMID: 19754526
People don't need to be treated as a stereotype for harm to occur; their mere belief that they could be viewed in a stereotyped fashion is enough - a phenomenon known as 'stereotype threat'. For example, women reminded of the stereotype that men are better at maths tend to perform more poorly in a subsequent maths task, even if they are actually treated fairly. Now Julie Henry and colleagues have extended this line of research to the domain of mental health. They've found that patients with a schizophrenia diagnosis function less well socially, when they think that the person they're chatting with knows their diagnosis.Thirty people diagnosed with schizophrenia or schizoaffective disorder spent a few minutes chatting on their own to one research assistant and then they did the same with another assistant an hour later. There were a few points of deception: first, the participants were led to believe that the assistants were participants from another study. Also, most importantly, before one of the conversations began, they were told that the assistant knew about their diagnosis of schizophrenia; before the other, they were told the assistant did not know. They were also told, truthfully, that both the people they were to chat with did not themselves have a diagnosis of schizophrenia. In reality, the research assistants didn't know whether each participant had a diagnosis of schizophrenia or not. This was achieved by having them them chat to the participants diagnosed with schizophrenia plus a number of control participants. Crucially, they weren't told in advance who was who. After each conversation, the research assistants rated the social behaviour of the person they'd just chatted with. The participants in turn rated the behaviour of the assistant they'd just chatted with and they said how they felt the conversation had gone.The key finding is that the social functioning of the participants with schizophrenia seemed to deteriorate when they thought their conversational partner knew their diagnosis (even though they didn't). Specifically, when they thought their diagnosis had been disclosed, the participants were rated by the research assistants as being more impaired at initiating conversations and at switching topics appropriately, and the assistants also found these conversations less comfortable.Henry's team can't be sure, but they think these apparent deficits emerged because the participants' concern about how they would be judged, in light of their diagnosis having been disclosed, interfered with their ability to converse in a more effective manner.A further twist was that the participants with schizophrenia seemed unaware of these effects - they reported finding the conversations, in which their diagnosis was known, just as comfortable and successful as when they thought their diagnosis had been kept hidden. This contrasts with non-clinical research on stereotype threat, in which people seem to be aware of the effects on their performance.The results provide food for thought regarding when and how mental health diagnoses should be disclosed. The researchers said their findings suggest 'that one of the defining qualities of [schizophrenia] - social skill impairment - is not caused solely by the disorder per se, but rather, also derives from feelings of being stereotyped.'_________________________________Henry, J., Hippel, C., & Shapiro, L. (2010). Stereotype threat contributes to social difficulties in people with schizophrenia. British Journal of Clinical Psychology, 49 (1), 31-41 DOI: 10.1348/014466509X421963
... Read more »
Henry, J., Hippel, C., & Shapiro, L. (2010) Stereotype threat contributes to social difficulties in people with schizophrenia. British Journal of Clinical Psychology, 49(1), 31-41. DOI: 10.1348/014466509X421963
Mirror neurons are one of the most hyped concepts in psychology and neurocience. V.S. Ramachandran famously wrote that they will 'do for psychology what DNA did for biology'. Although recordings from single cells in the brains of monkeys have identified 'mirror' neurons that respond both to the execution of a movement and the observation of another agent performing that same movement, the existence of such cells in humans has, up until now, been inferred only from indirect evidence, particularly brain imaging. Now, for the first time, Roy Mukamel and colleagues have provided direct evidence, using implanted electrode recordings of single cells, for the existence of mirror neurons in humans.Mukamel's team seized the opportunity for single cell recording provided by the clinical investigations that were being carried out on patients with intractable epilepsy. These patients had electrodes implanted into their brains to identify the loci of their seizures. Mukamel and his colleagues recruited 21 of these patients and had them look at videos of hand gestures or facial expressions on a laptop in one condition, and perform those same gestures and expressions in another condition.Most of the 1177 cells that were recorded showed a response either to the execution of an action or the sight of that action, not both. However, there was a significant subset of 'mirror' neurons in the front of the brain, including the supplementary motor area, and in the temporal lobe, including the hippocampus, that responded to the sight and execution of the very same actions.Critics could argue that rather than having mirror properties, these cells were responding to a concept. For example, according to this argument, a cell that responded to the sight of a smile and the execution of a smile, was actually being activated by the smile concept. Mukamel's group reject that argument. They had a control condition in which the words for actions appeared on a screen, rather than those actions being seen or performed. The postulated mirror neurons responded to the sight and execution of an action, but not the word.Another potential criticism is that the execution-related activity of a postulated mirror neuron is triggered by the sight of one's own action, rather than by motor-output per se. However, this can't explain the mirror neurons that responded both to the sight of a given facial expression and one's own execution of that facial expression (although proprioceptive feedback could still be a potential confound). Mirror neurons make functional sense in relation to empathy and imitative learning, but a drawback could be unwanted imitation and confusion regarding ownership over actions. The researchers uncovered another subset of cells that could help reduce these risks - these cells were activated by the execution of a given movement but inhibited by the sight of someone else performing that same movement (or vice versa). 'Taken together,' the researchers concluded, 'these findings suggest the existence of multiple systems in the brain endowed with neural mirroring mechanisms for flexible integration and differentiation of the perceptual and motor aspects of actions performed by self and others.' _________________________________Roy Mukamel, Arne D Ekstrom, Jonas Kaplan, Maraco Iacoboni, & Itzhak Fried (2010). Single-Neuron Responses in Humans during Execution and Observation of Actions. Current Biology [In Press].
... Read more »
Roy Mukamel, Arne D Ekstrom, Jonas Kaplan, Maraco Iacoboni, & Itzhak Fried. (2010) Single-Neuron Responses in Humans during Execution and Observation of Actions. Current Biology. info:/
The Alien vs. Predator series of films provide a rare exception to the usual rule that fictional worlds are separate, with pretend entities in one not existing in any other. In 2006, Deena Skolnick and Paul Bloom showed that young children aged between three and six years already understand this idea well. For example, the children said that the comic hero Batman could touch his side-kick Robin, but couldn't touch the sea sponge cartoon character SpongeBob. Now Weisberg (nee Skolnick) and Bloom have built on these findings, showing that young children also keep fictional game worlds separate when they are playing.An initial study involved 50 three- and four-year-olds. Each child sat with two experimenters, a toy bear, a toy doll and a central pile of toy blocks. The first experimenter, located to the right, introduced the child to the doll Mary; together they pretended it was her bath-time and the child used one or more blocks as bath objects, such as soap. Then the second experimenter, located to the left, introduced the child to Bruno the bear. They pretended it was his bedtime and the child used one or more blocks in the game, for example as a pillow.The crucial part came next, as the first experimenter told the child that Mary had grown tired and needed to sleep, whilst Bruno had woken and wanted to wash. Rather than using the toy block already established to be a pillow in Bruno's world, the children, regardless of age, nearly always reached for a new block from the pile to use as a pillow for Mary. Similarly, rather than using Mary's soap, most children reached for a new block to use as soap for Bruno. This remained the case in a follow-up study in which the researchers took great effort to ensure the children understood that the objects in one game world were available, and no longer being used by another toy character."Just because something was a pillow in Bruno's world did not necessarily mean that it was a pillow in Maggie's world," the researchers said.Concerned that the parallel play arrangements of the first two studies were unnatural, the researchers also performed a third and final study where two games were played in sequence. This time, if the researcher announced between game sessions: "I'm bored, let's play something else" the children were far less likely to transfer pretend objects from one game to another compared with an alternative situation in which the researcher merely said they should take a break between play sessions. In other words, the children seemed to understand when the researcher intended that they create a new fictional world."The results from these three studies suggest that children keep different pretend play games separate from each other, imposing subtle structure on their make-believe worlds," the researchers said._________________________________Skolnick Weisberg, D., & Bloom, P. (2009). Young children separate multiple pretend worlds. Developmental Science, 12 (5), 699-705 DOI: 10.1111/j.1467-7687.2009.00819.x
... Read more »
When we’re socialising and we try to make a certain impression – to appear confident, say, or smart – doing so affects our perception of the person we’re talking to, leading us to think they have less of the same trait that we’re trying to demonstrate in ourselves. Bryan Gibson and Elizabeth Poposki showed this in five experiments involving hundreds of undergrads.
In each experiment participants watched a short film before discussing it with another student (actually a stooge working for the researchers) in two brief (15 and 8 second) exchanges over a webcam. Crucially half the participants were given a specific ‘impression management’ goal . This was either to appear introverted, extraverted, smart, confident or happy, depending on the experiment. Afterwards the participants rated themselves and the student they’d conversed with.
The central finding was that, compared with the control participants, students given an impression management goal tended to rate their conversation partner lower on whichever trait they’d tried to demonstrate in themselves, but not on other traits.
Gibson and Poposki’s theory is that this effect occurs via two mechanisms. Striving to make a particular impression causes us to adopt a comparison mindset, they say. And by shifting our own self-construct on a given trait, our conversation partner appears as a consequence to have less of that trait in comparison with ourselves.
This explanation was borne out by the various experiments. For example, the effect still occurred even when participants were given an impression management goal, but no chance to act on it – they were tricked into thinking their webcam was broken, so they could see and hear their partner but their partner couldn't see or hear them. This suggests the mere formation of an impression management goal is enough to shift the self-concept and affect our perception of others. On the other hand, this study's central effect didn’t occur when the researchers recruited participants who reported having a particularly fixed self-construct with regards to the relevant trait. In other words, when a person’s self-construct wasn’t shifted by an impression management attempt, their perception of their conversation partner wasn’t altered.
Gibson and Poposki said their findings raise many interesting questions for future research. One of these concerns narcissists, who have an ongoing desire to come across as highly intelligent. This could cause them to chronically underestimate other people’s intelligence, which might well contribute to their social difficulties.
‘Our research highlights the notion that the impressions we form of others are not made in a social vacuum,’ the researchers concluded. ‘By selecting particular impression management goals to guide our social interactions, we may unwittingly influence how we come to view others as much as we influence how they come to view us.’
Gibson B and Poposki EM (2010). How the adoption of impression management goals alters impression formation. Personality and social psychology bulletin, 36 (11), 1543-54 PMID: 20921279
... Read more »
Gibson B, & Poposki EM. (2010) How the adoption of impression management goals alters impression formation. Personality and social psychology bulletin, 36(11), 1543-54. PMID: 20921279
'I don't tip because society says I have to. All right, if someone deserves a tip, if they really put forth an effort, I'll give them a little something extra. But this tipping automatically, it's for the birds. As far as I'm concerned, they're just doing their job.'Mr Pink, Reservoir Dogs.Stats from the USA suggest that $40 billion is spent on tips every year. Yet from the traditional economic perspective, which sees us as rational agents operating in our own interest, tipping waiters, barbers, taxi drivers and other service workers is crazy. You don't have to so why do you? That's if you do. Not everyone does. In an effort to explore our motivations for tipping, Stephen Saunders and Michael Lynn sent out 29 fieldworkers to survey 530 South African citizens after they'd had an encounter with a car guard. These unpaid workers are a common sight in South Africa at shopping centres, hospitals and schools. They help with parking, protect the car from vandalism and assist drivers with loading shopping and luggage.One explanation for why we tip is that we're trying to encourage good service in the future. However, Saunders and Lynn found no evidence that people who used a car guard more were more likely to tip, as you'd expect if this were their true motive. By contrast, perceived service quality was associated with both the likelihood of giving a tip and the amount tipped, thus suggesting that participants were using tipping as a form of reward. Similarly, those who said they thought it was important to help others in need tended to tip more (although they weren't any more likely to tip), suggesting altruism was another motive. Finally, social norms were a key factor - participants who said their friends and relatives thought it was important to tip were more likely to tip themselves, especially if there were more people with them at the time of questioning. Size of tip was not associated with this factor, perhaps because it's only the act of tipping that's visible to others, rather than the amount tipped.'Hopefully this paper will encourage more economists to look beyond the apparent irrationality of tipping and to study it from both a behavioural economics and psychological perspective,' the researchers said. In a separate study, based in Utah, John Seiter and Harry Weger tested the effects of ingratiation on food servers' tips. They had two waiters and two waitresses go about their usual duties but with a twist: for half the parties they served they were instructed to compliment the customers, telling them that they'd made an excellent choice in what they'd ordered. Counting the tips received from 348 dinner parties showed that complimenting customers on making a shrewd order led to tips that were three per cent greater on average than when no compliment was made - a statistically significant boost. 'A roughly 3 per cent increase may seem a small amount,' the researchers said, '[but] an additional $1 to $5 per shift could translate into hundreds of dollars per year for each food server.'More in-depth analysis showed that complimenting customers on their order only led to bigger tips for parties of two to three people. It made to no difference with a party of four and actually led to smaller tips for groups larger than this (the research involved parties of up to seven). It also turned out that one of the waiting staff had received smaller tips after complimenting customers (even thought the group average was for larger tips in this condition). Seiter and Weger surmised this could be because she didn't come across as sincere. This study builds on earlier research showing that use of mimicry, light touches on customers' shoulders, happy faces on the bill and squatting to customers' eye level can all help provoke larger tips. _________________________________Saunders, S., & Lynn, M. (2010). Why tip? An empirical test of motivations for tipping car guards. Journal of Economic Psychology, 31(1), 106-113 DOI: 10.1016/j.joep.2009.11.007Seiter, J., & Weger, Jr., H. (2010). The Effect of Generalized Compliments, Sex of Server, and Size of Dining Party on Tipping Behavior in Restaurants. Journal of Applied Social Psychology, 40 (1), 1-12 DOI: 10.1111/j.1559-1816.2009.00560.x
... Read more »
Saunders, S., & Lynn, M. (2010) Why tip? An empirical test of motivations for tipping car guards. Journal of Economic Psychology, 31(1), 106-113. DOI: 10.1016/j.joep.2009.11.007
Seiter, J., & Weger, Jr., H. (2010) The Effect of Generalized Compliments, Sex of Server, and Size of Dining Party on Tipping Behavior in Restaurants. Journal of Applied Social Psychology, 40(1), 1-12. DOI: 10.1111/j.1559-1816.2009.00560.x
Pounding the treadmill in 1993, John Basinger, aged 58, decided to complement his physical exercise by memorising the 12 books, 10,565 lines and 60,000 words that comprise the Second Edition of John Milton's epic poem Paradise Lost. Nine years later he achieved his goal, performing the poem from memory over a three-day period, and since then he has recited the poem publicly on numerous occasions. When the psychologist John Seamon of Wesleyan University witnessed one of those performances in December 2008, he saw an irresistible research opportunity.
Seamon and his colleagues tested Basinger's memory systematically in the lab. They provided two lines as a cue and then 'JB' (as they refer to him in their report) had to reproduce the next ten. With the exception of books VII, his least favourite, and XI, JB's performance was uniformly exceptional - regardless of whether the researchers revealed which book and book section the cue lines were from or not, and regardless of whether they tested portions of the poem in sequence or picked them randomly, JB displayed an accuracy of around 88 per cent in terms of correctly recalled words. When mistakes were made, they tended to be omissions rather than altered or added words. The researchers also tested JB's everyday memory and found that in all non-Milton respects it was age-typical.
Seamon and his co-workers claim JB's feat shows that 'cognitive expertise in memorisation remains possible even in later adulthood, a time period in which cognitive researchers have typically focused on decline.'
Just how did JB manage to pull off this incredible feat? He studied for about one hour per day, reciting verses in seven-line chunks, consistent with Miller's magic number seven - the capacity of short-term, working memory. Added together, JB estimates that he devoted between 3000 to 4000 hours to learning the poem. Seamon's team interpret this commitment in terms of Ericsson's 'deliberate practice theory', in which thousands of hours of perfectionist, self-critical practice are required to achieve true expertise.
JB didn't use the mnemonic techniques favoured by memory champions, but neither, the researchers say, should we see his achievement as a 'demonstration of brute force, rote memorisation'. Rather it was clear that JB was 'deeply cognitively involved' in learning Milton's poem. JB explained:
'During the incessant repetition of Milton's words, I really began to listen to them, and every now and then as the whole poem began to take shape in my mind, an insight would come, an understanding, a delicious possibility. ... I think of the poem in various ways. As a cathedral I carry around in my mind, a place that I can enter and walk around at will. ... Whenever I finish a "Paradise Lost" performance I raise the poem and have it take a bow.'_________________________________
Seamon, J., Punjabi, P., & Busch, E. (2010). Memorising Milton's Paradise Lost: A study of a septuagenarian exceptional memoriser. Memory, 18 (5), 498-503 DOI: 10.1080/09658211003781522
... Read more »
Seamon, J., Punjabi, P., & Busch, E. (2010) Memorising Milton's Paradise Lost: A study of a septuagenarian exceptional memoriser. Memory, 18(5), 498-503. DOI: 10.1080/09658211003781522
Faced with a challenging task, try folding your arms - new research shows people persevere for longer when their arms are crossed.Ron Friedman and Andrew Elliot gave dozens of students an impossible anagram to solve. Half the students were instructed to attempt the puzzle with their hands on their thighs, while the other students were told to sit with their arms folded. The thigh group only persevered for about 30 seconds on average, while the students with their arms folded struggled on for nearly 55 seconds.A second experiment involved testing more students with anagr... Read more »
Ron Friedman, & Andrew Elliot. (2008) The effect of arm crossing on persistence and performance. European Journal of Social Psychology, 38(3), 449-461. DOI: 10.1002/ejsp.444
Show one image exclusively to one eye and a different image exclusively to the other eye and rather than experiencing a merging of the images, an observer's percept will flit backwards and forwards randomly and endlessly between the two. This "binocular rivalry", as it's known, has been of particular interest to psychologists because it shows how the same incoming sensory information can give rise to two very different conscious experiences. Now, in a research first, psychologists have shown that a similar process occurs with our sense of smell. If one odour is presented to one nostril and another odour is presented to the other nostril, a person will experience "binaral rivalry" - sensing one smell and then the other, backwards and forwards, rather than a blending of the two. Wen Zhou and Denise Chen presented twelve participants with the smell of rose to one of their nostrils and the smell of a marker pen to their other nostril. The odours were presented intermittently, every twenty to thirty seconds, to prevent "adaptation", which is the tendency for brain cells to gradually reduce their response to a continuous stimulus. After each break in the smells, the participants indicated on a visual scale whether they had detected the scent of rose or of marker pen. Just as with binocular rivalry, the participants' perceptual experience fluctuated back and forth randomly between the two scents.The researchers believe this nostril rivalry is related in some way to the process of adaptation, both in the receptor cells in the nose and in the part of the brain that processes smells. For example, when repeatedly presented with a balanced mix of both smells, the participants' sensory experience fluctuated between rose and marker pen, presumably because of adaptation in the brain: as central neurons tired of one odour, their response to the other became more dominant and back again. The researchers also showed that adaptation occurs in the nose: swapping the bottles of odour around from one nostril to the other reinstated participants' experience of a given smell after it had previously faded through continuous sniffing. "Our work sets the stage for future studies of this phenomenon," the researchers said._________________________________Zhou, W., & Chen, D. (2009). Binaral Rivalry between the Nostrils and in the Cortex. Current Biology, 19 (18), 1561-1565 DOI: 10.1016/j.cub.2009.07.052
... Read more »
Zhou, W., & Chen, D. (2009) Binaral Rivalry between the Nostrils and in the Cortex. Current Biology, 19(18), 1561-1565. DOI: 10.1016/j.cub.2009.07.052
Do bilinguals have an internal switch that stops their two languages from interfering with each other, or are both languages always "on"? The fact that bilinguals aren't forever spurting out words from the wrong language implies there's some kind of switch. Moreover, in 2007, brain surgeons reported evidence for a language switch when their cortical prodding with an electrode caused two bilingual patients to switch languages suddenly and involuntarily.On the other hand, there's good evidence that languages are integrated in the bilingual mind. For example, bilinguals are faster at naming an object when the word for that object is similar or the same in the two languages they speak (e.g. ship/schip in English and Dutch).Now Eva Van Assche and colleagues have provided further evidence for the idea of bilingual language integration by showing that a person's second language affects the way that they read in their native language.The researchers recorded the eye movements of 45 bilingual Belgian students as they read sentences in their native Dutch tongue. The key finding was that they read Dutch words faster when the equivalent word in their second language, English, was similar or the same as the Dutch word. Specifically, they spent less time fixating on words like "piloot" ("pilot" in English) than on control words like "eend" (that's "duck" in English).Van Aassche and her colleagues said this shows that even when bilinguals read sentence after sentence in their native tongue, access to words in their second language remains open, rather than switched off, thus having an effect on the way the native language is processed."Becoming a bilingual means one will never read the newspaper again in the same way," they concluded. "It changes one of people's seemingly most automatic skills, namely, reading in one's native language."_________________________________Van Assche E, Duyck W, Hartsuiker RJ, & Diependaele K (2009). Does Bilingualism Change Native-Language Reading? Cognate Effects in a Sentence Context. Psychological science : a journal of the American Psychological Society / APS PMID: 19549082If you found this post interesting, you might also enjoy:-Babies raised in a bilingual home show enhanced cognitive control.-Change your personality, learn a new language.
... Read more »
Van Assche E, Duyck W, Hartsuiker RJ, & Diependaele K. (2009) Does Bilingualism Change Native-Language Reading? Cognate Effects in a Sentence Context. Psychological science : a journal of the American Psychological Society / APS. PMID: 19549082
I've always envied early risers, those who spring out of bed at the crack of dawn, ready, it seems, to take on the world. Of course their early vitality could be short-lived. Morning friskiness gives the impression of a positive nature but are 'larks' really more proactive people than 'owls'?Yes, according to Christoph Randler who surveyed 367 student participants and found a correlation between their self-reported 'morningness' (as revealed by their answers to questions about how easy they find it to get up in the morning and how alert they feel) and their self-reported proactivity (measured by their agreement with statements like 'I spend time identifying long-range goals for myself' and 'I feel responsible for my own life'). The correlation was relatively weak (.44, where 1 would be a perfect match) but was statistically significant. Randler also found proactivity to be (inversely) correlated, though to a lesser extent than morningness, with so-called 'social jetlag'. This is caused by the mismatch between one's biological time-keeping and the demands of social time, as betrayed by the difference in students' choice of rise times between weekdays and weekends.These findings suggest that morning people really are more proactive. What's not clear is why - whether it's because they really do have an inherent energy and drive or if instead it's simply easier for morning people to be proactive in a world that is generally tailored towards rising early, rather than working late. '... [W]hether evening people could be more proactive in their lifestyles if they had less restrictive schedules (e.g. they could start work later in the day)' is a question for future research, Randler said.This is far from being the first study to look for associations between people's sleep habits and other personality factors. Prior research suggests that evening people are more extaverted, pessimistic and creative, whilst morning people are more conscientious. Twin studies suggest that genetic differences explain a lot of the variation in people's morningness and eveningness._________________________________Randler, C. (2009). Proactive People Are Morning People. Journal of Applied Social Psychology, 39 (12), 2787-2797 DOI: 10.1111/j.1559-1816.2009.00549.x
... Read more »
We hear a lot about the harmful consequences to children of seeing their parents argue or watching violence on TV, but very little about the potential harm of witnessing school bullying. But now Ian Rivers and colleagues have published findings suggesting that being a bystander to bullying can often be just as psychologically harmful as being directly involved.The researchers asked just over 2000, predominantly white, children aged 12-16 at 14 state schools in the north of England about how much they'd been bullied, been a bully or witnessed bullying, over the last school term. Bullying appeared to be part of the daily lives of most of the children, with 63 per cent saying they'd seen bullying going on; 20 per cent admitting that they'd bullied someone else and 34 per cent reporting they'd been bullied. The pupils were also asked questions about their mental health and their use of cigarettes, alcohol and other drugs. The findings showed that being a witness to bullying was associated with increased mental health problems and substance abuse, above and beyond the effects of being directly involved in bullying. In other words, witnessing bullying was still significantly associated with psychological measures like anxiety and depression, even after the potential influence of being a bullying victim or perpetrator was factored out. Pupils who'd witnessed bullying (but not been a victim or bully) also tended to report drinking more alcohol than victims or those not at all involved in bullying.The researchers acknowledged that their study was not longitudinal so it only offered a snapshot of the relations between the various bullying roles and mental health measures. And there's also a need to treat pupils' self-report data with caution. Nonetheless, Rivers' team said their study suggests school psychologists should consider the effects of bullying on bystanders, not just on those directly involved.Possible reasons why witnessing bulling could be psychologically harmful include being reminded of one's own past experiences of being bullied; being made to feel that one is at risk of being bullied; and also feeling guilty for not intervening to help the victim.'It’s well documented that children and adolescents who are exposed to violence within their families or outside of school are at a greater risk for mental health problems ...' said Rivers. 'It should not be a surprise that violence at school will pose the same kind of risk.'_________________________________Rivers, I., Poteat, V., Noret, N., & Ashurst, N. (2009). Observing bullying at school: The mental health implications of witness status. School Psychology Quarterly, 24 (4), 211-223 DOI: 10.1037/a0018164
... Read more »
Rivers, I., Poteat, V., Noret, N., & Ashurst, N. (2009) Observing bullying at school: The mental health implications of witness status. School Psychology Quarterly, 24(4), 211-223. DOI: 10.1037/a0018164
The Research Digest blog was five years old in February. As part of an ongoing celebratory series, I've asked Dr Gavin Nobes of the University of East Anglia to look back on his research on children's naive models of the earth that I covered in March 2005, to reflect on that study and the field more generally. Here's what he had to say:"Almost 15 years ago the late George Butterworth visited UEL and inspired a group of us to follow up some work he and Michael Siegal had started in Australia. Using a novel, forced-choice question task, they were testing the claim, based on children's drawings, that children have theory-like ‘naive mental models’ of the Earth; that is, children believe it to be (for example) flat, or a hollow sphere in which we live. This area of research has important implications for our understanding of the acquisition of knowledge, and for science education. For example, if children are influenced primarily by their intuitions and observations (as proponents of the naive mental model approach claim), they would be expected to think the Earth is flat; but if cultural communication is the principal source of information, children’s first concept of the Earth should be a rudimentary version of the scientific, spherical model.In the study featured in the Digest five years ago, Georgia Panagiotaki, Alan Martin and I asked children not to draw but to choose, from a set of pictures, those that they thought best represented the Earth. As in the Australian study, we found that children knew much more about the Earth than previous researchers had claimed, and found no evidence of naive mental models.Despite this apparently strong evidence from two different methods, the debate continued. Our recognition (forced-choice questions and picture selection) methods were criticised on the grounds that, unlike the earlier studies based on children's drawings, they failed properly to elicit children’s understanding. We responded to these criticisms by giving the same open-ended, drawing-based questions (used in the earlier studies) to university students. We were amazed to find that many of them drew exactly the same pictures, and gave identical non-scientific answers, as had children who were supposed to have naive mental models. Subsequent interviews revealed that the students had drawn and answered in these ways because they didn’t understand the questions – despite them being designed for 5-year-olds! Further experiments with a new version of the task, in which we rephrased the original open questions to reduce their ambiguity, led both adults and children to give substantially fewer non-scientific answers. We concluded that naive mental models are methodological artifacts: children and adults give these responses to the original instrument because the questions are poorly worded.One recommendation that arises from this work is that, wherever possible, different methods should be used to test the same hypotheses. Another is that, however simple your children’s task might be, try it out first on adults: this is quick, easy, and can be remarkably revealing. And third, don’t be too dispirited by negative reviews: especially early on, editors sent our submissions to proponents of the naive mental model view, whose disparaging reviews resulted in rejections. Had it not been for Michael’s and George’s generous support and encouragement, we would probably have given up and turned to less controversial areas of research." _______________________________Nobes, G., Martin, A., & Panagiotaki, G. (2005). The development of scientific knowledge of the Earth. British Journal of Developmental Psychology, 23 (1), 47-64 DOI: 10.1348/026151004x20649Look out for more of these 'looking back' guest posts in the coming months.
... Read more »
Nobes, G., Martin, A., & Panagiotaki, G. (2005) The development of scientific knowledge of the Earth. British Journal of Developmental Psychology, 23(1), 47-64. DOI: 10.1348/026151004x20649
Women hoping to appeal to speed-dating partners should try subtly mimicking the words and body-language of their dates. That's according to Nicholas Gueguen whose new study shows that women who mimic are rated by men as more sexually attractive.Gueguen recruited three female participants who were taking part in real-life, heterosexual speed-dating sessions and coached them to mimic some of their 66 male dates but not others. In the mimicking condition, the female assistants were instructed to mimic their partner's utterances approximately five times during a five-minute date and to mimic his body language five times. For example, if a date said "You really do this?", the woman would respond "Yes I really do this" in the mimicry condition, but say only "Yes" in the non-mimicry condition. Similarly, if a man scratched his face, a mimicking assistant was instructed to wait two to three seconds and then scratch her face.The standard procedure at the end of these speed-dating sessions was for everyone to provide a list of those dates they would most like to give their contact details to. Guegen found that, on average, when a woman mimicked a dating partner, he was more likely, compared with non-mimicked dates, to want to give her his contact information, to say that the speed-date had gone well, and to rate her as more sexually attractive.Further analysis showed that in the mimicry condition only, a woman's perceived sexual attractiveness was linked to how much a partner subsequently wanted to share his contact information with her, even after factoring out the general influence of how well the man felt the date had gone. In other words, mimicry seemed to increase the influence of a woman's sexual attractiveness."By using an experimental approach in a real context it was found that mimicry is associated with greater preference and liking for a female in a courtship situation," Gueguen said. "This aspect [of mimicry] has never been examined previously".For extra appeal, the Digest blog archive suggests speed-daters could try combining mimicry with a light touch of their partner's arm. Guegen's previous research has shown that this can improve the success of romantic requests for a dance or phone number. Oh and for good measure, this earlier research suggests you should ask your opposite-sex friends to smile at you!_________________________________Gueguen, N. (2009). Mimicry and seduction: An evaluation in a courtship context. Social Influence, 4 (4), 249-255 DOI: 10.1080/15534510802628173Link to related Digest item: mimicry the best form of flattery for computers too.
... Read more »
Gueguen, N. (2008) Mimicry and seduction: An evaluation in a courtship context. Social Influence, 4(4), 249-255. DOI: 10.1080/15534510802628173
Back in the 70's, psychologist Paul Bakan published a short research report in which he noted that among 47 inpatients on an alcoholism ward, 7 were left-handed - more than you'd expect based on the approximate 10-per cent prevalence of left-handedness in the general population. Bakan described his observation as 'incidental' but according to Kevin Denny, the idea of an alcoholism-handedness link has proven sticky, with some commentators suggesting the stress of being left-handed in a right-handed world is to blame.
Several studies through the years have attempted to replicate the left-handed-alcoholism link but most have relied on small samples and any way the results have been inconsistent. Denny's contribution is an examination of data from the SHARE survey involving over 25,000 people from 12 countries. Left-handers aren't more prone to risky drinking, Denny finds, but they do drink more often.
Denny made his finding after categorising survey participants based on their self-reports as either heavy drinkers (those who drink 'almost everyday' or '5 or 6 days a week') or light drinkers (less than once a month or not at all for last six months). There was no evidence that handedness was related to excessive drinking, but left-handers were significantly less likely to be in the light drinker category than right-handers, meaning that they probably do drink more, in moderation, as a group overall.
'There is no evidence that handedness predicts risky drinking,' Denny wrote. 'Hence, the results do not support the idea that excess drinking may be a consequence either of atypical lateralisation of the brain or due to the social stresses that arise from left-handers being a minority group.'
Denny acknowledges his study has limitations - all participants in the SHARE survey are over 50, so it's possible his findings don't generalise to younger people. Related to this, it's possible that some heavy-drinking left-handers died before the age of 50, although their numbers are likely to be small. Another potential shortcoming is that some participants categorised as non-drinkers may have been problem-drinkers in recovery.
Denny, K. (2010). Handedness and drinking behaviour. British Journal of Health Psychology DOI: 10.1348/135910710X515705
... Read more »
It's not so pleasing when it glues your shoe to the pavement but a new study suggests chewing gum could be a great stress-reliever, with consequent health benefits. Perhaps the finding could help explain why Manchester United manager Sir Alex Ferguson - an incessant gum chewer - has coped for so long with the stress of top-flight football?Andrew Smith at Cardiff University surveyed over 2,000 workers and found that the 39 per cent of respondents who reported never chewing gum were twice as likely to say they were extremely stressed at work, compared with gum chewers, and one and a half times as likely to say they were very or extremely stressed with life in general.Of course, rather than chewing gum having a stress-relieving effect, it's perfectly possible that some other factor reduces stress and encourages chewing gum. Indeed, Smith looked at a range of potential confounds and found that women, lower earners, younger, less educated respondents, smokers, people with demanding jobs and neurotic extraverts were all more likely to chew gum. Crucially, however, the link between chewing gum and lower stress held even after taking all these extraneous factors into account.What's more, chewing gum was also associated with better mental and physical health. Again, this remained true even after controlling for extraneous factors, such that gum chewers were less likely to have symptoms of depression and half as likely to have self-reported high blood pressure or high cholesterol. Smith concluded that chewing gum may be a "readily available and relatively cheap method of addressing" stress and stress-related ill health. Possible mechanisms that might explain the associations reported here include an effect of chewing gum on autonomic nervous system activity and/or on the neurotransmitter serotonin. Smith noted that he has an intervention study underway that will provide a more robust test of the possible stress-related benefits of chewing gum. _________________________________Smith, A. (2009). Chewing gum, stress and health Stress and Health, 25 (5), 445-451 DOI: 10.1002/smi.1272
... Read more »
Excitable tabloids, technophile lawyers and gullible entrepreneurs have all spent the last few years salivating over the prospect of functional brain imaging delivering us the first form of truly scientific, objective lie detection. Not so fast. Most research that's tested the potential of functional brain scanning for lie detection has compared brain activity between lying and honest conditions by averaging signals across whole groups of participants - no use for real life. Now George Monteleone and colleauges have taken a representative paper from this literature and thoroughly examined its potential for spotting individual liars. The paper they examine was by Lhan Phan and colleagues in 2005 and involved fourteen participants having their brains scanned whilst they either told the truth or lied about playing cards in their possession. Consistent with several other similar papers, Phan's study showed differential activity in a raft of brain areas when people lied versus told the truth, especially frontal regions involved in working memory and deliberate effort. Monteleone's team took the brain activity of each individual in Phan's study and compared it with the averaged activity of the other 13 participants to see if the "lying areas" identified at the group level were also extra active when that specific participant was lying. At the group level, 16 brain regions showed differential activity when lying compared with telling the truth. The brain area that most resembled a true "neural signature" for lying was the medial prefrontal cortex (mPFC). Seventy-one per cent of participants showed heightened activity in this region when they were lying compared with telling the truth. This is better than chance, but far from perfect - really no different from the classic polygraph. Also, just like the polygraph, brain imaging suffers from the problem of balancing specificity with sensitivity. For example, if the threshold for significant mPFC activity is lowered, then the number of participants showing notable lying-related activity in this region increases, but so too do the number of false alarms - that is, participants who show activity in this region when they're telling the truth. In real life legal settings, these "false positives" could mean innocent people going to jail or worse.What's more, Monteleone's team warn that it's highly unlikely mPFC activity is a true neural signature for lying. Just as there are many reasons why our pulse might race and our palms get sweaty (thus triggering a polygraph), there are many potential excitors of mPFC activity, including self-consciousness and thinking about other people's mental states.This also raises the problem of cunning criminals devising simple ways to foil the brain scanner. A participant who performed complex mental arithmetic during truth and lying conditions, or who concentrated on the examiner's mental state throughout a scan, would likely spoil any neat comparison of truth and lying conditions.The problems don't end there. Monteleone's group further showed that for some lying participants, specific brain regions that appeared to be activated by lying were in fact really part of a far larger spread of brain activation that probably had nothing to do with lying at all. There's also the fact that the playing card lying paradigm is so simple and insipid compared with real-life lying. Also, the researchers observed that a minority of participants showed idiosyncratic brain responses to lying, out of keeping with the general group-level patterns. And finally, there are socio-cultural issues. Problems with language and the cultural appropriateness of deception could both massively distort a person's brain response to lying versus truth-telling."...[A]lthough fMRI may permit investigation of the neural correlates of lying," the researchers said, "at the moment it does not appear to provide a very accurate marker of lying that can be generalised across individuals or even perhaps across types of lies by the same individuals." _________________________________Monteleone, G., Phan, K., Nusbaum, H., Fitzgerald, D., Irick, J., Fienberg, S., & Cacioppo, J. (2009). Detection of deception using fMRI: Better than chance, but well below perfection. Social Neuroscience, 4 (6), 528-538 DOI: 10.1080/17470910801903530Link to related Wired news story: Evidence from fMRI lie-detection was used in a courtroom for the first time earlier this year.
... Read more »
Monteleone, G., Phan, K., Nusbaum, H., Fitzgerald, D., Irick, J., Fienberg, S., & Cacioppo, J. (2009) Detection of deception using fMRI: Better than chance, but well below perfection. Social Neuroscience, 4(6), 528-538. DOI: 10.1080/17470910801903530
Human immodesty knows no bounds. Most people think they're better looking than average, more intelligent, better at driving and less likely to get ill. Psychologists seeking to explain this common delusion have suggested it serves a protective role: a shield against the depressing realities of fate, fallibility and social spite. However, a surprising new study by Sander Thomaes and colleagues directly contradicts this account. Their investigation with older children suggests that a realistic self view is more protective.Two hundred and six children aged between nine and twelve years rated how much they liked each of their classmates and how much they thought each of their classmates liked them. This gave the researchers a measure of how realistic each child's self-view was. Two weeks later, the children were invited to play a "Survivor Game" - a kind of internet popularity contest in which the least popular of four players would be voted out of the group. The game was fixed and half the children were told that they were the least popular. The other children received neutral feedback: another child had been voted out.Using a measure of mood before and after the game, the researchers found that children with a more realistic view of their popularity at school were the least badly affected by rejection in the Survivor Game. By contrast, children with an inflated view of their popularity, or a deflated view, experienced a far greater drop in their mood after being told they'd been voted out."Our results suggest that vulnerable children holding positively or negatively distorted self-views may benefit from interventions that target their biased social-reasoning processes," Thomaes and his colleagues concluded._________________________________Thomaes, S., Reijntjes, A., Orobio de Castro, B., & Bushman, B. (2009). Reality Bites-or Does It? Realistic Self-Views Buffer Negative Mood Following Social Threat. Psychological Science DOI: 10.1111/j.1467-9280.2009.02395.x... Read more »
Thomaes, S., Reijntjes, A., Orobio de Castro, B., & Bushman, B. (2009) Reality Bites-or Does It? Realistic Self-Views Buffer Negative Mood Following Social Threat. Psychological Science. DOI: 10.1111/j.1467-9280.2009.02395.x
There are so many things you'd rather be doing than what you ought to be doing and what happens is that you delay doing what you ought. All the evidence shows that this procrastination is bad for you, for your productivity, your school grades, for your health. But still we keep putting things off. Until. Tomorrow. Now Michael Wohl and colleagues have proposed a rather surprising cure - self-forgiveness. That's right, forgive yourself for you have procrastinated, move on, get over it and you'll be more likely to get stuck in next time around. Wohl's team followed 134 first year undergrads through their first mid-term exams to just after their second lot of mid-terms. Before the initial exams, the students reported how much they'd procrastinated with their revision and how much they'd forgiven themselves. Next, midway between these exams and the second lot, the students reported how positive or negative they were feeling. Finally, just before the second round of mid-terms, the students once more reported how much they had procrastinated in their exam preparations.The key finding was that students who'd forgiven themselves for their initial bout of procrastination subsequently showed less negative affect in the intermediate period between exams and were less likely to procrastinate before the second round of exams. Crucially, self-forgiveness wasn't related to performance in the first set of exams but it did predict better performance in the second set.'Forgiveness allows the individual to move past maladaptive behaviour and focus on the upcoming examination without the burden of past acts to hinder studying,' the researchers said. 'By realising that procrastination was a transgression against the self and letting go of negative affect associated with the transgression via self-forgiveness, the student is able to constructively approach studying for the next exam.'_________________________________Wohl, M., Pychyl, T., & Bennett, S. (2010). I forgive myself, now I can study: How self-forgiveness for procrastinating can reduce future procrastination. Personality and Individual Differences, 48 (7), 803-808 DOI: 10.1016/j.paid.2010.01.029
... Read more »
Wohl, M., Pychyl, T., & Bennett, S. (2010) I forgive myself, now I can study: How self-forgiveness for procrastinating can reduce future procrastination. Personality and Individual Differences, 48(7), 803-808. DOI: 10.1016/j.paid.2010.01.029
Do you write about peer-reviewed research in your blog? Use ResearchBlogging.org to make it easy for your readers — and others from around the world — to find your serious posts about academic research.
If you don't have a blog, you can still use our site to learn about fascinating developments in cutting-edge research from around the world.
Research Blogging is powered by SMG Technology.
To learn more, visit seedmediagroup.com.