Showing posts with label Developmental. Show all posts
Showing posts with label Developmental. Show all posts

Tuesday, 25 August 2015

How do lying skill and frequency change through life, from childhood to old age?

Young adults – defined here as people aged 18 to 29 – are the most skilled liars, while teens are the most prolific. That's according to a new study published in Acta Psychologica that claims to be the first ever to investigate lying behaviour across the entire lifespan.

The research involved members of the public who were visitors at the Science Centre NEMO in Amsterdam. In all, 1005 people took part, aged from 6 to 77. To test lying ability, Evelyne Debey and her colleagues presented the participants with simple general knowledge questions (e.g. "Can pigs fly?"). These questions had to be answered as quickly as possible, either truthfully or dishonestly, depending on the colour of the YES/NO response options. Taking the pig question as an example, the idea is that a skilled liar ought to be able to answer "yes" very quickly, whereas a poor liar will be delayed when answering dishonestly, or they might even give a reflex honest answer.

Young adults showed the fastest reaction times and lowest error rates on this test of lying. Overall, lying proficiency showed an inverted U-shaped curve through the lifespan, improving through childhood, peaking in young adulthood and then gradually declining into old age. This meant that lying proficiency was the same in the youngest children (aged 6 to 8) as it was in the eldest participants (aged 60 plus).

To measure lying frequency, the researchers asked their participants to report the number of lies they had told during the preceding 24 hours to different people in different situations (e.g. to a stranger or relative; face to face or online), and to describe those lies. Overall, the participants reported telling an average of two lies during that time, which is consistent with past research. Teens admitted to telling more lies (an average of 2.8) than any of the other age groups. Again there was an inverted U-shaped relationship between age and lying such that lying frequency increased during childhood, peaked in adolescence, then decreased through life, so that the oldest group lied with the same frequency as the youngest participants.

The researchers have a theory that the reason lying skill and frequency show this pattern through the lifespan is because of age-related changes in inhibitory control – the idea being that to lie successfully you need to be able suppress the truth, and that young adults have the most inhibitory control.

To test this, the researchers had the same participants complete what's known as a "stop-signal task" – this involved pressing a button to indicate as fast as possible whether an X or an O had appeared on-screen. Crucially, on 25 per cent of these trials a tone sounded that told them to cancel their response. The later this tone sounded, the harder it is to withhold a response. Participants with greater inhibitory control can usually cancel their response even when the stop signal is given very late.

The participants' ability on this stop-signal task increased through childhood and peaked in young adulthood. However, performance after this age remained relatively stable. Moreover, performance on the stop-signal task did not correlate strongly with lying proficiency. This appears to undermine the researchers' theory about inhibitory control, but they argued that there are different types of inhibitory control and that perhaps a different measure of a different kind of inhibition (such as the Stroop Test, which measures the ability to deal with interfering mental demands) would have correlated more strongly with lying ability.

In a field that so often relies on student research participants, this study stands out for its involvement of the general public across wide range of ages. However, as the researchers acknowledge, the findings come with many caveats. Among these is the fact the study was cross-sectional – it doesn't tell us anything about how the same people's propensity for, and ability to, lie changes as they age. Also, the test of lying ability was artificial and involved none of the emotional consequences of real-life lies. Note too that, as shown in past research, lying frequency was highly skewed so that half the participants reported telling no lies in the previous 24 hours, and over 50 per cent of the lies were told by prolific liars who made up just 9 per cent of the sample.

_________________________________ ResearchBlogging.org

Debey, E., De Schryver, M., Logan, G., Suchotzki, K., & Verschuere, B. (2015). From junior to senior Pinocchio: A cross-sectional lifespan investigation of deception Acta Psychologica, 160, 58-68 DOI: 10.1016/j.actpsy.2015.06.007

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Friday, 14 August 2015

Study uncovers dramatic cross-cultural differences in babies' sitting ability

Paediatricians' offices are often adorned with a developmental milestone chart for infants, and they always show the same "normal" age-typical progression, from sitting to crawling to walking. But these expectations (e.g. 25 per cent of infants achieve independent sitting by 5.5 months) are rather misleading because they're derived solely from research on Western babies conducted back in the 1930s and 1940s. A new study, published recently in the Journal of Cross-Cultural Psychology, aimed to broaden our understanding of what constitutes typical sitting ability, by observing five-month-old infants from six different cultures: Argentina, Cameroon, Italy, Kenya, South Korea, and the United States.

Lana Karasik and her colleagues also departed from previous research by observing babies in their home environment rather than in a psychology lab. Specifically, a researcher local to each of the six cultures visited 12 mother and baby pairs in their homes for one hour. These sessions were taped and coded later based on where the babies were (i.e. in their mothers arms, on the ground, or on baby or adult furniture), their body position (sitting or lying etc) and how close their mother was to them. The mothers didn't know that the study was about infant sitting ability.

Overall, one third of the infants were able to sit independently, defined as sitting without support for at least one second. But there was significant cross-cultural variation. For example, just two of the US infants displayed independent sitting and none of the Italian infants, compared with 8 of the Kenyan infants (67 per cent) and 11 of the Cameroonian infants (92 per cent). There was also a wide-range of sitting proficiency, in terms of how long infants sat independently in a single bout. For example, the shortest bout was 2.4 seconds, while the longest was 28 minutes (achieved by a Cameroonian baby).

Figure from Karasik et al, 2015.
These cultural differences were mirrored by differences in the opportunities the infants were given to sit independently. For example, infants from the US, Argentina, South Korea and Italy spent most of their time in places that provided support, such as a strapped into child's furniture or in their mother's arms. By contrast, infants in Kenya and Cameroon spent most of their sitting time on the floor, or on adult furniture where they had to learn to balance themselves. Mothers in Kenya and Cameroon also tended to spend more time further away from their babies. One Kenyan mother spent 13 minutes out of reach of her baby as he sat independently on adult furniture (by the way, he didn't fall off the furniture, and neither did any other babies in this research).

It's tempting to infer that the cultural parenting practices in Kenya and Cameroon may have encouraged some of the infants in those cultures to acquire more precocious sitting abilities (on average). But of course this was a purely observational study with small samples, and we can't know whether the infants' abilities influenced their parents' behaviour or vice versa (in fact, it's probably a bit of both). It's also important to note, as the researchers do, that there was a huge amount of overlap in sitting ability across the cultures (e.g. some US infants sat independently longer than some Kenyan and Cameroonian infants), and there is also a large amount of variation within the cultures. Because of this, Karasik and her team say it is inappropriate to talk of babies in some cultures being uniformly more precocious than babies in others.

Infant sitting is a very important skill – it frees their hands to explore objects and interact more easily with adults. Given this, it seems amazing that most of what we know about the development of sitting ability is based on dated, lab-based research conducted almost exclusively in Western countries. "Had we not looked beyond onset ages [the simplistic idea that a child is either a sitter or not], ventured outside the laboratory, and studied samples of infants from six cultures across the globe," the researchers said, "we would never have known that at five months, some infants can safely sit on high benches for extended periods without the support of adults nearby."

_________________________________ ResearchBlogging.org

Karasik, L., Tamis-LeMonda, C., Adolph, K., & Bornstein, M. (2015). Places and Postures: A Cross-Cultural Comparison of Sitting in 5-Month-Olds Journal of Cross-Cultural Psychology, 46 (8), 1023-1038 DOI: 10.1177/0022022115593803

--further reading--
Why do toddlers bother learning to walk?
For infants, walking is more than just another step in motor development
10 surprising things babies can do
How babies go sole searching
Toddlers don't take the risk of entrapment seriously

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Monday, 20 July 2015

Older people are more willing to trust someone who has cheated others

There’s a stereotype that older people are more friendly and trusting, possibly leaving them vulnerable to con-artists. A new study using an economic trading game provides clear evidence that older people really are more trusting, at least in the sense that they are more likely to give the benefit of the doubt to people with a dodgy track record.

Phoebe Bailey’s research paradigm invited 72 Australian participants to complete a series of 30 trading game trials alone via a computer, in the knowledge that one random trial would pay off with real cash. In each trial, participants and their trading partner – an absent "trustee" – each began with an initial stake of Australian $5. First, the participant chose whether to entrust $0-5 to the trustee, and second, the trustee could return any amount back to the participant. Anything entrusted or returned was doubled in value by the computer, meaning that both parties could in theory come out on top by trading, but participants could also be easily betrayed by a trustee who returned them nothing.

Each trial involved a different trustee, who was distinguished using onscreen information, in the form of a grid of information that summarised whether the trustee had a generous (for half of the trustees) or mean track record in their past trials of the game with other participants. The researchers found that older adults (mostly aged between 70 and 80) compared to the youngsters (in their early twenties) made bigger hand-overs to trustees who had a grid depicting a mean track-record: that is, they were more likely to entrust money knowing that history suggested they might not see it again.

This tendency to trust was predicted by the researchers, as there are good theoretical reasons for it to emerge. There is some evidence of differential processing of low-trust cues by older adults in emotional brain areas. On a psychological level, age-related positivity may be due to people more actively pursuing desired experiences as the sands of time run out; while this can certainly include ambitious Bucket Lists, the simple experience of emotional connection to other human beings is desirable and may become more sought after when material goods become less salient (“you can’t take it with you”). Under this view, older people aren’t being irrational when they take greater trust-related risks, they are simply making a different kind of gamble because the trusting payoff means more to them. In this study, older people may feel no need to “sweat the small stuff”, and prefer to make 30 people feel there was someone nice at the other end of one trial, payoffs be damned.

It’s important to note that the researchers had expected another age-related effect – for older people to be swayed also by trustworthy facial features in mugshots presented alongside some of the grids. However, this effect didn’t emerge, possibly because it was swamped by the objectively more reliable grid information.

It strikes me as fairly healthy to forego small financial gains to feel altruistic and give someone a second chance. However, if the tendency to take trust-related risks generalises beyond this context, it could be a factor that makes older adults attractive targets for charlatans and fraudsters, especially when they offer the chance of meaningful connections.

_________________________________ ResearchBlogging.org

Bailey, P., Szczap, P., McLennan, S., Slessor, G., Ruffman, T., & Rendell, P. (2015). Age-related similarities and differences in first impressions of trustworthiness Cognition and Emotion, 1-10 DOI: 10.1080/02699931.2015.1039493

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Wednesday, 15 July 2015

Older people frequently underestimate their own memory skills

By guest blogger David Robson

Aristotle once compared the human mind to a wax tablet. When we are young, the wax is warm and soft; it is easy to make an impression and record our thoughts and feelings. With age, the wax hardens – the older impressions fade, and it is harder to carve out new images in their place.

This view of memory, at least among the general public, has changed little in the 2300 years since. Many of us still believe that the brain’s “plasticity” – its ability to adapt, change, and pick up new skills – decreases as we get older, much like Aristotle’s stiffening wax. Combined with a general cognitive decline, this is the reason why it’s assumed “you can’t teach an old dog new tricks”.

Yet modern science tells a more positive story: memory is more pliable than we imagine, even in old age. In fact, according to work by Dayna Touron at the University of Carolina, a large problem may simply be confidence – older adults don’t trust their memories, and so don’t realise their full capacity. She uses the example of learning a new route with GPS. Long after they have committed the route to memory, older adults are more reluctant than younger people to give up SatNav Sally. Indeed, over the last few years, Touron has amassed some compelling evidence for the importance of older people’s lack of confidence, which she presents in a recent review for Current Directions in Psychological Science.

For instance, Touron once asked participants to perform a tedious verbal task: they were given a table of random word pairs (e.g. dog-potato) and they then had to judge whether another list of word pairings contained any of the same word pairs that had appeared in the original table. Importantly, all the participants were told they could refer back to the original table if they wanted to, but they didn’t have to if they could remember the pairings. The older participants (aged between 60 and 75 years) were more reluctant to rely on their memories, and so continued looking up each entry – despite the fact that further tests revealed they had memorised just as many of the original word pairs as the younger participants.

In other experiments, volunteers were given a series of algebraic equations to solve. They could solve them initially using mental arithmetic, but each equation appeared multiple times, allowing the participants to learn the answers off by heart. Even so, the older volunteers reported going through the same calculations again and again, rather than relying on their memories of the solutions. Crucially, Touron found that they were perfectly capable of retrieving those memories if they were encouraged by the offer a small cash prize in return for a quick answer. Nor is their reluctance a form of “behavioural inertia” – the habit of sticking to the first strategy you use, however inefficient in the long run. In fact, the older participants are happy to change tack in other kinds of tasks, as long as they don’t involve actively recalling newly learnt information.

In other words, avoiding their memory seems to be a choice – a decision that may come from a poor understanding of the way their minds are working. Touron has found that her older participants underestimate the time and effort required to take the long route around the problems, rather than using their memories as a short cut; they also believe their memories are less accurate than they actually are. Indeed, as you might expect, the less confident they are in their own learning abilities, the less likely they are to make use of their memories during these trials.

Asking volunteers to complete diaries about their everyday activities, Touron has found that older people are just as reluctant to take advantage of their memories in their private lives as in the psych lab – whether they are cooking, driving or learning to use a computer. And that has big implications, she thinks. For one thing, it could be a case of use it or lose it – contributing to a more general mental decline as people age. A fear of misremembering could also lead to reduced self-esteem, perhaps making older folk less adventurous: they might avoid parties if they are scared of forgetting people’s names, for instance, contributing to further isolation and loneliness. For these reasons, Touron is now looking for measures that could encourage older people to have a more youthful confidence in their abilities.

The science makes intuitive sense. If the brain really were a wax tablet, as Aristotle imagined, the answer wouldn’t be to set our tools aside to gather dust and cobwebs; it would be to continue to work the wax with even greater fervour, melting it and moulding it until it was pliable once more.

_________________________________ ResearchBlogging.org

Touron, D. (2015). Memory Avoidance by Older Adults: When "Old Dogs" Won't Perform Their "New Tricks" Current Directions in Psychological Science, 24 (3), 170-176 DOI: 10.1177/0963721414563730

--further reading--
Exploring people's beliefs about their memory problems
Different mental abilities peak at different times of life, from 18 to 70+
Introducing the SuperAgers - the elderly people whose brains have stayed young
Very old and very cool - recognising a distinct mental strength of the elderly
Companies are more successful when their employees feel young for their age

Post written by David Robson (@d_a_robson) for the BPS Research Digest. David is BBC Future’s feature writer.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Tuesday, 14 July 2015

What is the correct way to talk about autism? There isn't one

Image: National Autistic Society
The language we use reflects our attitudes but perhaps more important, it can shape those attitudes. A new study considers this power in the context of autism. Lorcan Kenny and his colleagues have conducted a UK survey of hundreds of autistic people; parents, relatives and carers of autistic adults and children; and professionals in the field, about their preferences for the language used to discuss autism. The research was conducted online with the help of the National Autistic Society.

The main finding is that there is no consensus about the preferred terms to use when talking about autism and people with the diagnosis. A key disagreement between and within the surveyed groups is whether the language we use should put the "person first", as in "people with autism", or put the diagnosis first, as in "autistic person". Overall, researchers and other professionals expressed a strong preference for the former. One professional said:
"I don’t like phrases which describe a person as their condition, so would always go for 'person' first, because that’s what we all are regardless of what conditions we have. I would never describe myself as a thyroidy, for example."
In contrast, autistic people showed a clear preference for autism-first terminology. One of the autistic adults in the survey said:
"Separating the person from their autism is damaging, as it reinforces opinions about autism being a ‘thing’ that can be removed, something that may be unpleasant and unwanted, and something that is not just another aspect of a whole, complete and perfect individual human being. Describing oneself as autistic is an extremely important and positive assertion about oneself, it means that one feels complete and whole as one is."
Related to this disagreement is the issue of whether autism is viewed as a "disorder" or a "difference", and whether any disability associated with autism is seen as located purely in the individual or as arising from society's failure to adapt to the needs of people with autism. Another adult with autism said:
"Autism is just another way of thinking, not some sort of disease that one can catch."
Yet some parents and carers were wary of downplaying the impact of autism, often because they are the ones championing their children's needs. One of them said:
"I prefer 'disorder' to 'condition' because I think it conveys better the seriousness and the need for support and intervention."
There was also disagreement about the appropriateness and value of the term Asperger's Syndrome (a diagnosis dropped recently by US psychiatry) or "Aspie". Some people felt it was an important part of their identity. Yet others believed continued use of the term undermined efforts to build a united autism community.

Another contentious issue is the idea of autism being a spectrum upon which everyone is located to some degree. This terminology was more popular among professionals and family members than among autistic people, some of whom felt that it trivialises the difficulties faced by those who are "truly autistic".

A notable point of agreement across the different groups who completed the survey was the dislike for the terms "high-functioning autism" (it downplays the everyday difficulties experienced even by autistic people who have good verbal and intellectual skills) and "low-functioning autism" (it undermines people's potential).

The researchers said the "fundamental finding" of their research was that "there are reasonable and rational disagreements between members of the autism community as to which terms should be used to describe autism." They said this "plurality" of views was likely to persist and evolve with time and that for anyone involved in autism, choosing the right language will be difficult and require care, reflection and "practical wisdom". They added: "The overriding principle for those who are unclear about appropriate terminology should therefore be to inquire of the people with whom they are working or describing for clarification."

_________________________________ ResearchBlogging.org

Kenny, L., Hattersley, C., Molins, B., Buckley, C., Povey, C., & Pellicano, E. (2015). Which terms should be used to describe autism? Perspectives from the UK autism community Autism DOI: 10.1177/1362361315588200

--further reading--
Advice from the National Autistic Society on how to talk about autism.
Autism journal podcast about the new survey findings.
"Watch your language when talking about autism" co-author Liz Pellicano reflects on the new findings at The Conversation.
Autism – Myth and Reality

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Friday, 26 June 2015

Is dyslexia associated with exceptional visual-spatial abilities?

Image: Jose.Stuefer / Flickr
Children and adults with dyslexia have a specific learning difficulty that mainly affects the development of their literacy and language related skills. But what if such a profile also tended to be associated with exceptional strengths in other areas, such as visual skills? That's certainly what some experts have proposed, for example based on the observation that people with dyslexia are over-represented in fields that involve visual-spatial abilities, such as art and architecture.

Now a team led by Mirela Duranovic has tested 40 children (19 boys), aged 9-11 and diagnosed with dyslexia, on a range of tests of imagery and visual memory. The children with dyslexia performed similarly to 40 age-matched, non-dyslexic controls (19 boys) on most tests, including the mental rotation of shapes; copying a complex, abstract figure (the so-called Rey-Osterrieth Figure); and following the beginning of a line to the end, through a tangle of other lines from the left to right of a page.

On memory for simple geometric shapes there was a tendency for the dyslexic children to underperform. And on one test, the children with dyslexia clearly performed worse than the controls: this was drawing the Rey-Osterrieth Figure from memory.

However, on yet another test, the dyslexic children excelled, outperforming the controls. This was the Paper Folding Test, which requires looking at a depiction of how a piece of paper is folded and where a hole is punched through it, and then judging which one of several illustrations correctly depicts how the paper will look once unfolded again (see below; the correct answer is C).

The superior performance of the dyslexic children on the Paper Folding Test is intriguing – this test is arguably more challenging and complex than simple mental rotation tasks, and involves a larger sequence of mental steps to complete.

This new study adds to a complicated, contradictory literature on visual spatial skills in dyslexia,  filled with studies that have reported no differences between dyslexic people and controls, deficits in dyslexic groups, and advantages in dyslexia.

More research is now needed to explore why the currently reported dyslexia advantage was observed: what is it about the mental processes involved in the Paper Folding Task that meant the dyslexic children performed better than controls? Also, will the finding replicate, and will it generalise to other tasks that require the same mental processes?

"Connecting dyslexia to talent leads us in a more optimistic direction than only associating dyslexia with a deficit," the researchers concluded. "The revelation of talent in individuals with dyslexia opens a door to more effective educational strategies and for choosing professions in which individuals with dyslexia can be successful."

_________________________________ ResearchBlogging.org

Duranovic, M., Dedeic, M., & Gavrić, M. (2014). Dyslexia and Visual-Spatial Talents Current Psychology, 34 (2), 207-222 DOI: 10.1007/s12144-014-9252-3

--further reading--
The enigma of dyslexic musicians
Most genes that influence maths ability also affect reading

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Wednesday, 10 June 2015

Toddlers learn better when you make them giggle

There is probably nothing more fun than making a baby or toddler laugh. And now there's news that it could even help with learning – the toddler's not the adult's.

In the first study to look at the effects of humour on learning at such a young age, Rana Esseily and her colleagues began by showing 53 18-month-olds how to reach a toy duck with a cardboard rake (other toddlers who'd spontaneously used the rake as a reaching tool were excluded). Crucially, half the participating toddlers were given several non-humorous demonstrations of how to use the rake to reach and pull the duck nearer. In these straight demonstrations, the experimenter was smiley, but just played with the duck for a bit after getting hold of it. The other toddlers were given several humorous demonstrations. In this case, after getting hold of the duck, the experimenter suddenly threw it on the floor and smiled. Sixteen of the 37 toddlers in the jokey condition laughed at least once when shown the funny demonstrations.

Next, the researchers placed the rake near each toddler's hand, to see if they would imitate the action and use the rake to reach the duck for themselves.  Among the laughing toddlers, all but one (93.7 per cent) used the rake to reach the duck. In comparison, just 19 per cent of the non-laughing toddlers in the jokey condition used the rake, and just 25 per cent of the 16 toddlers who'd been given the straight (non-jokey) demonstrations.

"Our results suggest that laughing might be a stimulant of learning even during the second year of life," the researchers concluded. However, they conceded that there are other possible interpretations of their findings. For example, perhaps infants who laugh at jokes are just more cognitively advanced and that's why they showed superior learning (although if that were true, you'd also expect a similar range of ability in the control group, which wasn't found). Or maybe it's not laughter per se that aids toddlers' learning, but any kind of positive emotion. "Further work is clearly now required to elucidate the question of the mechanisms underlying this effect of laughter on infants' learning," the researchers said.

_________________________________ ResearchBlogging.org

Esseily, R., Rat-Fischer, L., Somogyi, E., O'Regan, K., & Fagard, J. (2015). Humour production may enhance observational learning of a new tool-use action in 18-month-old infants Cognition and Emotion, 1-9 DOI: 10.1080/02699931.2015.1036840

--further reading--
Little comedians
The jokes that toddlers make
Girls' and boys' brains respond differently to funny videos
How many psychologists does it take to explain a joke?

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Thursday, 4 June 2015

The scaremongers were wrong: Metalheads from the 80s are thriving

If you sell your soul to heavy metal do you pay for it later in life? During the 1980s, waves of adolescents found solace in this most notorious of extreme music subcultures, alarming their parents as well as authority figures including the US surgeon general and the campaigner and Second Lady Tipper Gore. But a new survey suggests that in 2015, the teenage metalheads from the 80s are doing alright.

This matters because early research seemed to back the prevailing panic: metalheads were fatalistic, cynical, manipulative, and struggled at school. It would become clear that this account failed to consider that many fans were misfits with complicated home-lives before metal entered the picture, and ignored clusters of very high functioning metalheads drawn to the music by its complexity. But even later researchers were reluctant to endorse heavy metal, optimistic only that fans will eventually "outgrow the subculture".

The current research, spearheaded by Humbolt State University’s Tasha Howe, recruited metalheads active in the 1980s by using Facebook. These were 99 fans, together with about 20 musicians and a similar number of groupies. Compared to a control group of a similar age (into pop, new wave, or soft rock), the heavy metal groupies and fans (but not musicians) reported more adverse childhood experiences, fitting with the idea that people are often drawn to the difficult themes and tone of metal because of real-life discord; the groupies were particularly prone to suicidal tendencies.

Considering their early difficult circumstances, how did the heavy metal groups fare psychologically over time? Much as their non-metal peers did. Based on the recently taken measures, no differences were found compared with controls in adult attachment, the Big Five personality traits, or hypomania. A statistical technique called Bayes Factors can show how likely it is that the null hypothesis was true, meaning the lack of effect is because there really is no difference between groups rather than because of small samples. The Bayes Factors confirmed that for most of these individual differences, the case for "no difference" was solid.

And how do they feel? Presently, the metalheads feel as content in life as their "norm" peers. Furthermore, they recalled being significantly happier in youth, with only one third expressing regrets, versus half of the control group. Furthermore, the controls were the group with the highest incidence of undertaking counselling for emotional problems. This gives credence to what many metal fans believe: that the music offers catharsis and the scene an outlet for the emotional challenges of adolescence.

On the whole, the heavy metal musicians did better than most other groups in the study (the heavy metal fans and non-metal control participants), suggesting they were a high-functioning group – able to master complex musicianship and make a career out of the thing they most loved. The big risk factor for them was unprotected sex – one third had contracted an STD, unsurprising seeing as they averaged over 300 partners each over their lifetimes.

By sampling only current Facebook users we can’t get a complete picture: whether metal increased the risk of premature death, for instance. But the research suggests that the typical fan wasn’t harmed by big hair, blast beats and guitar solos; on the contrary, for many young people, the moshpit was exactly where they needed to be.

_________________________________ ResearchBlogging.org

Howe, T., Aberson, C., Friedman, H., Murphy, S., Alcazar, E., Vazquez, E., & Becker, R. (2015). Three Decades Later: The Life Experiences and Mid-Life Functioning of 1980s Heavy Metal Groupies, Musicians, and Fans Self and Identity, 1-25 DOI: 10.1080/15298868.2015.1036918

--further reading--
Psychologists and neuroscientists who rock

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Tuesday, 2 June 2015

Why do children stick their tongues out when they're concentrating?

Have you ever watched a young child perform a delicate task with their hands and noticed how they stick out their tongue at the same time? A new study is the first to systematically investigate this behaviour in four-year-olds. This isn't just a cute quirk of childhood, the findings suggest, rather the behaviour fits the theory that spoken language originally evolved from gestures.

Gillian Forrester and Alina Rodriguez videoed fourteen 4-year-olds (8 boys), all right-handed, as they completed a number of tasks in their own homes. The tasks were designed to involve either very fine hand control (e.g. playing with miniature dolls or opening padlocks with keys), less fine control (e.g. a game of knock and tap, in which the child does the opposite to the researcher, be that knocking or tapping the table with their right hand), or no hand control (remembering a story).

The researchers studied the videos looking for how often the children stuck out their tongues during these different games, and whether they stuck them out towards the left or right side of their mouths.

All the children stuck out their tongues during the games and tasks, which supports past research with 5- to 8-year-olds that suggested this is a common behaviour. But crucially, the children stuck out their tongues more during some tasks than others, and most of all in the knock and tap game. This goes against expectations (the researchers thought the fine motor control games would provoke the most tongue protrusions) but Forrester and Rodriguez argue their surprise finding makes sense in terms of the evolutionary history of language. They explain the knock and tap game involves rapid turn-taking, hand gesturing and structure rules – what you could think of as "the foundational components of a communication system" or the rudiments of language.

This fits with another result, which is that most of the kids' tongue protrusions tended to be biased to the right, suggestive of control by the left brain hemisphere. The left side of the brain is the side that's more dominant for language in nearly all right-handers, so again we have a suggestion that children's gestural activities are accompanied by tongue protrusions because of the tongue and hands sharing a link with language and communication. The researchers think that adults (presumably excluding Miley Cyrus) suppress their own tongue protrusions because of the cultural connotations of sticking out your tongue.

Taken together with past research that's shown an overlap in the brain areas involved in speech and hand control, the researchers propose their new findings support the idea that the same communication system involves both the hand and the mouth, and that "hand and tongue actions possess a reciprocal relationship such that when structured sequences of hand actions are performed they are accompanied by spontaneous and synchronous tongue action".

_________________________________ ResearchBlogging.org

Forrester, G., & Rodriguez, A. (2015). Slip of the tongue: Implications for evolution and language development Cognition, 141, 103-111 DOI: 10.1016/j.cognition.2015.04.012

--further reading--
Why do children hide by covering their eyes?

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Wednesday, 20 May 2015

Poverty shapes how children think about themselves

"The Culture of Poverty”, published in 1966 (pdf), was hugely influential, persuading many policy makers that children from low-income families are destined for lives of “criminality, joblessness, and poverty” because they exist in enclaves characterised by dysfunctional beliefs and practices. Thankfully, this fatalistic view has since been largely refuted and attention has turned to ways to help poor children, including giving them access to books, good teachers and stable environments.

Now a review from the University of Massachusetts has highlighted a different way that poverty can leave a lasting impression on children: by altering their psychological states in ways that shape their future. This sounds like a bleak picture, but the review urges the situation is one we can combat.

Authors Amy Heberle and Alice Carter point out that adults belonging to a disadvantaged group are vulnerable to a pair of effects: higher levels of stress when their low status is made clear to them, termed status anxiety; and underperformance on a task when reminded that their social group is stereotypically poor at the task, termed stereotype threat. If these phenomena apply to young children, as Heberle and Carter propose, then even if they have a stable, stress-free family life, poor kids are likely to generate their own stress and underperform simply through awareness of their own in-poverty status.

For this to be the case, young children would need to possess social categories and understand stereotypical beliefs. It appears they do. The review explains how, from the age of five, children in Western countries have a handle on the category of "poor people", and are able to describe it coherently. Interestingly, middle income children are likely to use "dirty", "mean", and other stereotypes in their descriptions, whereas poorer children are more likely to describe how poor people feel, suggesting greater empathy and an awareness that they lie within or close to that group.

In terms of stereotypical beliefs, children from first grade (aged six to seven) onwards endorse the belief that poor kids do worse in school. Moreover, there is evidence from a single study that children believe that although poorer children have a similarly broad range of ambitions to that of other children, less than a quarter of these dreams would be achieved, whereas non-poor children should achieve the majority of theirs.

For poor children to be burdened by stereotype threat, they would also need to be conscious that others might assign them to a stereotypical category. Here the evidence is thinner, but we know that poor kids who say they would prefer poor friends give reasons including “they wouldn’t judge you on how you look, you talk, and the way you were.”

Heberle and Carter emphasise that more research is needed to establish exactly when children begin to experience status anxiety and stereotype threat. They urge far more work on the under-5s (children begin drawing social categories and stereotypes by the age of two), which would require the use of non-verbal techniques (e.g. preferential looking) in place of questions and conversation. They predict such research will show that social class and stereotypes fall into place by age three, very early in a child’s sense-making of the world.

If these mechanisms do have an impact, it would explain why researchers have struggled to establish a causal link between inequality and health outcomes at a personal level, even though we know more equal nations have better health. At least in developed nations, it may be that the harm comes not so much from lack of absolute material wealth but from the psychological mechanisms triggered by comparative poverty. These mechanisms might even be a contributing factor in the recent finding that 12-13 year olds from low-income families have thinner cortices in brain regions associated with academic performance.

If Heberle and Carter are right, then growing up poor does throw up psychological obstacles to healthy functioning. But these are issues that teachers and families can challenge by discussing and countering negative beliefs about poverty with their children, and that policy-makers can tackle too. Even innocuous, discretionary costs, such as a museum trip fee, can be too much for a stretched family budget, creating separation between poorer children and their peers. Recognising this, societies can try harder to lessen these burdens.

_________________________________ ResearchBlogging.org

Heberle, A., & Carter, A. (2015). Cognitive Aspects of Young Children’s Experience of Economic Disadvantage. Psychological Bulletin DOI: 10.1037/bul0000010

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Friday, 15 May 2015

Companies are more successful when their employees feel young for their age

If you want a dynamic workforce, seek not the young, but the young at heart. That’s the message of a new study that surveyed over 15,000 employees from 107 companies to determine how subjective age influences workplace performance.

Past research has made the case that employee age is important to workplace performance, with younger workers more likely to make breakthrough contributions – but the evidence is patchy, suggesting there is more to the story. The proposed cause for the youth advantage is that their mindset is focused on getting ahead and furthering their skills, networks and status, whereas older people are more concerned with maintaining their positions. Now a research team led by Florian Kunze has posed the question: if mindset is critical, then isn’t how old you feel really what matters?

In their survey, employees who felt substantially younger than their chronological age were more successful in meeting the goals they'd promised their managers they would achieve. Companies with more of these "young at heart" employees also tended to perform better overall, in terms of financial performance, efficiency and a longer tenured workforce. The survey also showed that organisations tended to have more young at heart workers when they offered both age-inclusive policies and, on average, their employees felt that their work was more important and meaningful.

This cross-sectional study can’t prove the causality, but it’s possible that the optimism and possibilities afforded by meaningful work can make us feel more vibrant, and active policies that challenge stereotypes and extend opportunities to older workers help remove the sense of age being an issue.

The Western workforce is steadily greying, so if chronological age were the be-all and end-all, organisational leaders ought to be concerned. But this research suggests that climates where all workers can feel young, energised by their work and not judged and stereotyped, facilitate the kind of dynamic performance associated with young bucks.

_________________________________ ResearchBlogging.org

Kunze, F., Raes, A., & Bruch, H. (2015). It Matters How Old You Feel: Antecedents and Performance Consequences of Average Relative Subjective Age in Organizations. Journal of Applied Psychology DOI: 10.1037/a0038909

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Friday, 1 May 2015

Children use time words like "seconds" and "hours" long before they know what they mean

For adults, let alone children, time is a tricky concept to comprehend. In our culture, we carve it up into somewhat arbitrary chunks and attribute words to those durations: 60 seconds in a minute, and 60 of those in an hour and so on. We also have a sense of what these durations feel like. Children start using these time-related words at around the age of two or three years, even though they won't master clocks until eight or nine. This raises the question – what do young children really understand about duration words?

Katharine Tillman and David Barner began by asking dozens of three- to six-year-olds to compare several pairs of durations (e.g. Farmer Brown jumped for a minute. Captain Blue jumped for an hour. Who jumped more?). As well as minutes and hours, other durations used were seconds, days, weeks, months and years. This test showed that by age four, the children were tending to get more of these questions right than would be expected if they were just guessing. With increasing age, the children got better at the task. In other words, from age four and up, children have a sense of the rank order of different duration terms.

What young children don't have, according to the findings from further experiments, is a sense of the actual lengths of time that these terms refer to. When the comparison test was repeated, but with different amounts of each duration, the children were flummoxed. Take, for example, the question "Farmer Brown jumped for three minutes. Captain Brown jumped for two hours. Who jumped more?" As adults, we aren't thrown by the minutes outnumbering the hours by three to two, because we know that an hour feels much longer, and is by definition 60 times longer. However, even five-year-olds, who know well the principle that an hour is longer than a minute, were thrown by these kinds of comparisons. This suggests they don't yet have a very good understanding of the formal definitions of duration words, nor what the different durations feel like.

In another experiment, five- to seven-year-old children were asked to place different duration words along a horizontal line after the far-left end had been described to them as the location for "something very short, like blinking" and the far-right end as "something very long, like the time from waking up in the morning to going to bed at night". Again, before age 6 or 7, the children really struggled with this – even with the order correct, they tended to space them out inappropriately, compared with how an adult would do it. Six and seven-year-olds who knew the formal definitions for the duration words tended to perform better.

These findings mirror what's been found for the way children use words for other concepts like numbers and colours. Before they map the words onto actual perceptual experiences, they understand that words in a given domain are related, and (in the case of numbers and time), they have a sense of the relative magnitude of the concepts. But it's only after using such words for some years, and learning their formal definitions, that they fully connect the experience of the concept (such as the length of an hour, or the physical magnitude of a number) with its corresponding word.

"Our results indicate that proficiency in estimating the absolute time encoded by duration words emerges relatively late," the researchers said, "and may even rely on formal instruction in [primary] school."

_________________________________ ResearchBlogging.org

Tillman, K., & Barner, D. (2015). Learning the language of time: Children’s acquisition of duration words Cognitive Psychology, 78, 57-77 DOI: 10.1016/j.cogpsych.2015.03.001

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.


Google+