Showing posts with label biological. Show all posts
Showing posts with label biological. Show all posts

Friday, 6 May 2016

Smartphone study reveals the world's sleeping habits

Middle aged men get the least sleep, the research found
Researchers in the USA have used a smartphone app to see how people's sleep habits vary around the world. More specifically they've investigated how much the timing of sunrise and sunset affect people's sleep times or if social and cultural factors are more important. "Quantifying these social effects is the next frontier in sleep research," they write in the paper in Science Advances.

The study involved the ENTRAIN smartphone app which helps people recover from jet lag by recommending ideal levels of light exposure based on a user's typical sleep routine. Users have the option to make their information available for research. Olivia Walch and her colleagues at the University of Michigan began collecting data from the app in 2014 and the new analysis is based on information sent in by 8070 users around the world during the first year.

Overall the data showed that a later sunrise goes hand in hand with later waking-up times, and that a later sunset is associated with people going to bed later, just as predicted based on how light affects the suprachiasmatic nucleus – the bundle of neurons behind the eyes that controls our sleep cycle, also known as the circadian rhythm. But crucially, the link between sunset and bedtime was weaker than biological explanations would predict.

Put differently, the time we get up is strongly influenced by the timing of sunrise, but the time we go to bed is not as strongly influenced by sunset, suggesting other social and cultural factors are involved. Consistent with this account, most of the cross-cultural differences in sleep – for example, the Dutch reported the most sleep and Singaporeans the least – were explained by later bed times in the countries getting less sleep.

The difference between the countries with the most and least sleep wasn't huge: just under 7.5 hours for Singapore and just over 8.1 hours for The Netherlands. But the researchers emphasised that even a 30-minutes difference is meaningful, especially when you consider that sleep debt can have a cumulative effect over time.

Users of the app from the UK averaged about 8 hours sleep (a healthy amount) with average wake time just after 7 am and average bed time just before 11.15.

The researchers were also able to use the smartphone data to compare sleep habits by age, gender, and time spent exposed to natural light. Age was the most important factor with older people tending to go to sleep earlier. There was also much less variability in the sleep times of older users, which could because of biological mechanisms that narrow the window of opportunity for when it's easy for older people to fall asleep.

This age-related finding could have everyday relevance – "being careful about how much light affects your circadian clock could be more and more important to sleep as you get older," the researchers said. If your body's only willing to sleep between fairly limited hours, you're best off listening to it and switching off that TV.

Meanwhile, women were found to get more sleep than men – 30 minutes more, on average – thanks both to going to bed earlier and waking up later. The gender difference was greatest in mid-life so that middle-age men are the demographic group getting the least sleep, on average.

In terms of exposure to outdoor, natural light, app users who had more of this tended to report going to sleep earlier and sleeping more, which is as you'd expect based on the effect of daylight hours on the brain's circadian clock.

The researchers concluded that their results "point to the suppression of circadian signaling at bedtime as an important target for clinical sleep intervention; and suggest that age-related differences in the window during which sleep can occur are evidenced on a global scale". Aside from these specific insights into sleep, the group also said their findings show the power of modern smartphone technologies as a research tool. ""This is a cool triumph of citizen science," said co-author Daniel Forger in a press release.

--A global quantification of "normal" sleep schedules using smarphone data

_________________________________
   
Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

Thursday, 5 May 2016

Literally "knowing by heart": Signals from heart to brain prompt feelings of familiarity

The idea that the body affects the mind is not new. The eminent American psychologist William James famously proposed that it is actually the physical sensation of fear that causes us to feel afraid. In more recent years, researchers have extended this principle, exploring the possibility that physical sensations play an important role in moral decisions and other processes usually seen as more purely cognitive or cerebral, such as memory.

It's already known that physical markers of arousal such as dilated pupils correlate with feelings of familiarity, but this could be because of the mental effort of remembering, rather than because physical arousal triggers the feeling of familiarity. Now a pioneering study in Journal of Experimental Psychology: General gets around this problem by testing people's memory for faces presented at specific phases of the heart beat. The results provide compelling evidence that our feelings and judgments about familiarity are influenced by signals arising from the heart.

Chris Fiacconi at Western University and his colleagues began by asking 37 undergrads to look at 68 fearful faces, presented for 1.5 seconds each. Next, the participants looked at 136 more faces – half had appeared previously and half were new – and their task was to say whether they had seem them before. Crucially, the participants were wired up to an heart monitor and during the memory test some of the faces were presented at the precise moment that the heart had just pumped a burst of blood into the arteries – the so-called systole phase – while the other faces were presented while the heart was relaxing, known as the diastole phase.

This is important because the systole phase increases blood pressure, which is detected by baroreceptors in the heart's arteries, and in turn the baroreceptors signal this change in pressure to various regions in the brain, including the brain stem but also higher brain areas involved in cognition. The amazing revelation from this first study was that participants were significantly more likely to say that a face was familiar if it was presented during the systole phase. This was true for faces that were old and also for those that were actually new. A follow-up study using neutral faces made the same findings.

A final study made things a little more elaborate. During the memory test for the faces, whenever participants thought they'd seen a face before, they were asked to clarify whether they actually recollected seeing it, or if it just felt familiar but they did not actually remember seeing it. Intriguingly, presenting a face during the heart's systole phase increased participants' tendency to report a sense of familiarity, but did not affect their claims to actually recollect having seen it.

These are dramatic results because they suggest that when making memory judgments, we don't necessarily rely only on our actual memory traces in the brain, but that we also interpret our physiological sensations. By presenting faces at a specific moment in the heart beat cycle, the current research effectively hacks into this system to trick participants into thinking they've seen new faces before. As the researchers state – in such cases "participants may interpret the transient increase in arousal that results from baroreceptor mediated feedback as owing to the familiarity of the stimulus probe."

Where next for this line of research? Fiacconi and his colleagues said that an important future goal is "to determine whether other epistemic feelings, such as feelings of knowing, tip-of-the-tongue-states, and deja-vu experiences are also shaped by this type of visceral feedback."

_________________________________ ResearchBlogging.org

Fiacconi, C., Peter, E., Owais, S., & Köhler, S. (2016). Knowing by heart: Visceral feedback shapes recognition memory judgments. Journal of Experimental Psychology: General, 145 (5), 559-572 DOI: 10.1037/xge0000164

--further reading--
People make more moral decisions when they think their heart is racing
Neuroscience lessons in body awareness from the man with two hearts
Does your heart rate hold the secret to getting in “the zone”?

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free weekly email will keep you up-to-date with all the psychology research we digest: Sign up!

Monday, 7 March 2016

This one physiological measure has a surprisingly strong link with men's and women's propensity for violence

By guest blogger Richard Stephens

I have a professional interest in the naughty. In my recent book Black Sheep The Hidden Benefits of Being Bad I explored in a light hearted fashion the psychology around the upsides of various antisocial behaviours – things like swearing, drinking, affairs and untidiness to name a few. However, this post is about physical violence, a much more serious form of bad behaviour for which I see no upside at all.

Thankfully there is some fascinating psychology into the factors that may lead people to violence, and that may yet help society to curb such negative behaviour. While it’s true that many of the factors associated with violence are situational – things like poverty, unemployment and educational attainment – there are also personal characteristics that are strongly associated with the chances that a person will behave violently. One of these is a person’s heart rate.

A recent study published in The International Journal of Epidemiology is just the latest to find a link between lower resting heart rate and more violent behaviour. Joseph Murray at the University of Cambridge and his colleagues measured resting heart rate in over 3,000 male and female children growing up in Pelotas, a relatively poor city in a relatively rich southern state of Brazil.

This was longitudinal study in the style of the British “Seven Up!” documentary series, with children born in 1993 periodically called back for interviews and testing as they grew up. The researchers were specifically interested in their participants’ resting heart rate, which is the heart’s beats per minute after 10 minutes of sitting still quietly. The researchers measured this three times – when the children were aged 11, 15 and 18.

A novel aspect of this study compared with earlier research (including in the UK in the 1950s and 1970s and more recent US and Swedish studies), is the sheer frequency of extreme violence in Pelotas: in 2011 the city had a murder rate of 18.9 per 100,000 population, almost 20 times higher than in England and Wales and Sweden. The new study also included women whereas the earlier research focused only on men.

The researchers identified criminal behaviour through a combination of asking the young people at the age of 18 if they had committed any crimes during the past year, and by checking with legal agencies to see if they had a criminal record. Crimes were flagged as violent if they involved assault, robbery, weapons, murder, kidnapping, non-consensual sex, serious personal threats and other rare violent acts.

For males there were clear links between resting heart rate at age 11, 15 and 18 and participation in violent crime. Males with a lower resting heart rate averaging around 59-65 beats per minute were between one-and-a-half-times and two-times as likely to have committed violent crimes compared with males with a higher resting heart rate averaging around 90-92 beats per minute. Women with a lower resting heart rate were twice as likely to have committed violent crimes than women with a higher resting heart rate.

I should add that the researchers did some additional work checking whether several situational contexts known to be associated with violent behaviour – things like unplanned pregnancy, the mother’s years in education and the family income – had any bearing on the results. But the findings stood even after taking these situational factors into account.

Why might resting heart rate be linked with violence? One theory is that having a low resting heart rate is very unpleasant to the extent that it drives individuals to seek stimulation, which may manifest as antisocial behaviour. A similar explanation was put forward by Hans Eysenck in the 1960s to explain the extravert personality trait.

Another theory is that low resting heart rate is a sign of fearlessness. Children lacking fear may be more likely to commit antisocial acts because they are unconcerned about the possible adverse consequences such as admonishment by a parent or teacher. The current study did not have any means of testing these competing theories but earlier US research found no effect of fearlessness when looking at resting heart rate and aggressive antisocial behaviour. On balance then, fearlessness seems less likely to be the underlying cause.

As the authors of the new research point out, it is surprising that a personal, physical characteristic like resting heart rate can have such a clear cut link with violent behaviour for both men and women, above and beyond societal influences like poverty, inequality, gangs, drug trafficking and corrupt justice systems.

I asked study author Joseph Murray, what the direct impacts of these findings might be – could we use what we know about resting heart rate to prevent violent outbreaks before they happen? Professor Murray said “While these were fascinating findings, I do not think that there are any direct implications for practice”, adding, “the level of current understanding about the mechanisms involved does not permit more than speculation.”

Still, this study provides a clear illustration that if we want to understand societal problems like crime and antisocial behaviour, we should look closely at the psychological and biological factors that are involved, as well as the social and societal contexts in which these behaviours are played out.

_________________________________ ResearchBlogging.org

Murray, J., Hallal, P., Mielke, G., Raine, A., Wehrmeister, F., Anselmi, L., & Barros, F. (2016). Low resting heart rate is associated with violence in late adolescence: a prospective birth cohort study in Brazil International Journal of Epidemiology DOI: 10.1093/ije/dyv340

Post written by Richard Stephens for the BPS Research Digest. You can read more of Richard’s work in his critically acclaimed popular science book: Black Sheep The Hidden Benefits of Being Bad.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Thursday, 25 February 2016

Twin study raises doubts about the relevance of "grit" to children's school performance

Grit is in vogue. US psychologist Angela Duckworth's TED talk on grit is one of the most popular recorded. And her forthcoming book on the subject, subtitled "the power of passion and perseverance" is anticipated to be a bestseller. On both sides of the pond, our governments have made the training of grit in schools a priority.

To psychologists, "grit" describes how much perseverance someone shows towards their long-term goals, and how much consistent passion they have for them. It's seen as a "sub-trait" that's very strongly related to, and largely subsumed by, conscientiousness, which is known as one of the well-established "Big Five" main personality traits that make up who we are.

The reason for all the interest in grit, simply, is that there's some evidence that people who have more grit do better in life. Moreover, it's thought that grit is something you can develop, and probably more easily than you can increase your intelligence or other attributes.

But to a team of psychologists based in London and led by behavioural genetics expert Robert Plomin, the hype around grit is getting a little out of hand. There just isn't that much convincing evidence yet that it tells you much about a person beyond the Big Five personality traits, nor that it can be increased through training or education.

Supporting their view, the researchers have published an analysis in the Journal of Personality and Social Psychology of the personalities, including grit, and exam performance at age 16 of thousands of pairs of twins. Some of the twins were identical meaning they share the same genes, while others were non-identical meaning they share roughly half their genes just like non-twin siblings do. By comparing similarities in personality and exam performance between these two types of twin, the researchers were able to disentangle the relative influence of genes and the environment on these measures.

The main finding is that the participants' overall personality scores were related to about 6 per cent of the variation seen in their exam performance. Grit specifically was related to just 0.5 per cent of the differences seen in exam performance. Given the small size of this relationship, the researchers said "we believe that these results should warrant concern with the educational policy directives in the United States and the United Kingdom."

Also relevant to the hype around grit, the researchers found that how much grit the participants had was to a large extent inherited (about a third of the difference in grit scores were explained by genetic influences), and that none of the difference in grit was explained by environmental factors that twin pairs shared, such as the way they were raised by their parents and the type of schooling they had (this leaves the remaining variance in grit either influenced by so-called "non-shared environmental factors" – those experiences in life that are unique to a person and not even shared by their twin who they live with – or unexplained). This is a disappointing result for grit enthusiasts because it suggests that the experiences in life that shape how much grit someone has are not found in the school or the home (at least not for the current sample). Bear in mind, though, that this doesn't discount the possibility that a new effective home- or school-based intervention could be developed.

The researchers concluded that once you know a child's main personality scores, knowing their amount of grit doesn't seem to tell you much more about how well they'll do at school. This study doesn't rule out the idea that increasing children's grit, if possible, could be beneficial, but the researchers warned that "more research is warranted into intervention and training programs before concluding that such training increases educational achievement and life outcomes."

_________________________________ ResearchBlogging.org

Rimfeld, K., Kovas, Y., Dale, P., & Plomin, R. (2016). True Grit and Genetics: Predicting Academic Achievement From Personality. Journal of Personality and Social Psychology DOI: 10.1037/pspp0000089

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Tuesday, 26 January 2016

New review prompts a re-think on what low sugar levels do to our thinking

Glucose. Fuel for our cells, vital for life. But how fundamental is it to how we think?

According to dual-systems theory (best known from Nobel laureate Daniel Kahneman’s work), low blood glucose favours the use of fast and dirty System One thinking over the deliberative, effortful System Two. Similarly, the ego depletion theory of Roy Baumeister sees glucose as a resource that gets used up whenever we resist a temptation.

But the authors of a new meta-analysis published in Psychological Bulletin find these claims hard to swallow. Their review suggests that glucose levels may change our decisions about food, but little else.

Jacob Orquin at Aarhus University and Robert Kurzban at the University of Pennsylvania searched the decision-making literature, finding 36 articles that directly investigated glucose by measuring blood concentration, providing participants with sugar solution, or via interventions such as wafting food smells, which triggers some amount of glucose production.

The authors pored through the articles and tabulated every effect, its direction as well as its size. They found the effects were very variable, often operating in different directions from study to study. But when the data was organised according to a key factor, a consistent pattern began to emerge. That factor? Food.

In payment tasks – involving hypothetical purchases ("how much would you pay for ...") and actual purchases while shopping – low blood glucose did increase people’s willingness to overspend … on food. But it actually made them less willing to spend money on non-food products. When it came to persistence on tasks (such as time spent trying to complete a puzzle), low glucose decreased willingness to work for non-food rewards, but led to more tenacious work towards food-related goals. And when people were given the choice between receiving a small amount now or a large one later, low glucose led to a large bias towards immediate gratification when food was the payoff, compared to a much smaller bias for non-food.

This pattern of results doesn’t fit the notion of glucose as willpower-fuel. It suggests instead that low glucose is a signal that, to ensure future wellbeing, food should be prioritised – by paying more for it, working harder for it, and grabbing a little now rather than taking the promise of more in the future. This signaling account also explains the recent discovery that you don’t need to consume glucose to produce some cognitive effects, simply tasting it is enough (by swishing around the mouth); no fuel has been received, but presumably the signaling system is temporarily fooled by the taste receptors.

Kahneman can sleep easy – the findings from this meta-analysis aren’t a blow to his dual process theory as a whole, merely the specific claim that glucose has a role in switching between thinking smart and slow. The meta-analysis is a more substantial problem for the claims of ego depletion, which are intimately related to the idea that willpower is a finite resource that depends on glucose.

Based on the prior glucose research and theory, some publications have recommended strategies like eating chocolate before tense marital discussions or stacking emergency Jelly Belly’s in the office desk drawer. But according to this meta-analysis, these strategies will yield little benefit; the main implication of being low on glucose is a greater preoccupation with finding something to eat. There’s a lot of strong psychological science out there to help with building everyday habits and making better decisions, so if you’re looking for a dose of something, we recommend you check those out instead.

_________________________________ ResearchBlogging.org

Orquin, J., & Kurzban, R. (2015). A Meta-Analysis of Blood Glucose Effects on Human Decision Making. Psychological Bulletin DOI: 10.1037/bul0000035

--further reading--
Labs worldwide report converging evidence that undermines the low-sugar theory of depleted willpower
New research challenges the idea that willpower is a "limited resource"

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Thursday, 14 January 2016

What does fear do to our vision?

By guest blogger Melissa Hogenboom

Consider the following scenario. A policeman is on patrol, maybe he's quite new to working in the field. He sees a suspicious young man and decides to follow him.

He turns the corner and sees that the man has drawn a gun from his pocket. In a snap second – almost too fast to think twice – he takes out his own gun and shoots the man dead.

Only the man didn't have a gun at all, it was a mobile phone.

Sadly, it's a familiar story. An incident exactly like it occurred only last week (January 2016) and a quick trawl though more newspaper reports shows how commonly it occurs.

When people make snap decisions in situations like this, they are often under intense momentary stress. This can provoke a host of automatic mental and physical effects that some psychologists refer to as "freezing behaviour". We usually think of this kind of reaction as occurring in animals – a mouse paralysed with fear or a deer trapped motionless in the headlights (resulting in much road kill).

In other words, it's the moment before an animal decides what to do to do, whether to "fight or flee". This is believed to be an innate response to a predator, to avoid being seen or heard. Research has shown that an animal's heart rate actually decreases when in this state.

Although we hear about it less in humans, our physiological response can be similar. For instance one 2005 study found that in response to pictures of mutilated bodies, participants' physical movements reduced and their heart rates slowed. The same effect was found in a 2010 study in response to pictures of threatening faces.

However, there’s still much more we need to learn about the effects of the human freezing response – for example, what effect does it have on visual perception, and could any effects help explain some of the tragic instances when police have mistaken phones and other harmless objects for guns? A new study published in the Journal of Experimental Psychology aimed to find out.

Maria Lojowska of Radboud University and colleagues in the Netherlands tested 34 participants between the ages of 18 and 30. To create a situation that elicited freezing behaviour, the researchers occasionally gave their participants a mild electric shock, which was always preceded by a red dot. Participants were told they were taking part in a visual perception task and were fully informed about the nature of the shocks before the experiment started.

It was not the shock itself that made the participants show "freezing behaviour" (as measured by their heart rate), rather it was the anticipation of the shock. When participants saw a green dot (which did not presage a shock), they relaxed, but when they saw a red dot they felt more scared, regardless of whether a shock was actually given or not.

The participants’ task was to judge as accurately as possible the orientation of the lines inside small squares, which appeared on a computer screen on the left or right of their visual fields. The squares either had several lines (high detail) or few lines (indicating low detail), as you can see below. Crucially, the researchers found that the participants’ visual performance was affected by whether or not they were stressed and showing physiological signs of freezing. When they were afraid and stressed, their performance at judging the squares with high detail was impaired but their ability to judge the squares with coarse visual details actually improved.

The square on the left features high detail and the one on the right low detail. Stimuli from Lojowska et al 2015. When scared, participants were better at perceiving low detail. 
The researchers said that previous research in animals had suggested that the freezing response leads to an overall improvement in vision, but their new findings suggest a more nuanced situation – it seems that when we’re afraid, we perceive some aspects of the world more clearly, but at the cost of ignoring much of the detail.

Intuitively, it makes sense that an animal or human only sees the most basic detail of a potentially threatening object. It would take too much time to take in all the detail of a scene. Our brain has a clever way of quickly reconstructing what every object is likely to be using its memory of similar events and situations, rather than analysing each new thing afresh, in depth. It is these shortcuts that can result in errors and visual illusions.

Despite these potential flaws in our visual perception, it's important for us to be able to perceive things quickly. If you are walking in a desert and glimpse a shape that could be a snake (but is more likely a stick), it's better to show caution and stop than assume it's a stick and walk right into danger.

Now that we better understand how our visual perception changes when we feel fear, Maria Lojowska and her team plan to discover exactly what's going on in the brain when this happens. Meanwhile, the researchers hope their findings might help inform training programmes to improve a person's performance when they are in a stressful environment. Many police forces in the US already train their officers to overcome their implicit bias towards race and sex. It would be helpful to add the limits of our visual perception to the list.
_________________________________

ResearchBlogging.org

Lojowska, M., Gladwin, T., Hermans, E., & Roelofs, K. (2015). Freezing promotes perception of coarse visual features. Journal of Experimental Psychology: General, 144 (6), 1080-1088 DOI: 10.1037/xge0000117

--further reading--
When humans play dead
Why do we get goose-bumps?
How science is helping America tackle police racism
The smell of fear more powerful than previously realised

Post written by Melissa Hogenboom (@melissasuzanneh) for the BPS Research Digest. Melissa is BBC Earth's feature writer.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Tuesday, 12 January 2016

Children born via IVF might be developmentally advantaged compared with their peers

Recent years have seen a huge increase in the number of children born via IVF and other fertility treatments (in 2011 in the UK, 17,041 babies were born via IVF). While this has undoubtedly brought immeasurable happiness to many families, medical experts have raised concerns that the steps involved in IVF – such as the direct implantation of embryos into the mother's uterus, and in some cases the injection of an individual sperm cell into the egg – may bypass some biological filtering processes thereby increasing the odds of inherited illnesses and disorders.

The existing research literature looking into this issue is incredibly contradictory. For example, some studies have found evidence of a higher prevalence of cognitive and social problems in children born through fertility treatments, yet other studies have found no differences between children born by IVF and their peers, while still other studies have actually documented advantages in IVF children. Part of the reason for the mixed results may be to do with the different cognitive and social measures used and changes to fertility treatment procedures over the years.

To this field comes a new paper in the European Journal of Developmental Psychology. The researchers led by Edwa Friedlander at the Hebrew University of Jerusalem hoped to improve upon the existing evidence base by using three gold standard measures of cognitive and social development, and by focusing on children born with the help of IVF (and the ICSI variant involving individual sperm injection).

The researchers compared 67 children born with the help of IVF and 67 children born without any medical intervention. At the time of assessment, the children in each group had an average age of four, with a range between 1 year 11 months and 7 years 6 months. Using the researcher-administered "Mullen scales of early learning" that taps motor control and language, and the researcher-administered "Autism Diagnostics Observation Schedule" which measures autistic-like behaviours and symptoms, the researchers found no differences between the two groups. Meanwhile, using the Vineland – a structured interview that involves asking parents or caregivers questions about the child's social, communication and motor skills – the researchers actually found the IVF group's scores on communication and motor skills were superior compared with their peers.

Friedlander and her colleagues concluded that infertile couples and medical professionals who work in the field of fertility treatments "can be encouraged by the current findings". However, they do add several notes of caution. For one thing, the assessments used in the current study are not sensitive enough to detect some specific signs and symptoms associated with developmental conditions, and also some potential deficits may not appear until later in childhood.

Another thing: the apparent advantage for the IVF group was only seen in the parental interviews, which of course involved a large degree of subjectivity. It's worth bearing in mind that other research has similarly documented advantages for IVF children as measured by parental report, but not through researcher-administered tests. That said, another recent study using a clinician-administered test did report motor development advantages for IVF-conceived children. There's definitely a need for more research to iron out these contradictory findings, but if the developmental advantages for IVF-conceived children do turn out to be real, this would raise some very interesting questions about why they exist – for example, might the parenting styles of IVF parents play a part?

_________________________________ ResearchBlogging.org

Friedlander, E., Mankuta, D., Yaari, M., Harel, A., Ebstein, R., & Yirmiya, N. (2015). Cognitive and social-communication abilities among young children conceived by assisted reproductive technologies European Journal of Developmental Psychology, 1-14 DOI: 10.1080/17405629.2015.1115343

--further reading--
How women become "super-mothers" after giving birth through IVF

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Thursday, 20 August 2015

Why do more intelligent people live longer?

By guest blogger Stuart Ritchie

It’s always gratifying, as a psychologist, to feel like you’re studying something important. So you can imagine the excitement when it was discovered that intelligence predicts life expectancy. This finding is now supported by a large literature including systematic reviews, the most recent of which estimated that a difference of one standard deviation in childhood or youth intelligence (that’s 15 IQ points on a standardised scale) is linked to a 24 per cent lower mortality risk in the subsequent decades of life. That’s a pretty impressive link, but it immediately raises a critical question: why do brighter people live longer?

A new study (pdf) published in the International Journal of Epidemiology attempts to provide new, biological evidence to answer this question. But first, let’s think through the possibilities. We know that people with higher IQ scores tend to be healthier, possibly because they eat better, exercise more, are better able to understand health advice, are less likely to be injured in accidents and deliberate violence, and also because they tend to have better jobs. Here, the causal arrow is pointing from IQ to longevity – the effects of being smarter cause you to die later. But there are other explanations: what if having a lower IQ is just an indicator of an underlying health condition that’s the real cause of earlier death? Or what if the genes for having a healthier body are also the genes for having a healthier brain, and the causal pathway is from this third variable (i.e. genetics) to both IQ and longevity?

The authors of the new study, Rosalind Arden and colleagues, tested this last hypothesis, known as "genetic pleiotropy" (the idea that the same genes influence multiple different traits). They took three twin datasets, selecting in total 1,312 twin pairs where one or both of the twins had died. Then they correlated the twins’ IQ scores with the lengths of their lives (or their life expectancies, for those still living).

As they expected, the researchers found an overall lifespan-IQ correlation, albeit a small one (r = 0.12, where 1.00 would be a perfect match). Importantly, by comparing the correlations in identical twins (who share all their genes) versus fraternal twins (who share approximately half), they were also able to estimate the "genetic correlation" – the overlap in the two traits that’s caused by genetic differences. They found that, overall, 95 per cent of the correlation in IQ and longevity was due to genetics.

So, is this a final answer to the debate over the IQ-mortality connection? Does this show that, perhaps depressingly, the link isn’t due to changeable lifestyle factors, but actually some kind of genetic "system integrity" that underlies brightness and longer lives?

Ritchie's critically acclaimed
new book is out now.
Not so fast. The important part is in the phrase "due to genetics". In a 2013 Nature Reviews Genetics article, geneticist Nadia Solovieff and colleagues outlined all the potential causal mechanisms that might make two traits genetically correlated. They drew a critical distinction between "biological" and "mediated" pleiotropy. The former is the "obvious" inference, which is that the same genes cause both intelligence and longevity. But the latter possibility is that the variables only appear to be genetically correlated, because genes cause one factor, which then goes on to cause the other. That is, if genes cause intelligence, and intelligence (via lifestyle choices etc.) causes a longer lifespan, we’d still see the same genetic correlation, even if those genes have no direct effect on lifespan itself. If true, this would still be pleiotropy of a sort: the genes linked to intelligence are having an indirect effect on lifespan. But as the authors acknowledge in their paper, this "pleiotropy-lite" interpretation of the new findings would mean we don’t yet have knockdown evidence for the genetic "system integrity" idea.

So how do we tease apart the two possible explanations for the genetic correlation? In the paper, the authors suggest we study non-human animals (for which the literature on cognitive ability is growing fast) where we can more readily control the "lifestyle" factors, thereby isolating any potential direct effects of the same genes on both intelligence and longevity. Really, though, we might have to wait until we have a long list of genes that are reliably linked to human intelligence. If we knew a good number of those, we could test whether they also influence health and lifespan – if they did, this would be evidence for true "biological" pleiotropy. We’d know then that the link between IQ and lifespan is down to some people simply winning the genetic lottery, rather than to lifestyle factors that any of us could change.

Conflict of interest: Stuart Ritchie is a postdoc in the lab of Ian Deary, one of the co-authors of the paper discussed here.

_________________________________ ResearchBlogging.org

Arden, R., Luciano, M., Deary, I., Reynolds, C., Pedersen, N., Plassman, B., McGue, M., Christensen, K., & Visscher, P. (2015). The association between intelligence and lifespan is mostly genetic International Journal of Epidemiology DOI: 10.1093/ije/dyv112

--further reading--
How do you prove that reading boosts IQ?

Post written by Stuart J. Ritchie, a Research Fellow in the Centre for Cognitive Ageing and Cognitive Epidemiology at the University of Edinburgh. His new book, Intelligence: All That Matters, is available now. Follow him on Twitter: @StuartJRitchie

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Wednesday, 19 August 2015

The powerful motivating effect of a near win

If you while away time in a games arcade – play some coin pushers here, a few fruit machines there – you will soon be familiar with that frustrating and enlivening sensation of the near win that follows getting four cherries out of five. New research from INSEAD suggests that these tantalising near wins produce high levels of motivational arousal, that encourage us to chase whatever alternative rewards are then available.

In one fascinating experiment, Monica Wadhwa and JeeHye Christine Kim gave lottery scratch-cards to 164 US shoppers about to enter a fashion store. A row of six winning symbols earned $20, and the cards were rigged so a third of participants won, a third lost abjectly, and a third nearly made it, with five in a row. Shoppers then went about their shopping, and on exiting the store, were asked to share their till receipts. The near-winners had made significantly more purchases than the other groups.

Why? Goal-gradient theory, proposed in the 1930s, suggests that when a reward is one hundred steps away, your initial step progresses you only one per cent towards your payoff. However, once you are almost in reach, the payoff for each effortful act is much higher, meaning we become more physiologically aroused and ready to act. But should the reward be snatched away, the readiness to act doesn’t disappear. Instead, it tends to be transferred to other sources; in the above example, the shoppers who just missed out on the lotto card were well positioned to seek out other rewards, thanks to the availability of tills.

Other experiments showed a near win encouraging participants to make more effort in a card-sorting task when money was on the line, to hurry more towards a chocolate bar, and even to salivate more heavily to images of high-value currency.

There is a caveat. Tightly-focused lab experiments demonstrated that almost winning really only matters when the heightened state of anticipation is prolonged. In a diamond-seeking video game, players who were just one diamond short of victory, but discovered very early that they’d lost (they turned over a tile showing a fatal rock rather than the winning diamond), did not show the near-win effect. By contrast, players who were just one diamond short, and who stayed alive in the game for a sustained amount of time (they avoided any rocks until the very end), did show the near-win effect – after the game had finished, they raced to get a chocolate bar, as if channelling the heightened motivation built up by their near win.

Wadhwa and Kim point out that we already know that repeatedly nearly winning a game can facilitate addiction to it, via heightened production of dopamine, the neurotransmitter associated with motivation and the anticipation of pleasure, among other functions. This new study show this feeling generalises beyond the game. If you almost vanquish the end-of-level boss, you might be more motivated to pound out that tricky article … or reward yourself with a bag of pretzels. So be aware of what you surround yourself with!

_________________________________ ResearchBlogging.org

Wadhwa, M., & Kim, J. (2015). Can a Near Win Kindle Motivation? The Impact of Nearly Winning on Motivation for Unrelated Rewards Psychological Science, 26 (6), 701-708 DOI: 10.1177/0956797614568681

--further reading--
How losing can increase your chances of winning

Post written by Alex Fradera (@alexfradera) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Thursday, 23 July 2015

Why fathers might want to thank their handsome sons

Women rated men's faces as more attractive when they were shown alongside a good-looking son
If you're the father to a good-looking boy, you might want to give him your thanks – his handsome looks apparently mean women will tend to find you more attractive. That's according to a new study by Pavol Prokop at Trnava University in Slovakia, who says the result is consistent with the established idea from evolutionary psychology that women instinctively pick up on cues to the quality of a man's genes.

Just as past research has shown that women, on average, find taller men with symmetrical faces more attractive because such features are indicators of good genes, the new finding suggests a man's offspring also influence women's judgments about his attractiveness. If a man can sire a handsome boy, the instinctual logic goes, then he must be in possession of valuable genes.

In the first experiment, Prokop presented dozens of young women with several triads of photographs showing an attractive or unattractive young man alongside pictures of two boys, one attractive, the other less attractive. For each triad, the women's task was to say which boy the man was father to. The finding here was that the women were more likely to assume that attractive men were fathers to attractive boys (and unattractive men the fathers of less attractive boys). This simple test lay the groundwork for the remainder of the study, confirming that women generally assume that attractive fathers have attractive sons.

Next, nearly three hundred more young women rated the attractiveness of a series of attractive and unattractive men's faces, each of which was presented alongside a boy (also attractive or unattractive), who was supposedly the man's son. In truth, but unbeknown to the participants, none of the pictured men and boys were actually related. A further detail was that each man was described either as the biological father or step-father to the boy shown alongside him.

When a man's face was presented alongside what participants believed to be his handsome son, he (the putative father) tended to receive higher attractiveness ratings from the participants, than if he was depicted with an unattractive son. There was some evidence that this effect was greater for unattractive men, and the effect was more apparent when men were described as biological fathers than as stepfathers. A weakness in the methodology (there were no sons of neutral attractiveness), means we can't know how much attractive sons were making their fathers appear more handsome to the women, compared with how much unattractive sons were having the opposite effect.

If handsome men are more likely to sire handsome children, and those handsome children exaggerate their fathers' attractiveness still further, a self-perpetuating cycle could be set in motion that might help explain a previous finding: attractive men tend to have more children (within the same marriage) than less attractive men. Of course there's also the possibility that the attractiveness boost gained by having a handsome son could leave a man more open to advances from his partner's female rivals (known as mate-poaching in evolutionary psychology), a possibility that awaits further research.

The main finding of this research – that fathers are rated more attractive when their sons are good-looking – is open to some counter-interpretations. For example, perhaps there was a simple priming effect at play and seeing any attractive image alongside a man's face would lead that man's face to receive higher attractiveness ratings.

Prokop tested that possibility in a further experiment in which men's faces were presented alongside attractive or unattractive non-human pictures, such as nature scenes and buildings (e.g. a beautiful beach versus a dirty beach). This time, women's judgments about the attractiveness of handsome men were unaffected by whether a beautiful or ugly scene or object appeared alongside them, suggesting the effect of a handsome son on a father's attractiveness is unique.

However, unattractive men did benefit from higher attractiveness ratings when their faces were shown alongside a beautiful scene or object. This is good news for men who don't have film-star looks – after all, while the influence of genetic inheritance means they are less likely to have the chance to bask in the reflected beauty of a handsome son, this result says they can easily turn to other means of boosting their attractiveness instead. For example, Prokop said they could try "wearing fashionable clothes."

_________________________________ ResearchBlogging.org

Prokop, P. (2015). The Putative Son’s Attractiveness Alters the Perceived Attractiveness of the Putative Father Archives of Sexual Behavior, 44 (6), 1713-1721 DOI: 10.1007/s10508-015-0496-2

--further reading--
You hunky smile magnet
The downside of being good-looking AND wealthy
Shiny, swanky car boosts men's appeal to women, but not women's appeal to men
Men feel more physically attractive after becoming a father
Freud was right: we are attracted to our relatives

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Our free fortnightly email will keep you up-to-date with all the psychology research we digest: Sign up!

Wednesday, 24 June 2015

New research challenges the idea that willpower is a "limited resource"

A popular psychological theory says that your willpower is
a "limited resource" like the fuel in your car, but is it wrong?
When we use willpower to concentrate or to resist temptation, does it leave us depleted so that we have less self-control left over to tackle new challenges? This is a question fundamental to our understanding of human nature and yet a newly published investigation reveals that psychologists are in open disagreement as to the answer.

The idea that willpower is a limited resource, much like the fuel in your car, is popular in academic psychology and supported by many studies. In their recent report What You Need To Know About Willpower: The Psychological Science of Self-control, the American Psychological Association states "A growing body of research shows that resisting repeated temptations takes a mental toll. Some experts liken willpower to a muscle that can get fatigued from overuse."

This view was backed by an influential meta-analysis published in 2010 [pdf] that looked at the results from nearly 200 published experiments. But now a team led by Evan Carter at the University of Miami has argued that the 2010 study was seriously flawed and they've published their own series of meta-analyses, the findings of which undermine the limited resource theory (also known as the theory of ego depletion).

Many psychology studies on willpower follow a similar format: one group of participants is first asked to perform an initial challenging task designed to drain their willpower, before completing a second "outcome" task that also requires willpower. For comparison, a control group of participants performs the outcome task without the first challenge. Superior performance by the control participants (on the outcome task) is taken as evidence that the willpower of the first group was left depleted by the initial challenge, thus supporting the theory that willpower is a limited resource.

The new meta-analyses and the 2010 effort both consider the combined results from many studies following this format, but the new analyses are far stricter in that they only consider studies that used tasks well-established in the literature as ways to challenge willpower, including suppressing emotional reactions to videos and resisting tempting food, and that also used established tasks as outcome measures, including persistence on impossible anagrams, food consumption and standardised academic tests (such as the graduate record exam). The 2010 analysis, by contrast, included a far wider range of studies including those that stretch the definition of a willpower challenge to its limits, including darts playing and purely hypothetical temptations.

Another key difference between the 2010 study and the new analyses is that Carter and his team trawled conference reports to find unpublished studies on willpower. This is important because in this scientific field, as with most others, it's likely there has been a bias in the literature towards publishing positive results (in this case, those consistent with the popular idea that willpower becomes depleted with repeated use).

When Carter's team analysed the evidence from the 68 relevant published and 48 relevant unpublished studies that they identified, they found very little overall support for the idea that willpower is a limited resource. The one exception was when the outcome measure involved a standardised test – here performance did appear to be diminished by a prior self-control challenge.

But for other outcome tasks such as resisting food, the combined data from published and unpublished experiments either pointed to no effect of a prior self-control challenge, or there was worrying evidence of a publication bias for positive results, as was the case, for example, when the outcome challenge involved impossible anagrams or tests of working memory. The new meta-analyses even found some support for the idea that self-control improves through successive challenges, a result that's consistent with rival theories such as "learned industriousness".

This new series of meta-analyses should be not be taken as the end of the theory of willpower as a limited resource. Proponents of that theory will likely respond with their own counter-arguments, including questioning the use of unpublished work by the new study. However, the results certainly give pause. "We encourage scientists and non-scientists alike to seriously consider other theories of when and why self-control might fail," Carter and his team conclude. It's worth noting too that this message comes after the recent doubts raised about a related idea in willpower research – specifically, the notion that depleted self-control is caused by a lack of sugar in the body.
_________________________________

  ResearchBlogging.orgCarter, E., Kofler, L., Forster, D., & McCullough, M. (2015). A Series of Meta-Analytic Tests of the Depletion Effect: Self-Control Does Not Seem to Rely on a Limited Resource. Journal of Experimental Psychology: General DOI: 10.1037/xge0000083

--further reading--
Self-control – the moral muscle. Roy F. Baumeister outlines intriguing and important research into willpower and ego depletion

Post written by Christian Jarrett (@psych_writer) for the BPS Research Digest.

Google+