Facebook is No Friend to Mental Health

Roughly a half billion people interact on Facebook every day, with many more engaging in other social networking sites. With all these friends, it would seem that social networking followers have a great support system and a happy life. New research points out, however, that Facebook may actually be undermining well-being and life satisfaction.
Researchers at the University of Michigan examined two weeks of Facebook use and concluded that the more people use Facebook, the more negative they feel about their life moment-to-moment, and the more dissatisfied they were with their life in general over time. The results were not affected by gender, self-reported loneliness, baseline symptoms of depression, the size of the social network, the motivation for using Facebook, or the perceived level of support from Facebook friends.
Several other studies have reported negative associations with social networking, including tension between romantic partners. Social networking sites foster attachment issues, uncertainty, and partner surveillance, all of which lead to negative relationship outcomes. Another study reported that the use of social networking sites leads to decreased intimacy in relationships, mostly owing to perceptions about the quality and quantity of the romantic partner’s use of social networking. Divorce and cheating has also been attributed to Facebook, especially in relationships that are less than 3 years old.
Social networking sites appear to offer a means for individuals with low self-esteem or difficulties establishing and maintaining interpersonal relationships to grow relationships and share connections with other people. However, these individuals most in need of positive social relationships actually suffer from social networking sites owing to an inability to communicate appropriately and negative reactions from other people, which led to lower self-esteem and negative effects on well-being in several studies.
All of this is not to say that there are no benefits to social networking, or that all people who use it end up depressed and dissatisfied with life. Patients with chronic illnesses or rare conditions are able to find information on their diseases and gain support from others around the globe, which has led to improved disease education and clinical outcomes. And, younger people who have grown up in the digital age are living proof of the transformation of intimate relationships. For many in this demographic, larger Internet-based social networks lead to higher levels of life satisfaction and social support. The internet may offer permanent relationships in a mobile world.

Most people spend their days constantly connected to things (phones, computers, tablets) and not to other people. Growing a digital network of people who “like” you, “share” things with you, and want to be your “friend” does not, at least according to recent research, create life-sustaining, life-fulfilling friendships. Social networks can certainly be used in a healthy, appropriate way and offer a modern way to communicate. But, they cannot replace true, meaningful relationships that lead to improved overall well-being.

Smell Your Age

You have likely been told to act your age or look your age, but can you smell your age? According to new research, you can at least smell another person’s age. Apparently, humans can correctly identify a person’s age simply by smelling their body odor.
The group of genes that makes up the olfactory (sense of smell) receptors is the largest gene family in mammals. And, it’s no wonder why humans have evolved to be able to smell millions of odorants in minute concentrations: body odor helps us identify family versus non-family members, choose a mate, and differentiate between genders. Even though all this happens subconsciously, the olfactory sense is critical to our behavioral and social cues and our evolutionary history.
For the current study, researchers asked three groups of people to sleep in shirts with under-arm pads for 5 nights. The pads collected sweat from the three groups: young (20-30 years old), middle-aged (45-55), and old-aged (75-95). Then, they placed the pads in jars and asked 41 individuals to identify the age group of the person’s scent. Almost every participant was easily able to correctly identify the age group of the person whose sweat was in the jar.
The participants reported that old-age odors were more pleasant than younger age groups. Old-age odors were also less intense. In the young and middle-aged groups, participants were able to differentiate between genders. Middle-aged men were rated as having the most intense and most unpleasant odor.

Body odor results from a complex interaction among the skin, secretions from glands, and bacterial activity. Plus, diet and lifestyle habits influence body odor. As humans age, the composition of skin and of the body’s secretions change, along with a myriad of other factors, which likely results in smells identifiable with old age. Though these findings are interesting, the authors report little consequence for our daily lives. Olfaction is associated with memory, relational signals, food decisions, and overall health. But, is the ability to discriminate age or gender based solely on smell going to change the way we go about our daily olfactory activities? It conjures up visions of greeting people like my dog greets people, and that would stink.

Physician Sleep Deprivation – Potential Effects on Patient Care

It is well understood that resident, intern and attending physicians do not receive an adequate amount of sleep at night. Long on call hours, 24-hour shifts within the hospital and limited time off all contribute to poor quantity and quality of sleep. Research seems to suggest a potential effect on the quality of patient care as a result of such sleep deprivation.
A 2011 study that took place in Korea scored a number of residents and interns on sleep deprivation. Nearly 71% of participants were sleep deprived with a mean of only 5 (+/- 1.2) hours of sleep per night while working an average of 14.9 (+/- 2.7) hours a day. Among the most sleep deprived of participants, scores for attention deficit were higher than average, suggesting a potential difficulty in focusing on treatment and diagnoses of patient condition. Additionally, sleep deprivation made it more difficult for participants to learn new information, which may make the continuing education of such physicians more difficult.
Further research into sleep deprivation in general may also illustrate another potential barrier in professional performance. A literature analysis by Kamphuis illustrates that there is significant evidence to support a relationship between sleep deprivation and increased levels of aggression. In numerous studies, participants who reported a low quantity and/or quality of sleep also scored higher on indexes of anger, hostility and impulsivity compared to control groups. Interestingly, there appears to be a physiological component to such increased anger.
Prefrontal cortical functioning is impaired in sleep deprived individuals. This functioning is responsible for a person’s ability to regulate emotional and behavioral responses to stimuli. In one particular study, subjects with an average sleep deprivation of 30 hours were less likely to be able to recognize human facial emotions for moderate happiness or anger, suggesting an inability to empathize from sheer lack of recognition alone.
This link between sleep deprivation and aggression could be helpful in explaining the reasons why some physicians are considered short-tempered by non-physician healthcare staff. The inability to empathize could certainly be to blame for this phenomenon; however, this author was unable to find current research attempting to test such a correlation, suggesting a need for further investigation into this specific area.

As the physician shortage in America continues to worsen, the effects of sleep deprivation on physicians could become more marked. To mitigate this potential barrier to quality patient care, healthcare institutions must find ways of filling gaps with more mid-level providers, interns, and residents to spread the workload among a larger group of individuals. Sleep deprivation is linked with a wide range of physical and emotional health problems and may be directly affecting patient care. There is much work still to be done to identify the specific risks to patients as well as strategies to reduce such risks.

Electronic Devices Are Unlikely To Cause Cancer

Technology has taken over our lives. From the moment we wake and begin checking emails to the moment we go to bed, computers, tablets, smart phones, and television screens consume our world. While in many ways these devices are enriching our lives, are they also taking a negative toll on our bodies and minds?
Some of the potentially negative short-term effects of use (and specifically overuse) of electronic devices are more apparent than others. The use of electronic devices while driving has been indisputably associated with an increased risk of car accidents. From a physical standpoint, turning up the volume on MP3 players, particularly when using ear buds, has been shown to form the foundation for hearing problems.
In addition, prolonged use of electronic devices has been linked to a more sedentary lifestyle and hence an increased risk for cardiac disease. Academically, children are suffering by relying on abbreviations and slang to get their messages across, even in cases of formal writing. In fact, a study by the Kaiser Family Foundation indicates that children aged six and under spend an average of two hours per day using screen-based media. These numbers only increase as a child ages.
The data on more long-term health benefits, however, are less clearly defined. An extensive literature review of data published between 2000 and 2010 explored the effects of cell phones on human health. Interestingly, the data were inconclusive with regards to documented negative health effects including cancer. Further review explored the potential link between cancer and other electronic devices, yielding similar results.
An additional report published in the International Journal of Epidemiology corroborated these studies and specifically indicated that there is no increase in risk for either glioma or meningioma as a result of using mobile devices.
From a molecular perspective, this inability to conclusively document linkage between electronic devices and cancer may be explained by the low-frequency radiation emitted by these devices. These low-frequency waves are not strong enough to break the molecular bonds needed to cause mutations in DNA, and hence cause cancer. In fact, cell phones emit less than 0.001 kilojoule per mole of energy. This is far less than the great than 4800 kilojoules per mole of energy known to cause mutations in DNA and related carcinogenic effects.

While these early studies do seem encouraging at lessening the fear of dying from the overuse of electronic devices, additional longitudinal studies are still warranted to explore other potentially deleterious effects of overuse of electronic devices. Either way, we should probably remember to “unplug ourselves” every once in awhile and explore more social, emotional, and physical aspects of human life.

Current Treatments for Post-Amputation Pain

In the US, surgeons perform about 185,000 limb amputations each year, and the majority of these individuals are left with some sort of post-amputation pain. Unfortunately, post-amputation pain syndromes have proven very difficult to treat. A variety of treatments are available, but most of these have not had high quality clinical trials, and only appear beneficial in some cases.
Post-amputation pain falls into two distinct categories: phantom limb pain and residual limb pain. Phantom limb pain refers to unpleasant and painful sensations that seem to the patient to originate from the lost limb, even though the individual consciously understands that the limb is no longer there. One study found that the incidence of phantom limb pain might be as high as 85%. The pain can vary from sharp and shooting sensations to dull, squeezing, or cramping sensations.
An amputee may also perceive residual limb pain (sometimes called “stump” pain) in the part of the body that remains after amputation. It is typically a sharp, burning pain, with a reported incidence of up to 74%. Often, an individual experiences both types of pain syndromes. These pain syndromes are separate from other types of pain an amputee might experience, such as those arising from an incorrectly fitted prosthetic device.
Multiple mechanisms appear to play a role in the development of post-amputation pain, though not all mechanisms may be present in every individual patient. For example, in the brain itself, reorganization in the somatosensory cortex is thought to play a role (this brain area is partly responsible for “mapping” sensations onto the relevant body part). Evidence also points to a role for the spinal cord in post-amputation pain, specifically in a region known as the dorsal horn, which normally receives the incoming sensory information. In some cases, the nerves which were severed in the amputation also appear to contribute to the condition.
Local injection therapy is one of the current mainstays of treatment for individuals with post-amputation pain; for example, injection of the local pain blocker lidocaine at the amputation site. This is more effectual for residual limb pain than for phantom limb pain, and even then the effect is usually temporary.
Drug treatment is another option. NMDA antagonists like ketamine, opioid drugs like morphine, calcitonin, and anticonvulsant drugs have all shown mixed results in studies. Various types of surgical treatment are sometimes attempted, such as peripheral nerve stimulation, motor cortex stimulation, and deep brain stimulation. These treatments are still experimental, but motor cortex stimulation in particular looks like it may prove beneficial. One study found 53% of phantom limb patients were successfully treated with this method.
Mirror therapy provides another intriguing and cost-effective mode of treatment. In this approach, a mirror is placed adjacent to an intact limb, providing the illusion that the amputated limb is present. Some subjects report immediate lessening or elimination of pain, and multiple studies have demonstrated at least short-term pain reduction using this simple technique. It is believed that this feedback can help somatosensory pathways reorganize, thus reducing pain.

Future developments may involve better classifying patients into subgroups, which may show different response levels to different treatments. Many patients will continue to need multimodal, individualized treatment. Researchers hope to develop better and more proven treatments, partially through a more complete understanding of the multiple potential causes of the condition. New drugs, such as anti-nerve growth factor antibodies, are currently under early development, and may one day provide a new class of medications for the physician’s toolkit.

Boosting Cognitive Performance by… Chewing?

Previous research has shown that the act of chewing gum can significantly reduce reaction times during certain tests, including numerical working memory, sustained alertness, auditory oddball, and semantic memory tasks. There are various theories about why this is the case, ranging from the activation of specific areas of the brain to improved mood and decreased stress whilst chewing, and many have hypothesized that increased activation of blood flow to the pre-frontal cortex is the cause, as both alerting and executive functions are thought to be housed there.
Alerting, orienting, and conflict resolution (executive function) are all important processes for success on the attentional network test, which requires participants to quickly make judgments that may or may not include conflicting information. Both accuracy and reaction time are recorded, and the interactions between conditions provides insight into the alerting and conflict resolution effects.
A recent study combined the attentional network test with the action of chewing gum to see what additional information neuroimaging with fMRI could add to the understanding of this strange phenomenon. Replicating previous results, the authors found that participants had shorter reaction times to stimuli while they were chewing gum than while they were not chewing.
However, they found no evidence that any behavioral effects in the alerting or conflict conditions during the test were affected, meaning that there was no trade-off between increased speed and accuracy. This suggests that an overall increase of arousal in the alerting and executive functional areas of the brain are behind the reduced reaction time. fMRI showed increased activation in the motor areas of the brain, indicating that chewing likely activates these regions first, and that these areas then increase the arousal and alertness levels during the test.
Many regions of the brain have been implicated in the effects of chewing on cognitive processing speed, but a few of them have been given more attention than the others, including the thalamus and the anterior cingulate gyrus, both of which are likely contributors to the overall alerting and arousal effects seen in these studies.

Of course, these results bring to mind many questions. Why is chewing, specifically, linked to an increase in processing speed? Do other repetitive activities have the same effect? Why does the activation of motor regions increase the level of arousal in areas that serve arousal and alerting purposes? Why is processing speed increased, but behavioral effects remain unaffected? This should be a very productive area of research in the years to come, and certainly a fun one to keep an eye on.

Math Anxiety – Dealing with Fear of Failure

Not everybody loves math. In fact, some people report tension, apprehension, and fear when faced with the need to perform mathematical tasks as a part of everyday life. Not surprisingly, these highly math anxious individuals (HMAs) perform more poorly on math related tasks than individuals with low math anxiety, tending to avoid math classes and math-related career paths. But, understanding more about the neural underpinnings of high math anxiety may help educators develop better strategies for counteracting these tendencies, ultimately opening the door to more diverse career opportunities for HMAs.
Recently, scientists have begun to understand the differences in neural activity that may partially underlie math anxiety. A 2012 study found that when individuals with math anxiety anticipate a math task, they display increased activity bilaterally in the dorso-posterior insula — a region of the brain associated with threat detection and often with the experience of pain itself.
Interestingly, this area did not remain activated during the math task itself: it appears as if the anticipation of math is the painful part, not the actual doing of it. The higher the degree of anxiety, the more this area of the brain appeared to be active. This mechanism helps explain why individuals with high math anxiety avoid math — just thinking about doing it is painful to them!
It’s worth noting that not all individuals with high math anxiety perform poorly on math tasks relative to those with low math anxiety. A 2011 study showed that some individuals with high math anxiety showed increased activity in the inferior frontoparietal regions of the brain relative to individuals with low math anxiety; these same individuals were the ones most likely to perform relatively well on math tasks, even though they were anxious.
This area of the brain includes regions thought to be involved in cognitive control and in dealing with negative emotions in a logical way. In those with high math anxiety, both high performers and low performers showed similar activity in areas of the brain associated with a fear response. Both groups also showed similar activity in regions associated with mathematical calculations. Thus the researchers concluded that the individuals’ cognitive response to their own anxiety may be the most important factor in determining their ultimate performance.

These findings may be used to shape educational strategies for high math anxiety students. The most successful strategies may not be ones that seek to eliminate the anxiety outright. Instead, it may be more effective for educators to teach these students how to utilize their own inner cognitive controls to mitigate the math-anxiety response when it happens — before it has a chance to decrease actual math performance. These individuals might not like doing math any more than before, but they might find themselves able to do it more successfully.

Does Language Trigger Visual Memories? – Part 1

One of the fundamental questions in cognitive science is how information is stored in the brain and in the mind. There are innumerable different models, each with its own strengths and weaknesses, but the one that I will be addressing here is known as embodiment.
From a neurolinguistic perspective, embodiment is the idea that the semantic content belonging to words is linked to sensorimotor representations in the brain. So if you talk about doing something, your brain behaves (to some extent) as if you are actually doing it.
In a view known as “weak embodiment”, linguistic and conceptual representations overlap in certain sensorimotor areas. In the “strong embodiment” view, linguistic and conceptual representations are much more strongly linked, with some researchers suggesting that action words actually trigger the neuronal assemblies that are associated with those actions.
A recent study sought to discover if either of these two views could be supported by neuroimaging. The researchers played sentences describing dynamic (“the mechanic is walking toward the airplane”) or static (“the mechanic is looking at the airplane”) situations and measured the brain response. The researchers looked in particular both in the visual cortex area V5 — which is known to respond to visual motion perception — as well as other temporal areas of the brain.
They suggested that, if the strong embodiment view holds, dynamic descriptions should trigger activity in areas that are very strongly linked with visual motion perception, like V5. If the weak embodiment holds, they said, it was more likely that these descriptions would not trigger V5, but would activate other areas related to motion perception (presumably ones that are less strongly linked to visual cues).
After using localizer stimuli to determine the location of each participant’s V5 area, the authors analyzed the effect of static and dynamic language stimuli on this area. They found activation in the left posterior middle and superior temporal gyri, near V5, but crucially, no overlap between language-activated areas and V5 proper, supporting the weak embodiment theory of language.
So what does this all mean? The authors point out that the sensorimotor representations that are triggered by language are not as specific as the strong embodiment view suggests. In this case, language triggered activation in areas that are more generally linked, amodally, to motion. These parts of the cortex seem to be activated by amodal information related to motion, such as animacy or intention. This has significant implications for theories of embodiment, as strong claims of embodiment must now come up with a way to explain the finding that “once-removed”, schematic areas are activated by language, instead of modality specific representations  (as far as I am aware, there hasn’t been a response to this article from proponents of the strong embodiment view, but I’m sure it’s coming).

Finally, it’s worth pointing out that this may have some relevance for theories on the relationship between language and non-linguistic cognition. Visual sensorimotor representations, being non-linguistic, seem to be triggered, though somewhat indirectly, by language. Exactly how this works could also be important in understanding the other ways in which language and memory interact.

Does Language Trigger Visual Memories? – Part 2

I recently wrote an article about the connection between language and visual memories in which the authors of the study concluded that the strong version of embodied cognition was not supported. Another recent article about embodiment came to a very different conclusion, which I thought I’d discuss further.
This study used a different kind of methodology in which subjects were asked to remember a series of words while performing an interference task. The words belonged to one of two groups: arm-related words, like grasp, braid, nip, wash, hack, and delve; or leg-related words, such as stride, plod, skate, inch, and dance.
There were four different interference conditions: a control condition, where there was no interference activity; an articulation condition, in which subjects had to repeat a nonsense syllable; an arm interference condition, in which subjects tapped a single paradiddle (e.g. LRLLRLRRLRLLRLRR) with their hands; and a foot interference condition in which they tapped the paradiddle with their feet. After the subjects were presented with four words, there was a six-second interval in which they were asked to perform the interference task, subsequent to which they were asked to repeat the four words they were shown.
Based on a weak embodiment view, like the one I detailed in my last post on this topic, verbal memory for both categories of words should have either been affected equally by both tasks or not affected at all, depending on exactly which theory you ascribe to. However, a strong embodiment view predicts that memory for arm-related words would be more affected by arm motions, and leg words by leg motion.
Although there wasn’t a statistically significant disruption, there was a trend toward significance supporting the strong embodiment view. There was a significant interaction of word type with moving body part, providing further support to the idea that semantic memory and sensorimotor representations are closely linked in the brain and the mind.
This study does provide some compelling evidence for strong embodiment, but it also raises a lot of questions. Is it possible that there’s some other process at work here? The authors mention several times that this particular methodology has been used to draw causal conclusions in the past, and so they feel confident in doing the same, but it’s possible to draw other conclusions from this data. Another question that deserves answering is whether the choice of specific arm- and leg-related words modulates the effect. Many of the leg words, for example, actually refer to whole-body motions that are driven by leg motions.

And even if we take this study to conclusively prove that sensorimotor representations are required to for working memory, what about long-term memory? And how might they modulate attention or other cognitive actions? This is an area that is going to receive a lot of attention in the near future, and I have high hopes for some very interesting results.

Is Thinking Bad For Your Brain?

Basic scientific research, old wives’ tales, and common sense all suggest that the best way to promote brain function is to keep your mind active. Intriguingly however, a recent report from Elsa Suberbielle and colleagues published in the journal Nature Neuroscience, seems to suggest just the opposite.
The DNA double helix that encodes the human genome is comprised of approximately 3 billion base pairs that dictate all of our characteristics, ranging from our eye color to our predisposition to heart disease. Disruption of proper base pairing including mutations, insertions, and deletions may lead to a variety of changes in our makeup. Although some of these base pair disruptions are more deleterious that others, double stranded breaks (DSB) in the DNA double helix remains of the most lethal.
Recent evidence indicates that normal exploration of a new environment causes significant increases in DNA DSBs in mice. In these studies, mice were moved from their home cage to a new larger cage comprised of different litter, odors, stations, and toys. They were allowed to explore the new cage for two hours with other mice that they were familiar with from their home cage. Interestingly, many of the documented breaks in DNA were found in the brain region referred to as the dentate gyrus, which is a critical region for learning and memory.
While, at first glance these data seem to suggest that “normal” thinking is bad for us, commentary from Herrup and colleagues addresses this issue. They report that while the data is scientifically sound, they may need to be viewed from a different perspective. For example, they suggest that the assays used to measure DNA DSBs may in fact be leading to the reported damage and that this idea should be further examined. In addition, is possible that the damage in DNA is functioning as a regulatory mechanism. Perhaps, by allowing some level of DNA damage, a higher degree of neuronal regulation can be achieved. Suberbielle hypothesizes that the formation of the DSBs is a natural process that permits for the remodeling of DNA and changes in gene expression that are necessary for learning, memory, and the effective processing of information.

It may be enticing to initially conclude from this report that thinking is bad for your brain. However, these data should be intended as a springboard for further studies in this area and the genetic regulatory mechanisms that are in place during “normal” brain function.

What You Hear Affects What You See

There are a lot of different models of attention, and the differences between them can be complex and subtle. Most of them, however, treat attention as a limited and expendable resource — you can only pay attention to so many things for so long a time. Is attention really in short supply?
Attention is usually not modality specific: For example, if you’re making a lot of effort paying attention to something that you’re seeing, you’re not likely to be able allocate attention to an acoustic cue as well. In short, there isn’t a store of visual attention, a separate store of aural attention, another one for tactile attention, and so on. There’s just one central store of attention.
Recent evidence has also led many researchers believe that rhythms entrain the attentional system so that it increases the amount of attention allocated at certain temporal locations. For example, if you see a blinking light, neural oscillations will synchronize with the rhythm of the blinking, so that you’re paying more attention at the points when the light is likely to be on.
A study published earlier this year used a fascinating methodology to determine whether or not this entrainment is cross-modal. Participants heard a tone played at regular or irregular intervals for a specified amount of time. At the end, a dot would appear in one of the four corners of a screen (the appearance of the dot was either synchronized with the final tone in the series, played earlier than the tone, or played later than the tone) and the participants would look at it. The researchers measured how long it took the participants to fixate on the dot.
Interestingly, participants were significantly faster to fixate on the dot when it was synchronized with the final tone than when it was not, suggesting that the visual attentional system was entrained by the aural tone series. When the experimenters omitted the final tone, the results remained the same, proving that it wasn’t the final tone itself that speeded up fixation, but the rhythm that preceded it.
Another important note is that participants weren’t directed to attend to the auditory tones. In fact, they weren’t told anything about them at all, suggesting that the entrainment of the attentional system is automatic and unconscious.
Although they may seem intuitively obvious, these findings lend additional insight into how attention works, and give major support to the idea that attention is a limited resource that is shared between different perceptual modalities, and provides proof that entrainment developed through one modality is accessed by other modalities.

Research on neural oscillation has been quite fruitful recently, and this is another example of how this is at the core of processes that we take for granted, like rhythmic attentional entrainment and many other temporal processes in the brain. Exactly how this low-level process is integrated into higher-level systems, like time-keeping and attention, is likely to see a lot more research in the near future.