Science

What happens in Vegas...

There’s nothing more romantic than spending Valentine’s day in Las Vegas by yourself. Overdosed on glitz and relentlessly tempted by the ephemeral promises of pleasure, luxury and instant winnings, I seek solace in what is hopefully eternal: true love.

As a well-trained psychologist, I find comfort in defining a concept as slippery as true love in terms of behavior. And as a happily married woman, I gravitate towards monogamy as a hallmark of ever-lasting affection. Humans, as a model species, however, are annoyingly unpredictable and complicated. And while monogamy is the exception rather than the rule in the animal kingdom, there are a few choice species that have provided scientists with a wealth of data and bad jokes.

Take houseflies, for instance. Once the common housefly falls in love and gives up her virginity, she generally doesn’t stray. But this fidelity is not entirely a personal choice: there is something funny in the seminal fluid that males inject during mating that kills her libido. Furthermore, her mate does not share her level of commitment and has no problem seeking out other females. This one-sided romance is not particularly inspiring.

Birds, in contrast, are notoriously monogamous, with 90% of species demonstrating pair-bonding that is mutual and lasts anywhere from one breeding season to a lifetime. But the avian love nest is not without drama: baby birds are not always the direct genetic descendants of both parents. Females are known to ‘dump’ eggs into the nests of other birds, spreading the parenting costs around the neighborhood. And a few more recent genetic studies of bird families have unveiled the promiscuity of one or both parents.

But romantics need not lose heart, for inspiration can be found in the most famous loving mammal: the common prairie vole. Given that only about 3% of mammal species are monogamous, the prairie vole is in a class all by himself. The mating behavior of these cute little buggers has fascinated scientists for decades and has led to the discovery of hormones and genes involved in long-term attachment. Validating ‘love at first sight’, bonding prairie voles activate the brain pathways served by the hormone called oxytocin, which decreases stress and increases positive social behavior, on their first date. Oxytocin helps prairie voles, and humans, for that matter, remember positive social interactions. It’s the same hormone that bonds a baby to its mother. In females, oxytocin rules the day. Males, however, also rely on a second molecule called vasopressin, a close relative of oxytocin, but which is associated with territoriality and aggression. If things go well, the first date ends in a 24-hour marathon of love-making, after which the two voles get the same high from each other as addicts do from cocaine. Give a vole cocaine and dopamine, a powerful and far-reaching neurotransmitter floods the nucleus accumbens. Give a mated vole his girlfriend and dopamine floods the nucleus accumbens.



Prairie voles are not perfect, however. Despite the fact that both members of the pair share in parental duties and show many affectionate behaviors, such as grooming and spooning, they are not always sexually faithful. What happens in Vegas, stays in Vegas, and in the grassy meadows surrounding the strip*. Like their mousy counterparts, human males carry different variants of the gene that expresses vasopressin and differences in this gene can account for some of the variability in pair-bonding behavior.

Of course, we are more than just the products of our genes and Valentine’s day is about romance and promise, not cheating and losing. If nothing else, Vegas has taught me the power of celebrating wins and glossing over losses. Thankfully, a recent fMRI neuroimaging study comes to the rescue. According to Bianca Acevedo, Lucy Brown, Helen Fisher and other colleagues at Rutgers, long-term love can be just as rewarding as the initial romantic hit of a successful first date. But in addition to activating the nucleus accumbens and other dopamine-rich regions, pictures of the object of one’s affection also stimulate those regions involved in long-term attachment, such as the globus pallidus, thalamus, and anterior cingulate cortex. Just like the bright, noisy slot machines that keep the casinos in business, fMRI studies tempt us to draw conclusions that are not supported by the data. But today, surely the romantic notion that a pair bonded for life can keep the passion fresh and strong is worth considering. Eight years and counting, happy Valentine’s day, Adam.


*poetic license: it’s a desert out there.
Comments

Here be Dragons

Despite being an opera singer, I’m not particularly superstitious. Unless, of course, you include the Chinese Zodiac in your definition of the superstitious. You see, I happen to belong to the most auspicious category, represented by the only legendary creature in the list: the almighty Dragon. Surely you don’t blame me, then, for holding onto the belief that my personality is best characterized by a giant, fire-breathing, winged beast*? I mean, really, can anyone argue that the following description doesn’t fit me to a tittle**?

HereBeDragons


Occupying the 5th position in the Chinese Zodiac, the Dragon is the mightiest of the signs. Dragons symbolize such character traits as dominance and ambition. Dragons prefer to live by their own rules and if left on their own, are usually successful. They’re driven, unafraid of challenges, and willing to take risks. They’re passionate in all they do and they do things in grand fashion. Unfortunately, this passion and enthusiasm can leave Dragons feeling exhausted and interestingly, unfulfilled. While Dragons frequently help others, rarely will they ask for help. Others are attracted to Dragons, especially their colorful personalities, but deep down, Dragons prefer to be alone. Perhaps that is because they’re most successful when working alone. Their preference to be alone can come across as arrogance or conceitedness, but these qualities aren’t applicable. Dragons have tempers that can flare fast! excerpted from http://www.chinesezodiac.com/dragon.php


Just reading that description gives my self-esteem a (clearly superfluous) boost. Being a scientist, however, I can’t quite commit to the belief whole-heartedly. And given that we’re about to enter another year of the Dragon on Monday, it’s only fair to look at the evidence. Is there any compelling proof that the Chinese zodiac predictions are worth considering?

First, the caveat. Being a skeptical person by nature, and a psychologist by training, my working hypothesis is that the signs appeal to the vast majority of people, because the traits associated with a given sign include virtues of personality that we all share, or revere, alongside their equally-universal vices. Take the dragon, for example: in general, most people aspire to success and are passionate about, well, something. Most people prefer to live by their own rules and take solace in the notion that because they have to follow someone else’s rules to some extent (either at work, home or play), they are not as successful as they could be. And sure, if you’re ambitious, driven, self-motivated and prefer to be alone, you are likely to be perceived as arrogant and you certainly will feel exhausted at times. Going through the list of personality traits on the Chinese Zodiac page of Wikipedia, I find myself represented to some extent by each one of the signs. That is, of course, if I indulge in my natural tendency towards searching for confirming rather than disconfirming evidence, which psychologists call the confirmation bias. If I assess the extent to which these description fit my character, rather than the extent to which they miss key components of my personality, I can become quite convinced.

But enough speculation. What do the data show? Oddly enough, there aren’t that many studies of the effects of the Chinese Zodiac on pubmed. But those brave scientists who have published such studies have made some pretty fascinating discoveries. Giving the importance and unpredictability of childbirth, it comes as no surprise that the Chinese Zodiac is often used to gage whether a woman will become pregnant in a given year, and what will be the sex of the child. To test whether the zodiac does indeed correlate with its own predictions, Jungmin Lee and Myungho Park investigated the sex preferences and fertility in South Korea in the Year of the Horse and published their results in 2006. The horse is associated with masculinity and, in South Korea at least, the year is considered inauspicious for girls, as they are thought to suffer unhappiness and misfortune. Certainly, these predictions have as much to do with the society in which these girls are born as they have to do with the moon: as the authors point out, ‘in patriachal and Confusionist societies, women are expected to be subservient to men’. (Pardon me while I expel some smoke via my nostrils). Is there any evidence that the year of the Horse correlates with a higher birth rate of boys? Is it such a strong force that women might avoid getting pregnant and show a decrease in fertility? The authors seem to think so.
Horse_fertility
I must admit that those years don’t strike me as significant outliers. But what about the year of the dragon? In many Asian cultures, the dragon is considered (ahem) the best sign (though not so much for the ladies). In Hong Kong, birth rates peaked in the 1980s and then started to decline, even though the numbers of married women of child-bearing age continued to increase. Every 12 years, however, a blip in births was observed, coinciding with the year of the Dragon. In 2002, Yip, Lee and Cheung published a study of birthrates in Hong Kong in the journal Social Science and Medicine. These data seem more convincing to me, especially because Taiwan and Singapore both saw large increased in birth rates in the two previous dragon years.

dragon_HK

But how can we assess whether these effects are mainly driven by human behavior or by the orbit of the moon? Luckily, another study was recently published, which assessed the accuracy of predictions by the Chinese Lunar Calendar on 2.8 million Swedish births between 1973 and 2006. Such a huge database is pretty compelling and I’ll let the authors speak for themselves: ‘We conclude that the CLC method is no better at predicting the sex of a baby than tossing a coin and advise against painting the nursery based on this method's result.’ There you have it. Once again, you’ll see it if you believe it. That is, belief in the zodiac will alter your behavior, such that it becomes a self-fulfilling prophecy. Off I go then, to follow my ambitions and take risks. Happy Chinese New Year and best of luck in 2012!


*Some of you might (accurately but pedantically) point out that only the European version has wings: his Asian counterpart is more snake-like. Well, it’s my superstition so I get to imagine it just the way I please, thankyouverymuch.

**not Snoop Dogg speak. Actual derivation of the term ‘to a T’.

Comments

The Best Invention of 2011

It’s been a while since my last blog post and I must confess that with each passing day, my standards for the quality of the next post have grown higher and higher. Worried that these standards were reaching insurmountable levels, I began to panic today. As the universe registered my anguish, my silent prayers were answered with a bang by the unlikeliest of sources: Time magazine.

I was on a break from a job when I picked up a copy of the magazine, left open by a colleague. Touting the best inventions of 2011, the magazine brought to my attention what must, indeed, be among the best inventions ever. Let me present to you, The Necomimi.



Admittedly, I’m way behind the curve, since this video has already had over 2 million views. But let me highlight some of the great features of this extraordinary device. It is produced by the Japanese company Neurowear whose website alone provides a significant amount of entertainment. They have developed a business model which aims to make literal wearing one’s heart on one’s sleeve. The term “Necomimi” was constructed by joining the Japanese words for cat and ear. Via a sensor placed on the forehead, the ears react to electrical signals that purportedly come from the brain. Simply by concentrating or relaxing, the wearer can manipulate the ears, mimicking gestures that cat lovers recognize as demonstrating alertness or comfort, respectively. But don’t listen to me, let the company speak for itself:

We created new human's organs that use brain wave sensor.「necomimi」is the new communication tool that augments human's body and ability.This cat's ear shaped machine utilizes brain waves and express your condition before you start talking.

2011 was, clearly, a banner year for technology. But lest you think that 2012 will never be able to top the Necomimi, allow me to reveal the Best Invention of 2012 So Far:The Baby Formula Banana Smoothie. This one was invented by my ever-resourceful husband. Have some leftover baby formula from visiting infants over the holidays? Just completed a 10K run across the Golden Gate Bridge? In a blender, start with some ripe bananas, add the leftover orange juice from the New Years Day Mimosa brunch, mix in yogurt and honey and top it all off with a healthy serving of baby formula powder. It’s almost as good as the Necomimi.

Now that I’ve broken the dry period, I promise to return to my regular blogging style, commenting on topics at the intersection of art and science next week. Until then, I hope that your creative juices have gotten a boost, and that you will find yourself thinking about other fashionable ways in which bodily functions might be harnessed,


Comments

Shifting Gears: When your Brain needs to Pump the Clutch

December is the busiest month of the year for classical singers. It’s the peak of audition season so traveling home for the holidays is squeezed in between trips to New York or other cities for auditions. Holiday parties put friends and supporters in a festive mood, while singers juggle gigs and rehearsals and fundraising drives. Like everyone else, we pump up our productivity in this last month to squeeze a few more ounces of achievement out of the dying year. To top it all off, we actively work to avoid the illnesses that result from travel, spending more time indoors with incubating viruses and the stress of the most wonderful time of the year. If we get sick, we can’t sing, and if we can’t sing, we don’t get paid. So forgive us if we seem a little reluctant to hug and kiss each and every person at the office party.

During this month, singers aren’t the only ones mastering the juggling act. Almost everyone finds themselves multi-tasking at work and at home. By New Year’s eve, many of us feel a sense of accomplishment that comes with putting the year to bed; we’ve navigating the holiday parties and dinners, we’ve completed our gift-giving and philanthropy, we’ve finished the projects that had to be done before the turn of the year. The busyness of December pays off with one last burst of productivity.


A cautionary tale.


But there’s ample evidence, now, that multi-tasking leads to less, not more, productivity. In fact, the very idea that we can do two things at once is a myth. What we’re actually doing is switching between tasks, and every time we make a switch we pay a price. What makes multi-tasking hard? I feel the challenge most acutely when I’m trying to write something creative and/or novel and my husband interrupts me with a question or comment. My first reaction is emotional: I feel irritated. Then, I quickly realize that the idea that I was just about to make concrete has returned to its amorphous, mushy state. I refocus my attention to the question he posed, respond and shift my focus back to the last thing that I wrote. Cognitive psychologists have studied this process of ‘task-switching’ for several decades now and the ‘cost’ of switching is real in terms of response times and accuracy: we are often less accurate on switch trials than on repeated trials of the same task. One would think, though, in this age of multi-tasking, that like any other cognitive skill, practice leads to increases in efficiency.

Not so, demonstrates a study from Anthony Wagner’s lab, published in 2009. In this experiment, the Stanford scientists categorized their study participants in terms of how much multi-tasking using various media devices they were in the habit of engaging in. Heavy medial multi-taskers performed worse on a task-switching test because they were more easily distracted by irrelevant information. But as the study authors point out, there remains the question of what comes first: are heavy multi-taskers simply more distractable by nature, and thus less able to focus on one thing at a time? Or does heavy multi-tasking lead to deficits in the ability to filter out irrelevant information? Regardless of the direction of causality, one thing remains clear: multi-tasking is a hard habit to break, but one that is going to become increasingly more prevalent unless we learn to manage the addiction. And it’s worth the effort required to remain focussed.

My dear friend Karin Foerde, who is now a post-doctoral fellow at Columbia University, ran a brilliant study while we were grad students together at UCLA demonstrating that multi-tasking interferes with our ability to remember the specifics of what we were doing: instead, multi-tasking favors habit learning, which is less flexible and harder to unlearn. When you focus on one task at a time, your declarative memory system, serviced by the medial temporal lobe, is running the show. When your attention is diverted to a secondary task, the habit-learning system driven by the basal ganglia, takes center stage. Multi-tasking creates short black-outs as you switch from one task to another, inhibiting your ability to consciously remember what it was that you were doing. This year, I will be leaving my iPhone at home when I’m off to the holiday parties. Maybe that way I’ll remember just how good those Christmas cookies tasted, instead of simply wondering how those extra holiday pounds appeared on my bathroom scale.

Comments

Money can't buy you Creativity, or can it?

We’ve just passed the half-way point of National Novel Writing Month (NaNoWriMo in hipster-speak) and I can definitely relate to this doggie:

Face To Face With The 2nd Step by Richard Stine
Face To Face With The 2nd Step by Richard Stine

I’ve got what I think is a genius idea for a plot, some genuine, interesting characters and just over 12,000 words. Sometimes my writing voice still feels as though it’s coated with phlegm but I am making progress. I’m still way behind on the word count, however: I should be somewhere in the 25,000-30,000 range right about now. NaNoWriMo is designed in part to help writers (professionals and avocationals alike) develop the discipline and find the motivation to churn out a first draft. After all, the myth that creativity only happens during fleeting and involuntary moments of inspiration is among the first to be debunked by prolific creative writers.

I only write when inspiration strikes. Fortunately, it strikes every morning at nine o’clock sharp. ~ W. Somerset Maughum


If it’s such a struggle, you might ask, then why on earth are you doing it? Well, the paradoxical truth is that I want to. The exhilaration that I feel when I’ve written something that might be good is intoxicating. It’s similar to how I feel onstage, firing on all cylinders. My motivation is intrinsic: that is, it comes from within rather than for some external goal like the pursuit of a degree or an award or financial gain. Don’t get me wrong, I certainly wouldn’t turn down an advance from a publisher and of course there is a part of me that hopes the work will be published and bring accolades and royalties. But I don’t expect to regret the time and energy spent on the project even if it never generates any income. And so I’ve been wondering, in those moments when the temptation to tweet, or check email, or book a weekend in Mexico shatters my concentration, how strong or pure my intrinsic motivation will prove to be and how the NaNoWriMo artificial deadline affects creativity.

The goal of NaNoWriMo participants is to write 50,000 words in 30 days. Some writers simply see it as an opportunity to check off an item on their Bucket List, others feel that the community atmosphere will help them stay motivated and complete a project that has been on the back burner for too long. Some hope to publish the work as is, others will refuse to allow anyone else to read a single word. My (only) writing buddy Gord McLeod sees it as an opportunity to build the block of marble from which he will carve his David over the course of the following months. This smorgasbord of motivations and goals reminds me of the work of Teresa Amabile, who studies the effects of motivation on creative output and is on faculty at Harvard Business School. Having studied and thought about the interplay between intrinsic and extrinsic motivation and creativity for decades, she will be the first to admit that the relationships are complex.

In a recent review simply titled Creativity, published in the Annual Review of Psychology, Teresa, along with Beth Hennessey at Wellesley College, sums up the latest research findings by suggesting that when people feel controlled by their situation, as is the case in many workplace environments, rewards for creativity undermine intrinsic motivation and paradoxically, suppress creative output. For example, if employees are asked to create posters and are told that the person who comes up with the best one will get a monetary bonus, research suggests that the final products will be less creative than if the bonus comes as a surprise, rather than an expectation. When intrinsic motivation is already strong, however, rewards can further enhance it and lead to more creative output. Specifically, when rewards confirm competence or provide support in the form of a manager’s kind words or extra resources, creativity flourishes.

So the bottom line seems to be that intrinsic motivation is necessary, but not sufficient, for creative output. And extrinsic rewards can be helpful, so long as they don’t destroy the sense that we are being creative just because we want to. Those NaNoWriMo guys are on to something, but as Beverly Sills has said, there are no shortcuts to any place worth going. And that’s enough procrastination for Day 18.
Comments

Effortless Mastery

Watching a great athlete, performer or surgeon at work is mesmerizing. When a highly complex skill looks effortless, we tend to think of the performer as otherworldly, rather than as the result of years of dedicated, mundane training. Effortless mastery, the hallmark of the world-class athlete, surgeon or other performer, appears only after countless hours of what’s now widely called deliberate practice. A large-scale study by Ericsson and colleagues published in 1993 transformed the way that scientists think about the relationship between talent, effort and mastery. It turns out that across domains, expert performance is directly correlated to the amount of deliberate practice that a person has engaged in, not simply the number of hours spent doing activities related to the domain. Some studies have even tried to quantify the number of hours or years required, and, in general, the magic number seems to be 10,000 hours or 10 years.



According to the Nielsen media rating company, the average American watches 4.5 hours of TV per day. If that American individual practiced, deliberately, as often as he/she watched TV, effortless mastery would be achieved by everyone in about seven to ten years. The idea that most Americans simply do not have time to master a domain must be false: what’s missing is effort, motivation and an understanding of how learning works.

Unlike watching television, deliberate practice is hard: it requires sustained attention and constant adaptability. Rote repetition of an activity simply ingrains habits, not all of whom are good. Mastery requires thoughtful practice, with feedback and change is incremental. Deliberate practice exhausts the muscles and the brain, and for most people, a four-hour practice session feels like a marathon. The way in which a person approaches training is inherently linked to personality and individual differences, making many generalizations uninformative. But are there some general principles that can cut across individual differences?

The first published evidence that called into question the popular notion that more practice of any kind inevitably leads to mastery came from studies of Morse Code operators at the turn of the 20th century (Bryan and Harter, 1897, 1899). The operators in these studies would show improvements in their skills with repeated practice but eventually, their progress would plateau and further practice would yield no more gains. By changing their practice techniques, however, the operators were able to jumpstart their learning and continue to improve. Are these plateaus an unavoidable consequence of learning? Using the same paradigm, that is, learning Morse Code, Keller (1958) showed that the training method itself can be designed to avoid plateaus and show steady learning.

What are the characteristics of the training method and are these characteristics applicable to domains other than learning Morse Code? Perhaps the most important, and certainly the most commonly discussed attribute of deliberate practice is captured by the name itself: deliberate. The subject must want to improve and must be focused upon doing so; attention must be paid and effort exerted. The other two factors outlined in the seminal paper on deliberate practice by Ericsson and colleagues are that the instructions for how to perform the task at hand be understandable and take into account the subject’s previous knowledge and that the subject receives feedback during practice that helps him/her adjust performance in the right direction. Then, the subject should repeated perform the task, adjusting when necessary and always focusing on the process.

Since mastery of a skill in a field requires on average 10,000 hours of deliberate practice, those individuals who take pleasure in practicing or who can enjoy the process are much more likely to put in the requisite hours, and pay more attention to what they are doing. Why do some people like practicing while others loathe it? The exercises that one chooses when practicing can vary in terms of the enjoyment they provide, but even rote repetition can be more or less interesting to different individuals. Ticking off repetitions can be experienced as serial micro-triumphs, or as the epitome of monotony. The mind can be engaged to different extents: the practicer can concentrate on each repetition, comparing it to previous instances, monitoring performance and observing the evoked sensations, or he/she can simply daydream and ‘check out’. Despite our innate tendency to resort to daydreaming in response to boredom, scientists have recently discovered that daydreaming or zoning out can actually lower your mood, rather than lift it. If you allow yourself to daydream during a practice session, it might, paradoxically, be less enjoyable than if you make the effort to concentrate, and battle the temptation of zoning out.

I’ve been thinking about motivation and concentration this past week because I’m participating in National Novel Writing Month - with the goal of cranking out a 50,000 word novel in 30 days. For the first time in my life, I’m focusing on quantity rather than quality in an artistic pursuit. So far, I’ve written about 8,700 words, and I have no idea what use this exercise will prove to be. But the first step towards mastery is deliberate practice, and if nothing else, I’ll have resisted the temptation to procrastinate for at least several hours every day for 30 days in a row. Now that’s one habit worth developing.


Comments

Testing the Water in Lake Wobegon

We all believe that we are special. At least here in the US, we are trained to think that we can do anything we set our minds to; that we have a supreme talent at something and the trick is simply to find out what that might be. A short survey of the world’s population and a basic understanding of statistics lead to a very different conclusion. Nevertheless, folk wisdom and even some recent psychological studies contend that believing in your potential and doggedly pursuing your passion will eventually lead to something good.



These days, even corporations have a hard time admitting that they are probably just average in terms of their global standing. Evidence of this reluctance to face the truth has hit the headlines and provoked the ire of a nation as CEO pay has continued to rise despite dismal company performance. The excuse? A company doesn’t want to admit that their CEO is simply average and so boards vote to pay their leaders above the market rate. Even the recent economic bubbles and the resultant crises are in part a function of overconfidence.

Psychologists and many other groups of people have known for decades that we often over-estimate our own capabilities. The ‘Lake Wobegon Effect’ permeates all sorts of skills and domains. In 1999, Justin Kruger and David Dunning from Cornell University published a study in the Journal of Personality and Social Psychology showing that in tasks ranging from humor to logical reasoning, the worst performers over-estimated their own abilities the most. Subjects in their study who actually performed above-average were more accurate in their assessments of their own performance. The authors suggested that the poor performers simply didn’t understand how hard the tasks were: once they were trained such that their skill level improved, they were also more accurate in evaluating their skill level. Justin Kruger then went on to show that when a task is really hard, skilled performers underestimated their own performance, because they did not take into account the comparison group. He concludes that people are just bad at comparing themselves with others.

In 2007, another study published in the Journal of Personality and Social Psychology showed that whether poor or skilled performers are the worst offenders in terms of comparing themselves with others depends largely on how easy or hard a task seems to be. When a task feels easy, poor performers overestimate their competence, and good performers accurately assess themselves as performing above-average. When a task feels difficult, good performers underestimate their performance and poor performers accurately admit that they are below-average.

So what’s the deal? Now that we know this effect exists, why don’t we simply re-calibrate our estimates? Before you plunge into a ‘I’m just an insignificant speck in the universe’ depression, consider this: for some reason, our brains have evolved to make us more likely to remember the good times than the bad times, to pay attention to good news more than bad news, to see ourselves in a more positive light rather than to face the fact that there are many people who are more skilled or more beautiful or more powerful. Eighty-percent of us behave like optimists; and we’re the ones whose genes have survived.
Comments

When Conviction Becomes Confabulation

Ever since I started working on Miracle Detectives, the TV show that I co-hosted on the OWN network, I’ve been fascinated by circumstances in which a person maintains a particular belief in the face of contradictory evidence. My goal on the show was never to shake someone’s faith but rather to explore mysterious phenomena with the tools that science has to offer. Most people could incorporate whatever new information I was able to glean into their existing worldview but once in a while, I met someone who stuck to a certain belief regardless of the evidence for or against it. That sort of conviction is considered noble in many situations: we value loyalty in friends and employees, we admire religious fervor, we encourage determination in the pursuit of wildly ambitious dreams and we set instincts on a higher pedestal than data in many business decisions.

once_upon_a_time_wm.JPG
(c) Dragoncrafted.com


But when does conviction turn into confabulation? Confabulation is a term that neurologists coined to describe a disorder of memory in which a patient gives a false or contrived answer to questions about the past but who believes that these answers reflect the truth. It was first described in patients with Korsakoff’s syndrome, whose memories have been obliterated by thiamine (vitamin B1) deficiency usually following years of alcohol abuse and/or severe malnutrition. Oliver Sacks poignantly described a case study in The Lost Mariner and A Matter of Identity in his book The Man who Mistook his Wife for a Hat.*

But while the Korsakoff’s patient represents one extreme, we have all confabulated at one time or another. Young children reporting their own memories often confabulate, as they learn how to distinguish remembering from fantasizing. We often indulge in impulse buying, justifying our purchases after the fact with false memories and constructing a narrative that makes sense and leaves our ego intact. We deny our shortcomings; the vast majority of us consider ourselves to be ‘above average’ in most instances. In the extreme cases of anosognosia, or denial of illness often following stroke, some patients will deny that part of their anatomy has been paralyzed, insisting that if they really wanted to, they could move the affected region.

Doubt, particularly in Western society, is interpreted as a sign of weakness. For many people, the term ‘skeptic’ is synonymous with Doubting Thomas, someone who refuses to believe unless shown direct evidence, and being called a skeptic is seen as pejorative amongst many social groups. Confabulation involves a lack of doubt, about something that is inherently doubtful: one’s memory for the past. But confabulation also involves a skill that humans have perfected: story-telling.

Some evolutionary psychologists argue that our propensity for creating, telling and remembering stories emerged about the same time that the proliferation of our neocortex began to differentiate us from other primates, and that this metabolically-expensive leap in brain size was driven by the need to communicate with each other and navigate social relationships. As our ancestors found strength in numbers, understanding and predicting the behavior of fellow co-habitants afforded a certain advantage, genetically-speaking. Stories might have served as tools for accomplishing this delicate task. But when that storyteller is let loose, and the Doubting Thomas in the brain is silenced, either unintentionally by brain damage or deliberately by conviction, confabulation is the result. Perhaps what fascinates me, then, is the interplay between the interpreter and the doubter in different people: how the relationship between these two components of the mind can result in belief or doubt.

*This book has also been made into an opera with music by Michael Nyman, which I’ve been dying to perform.

Comments

A World Without Meaning

Winning the 2011 Society for Neuroscience Brain Awareness Video Contest this year is a heart-wrenching animated poem told with the voice of a 10-year old boy whose grandfather has suffered a stroke and developed aphasia, a language impairment. It’s called The Treasure Hunt and it does a wonderful job of humanizing and explaining the condition. Language is such a central part of our minds that it’s very difficult for most of us to imagine a world in which our capacity to talk and express ourselves using speech is stripped away.

Treasure_hunt


As the fall settles in, and the holiday season has officially kicked off with Canadian Thanksgiving, opera singers everywhere are performing superstitious rituals with the aim of fending off illnesses. Some rinse their sinuses with saline daily, others add honey to their tea in generous portions; everyone’s hands begin to dry up from the liberal application of sanitizing gels and the first sensation of an itchy throat sends us burrowing under our covers to nap the germs away. For a singer, losing one’s voice means losing work and opportunities to generate future work. Because our productivity and happiness depend on a healthy voice, and getting sick is relatively unpredictable and mysterious, we latch onto home remedies and folk wisdom more readily than the average Joe.

When we do get sick, however, we refrain from talking as much as possible and, for a short while, we glimpse a world in which speech is an inaccessible form of communication. Being theatrical people by nature, we rely more heavily on facial expressions and gestures. Individuals with aphasia are also encouraged to use other forms of communication such as writing, or gesturing or drawing to get their ideas and desires across.

But what if the very meaning of the words is what begins to deteriorate rather than the ability to form them? What does it feel like to lose concepts? If you no longer know that an eagle is an eagle and a mouse is a mouse, does the world seem full of wonder or mystery? The patients that I’ve been studying at UCSF are suffering from semantic dementia, a progressive degenerative brain disease that slowly erases their conceptual knowledge. A baby learns first that a bird is a type of animal, and then that an eagle is a type of bird: patients with semantic dementia first forget that eagles and hawks are different types of birds, and eventually they can’t distinguish a bird from another animal.Their loss follows the development of language in reverse.

Often, these patients choose to engage in activities that involve complex visual images as their disease progresses: they love working on jigsaw puzzles, playing solitaire on the computer, gardening, and some even begin to paint or sculpt works of art. My goal has been to try to understand the changes in the mind that lead to this paradoxical emergence of visual creativity. I’ve approached this question using the rigorous methods of neuroscience: tracking where patients look when they are viewing pictures or art work or searching for a specific target in a large array, comparing the brain volumes of patients with healthy counterparts and patients with other diseases and correlating these volumes with specific behavior, timing how long it takes them to find a target and how accurately they can perform a difficult visual search task. It turns out that they are faster and more accurate than healthy controls in tasks like ‘Where’s Waldo?’, and the brain regions that correlate with performance on those tasks are the same regions involved in grapheme-color synaesthesia, a condition in which people ‘see’ letters and numbers in color.

When a video like The Treasure Hunt puts the experience of aphasia into a simple and elegant poem, I can’t help but wonder what it must be like to experience the world through the lens of semantic dementia: when things lose their meaning, are they less distracting? Does the world become more vivid and alive? Many of their paintings seem to suggest that it does. And the fact that these patients find new ways of communicating underscores the central role that relationships and social interactions play in our lives. The holidays are designed to strengthen the ties that bind us to friends and family, and as the days get shorter and the nights grow colder, it’s as good a time as any to return those personal calls.

Comments

When Time Stands Still

This week, I boldly ventured out of the house, knowing I won’t return until after dark, without a scarf and wearing a mini-skirt with, wait for it, NO LEGGINGS. You might chuckle to yourself, thinking that now I look just like all the tourists who visit San Francisco in the summer only to keep the sweatshirt business flourishing in Fisherman’s Wharf. But I will have the last laugh, as the sun sets and I walk home without a single goosebump: San Francisco’s secret September summer is finally here.

As you can probably tell, I feel a certain pride in my fledgling understanding of San Francisco’s seasons. Dressing for the weather is an art in this city, and I’m convinced that the eclectic fashion of our native hipsters is influenced in large part by the extreme fluctuations in temperature in the different neighborhoods of the city and at different times of the day.

Just as my brain has finally sorted out the regularities in our highly irregular climate, I must admit that I have become a bit of a San Francisco cliche. I no longer fear simultaneously donning two different patterns. I prefer to walk or ride a bike than to drive. I’m really picky when it comes to coffee. I get a box of vegetables delivered from a local farm every week. I do what I love, relying on the kindness of strangers to pay me for doing it. I have an entire trunk dedicated to my scarf collection. When I drive, I drive a smart car so that I can park in between meters. I run a lot. I blog. I tweet. And I listen to NPR.

And so it was that I found myself moved to tears (while driving my smart car to a random gig) by Maurice Sendak on Fresh Air. He has just lost two dear friends, one a few months ago, and another just the other day, and he talked about how he doesn’t fear death itself, but he does fear isolation. He misses his friends and all the other loved ones that he’s lost. So he wrote another book, to help him negotiate these feelings and to explore them further.

9780062051981_custom

Bumble-ardy, the latest from author and illustrator Maurice Sendak, is dark and deeply imaginative, much like his classic works Where the Wild Things Are and In The Night Kitchen. Bumble-ardy is an orphaned pig, who has reached the age of 9 without ever having a birthday party. He tells his Aunt Adeline that he would like to have a party for his ninth birthday, so Aunt Adeline plans a quiet birthday dinner for two. But Bumble-ardy instead decides to throw a large costume party for himself after his aunt leaves for work — and mayhem ensues. When his aunt returns she says, "Okay smarty, you've had your party but never again." Bumble-ardy replies, "I promise, I swear, I won't ever turn 10."



That last line has rattled around my mind all week: a simple rhyme that can be interpreted in so many ways, including as a strong desire to elongate the present moment into eternity and stop the unrelenting march of Time. I have long been fascinated by (and fearful of) the passage of time and the way that our minds track it. Our internal chronometer works in a myriad of ways, each tuned to a different time scale, from milliseconds to years. And just as memory is not an accurate record of the past, our sense of time is distorted by our goals, emotions, past experiences and current environment.

Even if we understand objectively that our memories are reconstructed versions of our experiences, it’s still difficult to reconcile the fact that our experiencing selves are very different from our remembering selves (see Kahneman’s brilliant TED talk here). That separation, however, is obvious to
David Eagleman, a 39-year old neuroscientist in Texas, who launched his human subjects off of a 50-meter high platform to test the notion that time slows down when we fear for our lives (a great New Yorker profile of him can be found here).

As expected, the disheveled subjects reported that their fall seemed to last a long time, and estimated that their own fall was 36% longer than the falls of the other subjects. A portable ‘perceptual chronometer’ was strapped onto the wrists of the subjects, to test whether subjects’ experiencing selves have better temporal resolution during the fall, when they report having experienced the slowing of time. A supersense should have its usefulness, after all, and the military, who funded part of this work, would benefit from harnessing such a skill. Alas, Eagleman and his colleagues found no evidence that our experiencing selves actually perceive time differently during frightening events. It all comes down to how and what we remember.

Happily, however, this finding points Bumble-ardy towards a way to accomplish his desire to stop Time. While we can’t manipulate the passage of time outside of our minds, our memory and imagination are the tools we need to leap forward, jump back and, most importantly, do-over what Time has taken away from us. Crafting rich, varied, and meaningful experiences fills our memory repositories with branches and footholds that trigger and support our reconstruction of Time gone by.

Comments

Numb3rs

My lucky number is 27. I don’t know exactly why, but I do distinctly remember deciding, or discovering, that it is an auspicious figure. 25 is not bad either. I’m a grapheme-color synaesthete, which is a fancy way of saying that I see letters and numbers in color. That’s not strictly true, however. I can see that the letters I’m typing now are black. But just like the number 2 is a symbolic representation of two things or has a specific value, it’s also red in my mind: the symbol represents two-ness as much as it represents redness in my mind. So you might think that I chose 27 as a favorite number because I like that combination of colors, or because that was the day on which I was born, or some other obvious reason. But I didn’t. I don’t know why I chose it: but it hasn’t let me down yet.

Lucky_Numbers_Abstract_Wall_Art.306190600_std
courtesy of Kristin’s Lucky Tarot

The topic of synaesthesia will likely come up again in this blog, since it affects the way that I see the world, but today I’ve been thinking about signs and symbols, and how readily we assign meaning to abstract things. I’m preparing to fly to New York to sing at a marathon concert commemorating the 9/11 terrorist attacks, and I’ve been remembering what I consider the strongest episode of the show that I co-hosted on the Oprah Winfrey Network called Miracle Detectives. While we were shooting the episode, I was struck by how many people who had lost loved ones on that tragic day found comfort in the notion that their loved ones were communicating with them from beyond the grave by sending messages. These messages took a variety of forms: some were caused by weather forces (such as a breeze), others took the shape of animals (butterflies and birds in particular), some in the appearance of previously-lost jewelry or coins and many included the seemingly inexplicable appearance of abstract symbols such as a loved one’s lucky number.

I remember one story in particular, that a mother who lost her 30-something son when the world trade center towers fell, shared with us. Mrs. C. told us that she’d always been a spiritual person, and by that she meant that she had had ‘out-of-body’ experiences in the past and visions or premonitions of things that seemed to happen just as she had imagined them.

Because of the devastation, it took months to find and confirm the remains of the victims of 9/11. Many, of course, were never found, even with available DNA testing. The absence of remains and the uncertainty that followed the attacks left many victims’ families hoping that their loved ones were only temporarily lost. Closure was impossible to find.

Mrs. C. begged for a sign from her son in the days, weeks and months that followed the tragedy. He was a volunteer firefighter, and from eyewitness reports, she learned that he had run into the burning building three times to save the lives of others. How did she come across these reports? Her son used to carry around a red bandana, to keep the sweat off of his eyes, and the eyewitnesses remembered seeing a young man with that red bandana. When Mrs. C. was attending the birth of her first grandchild, years after 9/11, the first nurse to walk into the birthing room was also wearing a red bandana. She interpreted this as a sign that her son was present at the birth.

Mrs. C.’s son also had a lucky number: 1
9*: it was his lacrosse and ice hockey number, as well as the address of his apartment and the number of his favorite Chanel perfume. When Mrs. C. asked for a sign from her son, she often interpreted the appearance of the number 19 as his response from the other side: she might see it on a receipt, or a road sign, or in any number of ways. Once she saw it, she was comforted by what she perceived as his continued presence in her life and it helped her cope with her immense grief.

This search for signs that Mrs. C. described led me to think about how our brains are
predisposed to find connections and create meaning out of the abstract or the unpredictable. This need is especially great when we are in a highly emotional state, or dealing with a loss as devastating, unexpected and seemingly random as the loss of an innocent loved one in a terrorist attack. Our memories are selective: we don’t remember everything that happened to us because it’s much more useful to have a repository of relevant information rather than an unmanageable database of minutiae. Often, an event that elicited a strong emotional reaction such as fear, anger or grief is an important one to remember, so that it might be avoided in the future. As a result, our amygdala, or the almond-shaped structure that sends messages to our frontal cortex, or the part of our brain that makes decisions and evaluates evidence, modulates what we remember depending on our emotional state.

The old adage that seeing is believing has been turned on its head by psychologists in recent years: yes, experiencing something yourself does make you more likely to believe in it, but your beliefs not only affect what you remember but also what you see and how you see it. Two weeks ago, a friend and colleague of mine,
David Amodio, published a study, in which his lab demonstrated that when people are worried about appearing to be racially-prejudiced, they perceive more racial differences between faces of people of their own race and those of a different race. This perceptual difference occurs very quickly - we can see a trace of it in the brain before the person is conscious of it. The motivation of a person affects his/her perception. If Mrs. C. is motivated to see signs from her beloved son, her brain will be tuned differently than if she did not have that motivation. As a case in point, her husband, who describes himself as ‘not very spiritual’, did not see any signs from his son and felt hurt that he was seemingly ignored. Then one day, in his garage, he sat in his car and begged for a sign: lo and behold, he noticed that a can of paint in front of him had the number 19 on its label. ‘It had been there the whole time,’ he says ‘I just hadn’t noticed it’. He’s absolutely right: with the right motivation, he finally perceived the sign that helped him deal with the loss of his son.


*in case you think that I see numbers as mainly red, here is the full palette: 1
234567890.Note that zero is white, so it’s there, but you probably can’t see it. And these are web-safe colors so they are not entirely accurate. The title of this blog post should read: Numb3rs

Comments

Interpretation: the Performer's Art

I spent the majority of this past week in Las Vegas, working as a model for Levi’s at Magic, a large fashion industry trade show. I did it mainly because it paid very well but there were two other reasons that justified this particular use of my time: 1) I wanted to experience the glamour of modeling in a big fashion event and 2) I’d never been to Vegas and seeing the city for the first time through the eyes of a working model was too poetic to resist. It might surprise you that I’ve lived in California for over a decade now and had never been to Sin City. It’s not that I haven’t had the opportunity: I had this romantic notion that my first trip to Vegas should be as an entertainer rather than a spectator, much like my first trip to Paris had to be with a lover rather than by myself. Vegas bills itself as the ‘Entertainment Capital of the World’ and Paris was built for lovers.

1504033_Las_Vegas_Nevada

 
Playing the role of a model in Vegas certainly has its perks: I was always greeted with smiles and courtesy, I never had to wait for a table at the fancy restaurants or stand in line to get into the clubs, and the trade show was within stumbling distance of my hotel room, which was equipped with every imaginable amenity. But already on the first day, I became keenly aware that despite the sheen of fool’s gold, Vegas can quickly turn dreams into acid. I was simply a vehicle for the jeans that I was demonstrating, easily replaced by any number of other women. Certainly, there are ways in which models can improve the look of a garment, but in the end, the garment is the focus and the designer is the star.
 
This shift away from the interpreter and towards the creative team behind the scenes is also occurring in opera and ballet, and many other performance arts. Composers are popping up like mushrooms after a rainstorm, as the proliferation of self-publishing and recording tools has changed the game. Much as blogging and citizen reporting are enabling individuals with no training in journalism to reach the masses, youtube, itunes and composition software are equipping creatives with the ability to create complex music without ever learning to play a single instrument.
 
Like the blogosphere, I suspect that in time, quality will cut through quantity in music as well as writing and the true test of a work will be its longevity. But in the meantime, why should a person devote years of his/her life mastering the art of interpretation rather than focusing on composing, or designing? Why work on skills when the sheer amount and frequency of output is what seems to dictate success?
 
Sometime during my third day in Vegas, I began to notice the acidity in the air: the meanness created by an empire built on losses. In Vegas, you can see, do, taste anything that you can imagine but everything has a cost: the better the quality, the higher the price tag. And the proximity to unaffordable luxury leads to bitterness. I have seen the same cruel disappointment envelop young creatives when the seemingly straight and narrow road to success proves to be deceptively curvy and bumpy. Upon my return from Vegas, I was desperate for a long and focused singing practice session and a workout.
 
I also re-read Uta Hagen’s Respect for Acting to remind myself of how complex and elegant the craft can be and how conscientious training can help the actor generate a more powerful experience for the audience. The same is undoubtedly true for my chosen craft: classically-trained singing. A great performance of a play or opera or other composition gives the audience a therapeutic emotional cleanse in addition to intellectual and sensory stimulation. Aristotle observed that a well-executed play allows the audience to expend pent-up emotions and that catharsis is a rewarding experience.
 
Several studies of the effect of music on the mind have shown that the brain regions involved in rewarding experiences such as eating, sex and taking pleasure-enhancing drugs are active when we are moved by a piece of music. And recently, in a PLosOne paper, Heather Chapin and co-authors from a university in Florida demonstrated that when a Chopin’s Etude in E Major was performed by an undergraduate piano major on a digital piano, the emotion and reward centers in listeners’ brains responded. When the same piece was played on the digital piano but using a computerized version that was technically accurate but lacked the expressive quality of the human performance, these areas were much less involved. One might argue that this is the first scientific study to demonstrate what audiences have known since the first cave man beat on a drum: the way in which a piece is performed matters just as much, if not more, than the piece itself. The performance matters, and the only way to give a great performance is to rehearse and train. The performance itself might not last forever, but as long as composers compose and designers design, the interpreter will have an important role to play.

Comments

The Power of the Puppy

64_brussels_griffon_64This week, some dear friends of mine have entrusted their dog to my care, as they frolick at various county fairs in Iowa. I love dogs, but my career keeps me too busy to have one of my own. And although we share our little plot of land with a cantankerous old cat, my husband and I both yearn for our own furry BFF. So we jumped at the chance to foster Paprika* while our friends savor fried butter and play cornhole. *To protect the privacy of our canine and its owners, names have been changed.


During the course of the week, I’ve watched Paprika carefully to ensure that she’s happy and comfortable, and have devoured a healthy portion of the seemingly inexhaustible supply of information about puppy training and care both on the
internet and from the local SCPA (I take my fostering responsibilities seriously). But Paprika remains a mystery. I understand that she learns by association, that she’s mainly motivated by food, that she will be destructive only if she’s bored or anxious, and that she likes to know where her pack is even if we’re not paying attention to her. But, when she stares at me as though she can see the very depths of my naked soul and read my innermost thoughts, I am unnerved.

Dog-lovers that we are, my husband and I watched a fascinating
documentary on Nova about how humans have bred dogs to pick up on our social signals. Scientist that I am, I looked up the studies that were referenced in Dogs Decoded and learned that whereas non-human primates and domesticated dogs are very good at following a person’s gaze to get information about where a desirable object such as a treat might be, only dogs seem to recognize certain social signals such as pointing to, or tapping on a container on the first try. Non-human primates will get the point eventually, but it often takes them many trials. Domesticated dogs, the authors argue, have been specially-selected to read human social cues, and these skills are in their genes. Even young puppies, who had experienced little human interaction, outperformed their closest ancestors, a pack of wolves who had been reared by humans.

Not only do dogs outperform wolves and great apes, who are much more closely related to us genetically,. but they make the
same mistakes that human infants do; mistakes that more rational wolves easily avoid, a group of scientists in Germany have found. As most parents have discovered, infants up to ten-months old will look for a hidden object in its original hiding place even after they have seen it being moved to a new location. Psychologists call this behavior the perseverative search error and believe that the children are following the cues of the person hiding the object, rather than tracking the information that would lead to the correct answer. Dogs make the same mistake when the person who is doing the hiding visibly communicates with the dog. When the communicator stands passively next to the dog, and the dog sees the object moved from the first location to the second via invisible string, the dog accurately searches for the object in the new location. Wolves don’t care what you do: they will look for the object where it actually is. Dogs believe that pleasing their humans will lead to more rewards in the long run: wolves have much less faith in the good nature of our species.

Questioning just how ingrained these social skills are in dogs, a group of scientists in Florida published
evidence that under the right rearing conditions, wolves can outperform dogs who had limited human contact or were found in animal shelters on tests of picking up on human social cues. Just like humans, nature and nurture interact, ultimately, in the development of complex behavior. Yet another reason why training puppies properly early in life is so critical.

Once trained, do dogs simply follow pointing and commands without regard to the context? Paprika seems to have a mind of her own: all the gesturing in the world won’t get her off the wingback chair I just had re-upholstered unless there’s an awfully good chance that she will be compensated with the duck fat treats. Supporting this observation, the German group just published another
paper this July in which they demonstrate that dogs will only respond to pointing and other cues if they have experienced those gestures in the context of food. If pointing did not lead to food the last time they followed the gesture, they won’t bother the next time. The human’s tone of voice, which accompanies the pointing, matters too.

Admittedly, I feel like a fool when I address Paprika in my high-pitched
Motherese, particularly when I’m rewarding her for pooping promptly and precisely in front of an audience of rough-looking bikers who play trivia at our neighborhood pub. But, as she licks duck fat from my fingers, and looks up at me with those wise old brown eyes, I can see that she understands what I’m trying to say and I feel a little less alone in this vast universe.
Comments

Using your brain with the future in mind

I read an interesting post yesterday from the Harvard Business Review blog about how feeling as though you are making progress at work, in a video game, or during any kind of training is a great motivator. I can definitely relate: one of the hardest things about being either a scientist or an artist (or an entrepreneur, for that matter) is that you have to spend many, many hours breaking new ground and very often you feel as though you are running in place. That’s the problem with innovation: you don’t know what’s going to work until you’ve tried a lot of different things. Sometimes you get lucky and hit on a solution fairly quickly; but most days, you have to run down many corridors and backtrack half of the time to reach the end of the maze. I hate treadmills: to stay motivated on a run, I need to see the scenery change.

Then I watched Jane McGonigal’s TED talk.



She’s certainly on to something. There’s an addictive quality to internet-based video games, which psychologists have been trying to understand since the tech boom. There’s even a journal dedicated to the psychology of interactive technologies and social networking. These days, instead of thinking about the brain as a machine, made up of solid, unchanging parts, neuroscientists have shifted towards a view of the brain as dynamic and plastic: dynamic in the sense that brain functions are served by chemical and electrical signals and plastic because cell structures, neural circuits and cell-signalling patterns change with use. So, it’s not surprising that spending a lot of time doing any single activity will lead to long-lasting structural and functional changes in the brain. By spending more time playing video games than engaging in other activities, gamers change the circuitry and functioning of their brains to match the demands of their passion.

But as Jane points out, that might not be a bad thing. Sure, internet games tap into the very same circuitry that goes awry in substance abusers. And some people are more susceptible to becoming addicts than others, with genetics and social interaction playing large roles. And the type of game does matter: violent or aggressive games can lead to violent or aggressive behavior, while prosocial games can improve social interactions, just as playing tennis everyday leads to an improvement in tennis playing and a remapping of the sensory and motor cortices that are involved in manipulating the racket and predicting the trajectory of the ball. London cab drivers, who have to memorize the intricate roads of London, develop larger and more efficient hippocampi, the regions of the brain that are involved in spatial navigation. The activities we choose to spend our time doing, regardless of what they might be, will affect the way that our brains function in the future. The time has come to be just as mindful of what we do with our brains as we are about what we ingest or how we stay fit. Jane might be right: if we invest more time in searching for solutions to the world’s problems, or developing the social and cognitive skills the we need to find the solutions, we just might be capable of doing great things.
Comments

The Creativity Instinct

In an interview earlier this week, I was asked to define creativity, a fairly common and entirely reasonable request given that I have made the expression and study of creativity my profession. Lately, I have found that I’m less and less comfortable answering that question with the traditional response; some combination of novelty and utility. That is, creative output must both be novel (a new combination of existing ideas or objects or an entirely new thing) and useful (that is, have some purpose). I won’t go into detail defending this particular definition or knocking it down, for that matter, except to say that as we learn more and more about creative behaviors and the motivation behind creativity, defining it has become as easy as herding cats. And likely just as useful.


creativity_fran
(c) www.francartoons.co.uk

I have come to believe that defining creativity in one comprehensive sound byte is at best unnecessary and at worst potentially harmful, if we ever want to truly understand it. What if we applied that same strict criteria to an equally complicated cognitive construct, such as memory? I often use the following definition of memory when asked to provide one, which is both unsatisfactory and strictly true: the change in behavior that comes with experience. I have developed muscle memories for singing after 15+ years of training and my behavior has changed as a result of that experience. I have a long autobiography of vivid events that I can summon at will to affect my current actions. My experience of the visual realm affects the way that I interpret what I see today. Although accurate, the definition glosses over the very aspect of memory that piques the curiosity of most people who want to discuss the topic: the fact that we can search through a vast repository of information that we gathered over a lifetime and use it to our advantage. How it works is much more interesting than what it is.

The very same is true for creativity. Creativity, like memory, is a collection of many processes and behaviors, with many different motivators and mechanisms. Like the study of memory, the study of creativity would also benefit greatly by a shift away from definition and towards understanding how it works under different circumstances. We talk about the differences between memory for events and motor skills or habits. In the same way, we should talk about the differences between creativity in writing and creativity in dance.

Often, when I’m not doing a particularly good job of sharing my enthusiasm for science and instead am caught up in academic jargon or the minutiae of some esoteric argument, I see the eyes of my conversation partner glaze over with boredom. It always surprises me because I’m clearly interested in what I’m talking about. But then I remember that what fascinates me is not what I know, but, rather, what I don’t know. Delving more deeply into a topic as wide-reaching and humanistic as creativity only raises more questions. And that is what drives interest. We flock towards the mysterious, towards things that we can’t seem to explain because curiosity leads to knowledge which leads to better decisions and, dare I say, the reproduction of those genes that underlie curiosity (for a paper on a potential curiosity gene, you can read about great tits, though not the kind you’re thinking of, unfortunately).

We hate boredom: it’s a highly uncomfortable state, as evidenced by our knee-jerk reaction to pick up a smartphone or even look out the window. And we love information: twitter thrives because it promises a never-ending stream of information about our world and its co-inhabitants. The bigger the mystery, the more surprising the cliff-hanger, the more complex the visual scene, the more interested we remain. Provided, of course, that there is some pattern or some hint of a pattern that suggests it’s not completely random. Because randomness is unpredictable and therefore useless information. Creativity, that new interpretation of ourselves and our world, is arguably our most powerful instinct. Only by understanding how it works can we ever hope to understand what it is.

Comments

A Mind Split

When I was 20, I worked at what was then called the Clarke Institute of Psychiatry in Toronto, volunteering as a research assistant on a study of the relationship between puberty and schizophrenia. I was lucky to be supervised by one of the smartest and kindest women I had ever met, Dr. Mary Seeman. Because I was pre-med and interested in becoming a psychiatrist myself, she organized opportunities for me to spend some time on the various wards of the hospital; the most memorable of which was my week on the acute psychotic episode ward, where patients who were experiencing psychotic symptoms for the first time were being diagnosed and treated. Psychosis in patients with schizophrenia generally appears when patients are in their early twenties, and  I was struck by how similar many of the patients were to me; they were experiencing delusions, hallucinations and disorganized thinking for the first time and they were my age. The trouble with delusions and other symptoms of schizophrenia is that they feel just as real as any other cognitive, perceptual or emotional experience. These patients described their delusions in the same language that I would use to describe the feelings that I experience in stressful situations.


Photobucket
A Fish’s Delusion from sonpham32 (www.photobucket.com)

Let’s say, for example, that a patient falsely believes that he is being monitored by the state: that someone has implanted a microchip into his brain that transmits his thoughts to a computer located in a branch of the Canadian equivalent of the CIA (CSIS). If that were true, one can imagine how frightening it might be. Hearing the patient describe his emotional reaction to this belief, I couldn’t help but sympathize. His emotional reaction was entirely appropriate, even though the cause of it was not real. I couldn’t sleep at night because I kept imagining how frightening it must be to have those irrational thoughts, or hallucinatory experiences.  Many patients, after responding well to pharmacological treatments of their symptoms, know that their delusions and hallucinations are caused by disease rather than the outside world, and they can describe them with the insight that an actor has when describing what a character that she is playing experiences. But most actors can readily turn off feelings induced by their skills; patients with schizophrenia live with the fear that they cannot control their thoughts and emotions so easily.  I have to admit that throughout my twenties, I lived with a small, nagging fear that at any time, my own psyche could betray me and symptoms of schizophrenia could just as easily tear apart my life as they had the lives of many of the young people that I encountered on that ward.

Like virtually all psychiatric disorders, the symptoms of schizophrenia arise from the building blocks of healthy mental processes. The fact that hallucinations and delusions use the same brain regions and mechanisms as normal perceptions and beliefs makes the disease so devastating.  Given this problem, it’s amazing that any drugs at all can target disease symptoms without destroying healthy thoughts and perceptions.

The different symptoms of schizophrenia are likely caused by different pathologies: some resulting from changes in dopamine receptors in the prefrontal cortex, others from changes in the way that the brain cells respond to acetylcholine, serotonin, GABA and/or other neurotransmitters. The pharmacological treatment for schizophrenia these days revolves around a cocktail of drugs targeting specific symptoms, which is why psychiatrists have such a hard time finding the right doses and combinations of drugs to maximize benefits and minimize their side effects.

Perhaps because the disease is so heterogeneous, I found that every patient with whom I interacted was first and foremost a unique individual, rather than a textbook case. Each person’s experience was different, and the problem of diagnosis dominated the conversation in the clinic. Yet I found myself relating to the experiences of these patients much more quickly than I would have expected, given how strange their symptoms sound when listed in a textbook.

My primary interest in neuroscience has been to understand the narrative and constructive nature of memory. In the course of my recent work on the topic, I came across a computational model of some of the cognitive symptoms of schizophrenia, aptly named DISCERN. In the journal Biological Psychiatry, Hoffman and colleagues developed a computer model of the cognitive symptoms of schizophrenia, and then tested both the model and real patients on a test of memory for stories (delayed story recall task). Patients with schizophrenia often have trouble remembering stories and some neuroscientists think that this episodic memory breakdown might lead to delusional thinking (click here for a paper reviewing the relationship between memory biases and delusions in schizophrenia).

What’s fascinating about the computational model is that the best predictor of errors made by patients with schizophrenia was a version that used hyperlearning as the mechanism of disruption. That is, delusions in patients with schizophrenia might be the result of an inability to forget, or to suppress irrelevant information from memory. Which reminds me, once again, of why the way in which our memory works is so fascinating: somehow, when functioning optimally, our minds ‘know’ or ‘learn’ to discriminate between details of our experiences, those that should be remembered and those that should be forgotten, so that we can make sense of the world and, with some accuracy, make predictions about the future. The vast majority of our brain’s operations seem to happen outside of our consciousness. We might know relatively little about our brains, but they sure do know a lot about us.

Comments

Why You'll Always Make the Right Decision

As I’ve alluded to in previous posts, I have a couple of big decisions that I need to make in the next few weeks. The outcomes of these decisions are hard to predict, but that’s not what makes them particularly difficult. They are difficult decisions because committing to one thing means closing the door to another and I’ve never been good at closing doors: I’m much better at knocking them down.

When I complain of the angst of decision-making to my friends, eventually, at some point in the conversation, they comfort me with the notion that no matter what I decide, it will have been the right decision. Now, if that were strictly true, then the decision would not be that difficult, and I would find real comfort in being reminded of that fact. But the truth is that choosing one alternative over another will lead to a different set of outcomes and, depending on what I value at the time, some outcomes will most definitely be better than others. So the real task is to predict what outcomes I will value in the future, and to make decisions that will lead to those outcomes and hopefully some form of personal fulfillment or contentment.

Although my well-meaning friends might not be right in that direct interpretation of the adage, they are all sage neuroscientists, in spirit, if not by way of education and career. Because the truth is that no matter what I decide, it’s very likely that my brain will work hard to convince me that it was the right decision.

Making the wrong decision leads to a very uncomfortable state that psychologists call ‘cognitive dissonance’: it’s the terrible feeling you get when your view of the world is at odds with how the world actually works. For example, if you believe that LA traffic is predictable and easily navigated so long as you avoid the rush hour(s) and never hit the beach, you might find that the third time you find yourself in standstill traffic in the middle of the day on random Wednesdays, you’ll feel the need to revise your belief system as it pertains to traffic in the city of Angels. Voila! You’ve just experienced the soothing effect of how our minds deal with cognitive dissonance: we simply change our attitudes, beliefs and their resulting actions.

American social psychologist Leon Festinger is credited with developing the first comprehensive theory of cognitive dissonance but the observation that we change our beliefs in order to justify our decisions can be traced all the way back to Aesop, and his fable The Fox and the Grapes: Fox wants grapes but can’t reach them, so he decides the grapes must be sour. It is easy to despise what you cannot get.

SourGrapesPosterSmall
from www.firedupmissouri.com


Fifty years after Festinger published his seminal book, social psychologists are still working out just how much our minds justify our choices. In a recent study published in Psychological Science, Tali Sharot, Cristina Velazquez and Ray Dolan demonstrated that the very act of making a choice affects our preferences. They asked study participants to rate 80 vacation destinations by imagining themselves taking a holiday there and predicting how happy they might be. Then, they told the participants that they were participating in a test of subliminal decision-making and asked them to pick one of two alternative destinations that they would only perceive ‘subliminally’. In fact, the alternatives weren’t presented at all during the decision-making but only appeared on the screen after a blind choice had been made. Even though the participants hadn’t actually chosen the destination, they still showed a preference for it when asked to rate it at a later time. When a computer made the choice for them, however, they didn’t show that same preference. The act of choosing affects makes us like our choice.

No matter which decision I make, you can bet that my anterior cingulate cortex will be working in overdrive, monitoring all the conflicting feelings and thoughts that will eventually be melded into a new worldview. In the meantime, I will follow the advice of a dear friend, whose excellent decision-making has created a flourishing international career, a loving and rich family life and lots of laughs: 1) first, gather all the information you can get about the possible outcomes, 2) hold off making the decision until the last possible moment, 3) once you’ve made the decision, tell everyone about it so that you can’t back out of it and finally 4) stop thinking about what would have happened if you had picked the other alternative. Oh, and he also said: ‘whatever you decide, I’m sure it will be the right decision’. How true.

Comments

Dreams: Setting the Stage for Creativity

What better way to celebrate America’s Independence Day than to visit its (arguably) first National Park? I had been looking forward to disconnecting from technology and camping in Yosemite for months. I find it exceedingly difficult to take time away from work and have resorted to planning vacations which are sufficiently distracting in order to clear my head. The breathtaking beauty of the Yosemite Valley, combined with the unique challenges of sleeping, eating and living outdoors certainly fit the bill. The change of pace, scenery and tasks allowed my mind to wander and wonder.

IMG_1695


Creativity is a slippery process: first, you have to gather all the necessary information and skills, second, you try to combine what you know or can do in a new way, then you generally need to step away from the problem or task and let it simmer for a bit, and finally, the new idea or way of expressing yourself seems to ‘pop’ into your mind. That third stage is called the incubation period. Understanding exactly what’s going on during that incubation period is arguably he Holy Grail in the study of creativity.

This week I came across two interesting studies of incubation that were published within a few months of each other in 2009. Sio and Ormerod reviewed a number of empirical studies of incubation in the journal Psychological Bulletin and found that when someone needs to consider a large amount of information to come up with a creative solution, the incubation period is particularly important. When the problem is visual rather than language-based, incubation is only effective if the person has undergone a long preparation period and has hit a creative block.

Denise Cai in Sarah Mednick’s lab at UCSD wondered whether dreaming, or rapid-eye-movement (REM) sleep, when our brain is busy consolidating what we’ve learned while we were awake, might be the critical component in incubation. She had her subjects take Mednick’s Remote Associates Test (RAT), a commonly used test of creativity, in which the goal is to figure out how three items are related (e.g. cookies, sixteen, heart - once you’ve had a chance to think about it, scroll down to see the answer below) and then she randomly assigned them to one of two conditions: full-on napping (measured by a polysomograph) or resting quietly while listening to instrumental music. Turns out that napping did, in fact, improve performance significantly more than rest when they were tested again on the RAT in the afternoon.

How important REM sleep is for memory consolidation remains fairly controversial, but there’s no question that sleep affects memory, especially the deepest sleep, called slow-wave sleep. Many professional classical musicians take a nap in the afternoon: napping helps their bodies recover from a long morning practice session and prepare for an evening concert but it’s also likely that their brains are consolidating the motor sequences that they have been learning while their conscious minds are at rest.

Whatever the relationship might be between sleep, memory consolidation and creativity, one thing is clear: there is something still magical about incubation. This weekend, my dreams were filled with waterfalls and butterflies, and new ideas are bubbling in my brain. It will be a while before I underestimate the importance of taking time off again. Oh and the answer to the RAT item above is sweet. Literally.

Comments

The Tao of Cat: the Future as a Function of the Past

The cat that lives in my backyard is his own man. He comes and goes as he pleases. He politely asks for food but when it doesn’t meet his expectation, he complains to the chef. He doesn’t like to be tied down but he’s not above showing his affection when his belly is full and it’s raining outside. He tolerates petting as a courtesy to those who serve him but his eyes belie his disdain for the cooing and snuggling that are sometimes inflicted upon him by his co-tenants.

Street Art: Chicago
Chicago Street Art

His is a pretty sweet life. Recently, I’ve found myself envying his advanced Buddhist practice of living purely in the moment. He does not plan for the future or regret past decisions. He is focussed entirely on the present and his behavior reflects only his most immediate needs. But he does have one major character flaw: when a choice is forced upon him, even if it is a desirable alternative or would have been his preferred action in any case, he cannot tolerate it. For example, it has become his habit, after dinner, to enjoy stretching out on a warm blanket near his chef. But if his exit from the room is blocked by a closed door, he will not settle down for a snooze, but will do everything he can to get out of the room. Once the door is opened, he ignores it and proceeds to sidle up to the chef with his motor in full purr mode.

On one level, I understand his angst: when I am told that I must do something, that action loses a part of its attraction. And I don’t like to feel boxed in or powerless to escape if circumstances change. But, after five years of having my needs met, and never once being hurt or trapped, I have learned to trust the person (and cat) with whom I share my living space. Part of this trust comes from my ability to predict the actions of my co-tenants based on thousands of observations over the years. These observations have been stored, some as vivid episodes, others as extractions of the regularities in my co-tenants’ behaviors, in my malleable brain.

Memory for episodes, in particular, fascinates me for two reasons: the first being that our autobiography, and to a large extent our identity, is made up of our memories of the past, and feels to us like a searchable database of our experience (more here) and the second being the extraordinary observation that patients who lose the ability to retain event memories are also unable to imagine the future (the case of K.C.described here). I’ve studied autobiographical memory for over a decade, from my very first published paper (found here) to the one that’s currently in the STUFF TO DO NOW folder on my desktop.

As neuroscientists have come to navigate the ever-shifting landscape of our personal memories, it has become increasingly clear that our representations of the past and the future overlap to a very large extent. A former lab-mate of mine, who finished her PhD with the supervisors of my very first project (Dr. Morris Moscovitch and Dr. MaryPat McAndrews, at the University of Toronto, co-authors on my first published paper), went on to complete a highly-successful post-doctoral fellowship at Harvard with (another U of T alum) Dr. Daniel Schacter. Donna Addis is a sweet, intelligent and dynamic woman (now a professor in her own right at the University of Auckland) who conducted a seminal neuroimaging study demonstrating that the brain regions that support episodic memory (the medial temporal lobe, the core of which is the hippocampus, as well as areas of dorsolateral prefrontal and parietal cortices) are also involved in imagining the future (paper available here).

Most married couples, among other people, know that memory is a constructive process, rather than an accurate recording of what actually happened. For a long time, this constructive aspect of event memory was seen as a short-coming, rather than a desirable feature. But Dan Schacter and Donna Addis suggested that we should rethink this negative connotation and think about the benefits of a constructive memory system. Specifically, they suggested that this feature enables us to pull together bits of our past, recombine them and imagine the consequences of our actions in the future. Our constructive memory system gives us the tools we need to become effective soothsayers.

This ability gives me an edge over Mr. Cat in many different ways. I can dream big, and imagine every step that I need to take in order to make that dream come true. I can delay immediate gratification for a bigger reward down the line. I can adapt to changing circumstances because I can alter my imagined future to account for new information. But this ability can also draw me out of the present and prevent me from enjoying the moment because I’m too busy planning for the future. I’ve got a lot of decisions to make in the coming weeks and I need Mr. Cat to remind me that when the door is open, my belly is full and it’s raining outside, it’s perfectly ok to snooze with the chef.

Comments

There's still hope for sopranos: thanks to evolution

This week I’ve been having a lot of fun with the music that I’m working on because my voice teacher is out of town and I took her absence as an opportunity to remind myself why I continue to work on improving my vocal chops every day rather than simply maintaining my current level of performance. With sincere apologies to my neighbors, I have to admit that I’ve been enjoying practicing the stratospherically-high runs in the famous Queen of the Night aria (Der Holle Rache - you can find it here, starting 40 secs into the video) from Mozart’s The Magic Flute. Sometimes I just need to let it rip.

P1030292

I’ve also been dusting off the Chansons Madecasses or Madagascar Songs by Maurice Ravel, in preparation for a concert that I’m giving on July 15th (details and a sample of one of the songs can be found here). These are fabulous, sensual and politically-charged pieces for voice, piano, cello and flute and I love them dearly. These chansons couldn’t be more different from Der Holle Rache in terms of the words, dramatic context, feel, style, tonality and texture and yet, when I’m singing them right, I get goosebumps. And not just when I’m singing them, but also when I hear someone else performing them.

Having that experience made me wonder, as I often do, why we get the ‘chills’ from specific musical passages. Jaak Panksepp, an Estonian-born (we’re practically neighbors! as my mother would say) neuroscientist in Washington state has studied and written about this phenomenon for decades, with an influential paper published in 1995 showing that, contrary to our intuition, we get the chills when we listen to ‘sad’ music, rather than music that makes us feel happy. A solo line, often in the soprano register (lucky for me), emerging from a denser musical texture most often caused his subjects to experience chills. He also found that women are more likely than men to get goosebumps when listening to music.

He has since gone on to suggest that the experience of chills evoked by music is related to the distress that we feel when we are separated from someone we love and that this response has perhaps evolved to encourage mothers to respond to their crying babies. It’s easy to imagine many of the most memorable musical passages as separation calls: Whitney Houston’s version of Dolly Parton’s I will always love you, the guitar solo in The Eagles’ Hotel California, the vocalise by Rachmaninoff, to name just a few. The solo instrument, on a simple melodic line, emerging from a thicket of other sounds.

Blood and Zatorre, neuroscientists at McGill University used neuroimaging to explore the parts of the brain that are activated during the experience of the chills evoked by music (you can find a copy here). They report that the same brain regions involved in other pleasurable activities such as eating or having sex such as the orbitofrontal and ventromedial prefrontal cortex, the striatum and the midbrain, are also involved in this experience. But what’s most interesting to me about their studies is the fact that exactly which musical passage evokes the experience is very much tied to the individual: just because I like it, or find it moving, doesn’t mean that you will. Of course, that observation is self-evident to most of us, and the staggering diversity of music available to us demonstrates that musical taste is deeply personal. By the same token, I’ve watched mothers pick out their own baby’s cry from a cacophony of sounds with remarkable ease.

As I return to my vocal practice this week, I’m going to keep both Panksepp’s and Blood and Zatorre’s findings in mind. And at my next audition, I’m not going to worry about the fact that at least a hundred other sopranos are vying for the part. I’ll remember that, just like a baby’s cry, each of our voices is unique and there’s no telling which of our voices will wake the latent maternal instinct deep in the heart of the men and women on the audition panel. There might be a lot of sopranos out there, but we also might be favored by evolution to give our audiences the chills. And that’s a goal worthy of all the practice hours it demands.
Comments

Being a Scientist is like being a Yankees Fan

The Yankees will eventually win another World Series. Science will also eventually get things right. Read More...
Comments

Not with a bang, but with a whimper

The purpose of this blog is to generate and nurture ideas on topics that represent the wide variety of subjects that irk, interest and impassion me: neuroscience, opera, critical thinking, empathy and all the places where those topics intersect. Read More...
Comments