But there is one thing that is indisputable: you won’t find out whether you love opera if you are unable to access it. And going to the opera is in some ways much more similar to going to the movies than any other form of entertainment. Both genres are all-encompassing: the lights are low, the drama looms large, the music shuttles your emotions from one extreme to the next, and the goal is to sweep you away into a different world. it just so happens that period and foreign pieces are much more common in opera than in Hollywood.
Like many North Americans, I often crave going to the movies precisely because I will be thoroughly entertained. It’s easy to forget work or personal troubles when your senses are bombarded with the powerful images and sounds of a movie. Sure, some movies are more engrossing than others: some are better acted, others better directed, some stories resonate with me more closely, others are too predictable for my taste. But if I go to the movies and a particular film is a dud, I don’t simply conclude that movies aren’t for me. Not so with opera. Many people only give it one shot. Or, worse, decide they don’t like the genre having never been to a single performance.
The most devoted opera fans are a different breed altogether. They will go to every production, multiple times, even if they hate it. They will relish their distaste for a particular singer, or director or concept. And they will elevate to the status of legendary hero the singer/director/composer/conductor of their choice. We see a similar passion in devotees of many other things: cars, movies, comic books, TV shows, indie rock bands, wine, baseball or pickled foods. Of course, like with so many other objects of affection, there is a rampant nostalgia for the past: a lost golden-age that makes the present seem thin and lifeless.
I believe that the vast majority of people want the same thing: to live a long, meaningful life surrounded by the people they love. But I’ve often wondered how ‘meaningful’ translates to different people. Recently, I came across an interesting notion: some psychologists believe that our search for meaning stems from our uniquely human awareness of our own mortality (though the jury is still out as to how unique this knowledge is to our fair species). A paper that just came out in the highly-regarded Journal of Personality and Social Psychology this week suggests that nostalgia helps give our lives a sense of meaning. The authors themselves say it best:
Believing, then, that one is part of something larger and more meaningful than one’s own physical self provides a psychological defense against the threat of inevitable, and absolute, physical annihilation (Becker, 1973).
It makes me tired just to think about that absolute, physical annihilation but I do think they have a point. In a clever series of studies and surveys, the psychologists found that nostalgia adds a sense of meaning to life by making one feel connected to others. Perhaps a feeling of leaving a legacy aids in dampening our fear of death by the same token. They also found that music evoked a sense of nostalgia, that led to the feeling of being loved and to the idea that life is worth living. When the meaning of life was threatened by reading an essay that contained the following comments, the participants engaged in nostalgic thinking as a defense mechanism:
When participants who reported a low sense of meaning in life were encouraged to engage in nostalgic reflection, they showed an increase in vitality and an attenuated response to stress.
There are approximately 7 billion people living on this planet. So take a moment to ponder the following question: In the grand scheme of things, how significant are you? The Earth is 5 billion years old and the average human life span across the globe is 68 years. These statistics serve to emphasize how our contribution to the world is paltry, pathetic and pointless. What is 68 years of one person’s rat-race compared to 5 billion years of history? We are no more significant than any other form of life in the universe.
So perhaps this focus on
a golden-era is simply the opera aficionado’s attempt
to cope with his/her mortality. Certainly old
recordings of operas trigger a strong sense of
nostalgia. And attending a live opera performance in
general can foster a sense of connectedness not only
with the other members of the audience and the
musicians, but also with the great works of
literature upon which the stories are based, and the
historical eras represented onstage. Perhaps this is
one more reason why the great operas, like Le Nozze
di Figaro, and La Traviata endure for centuries.
Let’s say, for example,
that a patient falsely believes that he is being
monitored by the state: that someone has implanted a
microchip into his brain that transmits his thoughts
to a computer located in a branch of the Canadian
equivalent of the CIA (CSIS). If that were true, one
can imagine how frightening it might be. Hearing the
patient describe his emotional reaction to this
belief, I couldn’t help but sympathize. His emotional
reaction was entirely appropriate, even though the
cause of it was not real. I couldn’t sleep at night
because I kept imagining how frightening it must be
to have those irrational thoughts, or hallucinatory
experiences. Many patients, after responding
well to pharmacological treatments of their symptoms,
know that their delusions and hallucinations are
caused by disease rather than the outside world, and
they can describe them with the insight that an actor
has when describing what a character that she is
playing experiences. But most actors can readily turn
off feelings induced by their skills; patients with
schizophrenia live with the fear that they cannot
control their thoughts and emotions so easily.
I have to admit that throughout my twenties, I lived
with a small, nagging fear that at any time, my own
psyche could betray me and symptoms of schizophrenia
could just as easily tear apart my life as they had
the lives of many of the young people that I
encountered on that ward.
Like virtually all psychiatric disorders, the symptoms of schizophrenia arise from the building blocks of healthy mental processes. The fact that hallucinations and delusions use the same brain regions and mechanisms as normal perceptions and beliefs makes the disease so devastating. Given this problem, it’s amazing that any drugs at all can target disease symptoms without destroying healthy thoughts and perceptions.
The different symptoms of schizophrenia are likely caused by different pathologies: some resulting from changes in dopamine receptors in the prefrontal cortex, others from changes in the way that the brain cells respond to acetylcholine, serotonin, GABA and/or other neurotransmitters. The pharmacological treatment for schizophrenia these days revolves around a cocktail of drugs targeting specific symptoms, which is why psychiatrists have such a hard time finding the right doses and combinations of drugs to maximize benefits and minimize their side effects.
Perhaps because the disease is so heterogeneous, I found that every patient with whom I interacted was first and foremost a unique individual, rather than a textbook case. Each person’s experience was different, and the problem of diagnosis dominated the conversation in the clinic. Yet I found myself relating to the experiences of these patients much more quickly than I would have expected, given how strange their symptoms sound when listed in a textbook.
My primary interest in neuroscience has been to understand the narrative and constructive nature of memory. In the course of my recent work on the topic, I came across a computational model of some of the cognitive symptoms of schizophrenia, aptly named DISCERN. In the journal Biological Psychiatry, Hoffman and colleagues developed a computer model of the cognitive symptoms of schizophrenia, and then tested both the model and real patients on a test of memory for stories (delayed story recall task). Patients with schizophrenia often have trouble remembering stories and some neuroscientists think that this episodic memory breakdown might lead to delusional thinking (click here for a paper reviewing the relationship between memory biases and delusions in schizophrenia).
What’s fascinating about the computational model is that the best predictor of errors made by patients with schizophrenia was a version that used hyperlearning as the mechanism of disruption. That is, delusions in patients with schizophrenia might be the result of an inability to forget, or to suppress irrelevant information from memory. Which reminds me, once again, of why the way in which our memory works is so fascinating: somehow, when functioning optimally, our minds ‘know’ or ‘learn’ to discriminate between details of our experiences, those that should be remembered and those that should be forgotten, so that we can make sense of the world and, with some accuracy, make predictions about the future. The vast majority of our brain’s operations seem to happen outside of our consciousness. We might know relatively little about our brains, but they sure do know a lot about us.
When I complain of the angst of decision-making to my friends, eventually, at some point in the conversation, they comfort me with the notion that no matter what I decide, it will have been the right decision. Now, if that were strictly true, then the decision would not be that difficult, and I would find real comfort in being reminded of that fact. But the truth is that choosing one alternative over another will lead to a different set of outcomes and, depending on what I value at the time, some outcomes will most definitely be better than others. So the real task is to predict what outcomes I will value in the future, and to make decisions that will lead to those outcomes and hopefully some form of personal fulfillment or contentment.
Although my well-meaning friends might not be right in that direct interpretation of the adage, they are all sage neuroscientists, in spirit, if not by way of education and career. Because the truth is that no matter what I decide, it’s very likely that my brain will work hard to convince me that it was the right decision.
Making the wrong decision leads to a very uncomfortable state that psychologists call ‘cognitive dissonance’: it’s the terrible feeling you get when your view of the world is at odds with how the world actually works. For example, if you believe that LA traffic is predictable and easily navigated so long as you avoid the rush hour(s) and never hit the beach, you might find that the third time you find yourself in standstill traffic in the middle of the day on random Wednesdays, you’ll feel the need to revise your belief system as it pertains to traffic in the city of Angels. Voila! You’ve just experienced the soothing effect of how our minds deal with cognitive dissonance: we simply change our attitudes, beliefs and their resulting actions.
American social psychologist Leon Festinger is credited with developing the first comprehensive theory of cognitive dissonance but the observation that we change our beliefs in order to justify our decisions can be traced all the way back to Aesop, and his fable The Fox and the Grapes: Fox wants grapes but can’t reach them, so he decides the grapes must be sour. It is easy to despise what you cannot get.
Fifty years after Festinger published his seminal book, social psychologists are still working out just how much our minds justify our choices. In a recent study published in Psychological Science, Tali Sharot, Cristina Velazquez and Ray Dolan demonstrated that the very act of making a choice affects our preferences. They asked study participants to rate 80 vacation destinations by imagining themselves taking a holiday there and predicting how happy they might be. Then, they told the participants that they were participating in a test of subliminal decision-making and asked them to pick one of two alternative destinations that they would only perceive ‘subliminally’. In fact, the alternatives weren’t presented at all during the decision-making but only appeared on the screen after a blind choice had been made. Even though the participants hadn’t actually chosen the destination, they still showed a preference for it when asked to rate it at a later time. When a computer made the choice for them, however, they didn’t show that same preference. The act of choosing affects makes us like our choice.
No matter which decision I make, you can bet that my anterior cingulate cortex will be working in overdrive, monitoring all the conflicting feelings and thoughts that will eventually be melded into a new worldview. In the meantime, I will follow the advice of a dear friend, whose excellent decision-making has created a flourishing international career, a loving and rich family life and lots of laughs: 1) first, gather all the information you can get about the possible outcomes, 2) hold off making the decision until the last possible moment, 3) once you’ve made the decision, tell everyone about it so that you can’t back out of it and finally 4) stop thinking about what would have happened if you had picked the other alternative. Oh, and he also said: ‘whatever you decide, I’m sure it will be the right decision’. How true.
Creativity is a slippery process: first, you have to gather all the necessary information and skills, second, you try to combine what you know or can do in a new way, then you generally need to step away from the problem or task and let it simmer for a bit, and finally, the new idea or way of expressing yourself seems to ‘pop’ into your mind. That third stage is called the incubation period. Understanding exactly what’s going on during that incubation period is arguably he Holy Grail in the study of creativity.
This week I came across two interesting studies of incubation that were published within a few months of each other in 2009. Sio and Ormerod reviewed a number of empirical studies of incubation in the journal Psychological Bulletin and found that when someone needs to consider a large amount of information to come up with a creative solution, the incubation period is particularly important. When the problem is visual rather than language-based, incubation is only effective if the person has undergone a long preparation period and has hit a creative block.
Denise Cai in Sarah Mednick’s lab at UCSD wondered whether dreaming, or rapid-eye-movement (REM) sleep, when our brain is busy consolidating what we’ve learned while we were awake, might be the critical component in incubation. She had her subjects take Mednick’s Remote Associates Test (RAT), a commonly used test of creativity, in which the goal is to figure out how three items are related (e.g. cookies, sixteen, heart - once you’ve had a chance to think about it, scroll down to see the answer below) and then she randomly assigned them to one of two conditions: full-on napping (measured by a polysomograph) or resting quietly while listening to instrumental music. Turns out that napping did, in fact, improve performance significantly more than rest when they were tested again on the RAT in the afternoon.
How important REM sleep is for memory consolidation remains fairly controversial, but there’s no question that sleep affects memory, especially the deepest sleep, called slow-wave sleep. Many professional classical musicians take a nap in the afternoon: napping helps their bodies recover from a long morning practice session and prepare for an evening concert but it’s also likely that their brains are consolidating the motor sequences that they have been learning while their conscious minds are at rest.
Whatever the relationship might be between sleep, memory consolidation and creativity, one thing is clear: there is something still magical about incubation. This weekend, my dreams were filled with waterfalls and butterflies, and new ideas are bubbling in my brain. It will be a while before I underestimate the importance of taking time off again. Oh and the answer to the RAT item above is sweet. Literally.