Posts Tagged ‘psychology’

Supplementing the previous post on the illusion of time, here are 10 examples of warped time provided by PsyBlog.

How time perception is warped by life-threatening situations, eye movements, tiredness, hypnosis, age, the emotions and more…

The mind does funny things to our experience of time. Just ask French cave expert Michel Siffre.

In 1962 Siffre went to live in a cave that was completely isolated from mechanical clocks and natural light. He soon began to experience a huge change in his experience of time.

When he tried to measure out two minutes by counting up to 120 at one-second intervals, it took him 5 minutes. After emerging from the cave he guessed the trip had lasted 34 days. He’d actually been down there for 59 days. His experience of time was rapidly changing. From an outside perspective he was slowing down, but the psychological experience for Siffre was that time was speeding up.

But you don’t have to hide out in a cave for a couple of months to warp time, it happens to us all the time. Our experience of time is flexible; it depends on attention, motivation, the emotions and more.

1. Life-threatening situations

People often report that time seems to slow down in life-threatening situations, like skydiving.

But are we really processing more information in these seconds when time seems to stretch? Is it like slow-motion cameras in sports which can actually see more details of the high-speed action?

To test this, Stetson et al. (2007) had people staring at a special chronometer while free-falling 50 metres into a net. What they found was that time resolution doesn’t increase: we’re not able to distinguish shorter periods of time when in danger. What happens is we remember the time as longer because we record more of the experience. Life-threatening experiences make us really pay attention but we don’t gain superhuman powers of perception.

2. Time doesn’t fly when you’re having fun

We’ve all experienced the fact that time seems to fly when we’re having fun. Or does it? What about when you’re listening to a fantastic uplifting piece of music? Does time seem to fly by, or conversely, does it seem to slow down?

When this was tested by Kellaris (1992), they found that when listeners enjoyed the music more, time seemed to slow down. This may be because when we enjoy music we listen more carefully, getting lost in it. Greater attention leads to perception of a longer interval of time.

The same thing happens when you have a really good, exciting day out. At the end of the day it can feel like you ate breakfast a lifetime ago. You enjoyed yourself enormously and yet time has stretched out.

The fact that we intuitively believe time flies when we’re having fun may have more to do with how time seems to slow when we’re not having fun. Boredom draws our attention to the passage of time which gives us the feeling that it’s slowing down.

Or—prepare yourself for a 180 degree about-face—it could all be the other way around. Perhaps you’re having fun when time flies. In other words, we assume we’ve been enjoying ourselves when we notice that time has passed quickly.

There’s evidence for this in a recent experiment by Sackett et al. (2010). Participants doing a boring task were tricked into thinking it had lasted half as long as it really had. They thought it was more enjoyable than those who had been doing exactly the same task but who hadn’t been tricked about how much time had passed.

Ultimately it may come down to how much you believe that time flies when you’re having fun. Sackett and colleagues tested this idea as well and found it was true. In their experiments, people who believed more strongly in the idea that time flies when you’re having fun were more likely to believe they were having fun when time flew. So, the whole thing could partly be a self-fulfilling prophecy.

3. The stopped clock illusion

The stopped clock illusion is a weird effect that you may have experienced. It happens when you look at an analogue watch and the second-hand seems to freeze for longer than a second before moving on.

I always thought this was because I just happened to look at it right at the start of the second, but this is actually an illusion.

What is happening is that when your eyes move from one point to another (a saccade), your perception of time stretches slightly (Yarrow et al., 2001). Weirdly, it stretches backwards. So your brain tells you that you’ve been looking at the watch for slightly longer than you really have. Hence the illusion that the second-hand is frozen for more than a second.

This happens every time our eyes move from one fixation point to the next, it’s just that we only notice it when looking at a watch. One explanation is that our brains are filling in the gap while our eyes move from looking at one thing to the next.

4. Too tired to tell the time

When things happen very close together in time, our brains fuse them together into a single snapshot of the present. For vision the shortest interval we can perceive is about 80 milliseconds. If two things happen closer together than that then we experience them as simultaneous.

The shortest possible gap in time we can distinguish across modalities (say visual and auditory) is between 20 and 60 milliseconds (Fink et al., 2006). That’s as little as a fiftieth of a second.

When we’re tired, though, our perception of time goes awry and we find it more difficult to distinguish between short spaces of time. This fact can be used to measure whether people are too tired to fly a plane, drive a truck or be a doctor. Indeed just such simple hand-held devices that quickly assess your tiredness are already being developed (Eagleman, 2009).

5. Self-regulation stretches time

The effort of trying to either suppress or enhance our emotional reactions seems to change our perception of time. Psychologists have found that when people are trying to regulate their emotions, time seems to drag on.

Vohs and Schmeichel (2003) had participants watch an 11 minute clip from the film Terms of Endearment. Some participants were asked to remain emotionally neutral while watching the clip and others were told to act naturally. Those who tried to suppress their emotions estimated the clip had lasted longer than it really had.

6. Altered states of consciousness

People report all sorts of weird experiences with time when taking drugs like psilocybin, peyote or LSD. Time can seem to speed up, slow down, go backwards, or even stop.

But you don’t need drugs to enter an altered state of consciousness, hypnosis will do the trick. People generally seem to underestimate the time that they’ve been under hypnosis. One study found this figure was around 40% (Bowers & Brenneman, 1979).

7. Does time speed up with age?

People often say the years pass more quickly as they get older. While youthful summers seemed to stretch on into infinity, the summers of your later years zip by in the blink of an eye.

A common explanation for this is that everything is new when we are young so we pay more attention; consequently it feels like time expands. With age, though, new experiences diminish and it tends to be more of the same, so time seems to pass more quickly.

Whether or not this is true, there is some psychological evidence that time passes quicker for older people. One study has found that people in their 20s are pretty accurate at guessing an interval of 3 minutes, but people in their 60s systematically overestimate it, suggesting time is passing about 20% more quickly for them (Mangan & Bolinsky, 1997).

8. The emotional experience of time

The emotions we feel in the moment directly affect our perception of time. Negative emotions in particular seem to bring time to people’s attention and so make it seem longer.

Research on anxious cancer patients, those with depression and boredom-prone individuals suggests time stretches out for them (reported in Wittmann, 2009). Just like life-threatening situations, negative emotions can concentrate our attention on the passage of time and so make it seem longer than it really is.

This effect may be made worse by our efforts to regulate these negative emotions (see number 5), which also has the effect of stretching time.

9. It’s getting hot in here

If you’ve ever had a fever then you’ll know that body temperature can have strange effects on time perception.

Experiments have found that when body temperature is raised our perception of time speeds up (Wearden & Pento-Voak, 1995). Conversely when we are cooled down, our sense of time also slows down.

10. What’s your tempo?

Setting aside emotions, age, drugs and all the rest, our experience of time is also affected by who we are. People seem to operate to different beats; we’ve all met people who work at a much slower or faster pace than we do. Psychologists have found that people who are impulsive and oriented towards the present tend to find that time moves faster for them than others (from O’Brien et al., 2011).

There’s little research on this but it’s likely that each of us has our own personal tempo. Research has found that when different people listen to metronomes the number of beats per minute they describe as comfortable ranges from as slow as 40 bpm up to a high of 200 bpm (Kir-Stimon, 1977). This is a large range and may help to explain why some people seem to operate at such a different pace to ourselves.

Time is relative

The last words on time come from two great thinkers; first Albert Einstein:

“Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour, and it seems like a minute. That’s relativity.”

And finally, Douglas Adams:

“Time is an illusion. Lunchtime doubly so.”

Image credit: Alice Lucchin

Advertisements

The question is: are we one self, one personality with different desires or are we a community of different selves with their own desires, their own strategies and resources? What about us dieting, feeling good about our self and our progress and being convinced we are the success we wanted to be. Then comes the chocolate cake, we convince ourselves that one slice now and then is ok and that it’s good anyway not to be too rigid, and all of a sudden it’s like living in a different world, being inhabited by a different mind. Or we’re two thirds through a planned 10km jog, already feeling exhausted, longing for a hot shower and a rest and cursing the original plan to get fit through jogging. Once we’ve finished our run though and we’re under the hot shower, we feel happy about having succeeded and already think of tomorrow jog.

The author of the article below, psychologist Paul Bloom, proposes that we’re not dealing here just with different desires battling it out in one self or personality but that in different situations we are taking on different personas already living in us. This concept throws up all kinds of questions. For example: how do we define happiness if our different selves have different ideas about what constitutes happiness; can something that seems to be right for one self be judged as wrong by another; which self has priority over another; can we trust our instinctual responses that say that pain is always bad or that love for children is higher than love for sex?

The post could be more succinct, but on the other hand it uses plenty of well-chosen examples to back up the author’s hypothesis of a living community of different selves in our heads, and he also provides some thoughts on how to manage those selves.

(One point that I find intriguing and that Bloom did not raise is a link between our minds and one branch of physics that speculates about us creating an infinite number of universes through being forced to make decisions. Which role would those different selves play in such assumed processes?)

First Person Plural

An evolving approach to the science of pleasure suggests that each of us contains multiple selves—all with different desires, and all fighting for control. If this is right, the pursuit of happiness becomes even trickier. Can one self “bind” another self if the two want different things? Are you always better off when a Good Self wins? And should outsiders, such as employers and policy makers, get into the fray?

By Paul Bloom | The Atlantic

Imagine a long, terrible dental procedure. You are rigid in the chair, hands clenched, soaked with sweat—and then the dentist leans over and says, “We’re done now. You can go home. But if you want, I’d be happy to top you off with a few minutes of mild pain.”

There is a good argument for saying “Yes. Please do.”

The psychologist and recent Nobel laureate Daniel Kahne­man conducted a series of studies on the memory of painful events, such as colonoscopies. He discovered that when we think back on these events, we are influenced by the intensity of the endings, and so we have a more positive memory of an experience that ends with mild pain than of one that ends with extreme pain, even if the mild pain is added to the same amount of extreme pain. At the moment the dentist makes his offer, you would, of course, want to say no—but later on, you would be better off if you had said yes, because your overall memory of the event wouldn’t be as unpleasant.

Such contradictions arise all the time. If you ask people which makes them happier, work or vacation, they will remind you that they work for money and spend the money on vacations. But if you give them a beeper that goes off at random times, and ask them to record their activity and mood each time they hear a beep, you’ll likely find that they are happier at work. Work is often engaging and social; vacations are often boring and stressful. Similarly, if you ask people about their greatest happiness in life, more than a third mention their children or grandchildren, but when they use a diary to record their happiness, it turns out that taking care of the kids is a downer—parenting ranks just a bit higher than housework, and falls below sex, socializing with friends, watching TV, praying, eating, and cooking.

The question “What makes people happy?” has been around forever, but there is a new approach to the science of pleasure, one that draws on recent work in psychology, philosophy, economics, neuroscience, and emerging fields such as neuroeconomics. This work has led to new ways—everything from beepers and diaries to brain scans—to explore the emotional value of different experiences, and has given us some surprising insights about the conditions that result in satisfaction.

But what’s more exciting, I think, is the emergence of a different perspective on happiness itself. We used to think that the hard part of the question “How can I be happy?” had to do with nailing down the definition of happy. But it may have more to do with the definition of I. Many researchers now believe, to varying degrees, that each of us is a community of competing selves, with the happiness of one often causing the misery of another. This theory might explain certain puzzles of everyday life, such as why addictions and compulsions are so hard to shake off, and why we insist on spending so much of our lives in worlds­—like TV shows and novels and virtual-reality experiences—that don’t actually exist. And it provides a useful framework for thinking about the increasingly popular position that people would be better off if governments and businesses helped them inhibit certain gut feelings and emotional reactions.

Like any organ, the brain consists of large parts (such as the hippocampus and the cortex) that are made up of small parts (such as “maps” in the visual cortex), which themselves are made up of smaller parts, until you get to neurons, billions of them, whose orchestrated firing is the stuff of thought. The neurons are made up of parts like axons and dendrites, which are made up of smaller parts like terminal buttons and receptor sites, which are made up of molecules, and so on.

This hierarchical structure makes possible the research programs of psychology and neuroscience. The idea is that interesting properties of the whole (intelligence, decision-making, emotions, moral sensibility) can be understood in terms of the interaction of components that themselves lack these properties. This is how computers work; there is every reason to believe that this is how we work, too.

But there is no consensus about the broader implications of this scientific approach. Some scholars argue that although the brain might contain neural subsystems, or modules, specialized for tasks like recognizing faces and understanding language, it also contains a part that constitutes a person, a self: the chief executive of all the subsystems. As the philosopher Jerry Fodor once put it, “If, in short, there is a community of computers living in my head, there had also better be somebody who is in charge; and, by God, it had better be me.”

More-radical scholars insist that an inherent clash exists between science and our long-held conceptions about consciousness and moral agency: if you accept that our brains are a myriad of smaller components, you must reject such notions as character, praise, blame, and free will. Perhaps the very notion that there are such things as selves—individuals who persist over time—needs to be rejected as well.

The view I’m interested in falls between these extremes. It is conservative in that it accepts that brains give rise to selves that last over time, plan for the future, and so on. But it is radical in that it gives up the idea that there is just one self per head. The idea is that instead, within each brain, different selves are continually popping in and out of existence. They have different desires, and they fight for control—bargaining with, deceiving, and plotting against one another.

The notion of different selves within a single person is not new. It can be found in Plato, and it was nicely articulated by the 18th-century Scottish philosopher David Hume, who wrote, “I cannot compare the soul more properly to any thing than to a republic or commonwealth, in which the several members are united by the reciprocal ties of government and subordination.” Walt Whitman gave us a pithier version: “I am large, I contain multitudes.”

The economist Thomas Schelling, another Nobel laureate, illustrates the concept with a simple story:

As a boy I saw a movie about Admiral Byrd’s Antarctic expedition and was impressed that as a boy he had gone outdoors in shirtsleeves to toughen himself against the cold. I resolved to go to bed at night with one blanket too few. That decision to go to bed minus one blanket was made by a warm boy; another boy awoke cold in the night, too cold to retrieve the blanket … and resolving to restore it tomorrow. The next bedtime it was the warm boy again, dreaming of Antarctica, who got to make the decision, and he always did it again.

Examples abound in our own lives. Late at night, when deciding not to bother setting up the coffee machine for the next morning, I sometimes think of the man who will wake up as a different person, and wonder, What did he ever do for me? When I get up and there’s no coffee ready, I curse the lazy bastard who shirked his duties the night before.

But anyone tempted by this theory has to admit just how wrong it feels, how poorly it fits with most of our experience. In the main, we do think of ourselves as singular individuals who persist over time. If I were to learn that I was going to be tortured tomorrow morning, my reaction would be terror, not sympathy for the poor guy who will be living in my body then. If I do something terrible now, I will later feel guilt and shame, not anger at some other person.

It could hardly be otherwise. Our brains have evolved to protect our bodies and guide them to reproduce, hence our minds must be sensitive to maintaining the needs of the continuing body—my children today will be my children tomorrow; if you wronged me yesterday, I should be wary of you today. Society and human relationships would be impossible without this form of continuity. Anyone who could convince himself that the person who will wake up in his bed tomorrow is really someone different would lack the capacity for sustained self-interest; he would feel no long-term guilt, love, shame, or pride.

The multiplicity of selves becomes more intuitive as the time span increases. Social psychologists have found certain differences in how we think of ourselves versus how we think of other people—for instance, we tend to attribute our own bad behavior to unfortunate circumstances, and the bad behavior of others to their nature. But these biases diminish when we think of distant past selves or distant future selves; we see such selves the way we see other people. Although it might be hard to think about the person who will occupy your body tomorrow morning as someone other than you, it is not hard at all to think that way about the person who will occupy your body 20 years from now. This may be one reason why many young people are indifferent about saving for retirement; they feel as if they would be giving up their money to an elderly stranger.

One can see a version of clashing multiple selves in the mental illness known as dissociative-identity disorder, which used to be called multiple-personality disorder. This is familiar to everyone from the dramatic scenes in movies in which an actor is one person, and then he or she contorts or coughs or shakes the head, and—boom!—another person comes into existence. (My own favorite is Edward Norton in Primal Fear, although—spoiler alert—he turns out in the end to be faking.)

Dissociative-identity disorder is controversial. It used to be rarely diagnosed, then the number of reported cases spiked dramatically in the 1980s, particularly in North America. The spike has many possible explanations: the disorder was first included as a specific category in the 1980 version of the Diagnostic and Statistical Manual of Mental Disorders, just as an influential set of case studies of multiple personalities was published. And increased popular interest was fueled by the 1973 novel Sybil and its 1976 movie adaptation, which starred Sally Field as a woman with 16 different personalities.

Some psychologists believe that this spike was not the result of better diagnosis. Rather, they say it stemmed in part from therapists who inadvertently persuaded their patients to create these distinct selves, often through role-playing and hypnosis. Recent years have seen a backlash, and some people diagnosed with the disorder have sued their therapists. One woman got a settlement of more than $2 million after alleging that her psychotherapist had used suggestive memory “recovery” techniques to convince her that she had more than 120 personalities, including children, angels, and a duck.

Regardless of the cause of the spike, considerable evidence, including recent brain-imaging studies, suggests that some people really do shift from one self to another, and that the selves have different memories and personalities. In one study, women who had been diagnosed with dissociative-identity disorder and claimed to be capable of shifting at will from one self to another listened to recordings while in a PET scanner. When the recordings told of a woman’s own traumatic experience, the parts of the brain corresponding to autobiographic memory became active—but only when she had shifted to the self who had endured that traumatic experience. If she was in another self, different parts of the brain became active and showed a pattern of neural activity corresponding to hearing about the experience of a stranger.

Many psychologists and philosophers have argued that the disorder should be understood as an extreme version of normal multiplicity. Take memory. One characteristic of dissociative-identity disorder is interpersonality amnesia—one self doesn’t have access to the memories of the other selves. But memory is notoriously situation-dependent even for normal people—remembering something is easiest while you are in the same state in which you originally experienced it. Students do better when they are tested in the room in which they learned the material; someone who learned something while he was angry is better at remembering that information when he is angry again; the experience of one’s drunken self is more accessible to the drunk self than to the sober self. What happens in Vegas stays in Vegas.

Personality also changes according to situation; even the most thuggish teenager is not the same around his buddies as he is when having tea with Grandma. Our normal situation dependence is most evident when it comes to bad behavior. In the 1920s, Yale psychologists tested more than 10,000 children, giving them a battery of aptitude tests and putting them in morally dicey situations, such as having an opportunity to cheat on a test. They found a striking lack of consistency. A child’s propensity to cheat at sports, for instance, had little to do with whether he or she would lie to a teacher.

More-recent experiments with adults find that subtle cues can have a surprising effect on our actions. Good smells, such as fresh bread, make people kinder and more likely to help a stranger; bad smells, like farts (the experimenters used fart spray from a novelty store), make people more judgmental. If you ask people to unscramble sentences, they tend to be more polite, minutes later, if the sentences contain positive words like honor rather than negative words like bluntly. These findings are in line with a set of classic experiments conducted by Stanley Milgram in the 1960s—too unethical to do now—showing that normal people could be induced to give electric shocks to a stranger if they were told to do so by someone they believed was an authoritative scientist. All of these studies support the view that each of us contains many selves—some violent, some submissive, some thoughtful—and that different selves can be brought to the fore by different situations.

The population of a single head is not fixed; we can add more selves. In fact, the capacity to spawn multiple selves is central to pleasure. After all, the most common leisure activity is not sex, eating, drinking, drug use, socializing, sports, or being with the ones we love. It is, by a long shot, participating in experiences we know are not real—reading novels, watching movies and TV, daydreaming, and so forth.

Enjoying fiction requires a shift in selfhood. You give up your own identity and try on the identities of other people, adopting their perspectives so as to share their experiences. This allows us to enjoy fictional events that would shock and sadden us in real life. When Tony Soprano kills someone, you respond differently than you would to a real murder; you accept and adopt some of the moral premises of the Soprano universe. You become, if just for a moment, Tony Soprano.

Some imaginative pleasures involve the creation of alternative selves. Sometimes we interact with these selves as if they were other people. This might sound terrible, and it can be, as when schizophrenics hear voices that seem to come from outside themselves. But the usual version is harmless. In children, we describe these alternative selves as imaginary friends. The psychologist Marjorie Taylor, who has studied this phenomenon more than anyone, points out three things. First, contrary to some stereotypes, children who have imaginary friends are not losers, loners, or borderline psychotics. If anything, they are slightly more socially adept than other children. Second, the children are in no way deluded: Taylor has rarely met a child who wasn’t fully aware that the character lived only in his or her own imagination. And third, the imaginary friends are genuinely different selves. They often have different desires, interests, and needs from the child’s; they can be unruly, and can frustrate the child. The writer Adam Gopnik wrote about his young daughter’s imaginary companion, Charlie Ravioli, a hip New Yorker whose defining quality was that he was always too busy to play with her.

Long-term imaginary companions are unusual in adults, but they do exist—Taylor finds that many authors who write books with recurring characters claim, fairly convincingly, that these characters have wills of their own and have some say in their fate. But it is not unusual to purposefully create another person in your head to interact with on a short-term basis. Much of daydreaming involves conjuring up people, sometimes as mere physical props (as when daydreaming about sports or sex), but usually as social beings. All of us from time to time hold conversations with people who are not actually there.

Sometimes we get pleasure from sampling alternative selves. Again, you can see the phenomenon in young children, who get a kick out of temporarily adopting the identity of a soldier or a lion. Adults get the same sort of kick; exploring alternative identities seems to be what the Internet was invented for. The sociologist Sherry Turkle has found that people commonly create avatars so as to explore their options in a relatively safe environment. She describes how one 16-year-old girl with an abusive father tried out multiple characters online—a 16-year-old boy, a stronger, more assertive girl—to try to work out what to do in the real world. But often the shift in identity is purely for pleasure. A man can have an alternate identity as a woman; a heterosexual can explore homosexuality; a shy person can try being the life of the party.

Online alternative worlds such as World of Warcraft and Second Life are growing in popularity, and some people now spend more time online than in the real world. One psychologist I know asked a research assistant to try out one of these worlds and report on what it is like and how people behave there. The young woman never came back—she preferred the virtual life to the real one.

Life would be swell if all the selves inhabiting a single mind worked as a team, pulling together for a common goal. But they clash, and sometimes this gives rise to what we call addictions and compulsions.

This is not the traditional view of human frailty. The human condition has long been seen as a battle of good versus evil, reason versus emotion, will versus appetite, superego versus id. The iconic image, from a million movies and cartoons, is of a person with an angel over one shoulder and the devil over the other.

The alternative view keeps the angel and the devil, but casts aside the person in between. The competing selves are not over your shoulder, but inside your head: the angel and the devil, the self who wants to be slim and the one who wants to eat the cake, all exist within one person. Drawing on the research of the psychiatrist George Ainslie, we can make sense of the interaction of these selves by plotting their relative strengths over time, starting with one (the cake eater) being weaker than the other (the dieter). For most of the day, the dieter hums along at his regular power (a 5 on a scale of 1 to 10, say), motivated by the long-term goal of weight loss, and is stronger than the cake eater (a 2). Your consciousness tracks whichever self is winning, so you are deciding not to eat the cake. But as you get closer and closer to the cake, the power of the cake eater rises (3 … 4 …), the lines cross, the cake eater takes over (6), and that becomes the conscious you; at this point, you decide to eat the cake. It’s as if a baton is passed from one self to another.

Sometimes one self can predict that it will later be dominated by another self, and it can act to block the crossing—an act known as self-binding, which Thomas Schelling and the philosopher Jon Elster have explored in detail. Self-binding means that the dominant self schemes against the person it might potentially become—the 5 acts to keep the 2 from becoming a 6. Ulysses wanted to hear the song of the sirens, but he knew it would compel him to walk off the boat and into the sea. So he had his sailors tie him to the mast. Dieters buy food in small portions so they won’t overeat later on; smokers trying to quit tell their friends never to give them cigarettes, no matter how much they may later beg. In her book on gluttony, Francine Prose tells of women who phone hotels where they are going to stay to demand a room with an empty minibar. An alarm clock now for sale rolls away as it sounds the alarm; to shut it off, you have to get up out of bed and find the damn thing.

You might also triumph over your future self by feeding it incomplete or incorrect information. If you’re afraid of panicking in a certain situation, you might deny yourself relevant knowledge—you don’t look down when you’re on the tightrope; you don’t check your stocks if you’re afraid you’ll sell at the first sign of a downturn. Chronically late? Set your watch ahead. Prone to jealousy? Avoid conversations with your spouse about which of your friends is the sexiest.

Working with the psychologists Frank Keil, of Yale University, and Katherine Choe, now at Goucher College, I recently studied young children’s understanding of self-binding, by showing them short movies of people engaged in self-binding and other behaviors and asking them to explain what was going on. The children, aged 4 to 7, easily grasped that someone might put a video game on a high shelf so that another person couldn’t get it. But self-binding confused them: they were mystified when people put away the game so that they themselves couldn’t get hold of it.

But even though young children don’t understand self-binding, they are capable of doing it. In a classic study from the 1970s, psychologists offered children a marshmallow and told them they could either have it right away, or get more if they waited for a few minutes. As you would expect, waiting proved difficult (and performance on this task is a good predictor, much later on, of such things as SAT scores and drug problems), but some children managed it by self-binding—averting their eyes or covering the marshmallow so as to subvert their temptation-prone self for the greater pleasure of the long-term self.

Even pigeons can self-bind. Ainslie conducted an experiment in which he placed pigeons in front of a glowing red key. If they pecked it immediately, they got a small reward right away, but if they waited until the key went dark, they got a larger one. They almost always went for the quick reward—really, it’s hard for a pigeon to restrain itself. But there was a wrinkle: the key glowed green for several seconds before turning red. Pecking the key while it was green would prevent it from turning red and providing the option of the small, quick reward. Some of the pigeons learned to use the green key to help themselves hold out for the big reward, just as a person might put temptation out of reach.

For adult humans, though, the problem is that the self you are trying to bind has resources of its own. Fighting your Bad Self is serious business; whole sections of bookstores are devoted to it. We bribe and threaten and cajole, just as if we were dealing with an addicted friend. Vague commitments like “I promise to drink only on special occasions” often fail, because the Bad Self can weasel out of them, rationalizing that it’s always a special occasion. Bright-line rules like “I will never play video games again” are also vulnerable, because the Bad Self can argue that these are unreasonable—and, worse, once you slip, it can argue that the plan is unworkable. For every argument made by the dieting self—“This diet is really working” or “I really need to lose weight”—the cake eater can respond with another—“This will never work” or “I’m too vain” or “You only live once.” Your long-term self reads voraciously about the benefits of regular exercise and healthy eating; the cake eater prefers articles showing that obesity isn’t really such a problem. It’s not that the flesh is weak; sometimes the flesh is pretty damn smart.

It used to be simpler. According to the traditional view, a single, long-term-planning self—a you—battles against passions, compulsions, impulses, and addictions. We have no problem choosing, as individuals or as a society, who should win, because only one interest is at stake—one person is at war with his or her desires. And while knowing the right thing to do can be terribly difficult, the decision is still based on the rational thoughts of a rational being.

Seeing things this way means we are often mistaken about what makes us happy. Consider again what happens when we have children. Pretty much no matter how you test it, children make us less happy. The evidence isn’t just from diary studies; surveys of marital satisfaction show that couples tend to start off happy, get less happy when they have kids, and become happy again only once the kids leave the house. As the psychologist Daniel Gilbert puts it, “Despite what we read in the popular press, the only known symptom of ‘empty-nest syndrome’ is increased smiling.” So why do people believe that children give them so much pleasure? Gilbert sees it as an illusion, a failure of affective forecasting. Society’s needs are served when people believe that having children is a good thing, so we are deluged with images and stories about how wonderful kids are. We think they make us happy, though they actually don’t.

The theory of multiple selves offers a different perspective. If struggles over happiness involve clashes between distinct internal selves, we can no longer be so sure that our conflicting judgments over time reflect irrationality or error. There is no inconsistency between someone’s anxiously hiking through the Amazon wishing she were home in a warm bath and, weeks later, feeling good about being the sort of adventurous soul who goes into the rain forest. In an important sense, the person in the Amazon is not the same person as the one back home safely recalling the experience, just as the person who honestly believes that his children are the great joy in his life might not be the same person who finds them terribly annoying when he’s actually with them.

Even if each of us is a community, all the members shouldn’t get equal say. Some members are best thought of as small-minded children—and we don’t give 6-year-olds the right to vote. Just as in society, the adults within us have the right—indeed, the obligation—to rein in the children. In fact, talk of “children” versus “adults” within an individual isn’t only a metaphor; one reason to favor the longer-term self is that it really is older and more experienced. We typically spend more of our lives not wanting to snort coke, smoke, or overeat than we spend wanting to do these things; this means that the long-term self has more time to reflect. It is less selfish; it talks to other people, reads books, and so on. And it tries to control the short-term selves. It joins Alcoholics Anonymous, buys the runaway clock, and sees the therapist. As Jon Elster observes, the long-term, sober self is a truer self, because it tries to bind the short-term, drunk self. The long-term, sober self is the adult.

Governments and businesses, recognizing these tendencies, have started offering self-binding schemes. Thousands of compulsive gamblers in Missouri have chosen to sign contracts stating that if they ever enter a casino, anything they win will be confiscated by the state, and they could be arrested. Some of my colleagues at Yale have developed an online service whereby you set a goal and agree to put up a certain amount of money to try to ensure that you meet it. If you succeed, you pay nothing; if you fail, the money is given to charity—or, in a clever twist, to an organization you oppose. A liberal trying to lose a pound a week, for instance, can punish herself for missing her goal by having $100 donated to the George W. Bush Presidential Library.

The natural extension of this type of self-binding is what the economist Richard Thaler and the legal scholar Cass Sunstein describe as “libertarian paternalism”—a movement to engineer situations so that people retain their choices (the libertarian part), but in such a way that these choices are biased to favor people’s better selves (the paternalism part). For instance, many people fail to save enough money for the future; they find it too confusing or onerous to choose a retirement plan. Thaler and Sunstein suggest that the default be switched so that employees would automatically be enrolled in a savings plan, and would have to take action to opt out. A second example concerns the process of organ donation. When asked, most Americans say that they would wish to donate their organs if they were to become brain-dead from an accident—but only about half actually have their driver’s license marked for donation, or carry an organ-donor card. Thaler and Sunstein have discussed a different idea: people could easily opt out of being a donor, but if they do nothing, they are assumed to consent. Such proposals are not merely academic musings; they are starting to influence law and policy, and might do so increasingly in the future. Both Thaler and Sunstein act as advisers to politicians and policy makers, most notably Barack Obama.

So what’s not to like? There is a real appeal to anything that makes self-binding easier. As I write this article, I’m using a program that disables my network connections for a selected amount of time and does not allow me to switch them back on, thereby forcing me to actually write instead of checking my e-mail or reading blogs. A harsher (and more expensive) method, advised by the author of a self-help book, is to remove your Internet cable and FedEx it to yourself—guaranteeing a day without online distractions. One can also chemically boost the long-term self through drugs such as Adderall, which improves concentration and focus. The journalist Joshua Foer describes how it enabled him to write for hour-long chunks, far longer than he was usually capable of: “The part of my brain that makes me curious about whether I have new e-mails in my inbox apparently shut down.”

It’s more controversial, of course, when someone else does the binding. I wouldn’t be very happy if my department chair forced me to take Adderall, or if the government fined me for being overweight and not trying to slim down (as Alabama is planning to do to some state employees). But some “other-binding” already exists—think of the mandatory waiting periods for getting a divorce or buying a gun. You are not prevented from eventually taking these actions, but you are forced to think them over, giving the contemplative self the chance to override the impulsive self. And since governments and businesses are constantly asking people to make choices (about precisely such things as whether to be an organ donor), they inevitably have to provide a default option. If decisions have to be made, why not structure them to be in individuals’ and society’s best interests?

The main problem with all of this is that the long-term self is not always right. Sometimes the short-term self should not be bound. Of course, most addictions are well worth getting rid of. When a mother becomes addicted to cocaine, the pleasure from the drug seems to hijack the neural system that would otherwise be devoted to bonding with her baby. It obviously makes sense here to bind the drug user, the short-term self. On the other hand, from a neural and psychological standpoint, a mother’s love for her baby can also be seen as an addiction. But here binding would be strange and immoral; this addiction is a good one. Someone who becomes morbidly obese needs to do more self-binding, but an obsessive dieter might need to do less. We think one way about someone who gives up Internet porn to spend time building houses for the poor, and another way entirely about someone who successfully thwarts his short-term desire to play with his children so that he can devote more energy to making his second million. The long-term, contemplative self should not always win.

This is particularly true when it comes to morality. Many cruel acts are perpetrated by people who can’t or don’t control their short-term impulses or who act in certain ways—such as getting drunk—that lead to a dampening of the contemplative self. But evil acts are also committed by smart people who adopt carefully thought-out belief systems that allow them to ignore their more morally astute gut feelings. Many slave owners were rational men who used their intelligence to defend slavery, arguing that the institution was in the best interests of those who were enslaved, and that it was grounded in scripture: Africans were the descendants of Ham, condemned by God to be “servants unto servants.” Terrorist acts such as suicide bombings are not typically carried out in an emotional frenzy; they are the consequences of deeply held belief systems and long-term deliberative planning. One of the grimmest examples of rationality gone bad can be found in the psychiatrist Robert Jay Lifton’s discussion of Nazi doctors. These men acted purposefully for years to distance themselves from their emotions, creating what Lifton describes as an “Auschwitz self” that enabled them to prevent any normal, unschooled human kindness from interfering with their jobs.

I wouldn’t want to live next door to someone whose behavior was dominated by his short-term selves, and I wouldn’t want to be such a person, either. But there is also something wrong with people who go too far in the other direction. We benefit, intellectually and personally, from the interplay between different selves, from the balance between long-term contemplation and short-term impulse. We should be wary about tipping the scales too far. The community of selves shouldn’t be a democracy, but it shouldn’t be a dictatorship, either.

Paul Bloom is a professor of psychology at Yale University and the author of Descartes’ Baby: How the Science of Child Development Explains What Makes Us Human.

Also see: Interview: “Song of My Selves”. Psychologist Paul Bloom reflects on happiness, desire, memory, and the chaotic community that lives inside every human mind.

There is an endless number of ways to stimulate creativity; amongst them some of these 10 Lifehacker ideas are pretty neat!

 

It doesn’t matter whether you’re an artist or a businessperson, we all require a little creative thinking in our work. If you find you’re getting stuck, here are some of the best ways to get those creative juices flowing again.

Photo by Drew Coffman.

10. Plan Ahead

Just because you’re being creative doesn’t mean you can skip out on the organisation part of being productive. Making plans ahead of time can help you avoid creative plateaus, and waiting to judge your ideas after you finish them can keep you from exploring more alogical ideas. Creativity won’t strike you on cue, but a simple mind map and a bit of creative focus can go a long way.

9. Set Some Weird Rules

While we’ve been hammered with certain guidelines for running businesses and doing good work, to encourage creativity you sometimes need to set some weirder rules. Reward failure, but punish inaction. Create some conflict. Think contrary to what you usually hear, and mix things up to get your mind thinking in new ways.

Photo by Hararca.

8. Think Inside The Box

All your life you’ve probably heard “think outside the box”. It’s a bit more complicated than that, though—instead of thinking completely differently (which is not only hard, but ignores the principles we’ve found to work), think inside the box and build on those already-useful ideas in new ways. Christopher Peterson said it best: “If you never venture outside the box, you will probably not be creative. But if you never get inside the box, you will certainly be stupid.”

Photo by Ronit Slyper.

7. Don’t Stress About Being Truly Original

If you reject anything out of a desire for true originality, you’ll never get anywhere. It’s all been done before, and the key isn’t coming up with a truly original idea, it’s knowing what to steal from other artists and how to make it new and interesting.

6. Stay Motivated With Side Projects

If you focus too hard on one project at a time, you’re bound to get stuck in a creative block, or at least a spell of low motivation. “Distracting” yourself with other, smaller projects gets you away from your big project while keeping you productive and creative. When you’re done with one of those, you’ll come back to your big project with a new mindset and renewed enthusiasm.

Photo by Marcin Wichary.

5. Change Up Your Morning Routine

There’s a reason some of the most creative people are known to be smelly and unkempt. While we aren’t about to tell you to ditch hygiene altogether, sometimes switching up your morning routine can give you a creative head start you wouldn’t have gotten otherwise. Try getting up in the morning and jumping right into your work—you may have some creative moments you hadn’t experienced after a shower, getting dressed, and so on.

Photo by Chaos Manor Reviews.

4. Get Some Exercise

A change of scenery is always a good idea to get a burst of creativity, but a good 30 minutes of exercise will actually boost your creativity. In fact, it boosts nearly every dimension of cognition, so exercise regularly to get your blood (and creative juices) flowing.

Photo by eduardomineo.

3. Stop Working Mid-Thought

If you find that you start some days with no idea where your project is going next, consider when you stop working the day before. Instead of looking for logical breaking points, always know what’s coming next—that way, when you start up the next day, you can build up a bit of creative momentum before moving on to the new stuff.

2. Get Some Sleep

We all know how great sleep can be for your health, but it’s good for your creative brain too. A Harvard researcher found that if you sleep on new ideas, you’re a good deal more likely to make connections between distantly related points. If you’re on a streak, there’s nothing wrong with burning the midnight oil once in a while, but don’t neglect regular, quality sleep if you want to keep that streak going.

Photo by Deeleea.

1. Know When To Take Time Off

We can’t all be creative 100% of the time, so don’t burn yourself out by working 24/7/365. Designer Stefan Sagmeister actually takes a year-long creative sabbatical every seven years to rejuvenate his creativity. That’s obviously not in the cards for everyone, but do as much as you can—even a little afternoon daydreaming can go a long way.

Photo by Kr. B.

Post image for Why We Buy: How to Avoid 10 Costly Cognitive Biases
The psychology of money: post-purchase rationalisation, the relativity trap, rosy retrospection, the restraint bias and more…

We all make mistakes with money, some more than others. And in this economy, who needs it?

But many of these mistakes are avoidable if we can understand how we think about money. Here are 10 biases that psychological research has shown affect our judgement…and how to avoid them.

1. Status quo bias

One of the biggest reason people lose out financially is they stick with what they know, despite much better options being available. We tend to choose the same things we chose before. And we continue to do this even when better options are available, whether it’s goods or services.

Research on investment decisions shows this bias (e.g. Samuelson & Zeckhauser, 1988). People stick to the same old pension plans, stocks and shares, even though there are better options available.

It’s hard to change because it involves more effort and we want to avoid regretting our decision. But there is better value out there if you’re prepared to look.

2. Post-purchase rationalisation

After we buy something that’s not right, we convince ourselves it is right.

Most people refuse to accept they’ve made a mistake, especially with a big purchase. Marketers know this, so they try to encourage part-ownership first, using things like money-back guarantees. Once you’ve made a decision, you convince yourself it was the right one (see: cognitive dissonance), and also start to value it more because you own it (e.g. Cohen et al., 1970).

Fight it! If the goods or services aren’t right, return them. Most country’s legal systems incorporate a cooling off period, so don’t rationalise, return it!

3. Relativity trap

We think about prices relatively and businesses know this. That’s why recommended retail prices are set high, then discounted. Some expensive options on restaurant menus are there only to make the regular meals look reasonable in comparison.

The relativity trap is also called the anchoring effect. One price acts like an anchor on our thinking. It’s easy to fall for, but also easy to surmount by making comparisons they don’t want you to make (read more about the relativity trap).

Use price comparison websites. And try comparing across categories of goods. Is an iPad really worth a month’s groceries or three years of cinema trips or a new set of clothes?

4. Ownership effect

We value things more when we own them. So when it comes to selling our stuff, we tend to set the price too high.

It’s why you sometimes see second-hand goods advertised at ridiculous prices. Unlike professionals, amateur sellers develop an emotional attachment to their possessions (read the research on 6 quirks of ownership).

It also works the other way. When bidding on eBay, it’s possible to feel you already partly own something before you actually buy it. So you end up paying above the market value.

When buying or selling you have to try and be dispassionate. Be aware that unless you set limits, your unconscious may take over.

5. Present bias

In general humans prefer to get the pleasure right now, and leave the pain for later. Economists call this hyperbolic discounting.

In a study by Read and van Leeuwen (1998), when making food choices for next week, 74% of participants chose fruit. But when deciding for today, 70% chose chocolate. That’s humans for you: chocolate today, fruit next week.

The same is true of money. Marketers know we are suckers for getting discounts right now, so they hide the pain for later on (think mobile phone deals). Unfortunately buy now, pay later offers are often very bad deals.

One way to get around this is to think about your future self when making a purchasing decision. Imagine how ‘future you’ will see the decisions of ‘present you’. If ‘future you’ wouldn’t like it, don’t do it.

6. Fear of losses

People tend to sell things when they go up in price, but hold on to them when they go down. It’s one demonstration of our natural desire to avoid losses. This effect has been seen in a number of studies of stock-market trading (e.g. Weber & Camerer, 1998).

The fact that prices are falling, though, is a big clue. If you can fight the fear of losing, in the end it could leave you better off.

7. Familiarity bias

Advertising works partly because we like what we know, even if we only vaguely know it. We even choose familiar things when there are clear signals that it’s not the best option (Richter & Spath, 2006).

Always check if you’re buying something for the right reasons. Mere familiarity means the advertisers are winning. Smaller companies that can’t or won’t afford pricey TV commercials often provide better products and services.

8. Rosy retrospection

We tend to remember our decisions as better than they really were.

This is a problem when we come to make similar decisions again. We have a bias towards thinking our previous decision was a good one; it could be the holiday, house or car you chose (e.g. Mitchell & Thompson, 1994). That’s partly why we end up making the same financial mistakes again: we forget we made the same mistake before.

Before making an important financial decision, try to dredge up the real outcomes of previous decisions. Only without the rose-tinted spectacles can we avoid repeating our mistakes.

9. Free!

The word ‘free’ has a magical hold on us and marketers know it. Behavioural economics research shows we sometimes take a worse deal overall just to get something for free. Watch out if you are offered something for ‘free’ as sometimes the deal is not that good.

10. Restraint bias

Many mistakes with money result from a lack of self-control. We think we’ll control ourselves, but, when faced with temptation, we can’t. Studies like Nordgren et al., (2009) show people are woefully optimistic in predicting their self-control.

So, don’t put yourself in the situation of being tempted. This is why cutting up credit cards is often recommended. We’re mostly weaker than we think, so we shouldn’t give ourselves the opportunity.

Image credit: Jason Rogers

Via PsyBlog

The interesting website “You Are Not So Smart” published a couple of days ago a post on the hivemind: the one leading to deindividuation in groups. Deindividuation makes people shout at someones standing on top of a building to jump and then filming the death fall or tweeting the action. Three ingredients lead to deindividuation: anonymity, group size and arousal (being aroused by the environment and feeling aroused).

The Misconception: People who riot and loot are scum who were just looking for an excuse to steal and be violent.

The Truth: You are are prone to losing your individuality and becoming absorbed into a hivemind under the right conditions.

Source: Improv Everywhere

When a crowd gathers near a suicidal jumper something terrible is unleashed.

In Seattle in 2001, a 26-year-old woman who had recently ended a relationship held up traffic for a little too long as she considered the implications of leaping to her death. As motorists began to back-up on the bridge and become irate, they started yelling “Jump, bitch, jump!” until she did.

Cases like this aren’t unusual. …

Psychologists call this phenomenon deindividuation …. In certain situations, you can expect to be de-individualized. Unlike conformity, in which you adopt the ideas and behaviors of others for acceptance and inclusion, deindividuation is mostly unconscious and more likely to lead to mischief. As psychologist David G. Myers said, it is “doing together what you would not do alone.”

Read the whole post

learning styles

It reminds me of the days of superlearning and learning to learn: Understanding how you process information to help you get organized (part 1+2) on the Unclutterer blog. It’s all about working out what your information processing modality is (step 0ne) and then applying some common sense helpers accordingly to make you use information more effectively (step 2).

Unclutterer covers three modalities: visual, auditory and kinesthetic information processing (as usual, no olfactory 😉 ). The site asks you a number of questions that allow you to work out which modality (modalities) is (are) your prime information processing one(s). Based on the outcome you the are then offered a number of strategies that support the way you can retain and otherwise use information more effectively.

The whole approach is a bit basic, but once you get the gist of it you’ll be pretty quickly developing your own ways of tuning into how you can best learn new things.

Unclutterer Via Lifehacker

procrast1-overdoer

The Overdoer

procrast2-defier

The Defier

procrast3-dreamer
The Dreamer
procrast4-self-critical
The Self-Critical
procrast5-perfectionist
The Perfectionist
procrast6-worrier
The Worrier