by Carmen Nobel
The children’s classic The Polar Express tells the fanciful story of a young boy’s journey to the North Pole on a train filled with chocolate and candy. But when Warner Brothers released a $165 million computer-animated version of the tale, many critics described the film not as a happy Christmas fantasy but as a horror movie. “This season’s biggest holiday extravaganza, ‘The Polar Express,’ should be subtitled ‘The Night of the Living Dead,’ ” groused CNN reviewer Paul Clinton. “If I were a kid, I’d have nightmares,” wrote Geoff Pevere in the Toronto Star.
“WHAT I’M INTERESTED IN IS HOW AND WHY THE BRAIN EVOLVED TO PAY ATTENTION TO OTHER PEOPLE.”
The problem was that while the film’s characters appeared astonishingly human in many ways, their eyes looked lifeless. Viewers were creeped out.
Humans are often delighted by objects with vaguely humanoid characteristics—think Pet Rocks, toy robots, or sock puppets. But there is a point at which an object looks almost human, yet not quite human enough, and the result is disturbing. It’s called the uncanny valley. And for Christine Looser, it’s the starting point for a line of research aimed at discovering how our brains detect life, and how we distinguish the cognizant from the mindless.
“What I’m interested in is how and why the brain evolved to pay attention to other people,”
says Looser, a fellow at Harvard Business School who sports a PhD in cognitive neuroscience.
At first glance, a neuroscientist and a business school might seem like an odd fit. But the fact is that the business world has been paying increasing attention to how the brain works. The field of neuroeconomics has gained ground in the past 10 years, with work exploring the brain processes that underlie decision-making. There is the nascent but fast-growing field of neuromarketing, which uses brain-tracking tools to determine why consumers prefer some products over others. And there is neuroleadership, which applies neuroscience to management research. Looser is looking to integrate insights from social psychology, neuroscience, and business. “The big-picture question for me is in how we interact with other people,” she says. “It’s hard to come up with any business transaction you can do alone.”
The Tipping Point Of Animacy
Specifically, Looser is interested in what happens to our minds when we perceive that something is alive and socially relevant. The importance of that process goes far beyond box-office results for CGI (computer-generated imagery) films. In fact, the ability to distinguish the animate from the inanimate may be the brain’s most important survival mechanism, argues Looser, noting that there is good evidence that our minds work entirely differently when dealing with people as opposed to objects.
“You have to monitor people more than other things,”
she says. “As an extreme example, I don’t have to worry about that coffee cup unless you pick it up and throw it at me. People have to be both remembered and understood so that we can predict what they’re going to do in the future. How your brain does that is what’s interesting to me—when it works well, when it fails, and how it can influence the decisions we make.”
To determine the point at which humans recognize the impression of life, or animacy, in a face, Looser conducted a series of experiments in cooperation with her PhD advisor at Dartmouth College, psychology professor Thalia Wheatley. (They detail their findings in The Tipping Point of Animacy: How, When, and Where We Perceive Life in a Face, published in the December 2010 issue of Psychological Science.)
The researchers used morphing software to create a visual continuum of animacy, with images of doll faces at one end of the spectrum and images of similar human faces at the other. The images in between were morphed combinations of real and fake, which each successive iteration containing a higher percentage of human face. In total, they created 220 morphed faces.
Sixty college students were asked to evaluate each face on the continuum for two attributes: animacy (on a scale of 1, “definitely alive,” to 7, “definitely not alive”) and pleasantness (from 1,”very unpleasant,” to 7, “very pleasant”). In a related experiment, 29 of the participants returned two months later to repeat the same experiment, but this time they were asked to gauge whether a face could formulate a plan, feel pain, and had a mind.
The results showed that participants’ perceptions of aliveness didn’t track linearly with the morphing continuum. Rather, they tended to deem faces “definitely not alive” for the majority of the continuum, perceiving life only when the morphed face was almost entirely human. “People are really hypersensitive to what makes something look alive,” Looser says.
Not surprisingly, the ratings of whether a face could think or feel correlated with the animacy ratings—only the most human faces on the end of the spectrum were deemed able to perceive anything. The pleasantness ratings, on the other hand, did track linearly. The higher percent of human face in a morphed picture, the higher rating of pleasantness it received. That said, the researchers did see evidence of the uncanny valley in follow-up conversations with the student subjects. “Though pleasantness did not decrease around the animacy category boundary, a number of participants anecdotally reported that they found some of the morphed images creepy or unsettling,” they write in “The Tipping Point of Animacy.”
The Eyes Have It
A follow-up experiment determined what part of a face best reveals its animacy. The researchers used the same morphed images, but cropped each face to reveal only one of four facial features: eye, mouth, nose, and skin. Participants were then asked to gauge how alive the image looked based solely on the one visible facial feature.
The results showed that all the nose and skin images received relatively low aliveness ratings from the participants, regardless of whether the picture actually depicted human or non-human body parts. In short, noses and skin were not good indicators of life. Mouths scored slightly higher as the pictures grew progressively human.
But in looking at images of an eye, participants easily gauged the difference between human and non-human images. This offered scientific proof of why viewers may have been especially disturbed by the dead-eyed characters in The Polar Express, and why film producers need to be mindful of the uncanny valley. “Eyes convey a wealth of information, from attention to emotion and intent,” the researchers write. “Therefore, it is no wonder that eyes have been the Achilles heel of CGI.”
In another study, Looser, Wheatley, and Swaroop Guntupalli, also of Dartmouth, conducted fMRI scans on 30 participants for an in-depth look at how brains evaluate faces. From previous studies the researchers knew that certain brain areas react more strongly to faces than they do to faceless objects. But, they wondered, is the brain more interested in whether someone (or something) has a face, or whether someone (or something) is actually alive? In other words, do our brains first look for faces and then look for life, or vice versa? “We were looking at how the brain prioritizes visual information and social information,” Looser says.
While participants viewed a variety of images including human faces, doll faces, dog faces, toy dog faces, and clocks, the research team recorded the activity of their brains. The results showed that some brain areas prioritized the shape of the face, grouping humans with dolls, while other brain areas prioritized whether the face was alive, grouping humans with dogs. “Even though they look nothing alike, there are areas that respond to these living things as though they’re very similar,” says Looser, who explains that another study—using scalp electrodes—indicated that brains respond to all human-like faces at first, and then subsequently respond most strongly to the faces that are alive.
“The prioritization of form may maximize survival: better to false alarm to a spurious face-like pattern in a rock than miss a predator,”
the researchers write in their paper “Multivoxel Patterns in Face-Sensitive Temporal Regions Reveal an Encoding Schema Based on Detecting Life in a Face,” forthcoming in the journal Social Cognitive and Affective Neuroscience.
An ongoing study is using eye-tracking software to monitor participants’ viewing patterns as they look at pictures and short movie clips. Almost without fail, they look at faces before their eyes go anywhere else—even if the faces are inanimate, Looser says.
There are existential aspects to the findings. “There seems to be an intrinsic desire to see minds even where there are no minds,” she explains. “People see faces in clouds, in burnt toast, and in all these things where it just doesn’t make sense. It reveals a sensitivity to human form that might underlie an intrinsic motivation to pay attention to and connect with others.”
But Looser sees practical applications as well. At Harvard, she is conducting research to determine whether we’re more likely to retain and remember information—data and objects—if the information is paired with a picture of a human face. Such information could help marketers to create more memorable product advertisements and executives to design presentations that really stick with their customers, clients, and employees. “One possibility could be that the human face is distracting, so you remember the face but not the thing, but the other possibility, which might be more likely, is that seeing a human face is giving you access to advanced cognitive mechanisms, and so you will encode things more deeply because they’re paired with a face,” she says.
Looser also plans to study whether our brains get overloaded in the presence of too many faces, which may yield insight into the effects of a crowded office—not to mention the morning commute. “Taking the subway can be so draining,” she says. “Is there a way to test how much energy you lose by being around large crowds? And is there a way to be around a crowd of people in such a way that it doesn’t feel draining? There are a lot of open questions that psychology and neuroscience can help answer. I think they can have a real impact on the way we plan our days and forge relationships.”
ABOUT THE AUTHOR
Carmen Nobel is the senior editor of Harvard Business School Working Knowledge.
Artículo extraído de: http://hbswk.hbs.edu/item/neuroeconomics-eyes-brain-business