Items related to The Man Who Lied to His Laptop: What We Can Learn About...

The Man Who Lied to His Laptop: What We Can Learn About Ourselves from Our Machines - Softcover

 
9781617230042: The Man Who Lied to His Laptop: What We Can Learn About Ourselves from Our Machines
View all copies of this ISBN edition:
 
 
Counterintuitive insights about building successful relationships- based on research into human-computer interaction.

Books like Predictably Irrational and Sway have revolutionized how we view human behavior. Now, Stanford professor Clifford Nass has discovered a set of rules for effective human relationships, drawn from an unlikely source: his study of our interactions with computers.

Based on his decades of research, Nass demonstrates that-although we might deny it-we treat computers and other devices like people: we empathize with them, argue with them, form bonds with them. We even lie to them to protect their feelings.

This fundamental revelation has led to groundbreaking research on how people should behave with one another. Nass's research shows that:
  • Mixing criticism and praise is a wildly ineffective method of evaluation
  • Flattery works-even when the recipient knows it's fake
  • Introverts and extroverts are each best at selling to one of their own
Nass's discoveries provide nothing less than a new blueprint for successful human relationships.

"synopsis" may belong to another edition of this title.

About the Author:
Clifford Nass is the Thomas M. Storke Professor at Stanford University and director of the Communication between Humans and Interactive Media (CHIMe) Lab. He is a popular designer, consultant, and keynote speaker, and is widely quoted by the media on issues such as the impact of multitasking on young minds. He lives in Silicon Valley.
Excerpt. © Reprinted by permission. All rights reserved.:
Introduction

Why I Study Computers to Uncover Social Strategies

When you work with people, you can usually tell whether things are going smoothly or are falling apart. It’s much harder to figure out why things are going wrong and how to improve them. People seem too complex for you to consistently make them happier or more cooperative, or to make them see you as more intelligent and persuasive.

Over the past twenty years, I have discovered that the social world is much less complicated than it appears. In fact, interactions between people are governed by simple rules and patterns. These truths aren’t vague generalities, such as advice from our grandparents (“nothing ventured, nothing gained”), pop psychologists (“follow your dreams”), or celebrities (“don’t take no for an answer”). Instead, in this book I present scientifically grounded findings on how to praise and criticize, how to work with different types of people, how to form teams, how to manage emotions, and how to persuade others.

I didn’t set out to discover ways to guide successful human relationships. As a professor in many departments—communication; computer science; education; science, technology, and society; sociology; and symbolic systems—and an industry consultant, I work at the intersection of social science and technology. My research at Stanford University and my collaborations with corporate teams had originally been focused on making computers and other technologies easier, more effective, and more pleasant for people to use. I didn’t know that I would be thrust into the world of successful human relationships until I encountered three peculiar problems: an obnoxious paper clip, a suspicious auditor, and an untrustworthy navigator.

In 1998, Microsoft asked me to provide evidence that it was possible to improve one of the worst software designs in computer history: Clippy, the animated paper clip in Microsoft Office. While I have often been asked by companies to make their interfaces easier to use, I had a real challenge on my hands with Clippy. The mere mention of his name to computer users brought on levels of hatred usually reserved for jilted lovers and mortal enemies. There were “I hate Clippy” Web sites, videos, and T-shirts in numerous languages. One of the first viral videos on the Internet—well before YouTube made posting videos common—depicted a person mangling a live version of Clippy, screaming, “I hate you, you lousy paper clip!”

One might think that the hostility toward Clippy emerged because grown-ups don’t like animated characters. But popular culture demonstrates that adults can indeed have rich relationships with cartoons. For many years, licensing for the animated California Raisins (originally developed as an advertising gimmick by the California Raisin Advisory Board) yielded higher revenues than the actual raisin industry. The campaign’s success in fact helped motivate Microsoft to deploy Clippy in the first place. (Bill Gates envisioned a future of Clippy mugs, T-shirts, and other merchandise.) Similarly, Homer Simpson, Fred Flintstone, and Bugs Bunny all have name recognition and star power equivalent to the most famous human celebrities. What about Clippy, then, aroused such animosity in people?

Around this same time, my second mystery appeared. A market-analysis firm asked me to explain why employees at some companies had started reporting dramatic increases in the approval ratings of all the software applications they were using.

I started my investigation by comparing the newly satisfied users with those who had experienced no change in satisfaction. Strangely, I found that the people in the satisfied and dissatisfied companies were relatively uniform with regard to their industries (banking versus retail), the types of computers being used (PCs versus Macs), the categories of software they worked with (programming versus word processing), and the technical skill levels of their employees (novice versus expert).

I then looked at how the researchers surveyed the companies (how often, by whom, how many times). The only difference I found was that the companies that had started reporting higher approval ratings had changed their procedure for obtaining the evaluation. Formerly, all of the companies had people evaluate software on a separate “evaluation” computer. Later, some companies later changed that procedure and had their employees evaluate the software on the same computer they normally worked with. Those companies subsequently reported higher approval ratings. Why would people give software higher ratings on one computer as compared to another identical computer?

My third problem concerned the navigation system BMW used in its Five Series car in Germany. BMW represents the pinnacle of German engineering excellence, and at the time its navigation system was arguably well ahead of other companies in terms of accuracy and functionality. Despite that fact, BMW was forced to recall the product. What was the problem? It turns out that the system had a female voice, and male German drivers refused to take directions from a woman! The service desk received numerous calls from agitated German men that went something like this:

Customer: I can’t use my navigation system.
Operator: I’m very sorry about that, sir. What seems to be the problem?
Customer: A woman should not be giving directions.
Operator: Sir, it is not really a woman. It is only a recorded voice.
Customer: I don’t trust directions from a woman.
Operator: Sir, if it makes you feel better, I am certain that the engineers that built the system and the cartographers who figured out the directions were all men.
Customer: It doesn’t matter. It simply doesn’t work.

Something wasn’t right, but the logic seemed impregnable (give or take).



How a Sock Rescued My Research

While these three dilemmas existed in vastly different products, industries, and domains, one critical insight allowed me to address all of them. My epiphany occurred while I was sitting in a hotel room, flipping through television channels. Suddenly, I saw Shari Lewis, the great puppeteer. She caught my attention for three reasons. First, instead of entertaining children, she was on C-SPAN testifying before Congress. Second, she had brought along her sock puppet Lamb Chop (not the first “puppet” to have appeared before Congress). Third, Lamb Chop was testifying in response to a congressman’s question.

In her childlike “Lamb Choppy” voice (very distinct from Lewis’s Bronx accent), Lamb Chop said, “Violence on television is very bad for children. It should be regulated.” The representative then asked, “Do you agree with Lamb Chop, Ms. Lewis?” It took the gallery 1.6 seconds to laugh, the other congressmen 3.5 seconds to laugh, and the congressman who asked the question an excruciating 7.4 seconds to realize the foolishness of his question.

The exchange, while leaving me concerned for the fate of democracy, also struck me as very natural: here was someone with a face and a voice, and here was someone else—albeit a sock—with its own face and voice. Why shouldn’t they be asked for their opinions individually? Perhaps the seemingly absolute line between how we perceive and treat other people and how we perceive and treat things such as puppets was fuzzier than commonly believed.

I had seen that, given the slightest encouragement, people will treat a sock like a person—in socially appropriate ways. I decided to apply this understanding to unraveling the seemingly illogical behaviors toward technology that I had previously observed. I started with the despised Clippy. If you think about people’s interaction with Clippy as a social relationship, how would you assess Clippy’s behavior? Abysmal, that’s how. He is utterly clueless and oblivious to the appropriate ways to treat people. Every time a user typed “Dear . . . ,” Clippy would dutifully propose, “I see you are writing a letter. Would you like some help?”—no matter how many times the user had rejected this offer in the past. Clippy would give unhelpful answers to questions, and when the user rephrased the question, Clippy would give the same unhelpful answers again. No matter how long users worked with Clippy, he never learned their names or preferences. Indeed, Clippy made it clear that he was not at all interested in getting to know them. If you think of Clippy as a person, of course he would evoke hatred and scorn.

To stop Clippy’s annoying habits or to have him learn about his users would have required advanced artificial-intelligence technology, resulting in a great deal of design and development time. To show Microsoft how a small change could make him popular, I needed an easier solution. I searched through the social science literature to find simple tactics that unpopular people use to make friends.

The most powerful strategy I found was to create a scapegoat. I therefore designed a new version of Clippy. After Clippy made a suggestion or answered a question, he would ask, “Was that helpful?” and then present buttons for “yes” and “no.” If the user clicked “no,” Clippy would say, “That gets me really angry! Let’s tell Microsoft how bad their help system is.” He would then pop up an e-mail to be sent to “Manager, Microsoft Support,” with the subject, “Your help system needs work!” After giving the user a couple of minutes to type a complaint, Clippy would say, “C’mon! You can be tougher than that. Let ’em have it!”

We showed this system to twenty-five computer users, and the results were unanimous: people fell in love with the new Clippy! A long-standing business user of Microsoft Office exclaimed, “Clippy is awesome!” An avowed Clippy hater said, “He’s so supportive!” And a user who despised “eye candy” in software said, “I wish all software was like this!” Virtually all of the users lauded Clippy 2.0 as a marvelous innovation.

Without any fundamental change in the software, the right social strategy rescued Clippy from the list of Most Hated Software of All Time; creating a scapegoat bonded Clippy and the user against a common enemy. Unfortunately, that enemy was Microsoft, and while impressed with our ability to make Clippy lovable, the company did not pursue our approach. When Microsoft retired Clippy in 2007, it invited people to shoot staples at him before his final burial.

Did the social approach also help explain users’ puzzling enthusiasm for their software when they gave feedback to the computer they had just worked with? Think about this as a social situation with a person rather than with a computer being evaluated. If you had just worked with someone and the person asked, “How did I do?” the polite thing to do would be to exaggerate the positive and downplay the negative. Meanwhile, if someone else asked you how that person did, you would be more honest. Similarly, the higher ratings of the software when it was evaluated on the same computer could have been due to users’ desire to be polite to the computer and their perception of the second computer as a neutral party. Did users feel a social pull when evaluating the computer they had worked with, hiding their true feelings and saying nicer things in order to avoid “hurting the computer’s feelings”?

To answer this question, I designed a study to re-create the typical scenarios in companies that evaluate their software. I had people work with a piece of software for thirty minutes and then asked them a series of questions concerning their feelings about the software, such as, “How likely would you be to buy this software?” and “How much did you enjoy using this software?” One group of users answered the questions on the computer they worked with; another group answered the questions on a separate but identical computer across the room.

In a result that still surprises me fifteen years later, users entered more positive responses on the computer that asked about itself than they did on the separate, “objective” computer. People gave different answers because they unconsciously felt that they had to be polite to the computer they were evaluating! When we questioned them after the experiment, every one of the participants insisted that she or he would never bother being polite to a computer.

What about BMW’s problem with its “female” navigation system? Could stereotypes be so powerful that people would apply them to technology even though notions of “male” and “female” are clearly irrelevant? I performed an experiment where we invited forty people to come to my laboratory to work with a computer to learn about two topics: love and relationships, a stereotypically female subject, and physics, a stereotypically male subject. Half of the participants heard a recorded female voice; the other half heard a recorded male voice.

After being tutored by the computer for about twenty minutes, we gave the participants a computer-based questionnaire (on a different computer, of course!) that asked how they felt about the tutoring with respect to the two topics.

Although every aspect of the interaction was identical except for the voice, participants who heard the female voice reported that the computer taught “love and relationships” more effectively, while participants with the male-voiced computer reported that it more effectively taught “technical subjects.” Male and female participants alike stereotyped the “gendered” computers. When we asked participants afterward whether the apparent gender of the voice made a difference, they uniformly said that it would be ludicrous to assign a gender to a computer. Furthermore, every participant denied harboring any gender stereotypes at all!

People’s tendencies with regard to scapegoating, politeness, and gender stereotypes are just a few of the social behaviors that appear in full force when people interact with technology. Hundreds of results from my laboratory, as summarized in two books (The Media Equation and Wired for Speech) and more than a hundred papers, show that people treat computers as if they were real people. These discoveries are not simply entries for “kids say the darndest things” or “stupid human tricks.” Although it might seem ludicrous, humans expect computers to act as though they were people and get annoyed when technology fails to respond in socially appropriate ways. In consulting with companies such as Microsoft, Sony, Toyota, Charles Schwab, Time Warner, Dell, Volkswagen, Nissan, Fidelity, and Philips, I have helped improve a range of interactive technologies, including computer software, Web sites, cars, and automated phone systems. Technologies have become more likable, persuasive, and compelling by ensuring that they behave the way people are supposed to behave. The language of human behaviors has entered the design vocabulary of software and hardware companies around the world.

Of course, this “Computers Are Social Actors” approach can only work if the engineers and designers know the appropriate rules. In many cases, this is not a problem: there are certain behaviors that virtually everyone knows are socially acceptable. On a banking Web site, for example, we all would agree that it is important that the site use polite and formal language, just as a bank teller would. For a humanoid robot, it doesn’t take an exp...

"About this title" may belong to another edition of this title.

Other Popular Editions of the Same Title

9781617230011: The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships

Featured Edition

ISBN 10:  ISBN 13:  9781617230011
Publisher: Current, 2010
Hardcover

  • 9781591843399: Man Who Lied to His Laptop: The New Research on Human Relationships

    Portfolio, 2010
    Softcover

Top Search Results from the AbeBooks Marketplace

Stock Image

Nass, Clifford; Yen, Corina
Published by Penguin Publishing Group (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Softcover Quantity: 1
Seller:
Books Unplugged
(Amherst, NY, U.S.A.)

Book Description Condition: New. Buy with confidence! Book is in new, never-used condition. Seller Inventory # bk1617230049xvz189zvxnew

More information about this seller | Contact seller

Buy New
US$ 20.53
Convert currency

Add to Basket

Shipping: FREE
Within U.S.A.
Destination, rates & speeds
Stock Image

Nass, Clifford; Yen, Corina
Published by Penguin Publishing Group (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Softcover Quantity: 1
Seller:
Book Deals
(Tucson, AZ, U.S.A.)

Book Description Condition: New. New! This book is in the same immaculate condition as when it was published. Seller Inventory # 353-1617230049-new

More information about this seller | Contact seller

Buy New
US$ 20.53
Convert currency

Add to Basket

Shipping: FREE
Within U.S.A.
Destination, rates & speeds
Seller Image

Nass, Clifford; Yen, Corina
Published by Penguin Publishing Group (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Softcover Quantity: 5
Seller:
GreatBookPrices
(Columbia, MD, U.S.A.)

Book Description Condition: New. Seller Inventory # 16480397-n

More information about this seller | Contact seller

Buy New
US$ 17.90
Convert currency

Add to Basket

Shipping: US$ 2.64
Within U.S.A.
Destination, rates & speeds
Stock Image

Corina Yen Clifford Nass
Published by Penguin Books (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Softcover Quantity: 1
Seller:
Books Puddle
(New York, NY, U.S.A.)

Book Description Condition: New. pp. 240 1st Edition. Seller Inventory # 2642172894

More information about this seller | Contact seller

Buy New
US$ 16.65
Convert currency

Add to Basket

Shipping: US$ 3.99
Within U.S.A.
Destination, rates & speeds
Seller Image

Nass, Clifford
Published by Current 6/26/2012 (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Paperback or Softback Quantity: 5
Seller:
BargainBookStores
(Grand Rapids, MI, U.S.A.)

Book Description Paperback or Softback. Condition: New. The Man Who Lied to His Laptop: What We Can Learn about Ourselves from Our Machines 0.5. Book. Seller Inventory # BBS-9781617230042

More information about this seller | Contact seller

Buy New
US$ 21.78
Convert currency

Add to Basket

Shipping: FREE
Within U.S.A.
Destination, rates & speeds
Stock Image

Nass, Clifford
Published by Penguin Publishing Group (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Paperback Quantity: 1
Seller:
Big Bill's Books
(Wimberley, TX, U.S.A.)

Book Description Paperback. Condition: new. Brand New Copy. Seller Inventory # BBB_new1617230049

More information about this seller | Contact seller

Buy New
US$ 19.82
Convert currency

Add to Basket

Shipping: US$ 3.00
Within U.S.A.
Destination, rates & speeds
Stock Image

Yen Corina Nass Clifford
Published by Penguin Books (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Softcover Quantity: 1
Seller:
Majestic Books
(Hounslow, United Kingdom)

Book Description Condition: New. pp. 240. Seller Inventory # 49561089

More information about this seller | Contact seller

Buy New
US$ 15.30
Convert currency

Add to Basket

Shipping: US$ 8.20
From United Kingdom to U.S.A.
Destination, rates & speeds
Stock Image

Nass, Clifford
Published by PLUME (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Softcover Quantity: > 20
Seller:
Russell Books
(Victoria, BC, Canada)

Book Description Softcover. Condition: New. Special order direct from the distributor. Seller Inventory # ING9781617230042

More information about this seller | Contact seller

Buy New
US$ 16.00
Convert currency

Add to Basket

Shipping: US$ 9.99
From Canada to U.S.A.
Destination, rates & speeds
Stock Image

Clifford Nass
Published by Penguin Putnam Inc, New York (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Paperback Quantity: 1
Seller:
Grand Eagle Retail
(Wilmington, DE, U.S.A.)

Book Description Paperback. Condition: new. Paperback. Counterintuitive insights about building successful relationships- based on research into human-computer interaction.Books like Predictably Irrational and Sway have revolutionized how we view human behavior. Now, Stanford professor Clifford Nass has discovered a set of rules for effective human relationships, drawn from an unlikely source- his study of our interactions with computers.Based on his decades of research, Nass demonstrates that-although we might deny it-we treat computers and other devices like people- we empathize with them, argue with them, form bonds with them. We even lie to them to protect their feelings.This fundamental revelation has led to groundbreaking research on how people should behave with one another. Nass's research shows that-Mixing criticism and praise is a wildly ineffective method of evaluationFlattery works-even when the recipient knows it's fakeIntroverts and extroverts are each best at selling to one of their ownNass's discoveries provide nothing less than a new blueprint for successful human relationships.Counterintuitive insights about building successful relationships- based on research into human-computer interaction.Books like Predictably Irrational and Sway have revolutionized how we view human behavior. Now, Stanford professor Clifford Nass has discovered a set of rules for effective human relationships, drawn from an unlikely source- his study of our interactions with computers.Based on his decades of research, Nass demonstrates that-although we might deny it-we treat computers and other devices like people- we empathize with them, argue with them, form bonds with them. We even lie to them to protect their feelings.This fundamental revelation has led to groundbreaking research on how people should behave with one another. Nass's research shows that-Mixing criticism and praise is a wildly ineffective method of evaluationFlattery works-even when the recipient knows it's fakeIntroverts and extroverts are each best at selling to one of their ownNass's discoveries provide nothing less than a new blueprint for successful human relationships. Shipping may be from multiple locations in the US or from the UK, depending on stock availability. Seller Inventory # 9781617230042

More information about this seller | Contact seller

Buy New
US$ 26.64
Convert currency

Add to Basket

Shipping: FREE
Within U.S.A.
Destination, rates & speeds
Stock Image

Nass, Clifford
Published by Penguin Publishing Group (2012)
ISBN 10: 1617230049 ISBN 13: 9781617230042
New Paperback Quantity: 1
Seller:
GoldenWavesOfBooks
(Fayetteville, TX, U.S.A.)

Book Description Paperback. Condition: new. New. Fast Shipping and good customer service. Seller Inventory # Holz_New_1617230049

More information about this seller | Contact seller

Buy New
US$ 22.67
Convert currency

Add to Basket

Shipping: US$ 4.00
Within U.S.A.
Destination, rates & speeds

There are more copies of this book

View all search results for this book