excerpted from the book
Trust Us, We're Experts!
by Sheldon Rampton and John Stauber
Jeremy P. Tarcher / Putnam Publisher, 2001, paper
know of no safe depository of the ultimate power of the society
but the people themselves; and if we think them not enlightened
enough to exercise their control with a wholesome discretion,
the remedy is not to take it from them, but to inform their discretion.
When psychologists have explored the relationship between I individuals
and authority figures, they have found that it can be disturbingly
easy for false experts to manipulate the thinking and behavior
of others. One of the classic experiments in this regard was conducted
in 1974 by Stanley Milgram, who tried to see how far people would
go in following orders given by a seemingly authoritative scientist.
The subjects of Milgram's research were taken into a modern laboratory
and told that they would be helping conduct an experiment that
involved administering electric shocks to see how punishment affected
the learning process. The subjects were seated at a machine called
a "shock generator," marked with a series of switches
ranging from "slight shock" to "severe shock."
Another person was designated as a "learner" and was
hooked up to receive a jolt each time he gave the wrong answer
on a test. A third individual, the "scientist," stood
over the experiment giving instructions and supervision. Unbeknownst
to the real subjects of the experiment, both the "learner"
and the "scientist" were actors, and no actual electricity
was used. As each fake shock was administered, the "learner"
would cry out in pain. If the subject administering the shocks
hesitated, the "scientist" would say something like,
"Although the shocks may be painful, there is no permanent
tissue damage, so please go on," or "It is absolutely
essential that you continue." The result was that many subjects
continued to administer shocks, even when the "learner"
claimed heart trouble, cried out, or pleaded to be set free. "With
numbing regularity," Milgram observed, "good people
were seen to knuckle under the demands of authority and perform
actions that were callous and severe. Men who are in everyday
life responsible and decent were seduced by the trappings of authority,
by the control of their perceptions, and by the uncritical acceptance
of the experimenter's definition of the situation, into performing
In another famous experiment, known as the "Doctor Fox
Lecture," a distinguished-looking actor was hired to give
a meaningless lecture, titled "Mathematical Game Theory as
Applied to Physical Education." The talk, deliberately filled
with "double talk, neologisms, non sequiturs, and contradictory
statements," was delivered before three audiences composed
of psychiatrists, social workers, psychologists, educators, and
educational administrators, many of whom held advanced degrees.
After each session, audiences received a questionnaire asking
them to evaluate the speaker. None of the audience members saw
through the lecture as a hoax, and most reported that they were
favorably impressed with the speaker's expertise.
Between World Wars I and II, the rise of the public relations
industry in the United States and the growing use of propaganda
by fascist and communist governments prompted a group of social
scientists and journalists to found a remarkable organization
called the Institute for Propaganda Analysis. The IPA published
a periodic newsletter that examined and exposed manipulative practices
by advertisers, businesses, governments, and other organizations.
Fearlessly eclectic, it hewed to no party lines and focused its
energies on studying the ways that propaganda could be used to
manipulate emotions. It is best known for identifying several
basic types of rhetorical tricks used by propagandists:
1. Name-calling. This technique, in its crudest form, involves
the use of insult words. Newt Gingrich, the former Speaker of
the U.S. House of Representatives, is reported to have used this
technique very deliberately, circulating a list of negative words
and phrases that Republicans were instructed to use when speaking
about their political opponents-words such as "betray,"
"corruption," "decay," "failure,"
"hypocrisy," "radical," "permissive,"
and "waste." The term "junk science," is an
obvious use of this same strategy. When name-calling is used,
the IPA recommended that people should ask themselves the following
questions: What does the name mean? Does the idea in question
have a legitimate connection with the real meaning of the name?
Is an idea that serves my best interests being dismissed through
giving it a name I don't like?
2. Glittering generalities. This technique is a reverse form
of namecalling. Instead of insults, it uses words that generate
strong positive emotions-words like "democracy," "patriotism,"
"motherhood," "science," "progress,"
"prosperity." Politicians love to speak in these terms.
Newt Gingrich advised Republicans to use words such as "caring,"
"children," "choice," "commitment,"
"common sense," "dream," "duty,"
"empowerment," "freedom," and "hard work"
when talking about themselves and their own programs. Democrats,
of course, use the same strategy. Think, for example, of President
Clinton's talk of "the future," "growing the economy,"
or his campaign slogan: "I still believe in a place called
3. Euphemisms are another type of word game. Rather than attempt
to associate positive or negative connotations, euphemisms merely
try to obscure the meaning of what is being talked about by replacing
plain English with deliberately vague jargon. Rutgers University
professor William Lutz has written several books about this strategy,
most recently Doublespeak Defined. Examples include the use of
the term "strategic misrepresentations" as a euphemism
for "lies," or the term "employee transition"
as a substitute for "getting fired." Euphemisms have
also transformed ordinary sewage sludge into "regulated organic
nutrients" that don't stink but merely "exceed the odor
4. Transfer is described by the IPA as "a device by which
the propagandist carries over the authority, sanction, and prestige
of something we respect and revere to something he would have
us accept. For example, most of us respect and revere our church
and our nation. If the propagandist succeeds in getting church
or nation to approve a campaign in behalf of some program, he
thereby transfers its authority, sanction, and prestige to that
program. Thus, we may accept something which otherwise we might
reject." In 1998, the American Council on Science and Health
convened what it called a "blue-ribbon committee" of
scientists to issue a report on health risks associated with phthalates,
a class of chemical additives used in soft vinyl children's toys.
People familiar with ACSH's record on other issues were not at
all surprised when the blue-ribbon committee concluded that phthalates
were safe. The committee's real purpose, after all, was to transfer
the prestige of science onto the chemicals that ACSH was defending.
5. Testimonial is a specific type of transfer device in which
admired individuals give their endorsement to an idea, product,
or cause. Cereal companies put the pictures of famous athletes
on their cereal boxes, politicians seek out the support of popular
actors, and activist groups invite celebrities to speak at their
rallies. Sometimes testimonials are transparently obvious. Whenever
they are used, however, the IPA recommends asking questions such
as the following: Why should we regard this person (or organization
or publication) as a source of trustworthy information on the
subject in question? What does the idea amount to on its own merits,
without the benefit of the testimonial?
6. Plainfolks. This device attempts to prove that the speaker
is "of the people." Even a geeky multibillionaire like
Bill Gates tries to convey the impression that he's just a regular
guy who enjoys fast food and popular movies. Politicians also
use the "plain folks" device to excess: George Bush
insisting he eats pork rinds; Hillary Clinton slipping into a
southern accent. Virtually every member of the U.S. Senate is
a millionaire, but you wouldn't know it from the way they present
7. Bandwagon. This device attempts to persuade you that everyone
else supports an idea, so you should support it too. Sometimes
opinion polls are contrived for this very purpose, such as the
so-called "Pepsi Challenge," which claimed that most
people preferred the taste of Pepsi over Coca-Cola. "The
propagandist hires a hall, rents radio stations, fills a great
stadium, marches a million or at least a lot of men in a parade,"
the IPA observed. "He employs symbols, colors, music, movement,
all the dramatic arts. He gets us to write letters, to send telegrams,
to contribute to his cause. He appeals to the desire, common to
most of us, to follow the crowd."
8. Fear. This device attempts to reach you at the level of
one of your most primitive and compelling emotions. Politicians
use it when they talk about crime and claim to be advocates for
law and order. Environmentalists use it when they talk about pollution-related
cancer, and their opponents use fear when they claim that effective
environmental regulations will destroy the economy and eliminate
jobs. Fear can lead people to do things they would never otherwise
consider. Few people believe that war is a good thing, for example,
but most people can be convinced to support a specific war if
they believe that they are fighting an enemy who is cruel, inhuman,
and bent on destroying all that they hold dear.
The IPA disbanded at the beginning of World War 11, and its
analysis does not include some of the propaganda devices that
came to light in later years, such as the "big lie,"
based on Nazi propaganda minister Joseph Goebbels's observation
that "the bigger the lie, the more people will believe it."
Another device, which the IPA did not mention but which is increasingly
common today, is the tactic of "information glut"-jamming
I the public with so many statistics and other information that
people simply give up in despair at the idea of trying to sort
it all out.
The Precautionary Principle
... By training and enculturation, most experts in the employ
of government and industry are technophiles, skilled and enthusiastic
about the deployment of technologies that possess increasingly
awesome power. Like the Sorcerer's Apprentice, they are enchanted
with the possibilities of this power, but often lack the wisdom
necessary to perceive its dangers. It was a government expert,
Atomic Energy Commission chairman Lewis L. Strauss, who promised
the National Association of Science Writers in 1954 that atomic
energy would bring "electrical energy too cheap to meter"
within the space of a single generation. Turn to the back issues
of Popular Science magazine, and you will find other prophecies
so bold, so optimistic, and so wrong that you would be better
off turning for insight to the Psychic Friends Network. If these
prophecies had been correct, we should by now be jet-packing to
work, living in bubble-domed cities beneath the (- ocean, colonizing
the moon and Mars. The cure to cancer, like prosperity, is always
said to be just around the corner, yet somehow we never actually
turn that corner. Predictions regarding computers are notorious
for their rhetorical excess. "In from three to five years,
we will have a machine with the general intelligence of an average
human being," MIT computer scientist Marvin Minsky predicted
in 1970. "I mean a machine that will be able to read Shakespeare,
grease a car, play office politics, tell a joke, have a fight.
At that point, the machine will begin to educate itself with fantastic
speed. In a few months, it will be at a genius level, and a few
months after that, its power will be incalculable." Expert
predictions of this sort have been appearing regularly ever since,
although the day when computers will be able to grease your car
(let alone read Shakespeare) keeps getting pushed back.
The views of these techno-optimists deserve to be part of
the decision-making process, but they should not be allowed to
crowd out the views and concerns of the skeptics-the people who
are likely to experience the harmful effects of new technologies
and who deserve to play a role in deciding when and how they should
be introduced. Just as war is too important to leave to the generals,
science and technology are too important to leave in the hands
of the experts.
Opponents of the precautionary principle have caricatured
it as a rule that "demands precautionary action even in the
absence of evidence that a health or environmental hazard exists"
and says "if we don't know something we mustn't wait for
studies to give answers." This is not at all its intent.
It is a guide for policy decisions in cases where knowledge is
incomplete regarding risks that are serious or irreversible and
that are unproven but plausible in the light of existing scientific
knowledge. No one is suggesting that the precautionary principle
should be invoked regarding purely fanciful risks. There are legitimate
debates over whether a risk is plausible enough to warrant the
precautionary principle. There are also reasonable debates over
how to implement the precautionary principle. However, groups
that seek to discredit the principle itself as "unscientific"
are engaged in propaganda, not science.
Follow the Money
When you hire a contractor or an attorney, they work for you
because you are the one who pays for their services. The PR experts
who work behind the scenes and the visible experts who appear
on the public stage to "educate" you about various issues
are not working for you. They answer to a client whose interests
and values may even run contrary to your own. Experts don't appear
out of nowhere. They work for someone, and if they are trying
to influence the outcome of issues that affect you, then you deserve
to know who is paying their bills.
Not everyone agrees with this position. Jeff Stier is the
associate director of the American Council on Science and Health
(ACSH), Stier goes so far as to claim that "today's conventional
wisdom in favor of disclosing corporate funding of research is
a 'new McCarthyism.' " Standards of public disclosure, he
says, should mirror the standards followed in a court of law,
where "evidence is admissible only if the probative value
of that evidence exceeds its prejudicial effect." To disclose
funding, he says, can have a "prejudicial effect" if
it "unfairly taints studies that are scientifically solid."
Rather than judging a study by its funding source, he says, you
should simply ask whether its "hypothesis, methodology and
conclusion" measure up to "rigorous scientific standards."
When we asked him for a list of ACSH's corporate and foundation
donors, he used these arguments to justify his refusal. With all
due respect, we think Stier's argument is an excuse to avoid scrutiny.
Even in a court of law, expert witnesses are required to disclose
what they are being paid for their testimony.
Some people, including the editors of leading scientific journals,
raise more subtle questions about funding disclosure. The problem,
they say, is knowing where to draw the line. If someone received
a small grant 20 years ago from a pharmaceutical company to study
a specific drug, should they have to disclose that fact whenever
they comment about an entirely different drug manufactured by
the same company? And what about nonfinancial factors that create
bias? Nonprofit organizations also gain something by publishing
their concerns. They may have an ideological ax to grind, and
publicity may even bring indirect financial benefits by helping
attract new members and contributions. Elizabeth Whelan of ACSH
made these points during a letter exchange with Ned Groth of the
Consumers Union. "You seem to believe that while commercial
agendas are suspect, ideological agendas are not," Whelan
complained. "This is a purely specious distinction.... A
foundation's pursuit of an ideological agenda-perhaps one characterized
by a desire for social change, redistribution of income, expanded
regulatory control over the private sector, and general promotion
of a coercive utopia-must be viewed with at least as much skepticism
and suspicion as a corporation's pursuit of legitimate commercial
In understanding the hold that experts have on our lives,
we should consider the role that we ourselves play as consumers
of information. Most propaganda is designed to influence people
who are not very active or informed about the topic at hand. There
is a reason for this strategy. Propagandists know that active,
informed people are likely to already hold strong 5, opinions
that cannot be easily swayed. The people who are most easily manipulated
are those who have not studied a subject much and are therefore
susceptible to any argument that sounds plausible.
Of course, there is no way that anyone can be active and informed
about every issue under the sun. The world is too complex for
that, and our lives are too busy. However, each of us can choose
those issues that move us most deeply and devote some time to
them. Activism enriches our lives in multiple ways. It brings
us into personal contact with other people who are informed, passionate,
and altruistic in their commitment to help make the world a better
place. These are good friends to have, and often they are better
sources of information than the experts whose names appear in
the newspapers or on television. Activism, in our opinion, is
not just a civic duty. It is a path to enlightenment.
Us, We're Experts!