Monday, 9 March 2009

The Obama Mass Dissonance Effect.

“Nothing human is foreign to me”. Terence.

Now here’s a funny thing. Barry Hussein Sotuoro Barack Obama has NOTHING to do with hope and change by any definition. Everything he is, has said, done and believes in is as old as people and hey, [Left] Liberal Fascism. It’s as timeless as Logical Fallacy and Cognitive Dissonance. The same applies to his global mass of pop fan drones.

Except for his crony beneficiaries, all Obama lovers have two things in common: they know NOTHING about Barry Obama and they don't WANT to know anything.

The terrible truth is that nothing positive about Obama is true. Even the lies are unoriginal.

Here’s a great list of classic mental patterns with this caveat. The chap that put this logic reference list together appears to deal from a narrow Pavlovian range of references, and thus suffers from Logical Fallacy and Cognitive Dissonance himself! He may just be a er, Leftard?

"Oh, BushHitler! Here boy!"

No really. As far as I’ve seen, his memes are bog-standard liberal conceits to Marxist product sodden and atheistic ones. I thought hey, using Occam's Razor, he’s gotta compare “mass deception”, “wishful thinking” or the “Barnum Effect” at least to the uber-scam and super gravy train fraud of global warming. [GW merely equals global socialism.] But sadly no, he doesn’t. Go figure.

It appears that many similar academics, like technicians working in infectious disease but without gloves and a mask, are almost by default infected by their own environment and subject. Thus content still trumps form.

Funny, innit? And they pay these kinds of fatally flawed fellows?

But still it’s a splendid list. THESE are some of the defining mental delusions of Obamessianism today. Terrifyingly, there’s plenty more where these come from. Pathetic, trasonous, incompetent, ridiculous, sad, disastrous, stupid, ugly, fake, phony, boring, empty, dangerous, indecent, useless, creepy, fascistic and totally lacking in self-reflection or any sense of history whatsoever and entirely NOTHING NEW.

And that's the positive side.

Via Wishful thinking.

"Wishful thinking is interpreting facts, reports, events, perceptions, etc., according to what one would like to be the case rather than according to the actual evidence. If it is done intentionally and without regard for the truth, it is called misinterpretation, falsification, dissembling, disingenuous, or perversion of the truth.

Hidden persuaders.

A term used by Geoffrey Dean and Ivan Kelly (2003) to describe affective, perceptual, and cognitive biases or illusions that lead to erroneous beliefs. Examples of hidden persuaders abound.


Self-deception is the process or fact of misleading ourselves to accept as true or valid what is false or invalid. Self-deception, in short, is a way we justify false beliefs to ourselves.

When philosophers and psychologists discuss self-deception, they usually focus on unconscious motivations and intentions. They also usually consider self-deception as a bad thing, something to guard against. To explain how self-deception works, they focus on self-interest, prejudice, desire, insecurity, and other psychological factors unconsciously affecting in a negative way the will to believe.

A common example would be that of a parent who believes his child is telling the truth even though the objective evidence strongly supports the claim that the child is lying.

The parent, it is said, deceives him or herself into believing the child because the parent desires that the child tell the truth. A belief so motivated is usually considered more flawed than one due to lack of ability to evaluate evidence properly. The former is considered to be a kind of moral flaw, a kind of dishonesty, and irrational. The latter is considered to be a matter of fate: some people are just not gifted enough to make proper inferences from the data of perception and experience.

"There's no success like failure.." Bob Dylan Love Minus Zero.

Colonel Neville: Ah, Bob. The King of non-sequitur and alleged poetic meaning. I really like Highway 61 Revisited, Blonde On Blonde etc though. Go figure.

Cognitive dissonance.

"Cognitive dissonance is a theory of human motivation that asserts that it is psychologically uncomfortable to hold contradictory cognition's. The theory is that dissonance, being unpleasant, motivates a person to change his cognition, attitude, or behavior. This theory was first explored in detail by social psychologist Leon Festinger, who described it this way:

Dissonance and consonance are relations among cognition's that is, among opinions, beliefs, knowledge of the environment, and knowledge of one's own actions and feelings. Two opinions, or beliefs, or items of knowledge are dissonant with each other if they do not fit together; that is, if they are inconsistent, or if, considering only the particular two items, one does not follow from the other (Festinger 1956: 25).

He argued that there are three ways to deal with cognitive dissonance. He did not consider these mutually exclusive.

1. One may try to change one or more of the beliefs, opinions, or behaviors involved in the dissonance;

2. One may try to acquire new information or beliefs that will increase the existing consonance and thus cause the total dissonance to be reduced; or,

3. One may try to forget or reduce the importance of those cognition's that are in a dissonant relationship (Festinger 1956: 25-26).

For example, people who smoke know smoking is a bad habit. Some rationalize their behavior by looking on the bright side: They tell themselves that smoking helps keep the weight down and that there is a greater threat to health from being overweight than from smoking. Others quit smoking.

Most of us are clever enough to come up with ad hoc hypotheses or rationalizations to save cherished notions. Why we can't apply this cleverness more competently is not explained by noting that we are led to rationalize because we are trying to reduce or eliminate cognitive dissonance.

Barnum effect.

The Barnum effect is the name given to a type of subjective validation in which a person finds personal meaning in statements that could apply to many people.

For example:

You have a need for other people to like and admire you, and yet you tend to be critical of yourself. While you have some personality weaknesses you are generally able to compensate for them. You have considerable unused capacity that you have not turned to your advantage. At times you have serious doubts whether you have made the right decision or done the right thing.

If these statements sound like they came from a newsstand astrology book, that may be because they did. Such statements are sometimes called Barnum statements and they are an effective element in the repertoire of anyone doing readings: astrologers, palm readers, psychics, rumpologists and so on.

If the statements appear on a personality inventory that one believes has been especially prepared for you alone, one often validates the accuracy of such statements and thereby gives validity to the instrument used to arrive at them.

If Barnum statements are validated when they have originated during a psychic reading, the validation is taken as also validating the psychic powers of the medium.

"Barnum effect" is an expression that seems to have originated with psychologist Paul Meehl, in deference to circus man P. T. Barnum's reputation as a master psychological manipulator who is said to have claimed "we have something for everybody."

Collective hallucination.

Where belief in miracles exists, evidence will always be forthcoming to confirm its existence. In the case of moving statues and paintings, the belief produces the hallucination and the hallucination confirms the belief. --D.H. Rawcliffe
A collective hallucination is a sensory hallucination induced by the power of suggestion to a group of people. It generally occurs in heightened emotional situations, especially among the religiously devoted. The expectancy and hope of bearing witness to a miracle, combined with long hours of staring at an object or place, makes certain religious persons susceptible to seeing such things as weeping statues, moving icons and holy portraits, or the Virgin Mary in the clouds.

Those witnessing a "miracle" agree in their hallucinatory accounts because they have the same preconceptions and expectations. Furthermore, dissimilar accounts converge towards harmony as time passes and the accounts get retold. Those who see nothing extraordinary and admit it are dismissed as not having faith. Some, no doubt, see nothing but "rather than admit they failed...would imitate the lead given by those who did, and subsequently believe that they had in fact observed what they had originally only pretended to observe....(Rawcliffe, 114).

Not all collective hallucinations are religious, of course. In 1897, Edmund Parish reported of shipmates who had shared a ghostly vision of their cook who had died a few days earlier. The sailors not only saw the ghost, but distinctly saw him walking on the water with his familiar and recognizable limp. Their ghost turned out to be a "piece of wreck, rocked up and down by the waves" (Parish, 311; cited in Rawcliffe, 115)

Communal reinforcement.

Communal reinforcement is the process by which a claim becomes a strong belief through repeated assertion by members of a community. The process is independent of whether the claim has been properly researched or is supported by empirical data significant enough to warrant belief by reasonable people. Often, the mass media contribute to the process by uncritically supporting the claims.

More often, however, the mass media provide tacit support for untested and unsupported claims by saying nothing skeptical about even the most outlandish of claims.

Communal reinforcement explains how entire nations can pass on ineffable gibberish from generation to generation. It also explains how testimonials reinforced by other testimonials within the community of therapists, sociologists, psychologists, theologians, politicians, talk show hosts, etc., can supplant and be more powerful than scientific studies or accurate gathering of data by disinterested parties.

Communal reinforcement explains, in part, why about half of all American adults deny evolution occurred and believe that God created the universe in six days,* made the first man and woman out of clay, and a snake talked the woman into disobeying an order from God thereby causing all our problems. It also explains how otherwise rational and intelligent people can be persuaded to accept such stories as true when they are provided by a comforting community in a time of great emotional need.

Every cult leader knows the value of communal reinforcement combined with isolating cult members from contrary ideas.

See also confirmation bias, selective thinking, testimonials, and wishful thinking.

Moses syndrome.

(1) A delusion characterized by uncritical belief in the promises of others to lead one to the Promised Land, e.g., to beauty, youth, wealth, power, peace of mind, or happiness. (2) A delusion characterized by the belief that one has been chosen by God, destiny, or history to lead others to the Promised Land, e.g., some goal such as "putting the sciences on a firm foundation" (Descartes) or belief in such things as "the eternal law of nature that gives Germany as the stronger power the right before history to subjugate these peoples of inferior race, to dominate them and to coerce them into performing useful labors" (Hitler).

The Moses syndrome should not be confused with the baby Moses syndrome (the hope-in-a-basket fallacy), a kind of defense mechanism whereby one deceives oneself into inaction by the wishful thought that somebody else will eventually come along to solve your problems for you and save you from disaster’.


There is currently a controversial debate concerning whether unusual experiences are symptoms of a mental disorder, if mental disorders are a consequence of such experiences, or if people with mental disorders are especially susceptible to or even looking for these experiences. --Dr. Martina Belz-Merk.

Apophenia is the spontaneous perception of connections and meaningfulness of unrelated phenomena. The term was coined by K. Conrad in 1958 (Brugger).
Soon after his son committed suicide, Episcopalian Bishop James A. Pike (1913-1969) began seeing meaningful messages in such things as a stopped clock, the angle of an open safety pin, and the angle formed by two postcards lying on the floor. He thought they were conveying the time his son had shot himself (Christopher 1975: 139).

Peter Brugger of the Department of Neurology, University Hospital, Zurich, gives examples of apophenia from August Strindberg's Occult Diary, the playwright's own account of his psychotic break:

He saw "two insignia of witches, the goat's horn and the besom" in a rock and wondered "what demon it was who had put [them] ... just there and in my way on this particular morning." A building then looked like an oven and he thought of Dante's Inferno".

No comments: