Sei sulla pagina 1di 189

Reasoning of a Madman

Dr Romesh Senewiratne © 2015


romeshsenewiratne@gmail.com

The Australian-English word ‘yakka’ (as in hard yakka) means work. The
term comes from the Yuggera (or Jagera) people – the Aboriginal tribe
that lived where I do now, prior to the arrival of the White Man. The
Yuggera tribe was one of several that spoke the Yuggera languages (the
Yuggera Language Group) which is one of many language groups that
belong to the Pama-Nyungan language family. The Yuggera language is
one of more than twenty languages spoken by the Murri people of
Queensland (murri means kangaroo in Yuggera). The Pama-Nyungan
language family is one of 27 languages families, and the one that
covers the large majority of the Australian landmass. It is shown in
yellow in the map below, based on modern linguistic scholarship and
published on Wikipedia:

I was brought to Yuggera land when I was fifteen, against my will, but
without any voiced objection. I liked where I was, in the hill town of
Kandy in Sri Lanka, and did not see the attraction of leaving my home
and moving to a distant land, about which I knew very little, other
than that it had lots of deserts and strange marsupial animals. I knew
that Australia had been settled by the British, which used it as a
prison colony, and that this land was forcibly acquired – stolen –
from the Aboriginal people. I didn’t know much more about Aboriginal
people and what they suffered at the hands of the colonists, but I did
have something of the Third World perspective in my view of
colonialism – I saw colonialism as an evil imposed on dark-skinned
people by people who considered themselves “white”. This was certainly
the case in Australia, where the discrimination based on race –
specifying the Aboriginal race – was enshrined in the original 1901
constitution that federated what had previously been separate British
colonies into the nation of Australia. In determining the population
of Australia by census, according to the constitution, people of the
“Aboriginal race” were not to be counted (denying their existence as
well as their vote).

I was told that we were coming to Australia “for my education” and


that of my older sister. I later discovered that this was not really
the reason my father (who made the decisions) wanted to leave Sri
Lanka and come to Australia. In his own later writings he has said
that the reason he resigned from the university in Sri Lanka and
accepted a position at the University of Queensland was because he was
refused the position of professor of medicine in Sri Lanka. He was
apparently told that they didn’t want a “black Englishman”.

My father is not, in fact, a black Englishman, though he trained in


medicine in England, and speaks English as his mother tongue (which he
learned from his parents and at school in Sri Lanka). I, on the other
hand, am a genuine Black Englishman, having been born in London, when
my father was a medical student. I do not advertise the fact that I am
a Black Englishman due to its negative connotations. My father’s
account illustrates how Black Englishmen were despised in Sri Lanka in
the 1970s.
In Australia, more than in Sri Lanka, I was forced to identify myself
as either black or white. The fact that I regarded myself as neither
did not stop me from being identified as being either “black” or
“Indian” by others. If I was “black” I could still be Australian, but
if I was Indian it made me an outsider – a foreigner. I gathered this
when I was asked by friendly Aboriginal people, where “my mob” came
from. I could choose between being a Sri-Lankan Australian or a Black
Australian – I have never regarded myself as a “British Australian” or
an English Australian, though this is, according to the circumstances
of my birth what I am. I am British by birth, and there is no escaping
it.

Speaking as a Reverberating Electrical


Circuit
My working hypothesis is that my mind is simply the reverberating
electrical circuit in my brain. This is a reductionist, materialist
hypothesis and I am not given to reductionism or materialism. That’s
why it’s only a working hypothesis – one that I am trying to disprove.

I am conscious of these reverberating electrical circuits only because


I have read about them, and I trust the source of the information, as
far as this particular fact is concerned. I accept, without trying to
count them myself (not that I could), that we have about 100 billion
neurons in our brains and that these communicate with each other
through trillions of nerve fibres (outgoing axons and incoming
dendrites) through a combination of electrical and chemical
signalling. Most of the signalling at the synapses is chemical
(neurotransmitters) but the propagation of the electrical impulse
along the axon (as an action potential) is due to waves of
depolarisation caused by movement of positively and negatively charged
ions across the cell membrane of the axon.
The electrical circuit in my brain became active, I presume, before I
was born. It will remain active until I die, when the activity will
cease, but not abruptly. Some cells will continue firing longer than
others, but eventually all the electrical activity will cease. I will
be truly brain dead. At this stage will my mind also cease to exist?
What about my immortal soul? What is the soul, and do I have one? What
about spirit and consciousness? Can consciousness exist without a
brain to produce it? These are some of the questions I must ponder in
order to disprove my hypothesis that my mind is simply the
reverberating electrical circuit in my brain.
Reverberate:

verb
(of a loud noise) be repeated several times as an echo.
"her deep booming laugh reverberated around the room"
have continuing and serious effects.
"the statements by the professor reverberated through the Capitol"

The term reverberating circuit is presently being used to describe


mere feedback loops between neurones, but I mean reverberate as in the
second definition above – to have continuing and serious effects. The
electrical activity that begins in the developing network of neurones
in utero continues, feeding back on itself, with new input (from the
senses) and powered by energy derived from the aerobic metabolism of
glucose, which reaches the neuronal network through the blood stream.
The heart, like the brain, develops early in the embryo and starts
beating at 3 to 4 weeks gestation. The relationship between the heart
and the brain will be explored later, but first let me explain how my
use of “reverberating electrical circuit” differs from a neuronal
feedback loop. This is what is meant by a “reverberating circuit”,
from a Google image from the University of Arizona:
/

Reverberating circuits of this type are only one of several types of


neuronal circuit. My idea of a reverberating electrical circuit is
more that the entire network of neurons, extending to the peripheral
nervous system, functions as a single integrated circuit that
reverberates, feeding back upon itself and modifying its own structure
and function through neuronal plasticity. The electrical signals are
caused by millions of simultaneous action potentials that are
synchronized at a very deep level, acting in concert (through a
delicate and complex combination of inhibitory and stimulatory
signals) to orchestrate the movements that we call actions – voluntary
and involuntary, conscious and unconscious (and subconscious). The
hypothesis I am trying to refute is that our thought and consciousness
themselves are the subjective experience of the electrical activity
within this whole integrated, reverberating circuit. The circuit
consists of numerous sub-circuits, including some that are localised
in specific brain structures and areas, and others that are widely
distributed, involving neuronal assemblies and complex connections
between various layers of the cortex and subcortical structures.
When I speak of “my mind” somehow all these parts of the brain act in
concert to give me a unitary concept of self. But I am not just my
mind. I also have a body and maybe, just maybe, I also have a soul.
I’m not too keen on the idea of immortality, though; forever seems to
me like a formidably long time. Since I outgrew Christianity in my
teens I haven’t believed in heaven or hell, or life after death. I
wasn’t brought up to believe in reincarnation, but have sometimes
entertained the idea, usually when, in retrospect, I was crazy. I
hesitate to reject the idea of a soul, though – who wants to be
soulless?

The network of neurones and the passage of electricity through it has


been elucidated in great detail over the past 500 years – the longer
history of the modern neurosciences.
Andreas Vesalius (1514-1564) – neuroanatomy, corrected Galen, but
retained humourism
Rene Descartes (1596-1650) – mind-body dualism, pineal as ‘seat of
soul’, modified humourism
Michael Faraday (1791-1867) and Luigi Galvani (1737-1798) –
electricity and contraction of muscles under electrical stimulation of
nerve in frog’s leg
Paul Broca (1824-1880) – neuroanatomy (identification and naming of
‘limbic lobe’ and speech generation area – Broca’s area in the frontal
lobe); craniometry; corresponded brain size with intelligence (weighed
Cuvier’s brain after the famous biologist died in 1832).

Charles Darwin (1809-1882) (vs French Lamarck and Cuvier); theories on


evolution by natural selection, survival of the fittest (term coined
by Herbert Spencer, which Darwin used himself in the 1869 5th edition
of The Origin of Species); theorised on similarity between emotions of
humans and other mammals, such as dogs and also referred to various
social instincts in humans (such as sympathy).

Francis Galton – eugenics, IQ testing, statistical bell curves,


scientific racism (inspired by Darwin). Eugenics society founded in
1880s by Galton and Leonard Darwin (son of Charles) and gained support
from Winston Churchill. Eugenics exported from Cambridge to USA,
Australia and elsewhere (including Germany where it was merged with
German Nationalism and Aryan race supremacy theories)
Malpighi, Golgi, Purkinje, y Cajal – histology and microscopy
(illustrations of neurones and their cell processes – axons and
dendrites)

1890 – naming of neurons; 1898 – synapse named by Sherrington (1857-


1952) as point of contact between neurons and between neurons and
muscle cells
Experiments on cats (Sherrington) and dogs (Walter Cannon) on function
of autonomic nervous system, arousal, sleep and brain physiology
(1890-1920). Cannon (at Harvard) coined “fight or flight” and “rest
and digest” as respective functions of the sympathetic and
parasympathetic branches of the autonomic nervous system. Sherrington
sectioned cats brains at various levels, observing the behaviour of
decorticate cats, and producing cats that remained constantly awake or
somnolent depending on the level at which their brain-stem was cut.
This led to identification of the Reticular Activating System (RAS)
which is targeted to cause loss of consciousness in anaesthesia.
Embryology and comparative anatomy (comparison of human brain with
other animals)
Reticularists versus neuronists (proponents of Neuron Doctrine)
Neurology (Latin terminology) – diseases and lesions in brain and
nervous system (discovery of function of various parts of brain on
autopsy following clinical examination)
Psychiatry and Psychology as developments of philosophy (Greek
terminology) – “moral treatments”
1890 - 1914: Classificatory psychiatry and chemical/electrical
treatments – ‘mental illnesses’ (Emil Kraepelin, Eugen Bleuler –
Germany, Switzerland)
Mesmerism and hypnosis – Mesmer (‘animal magnetism’), Charcot
Psychoanalysis – Freud, Jung, Adler (inspired by Darwin)
Experimental psychology and development of behaviourism – Wilhelm
Wundt, Ivan Pavlov
Anglo-American behaviourism – BF Skinner, James Watson
The Brain in the Internet Age

Internet Age enabled by Digital Revolution of 1960s, which has also


described as the Information Age and the Third Industrial Revolution
(after Agricultural Revolution and Industrial Revolution). Network
established in 1990s. Exponential growth since early 1990s.
Change in way brain has been understood, due to advances in
neuroimaging and microscopy; and so-called “cognitive revolution” in
American academic psychology (departure from behaviourism) resulting
in CBT (reinventing talk therapies). Development of biopsychosocial
model from purely ‘biological’ model and falling of Freud’s theories
to disrepute (along with hypnosis, persuasion and debate/discussion)
as treatment strategies (talk therapies). Attempts to bridge wide gulf
between study of brain and study of mind; consequent separation of
psychology from philosophy; greater leaning to mathematical and
statistical models (population statistics/ individual rating scales –
self reported, subjective or ‘rated by observer’ (also subjective);
computer modelling and neural networks and neural assemblies;
plasticity; genetic studies; drug and chemical obsession (dopamine
theory of schizophrenia; serotonin theory of depression); mirror
neurones; focus on function of cortical areas not on surface of brain
(insula, precuneus, medial frontal lobe); breeding of mutant rodents
(mice and rats) as ‘models’ for schizophrenia, depression and
dementia.
Music
Emotions
New Age psychology and reintroduction of open talk of soul and spirit
– spiritual no longer a dirty word to doctors. Spirituality without
religion. Theosophy, the pineal and the Third Eye of Shiva (also in
Buddhist iconography and custom of bindu/pottu); pineal and
hallucinogenic experiences – DMT cult; pineal and marketing of
melatonin; serotonin, SSRIs and the pineal; New Age syntheses of Third
Eye opening, pineal gland and chakra system;
Vedic concepts introduced in 19th century by German philosophers,
based on idea that Sanskrit (from India) was the mother language of
all the Indo-European languages; explosion of interest with the
Beatles infatuation with the Maharishi Mahesh Yogi, founder of TM, in
the 1960s. Meditation and yoga; Brahmanism in physics; Deepak Chopra
started with TM and Ayurveda before coining the term ‘quantum healing’
for his particular synthesis of ‘east’ and ‘west’. Unnecessary
mysticism.

Blavatsky’s delusion about root races, Atlantis and the pineal


incorporated into New Age ideas, especially in UK, in the 1970s, with
resurgence in 1990s. Graham Hancock theories about civilizations
submerged by great flood after last ice age, 10,000 years ago taken up
by Tamil nationalists, who claimed his documentary as evidence of the
continent of Kumari Kandam that was written about in the ancient Tamil
Sangam literature. Kumari Kandam was said to be a vast extension of
the Pandyan Kingdom of south India, which was submerged by a massive
flood thousands of years before the Sangam Age (around 300 BC). Modern
discoveries about the movement of the tectonic plates and the history
of sea level changes during the ice ages from the geological record
show, conclusively, that there was no continent of Kumari Kandam,
although the sea level was certainly lower during the last ice age
(which ended about 10,000 years ago). At this time the coastline of
India did extend into what is now the Indian ocean but only by tens,
and not hundreds of kilometres (as the Kumari Kandam legend holds).

Beliefs and the Brain


No belief without memory
Brain substrate of memory
Neural mechanisms for logic
Beliefs lead to actions – belief and behaviour
The role of emotions in belief
Reasoning and the frontal lobes
Holding contradictory beliefs
Conviction, Certainty and Delusions
Being convinced of something
Being persuaded by someone or something
Degree of conviction
Theory, hypothesis and possibility
False beliefs and false, fixed beliefs (delusions)
Reasoning from first principles
Scientific reasoning
Inductive and deductive reasoning
Belief of the basis of authority (scriptural, expert, parental)
The God Delusion and Dawkins Delusion
Reasonable and unreasonable beliefs
Who decides what is reasonable?
Diagnosis of unreasonable beliefs
Diagnosis of unreasonable reasoning methods and styles
Argument. Persuasion and Suggestion
Changing beliefs by debate
The art of persuasion (religious, philosophical, psychiatric)
Suggestion and hypnosis
Reaction to the arguer
Reaction to the debater
Reaction to the persuader
Persuasion by oratory
Persuasion by logic
Persuasion by emotional appeal
Holism and Reductionism

Fritjov Capra
James Lovelock (gaia theory)
Lynn Margulis (endosymbiosis)
Rupert Sheldrake (morphic fields)
Jan Smuts (holism – coining of term)
Satish Kumar (holism – New Age)
Vandana Shiva (ecology – dubious claim about zinc)
Deepak Chopra (quantum healing, ayurveda)
Stan Grof (parapsychology)
Dean Radin (parapsychology)
Robert Lanza (biocentrism)
Rudolph Tanzi (dementia – consciousness)
Spirituality without Religion - journey of the soul

Meditation and Deep Contemplation


Descartes and meditation
Contemplation and a contemplative life
Walking as a catalyst for contemplation
Meditation and mantras
Meditation and relaxation
Health benefits of relaxation
Contemplation leads to insights (which may be true or false)
Meditating on music and with music
Meditation and Positive Self-Affirmation
Repeating ‘om-shanthi’ by Satish Kumar
/
Levels of Consciousness
Waking consciousness
Degree of alertness
Degree of observation
Degree of awareness
Attention and concentration
Sleep – quiet and active (REM)
Role of the brainstem, midbrain and thalamus
Role of the limbic system

1940 Papez circuit from injecting rabies into brains of cats (injected
into hippocampus, which was considered central to emotions – now more
associated with memory):
/
/
/

/
Role of the cortex
The role of neurotransmitters – dopamine, glutamate
Precuneus
/

Insula

Caudate nucleus:
/
/
/
/
What about the pineal? The gland/organ has sympathetic innervation via
the superior cervical ganglion (SCG) and parasympathetic innervation
via the pterygopalatine and otic ganglia. What is the significance of
this? There is also innervation by the SCG of the choroid plexus
(which secretes CSF).
/

The Hard Problem of Consciousness

How does the brain produce the mind?


Does the brain produce the mind?
At what stage of development does the foetus develop a brain?
At what stage of development does a foetus develop a mind?
The mind as an emergent property of organized nervous systems
Is consciousness the movement of electricity in the brain?
Altered States of Consciousness
NDE
Telepathy and its pathologisation
Spiritual experiences and the development of delusions
Ordinary and Non-ordinary states (Grof)
Aspects of the Conscious Mind
Attention – divided; directed; thalamus; RAS; frontal lobes; limbic
system
Concentration –focused attention
Perception
Memory
Emotions
‘Cognition’
Language – generation, comprehension (Broca and Wernicke)
Connectivism and its limitations
Neural Correlates of Consciousness

Thalamus
RAS
Midbrain amine-generating networks – dopamine, Ach, serotonin

The Mind-Brain Relationship


The mind influences the development of the brain (plasticity)
The mind affects activity in the brain
The brain influences development of the mind
The brain affects activity in the mind
The Mind-Body Relationship
The brain is part of the body
Brain-body connection also two-way flow of information and causality
Adding Soul to the Mind and Body

The moral dimension


Ethics and being good
Aristotle’s concept of soul
Crick’s explanation of soul
Is soul different to mind?
Is soul different to spirit?
Is spirit the same thing as consciousness? (Chopra)
Does the soul survive death?
Reincarnation, Hinduism and past lives
Heaven, hell and purgatory

Evolution of the Soul


Collecting truths
Increasing awareness – concept of enlightenment bringing wisdom
Developing virtues (metta, karuna, muditha, upeksha; shanthi)
Ridding oneself of vices (Catholic seven deadly sins; Buddhist greed,
hatred, delusion)
Dharma – is it truth or law?
Karma and its attendant evils – the truth of causality and the evil of
blaming misfortunes on sins in past lives; tendency to lead to
misattribution of cause (confirmation bias)
/
The framing of a holistic philosopher

From my distant vantage point in the antipodes I have observed the


Indian philosopher Satish Kumar being professionally framed. He was
framed in 2006, for the British public, as an Enemy of Reason. The
framer was the famous Darwinist Richard Dawkins, who used
incriminating edits of an interview with the elderly philosopher and
activist for his two-part expose of “enemies of reason” for British
television. The Oxford English dictionary says that ‘framing’ in the
way that I have used it is a slang term, but Australians use lots of
terms that were once regarded as ‘slang’. Even our politicians speak
in colloquial slang. It is the language of the people, of the public.
TV has been used to frame people as ‘good guys’ and ‘bad guys’ since
its invention. It has always been a tool of propagandists, but it has
also been a source of enlightenment. I regard the wonderful BBC nature
documentaries by David Attenborough as enlightening, and Attenborough
as a modern sage. He has my utmost respect and I respect him as an
enlightened being. I think Attenborough knows much more about nature
than the Buddha, Mohammed, Moses and Jesus Christ combined. From my
antipodean perspective he is more enlightened than these revered
religious figures, when it comes to the light of knowledge.
I also think that David Attenborough’s wisdom goes far beyond his
knowledge of nature – which is why I regard him as a great sage in
addition to a great scientist. He is a nice guy and he shows it in
everything he does. He has a highly developed moral sense. He knows
what is right and what is wrong – what is good and what is evil. He
doesn’t talk about these things, but he shows it clearly in the
narratives of his numerous, awe-inspiring documentaries about the
natural world.
David Attenborough was always presented as a good guy by British TV,
and by all accounts he was one. I have never heard anyone say a bad
word about Attenborough. Not so, Richard Dawkins, who is a
controversial figure even in Britain. The Oxford University’s
Professor of the Public Understanding of Science has, because of his
strident atheism, become a polarising influence, pushing some to
defence of their religious beliefs, and others to attack religion,
superstition, pseudoscience and ‘believers’ with added vigour. I
started with considerable support for Dawkins’ views but as the years
have gone by I have wondered about his methods, and his own system of
beliefs. He has called himself both a Darwinist and a ‘spiritual
Atheist’, though he also says that he is ‘technically agnostic’ but
places himself at 6 or 6.5 on a hypothetical atheism scale from 1 to
7. What does he mean by Darwinism, and how does this differ from the
belief in evolution by natural selection that my zoologist mother and
I hold, though we have never called it “Darwinism” (which sounds
uncomfortably like a religion)? What does Dawkins mean when he uses
the term ‘spiritual’, in calling himself a “spiritual atheist”? How
different is this to Satish Kumar and Deepak Chopra’s ideas of
spirituality?
The BBC has changed a lot, but it still has, like every other national
broadcaster, a national bias, or at least a national perspective.
David Attenborough’s documentaries reveal the most beautiful and
magnificent aspects of that perspective. Support for various wars,
glorification of the British Royal family, and the building of
national myths, and exaggerating the virtues of their national heroes,
while ignoring or minimising their faults and failings are less
positive aspects of the BBC.
The documentaries The Root of All Evil and The Enemies of Reason, were
made by British Channel 4 (C4) in collaboration with Dawkins, rather
than the more conservative BBC. C4 began broadcasting in 1982, as the
fourth national broadcaster (after BBC1, BBC2 and ITV). The objective,
with any TV station is to attract viewers, and C4 had an obvious
disadvantage against the older established stations. Controversial
titles are an effective way of attracting viewers. ‘A root of some
evil’, would be a more accurate title for The Root of All Evil, but
who’s going to watch such a boring expose? Likewise in The Enemies of
Reason, C4 insisted on using ‘the’ in the title, making Dawkins seem
more dogmatic than he actually is.
Dawkins speaks in plain English and his concern is with the truth.
When he made the documentary framing Satish Kumar and Deepak Chopra as
enemies of reason in 2007 he was the Professor for the Public
Understanding of Science, a position that made him the public face of
the British scientific establishment.
Rather than his scientific academic colleagues, the British public
that was the target of Dawkins’ polemic documentary exposing the
purveyors of superstition and ‘complementary and alternative medicine’
as The Enemies of Reason. The first part was subtitled ‘Slaves to
Superstition’, and exposed Satish Kumar as a “New Age guru” (as well
as English astrology and spiritualism) and the second, titled ‘The
Irrational Health Service’ targeted homeopathy, and the “quantum
healing” of the Indian-American physician Deepak Chopra. Chopra has
been described as a “diamond-encrusted guru” by another British
professor, Brian Cox. Deepak Chopra is also associated with the New
Age, and both he and Kumar (who are friends) describe their respective
approaches as “holistic”.
Six years after Satish Kumar was framed as an enemy of reason for
British viewers, more of the story has become evident from two You
Tube postings, both in May 2013. “Richard Dawkins: Enemies of Reason –
Part I – Slaves to Superstition” was posted by “Anti-Theist” on 12
May, 2013, and has had 108,649 views on 14.6.2015; the more reasonably
titled “Richard Dawkins interviews Satish Kumar (Enemies of Reason
Uncut Interviews)” was posted a week earlier, on 4 May 2013 by
Bernardo Segura of the University of Chile, and has had 117,372 views.
Viewers have overwhelmingly liked both postings – the former has 1018
like versus 38 dislikes, the latter has had 913 likes and 42 dislikes.
The two versions are neck and neck on the Internet, but when it was
first aired on British television, all that was seen was Professor
Richard Dawkins’ version. This presented Satish Kumar as an enemy of
reason, a slave to superstition, and a threat to civilization itself.
It seems to me, after watching the two documentaries, that Satish
behaved in a rather more civilized way than Dawkins and Channel 4.
Though most viewers have liked the Dawkins documentary about the evils
of superstition, the most popular comment out of 1447 comments on the
uncut version shows support for Satish Kumar, from a Dawkins fan,
which has 60 thumbs up so far. This is the comment of a young Indian
man by the name of Aviram Vijh, who wrote, in March 2015:
“I almost revere Dr Dawkins. He is a brilliant scientist who has made
important contributions to our times. However, I don’t find anything
too ‘unreasonable’ about what Satish says. He makes some good points
and clearly states that he is NOT sure, but that is how he understands
the nature of things. You can’t talk to a philosopher and expect him
to talk in scientific terms. That is not what philosophers do.”
Perhaps owing to our convict heritage, we talk in Australia of people
being “framed” in the sense of what Dictionary.com lists as an
“informal” use of the word – “to incriminate (an innocent person)
through the use of false evidence, information, etc”. The
‘incrimination’ of Satish Kumar was not accusing him of being a
criminal, but an ‘enemy of reason’ and a ‘slave to superstition’.
Furthermore, there was no manufacture of false evidence – Satish Kumar
himself provided the evidence by agreeing to be interviewed. He was,
nevertheless ‘framed’, as I understand the word, through selectively
choosing the most unreasonable things he said during the 45 minutes
that Dawkins interviewed him. Dawkins and Channel 4 (C4) could have
selected the sanest, most rational, sensible things that the
philosopher said, of which there were plenty to choose from. This was
obviously not the plan. It was to frame him, as an “enemy of reason”.
I wonder if the philosopher was warned about the intended title of the
program? In my opinion, what Dawkins did to Satish Kumar was not
gentlemanly conduct – “it just wasn’t cricket” to use the old English
gentleman’s exclamation. But Dawkins belongs to a new breed of British
professor who doesn’t believe in such niceties.
I like the fact that Dawkins doesn’t beat around the bush and speaks
in plain language (though he has invented a few neologisms of his own,
such as memes). This is a relief from the obscure waffling of
philosophers and the incomprehensibility of much academic scientific
writing to anyone not familiar with the specific jargon of the
discipline. I like people who talk straight, and Dawkins has a well-
deserved reputation for speaking in plain, understandable English. He
also has a gift for rhetoric and invective in his writing, as well as
a sophisticated sense of humour. He’s a charming guy, and when he
smirks at stupidity (which C4 makes the most of) it is funny for those
who are on Dawkins’ side. This side, in his divided world of reason
and its enemies – is the side of reason. In other words, Dawkins puts
himself forward as the voice of reason. I have often found myself on
his side, but when it comes to his treatment of Satish Kumar, not
because I agree more with Kumar than Dawkins, but because I think it
is wrong to frame people as “enemies of reason” without good reason.
Though I doubt that he has full insight into it, Richard Dawkins used
standard hypnotic tricks of TV and film-making to frame Sadith Kumar
as an enemy of reason. He used the power of suggestion with
manipulation of the audience to convince them of his credibility, get
them onside and demonise “the enemy”. The enemies in this case were
identified as “New Age gurus”, whoever they are, and however they be
judged as such. It is clear that the Sanskrit word for teacher has
become a pejorative term in the west, since guru means nothing more
than teacher. Doctor also means teacher, but derived from the Greek,
rather than the Indian tradition.
Let me take you through the stages of the hypnosis and carefully
edited manipulation, step by step. The documentary starts with Dawkins
sitting in a room with four or five people sitting in comfortable
chairs in a comfortable English house, with their eyes closed,
chanting what sounds like a Tibetan Buddhist mantra. The Oxford
professor is sitting rather awkwardly on a chair, looking bemused. His
voice, dubbed as a voiceover introduces the subject at hand by
extolling the virtues of science (not mentioning that the cure for
smallpox was developed by Islamic science in the Turkish Ottoman
Empire and not “western” or British science):
“Science has sent orbiters to Neptune, eradicated smallpox and created
a supercomputer that can do sixty trillion calculations per second.
Science frees us from superstition and dogma and enables us to base
our knowledge on evidence. Well, most of us.”
The film cuts to scenes of Orthodox Jews and Catholics bowing and
praying, taken from his previous documentary The Root of All Evil,
with the continued narrative:
“Previously I’ve explored how organized faith and primitive
religious values blight our lives”.
There is then a sudden edit, intended to rouse the indignation of
women, with Dawkins as their knight in shining armour. We watch a few
seconds of a dark-skinned man with a beard and shaved head, looking
like the stereotypical young Muslim “terrorist”, though sporting an
American accent:
“You take the women and dress them like whores on the street.”
We can’t see Dawkins’ face but we hear him reply “I don’t dress women,
they dress themselves”.
Dawkins has obtained our sympathy and support for his valiant crusade
against chauvinism. He clearly is not a male chauvinist pig like the
other guy. We are even more on his side when his disembodied voice
explains:
“The fault lines lie even deeper than religion. There are two ways of
looking at the world – through faith and superstition, or through the
rigours of logic, observation and evidence through reason. Yet today,
reason has a battle on its hands. I want to confront the epidemic of
irrational superstitious thinking.”
Pleasant hypnotic music and colourful, rotating roulette wheel induces
a deeper trance, as the convincing voice of Dawkins warns that, “It’s
a multimillion-dollar-pound industry that impoverishes our country”
(though I would have thought his country would have increased economic
growth as the result of the blossoming of superstition, if indeed this
is occurring).
The narrative is interrupted by a bald middle-aged man, and dressed in
a dark coat and blue shirt that matches Dawkins’ own when he finally
makes an entrance in person. The man is preposterously telling
Dawkins:
“Astrology tends towards the divine and sacred, which I know
you don’t like much.”
We naturally come to Dawkins’ defence at this attack on our hero, who
previously defended the right of women to dress themselves. Our state
of suggestibility has increased, and our critical reasoning decreased.
The image cuts to some of Deepak Chopra’s books on a shelf, such that
you can read the title “Quantum Healing”, followed by a more lingering
image of a titled “Working with Angels, Fairies and Tree Spirits”. The
scene has been set for framing of Satish Kumar, with the continued
Dawkins narrative:
“And throws up New Age gurus who encourage us to run away from
reality”
The image cuts from the books to an elderly Indian gentleman with fine
features and rather sparkly eyes explaining that “the tree-ness is a
spiritual quality”. “Or the rock-ness?” Dawkins asks. “Or the rock-
ness” Kumar confirms.
One does not see Satish Kumar again till the end of the documentary,
when a bit more of their discussion is shown. What was chosen to
portray him as an “enemy of reason” was the most unreasonable thing he
said, from the perspective of Dawkins and Channel 4. This was the most
unreasonable thing he said in their 45-minute exchange from my
perspective too, but it was not that unreasonable and most certainly
does not make this respected philosopher an “enemy of reason”. There
were Western and Eastern philosophers discussing the idea of
panpsychism long before the New Age.
At no stage does Satish Kumar suggest or imply that we should “run
away from reality”, as Dawkins alleges “new age gurus” are in the
habit of doing. The philosopher does suggest that we go for a walk if
we become stressed, which is great advice, and rather better than
medical advice to take a Valium. He has been teaching a philosophy of
what he describes as “holism” as the opposite of reductionism, and
philosophises that what is spiritual is everything that is not
material. Friendship, compassion and love, he regards as “spiritual”
and he has centred his philosophy of peace around concepts such as
these. He talks about the quality of something being the spiritual
aspect and the quantity of something being the material aspect, which
come together as the whole. This is a metaphysical idea that is not
new, original or particularly New Age. Like many of the ideas of the
“New Age gurus” Dawkins distrusts, Kumar draws Indian philosophical
concepts from the Hindu, Buddhist and Jain traditions, such as ahimsa,
karma, shanthi and moksha. This does not make him an enemy of reason,
though he could quite reasonably be regarded as a guru, using the
Sanskrit terminology of the same traditions.
Kumar has developed the slogan of “soil, soul and society” as an
alternative “trinity” to the Biblical trinity of “father, son and holy
ghost” or “mind, body and spirit”. These are radical ideas, but Kumar
is a radical philosopher in his gentle, friendly way. His ethos is to
promote mindfulness and engagement with life, consciously promoting a
philosophy of serving the planet and others, maintaining an awareness
of the interconnectedness and interdependence of all living beings. He
calls this “spirituality”, which is very different to superstition or
organized religion; Kumar is adamant that he does not believe in the
supernatural. Even if he did that would not make him an “enemy of
reason”, any more than other religious people.
The snippet about “tree-ness and rock-ness” being spiritual qualities,
can only be understood in the overall context of what Kumar teaches
about the difference between what is material and what is spiritual.
His view is that the “human-ness of humans” is their spirituality in
the same way that the “tree-ness” of a tree is its spirituality and
likewise the rock-ness of a rock. This may sound like “woo-woo”, as
sceptics call such ideas in the USA, or more kindly, mysticism. It is
a form of panpsychism – the believe in universal consciousness, which
often assumes that “mind” or consciousness precede matter.
I don’t personally believe in panpsychism, and think that
consciousness is an emergent property from complex nervous systems. I
think the brain produces the mind – but I also believe that that the
mind has profound effects on the development and activity of the
brain. Over the past two decades I have devoted a lot of time
exploring this relationship between the mind and brain, and the
relationship with the brain and body, attempting to develop a holistic
model of health. I have been calling this model “holistic” since 1995,
with the simple idea that holism means merely that in biology the
whole can be more than the sum of its parts, and that these parts need
to be looked at in the overall context of the whole.
Holism is necessary to gain an understanding of the big picture, and
is the remedy for both splintering and reductionism in science,
especially biology and medicine. Holism is necessary to understand the
brain, which functions as a whole, producing a single entity that we
call our mind – our self-identity. The mind (of which the
consciousness mind is only a part) emerges holistically from the
activity of the brain – consciousness cannot be localised to a
particular region of the organ, though there are some parts of the
brain more associated with alertness and concentration that others.
The known function of the brain indicates that many mental functions
are localised, while others are more widely distributed. Putting in
all together and understanding it as a whole is what I regard as
holism.
When I adopted the concept of holism twenty years ago, in reference to
my medical approach (which is strictly scientific) I had not heard of
the man who apparently coined the term in 1929 in a book called Holism
and Evolution. This was a racist South African general by the name of
Jan Smuts, who was more a politician than a scientist. I haven’t read
this book, and don’t know how Smuts connected holism and evolution.
Satish Kumar is known for his philosophy of holism; Dawkins is an
expert and enthusiast of evolution. I believe in both holism and
evolution, and have tried to gain a holistic understanding of
evolution and of biology generally.
The talking head and hands of Satish Kumar are shown in the
introductory minutes of the documentary Enemies of Reason as a New Age
guru and slave to superstition before the grand entrance of Dawkins
himself, the star of his own show, and champion of Reason with a
capital R. The professor’s narrative continues:
“As a scientist I don’t think I don’t think that our indulgence of
irrational superstition is harmless. I believe it profoundly
undermines civilization. Reason and a respect for evidence are the
source of our progress, our safeguards against fundamentalists and
those who profit from obscuring the truth”.
Then comes the moment the intros been building up to – the appearance
of the guru of atheism who has been preparing us with a grim warning
about the imminent downfall of society at the hands of sinister ‘new
age’ gurus like Satish Kumar:
/
The hypnotic process continues through the program, maintaining a
state of suggestibility in the viewer. You get to like and trust
Dawkins, though his very reasonable ridicule of some ridiculous
English men and women doing ridiculous things.

It’s a pretty harsh thing to call someone an enemy of reason,


especially an apparently kindly old man who has been a leader of the
environmental movement for many years and who made his name, and
gained a reputation and following, as a peace activist and editor,
rather than a man of science. Reason comes in many forms, and Satish
Kumar displayed both reason and wisdom in his responses to the traps
Dawkins set for him. He came prepared with the first trap – holism.
The scene was filmed with both men sitting on a bench and the
conversation is amicable and polite. Dawkins starts by saying that he
think they have quite a lot in common, like a shared admiration of
Bertrand Russell. This was said with the pre-knowledge that Satish
Kumar had an anecdote he liked to tell about how he was inspired to
become an anti-nuclear activist by the example of the English
philosopher being jailed at the age of 90 for his opposition to
nuclear weapons. Dawkins laughed and agreed that Russell was a great
man, then went straight for the kill. He got down to the business of
trying to get the Indian gentleman to prove to the viewing audience
that, like astrologers, dowsers, and conmen who claim to be in touch
with the dead relatives of gullible English audiences, Satish Kumar is
an “enemy of reason”.
“Now, you’re very keen on holism. Holism is a word that I associate
with General Jan Smuts, What does holism mean to you?”
Satish Kumar’s idea of holism is centred on connection and context:
“Holism means to me that things are connected – nothing stands in
isolation, so when you look at a tree, you don’t just look at the
tree. You know the tree exists only because of there is the soil,
there is the sunshine, there is the air…so many things make up the
tree. Looking at the tree in isolation without the context, without
the holistic view of the surroundings will be wrong, in my view.”
It’s not exactly how I would define holism, but hardly the answer of
an “enemy of reason”. Kumar qualified what he said with, the phrase
“in my view” – hardly a dogmatic attitude. This was his attitude
throughout the interview. He didn’t seem arrogant or opinionated;
rather he cheerfully explained his philosophy and world-view in
English words that an English audience can understand. This is not
easy when the language of your philosophy is based on Sanskrit and the
Hindu, Jain and Buddhist traditions of India, rather than Latin and
Greek and the Protestant Christian tradition.
/
Rather than scientific or religious texts or teachers, Satish Kumar
credits his mother, who was illiterate but wise, for his core
philosophy of ‘being a pilgrim’. He argues that science is only one
way of gaining knowledge and that the ‘scientific way’ is only one way
of many. Insight, love, compassion and intuition are other ways of
acquiring wisdom. These, because they are not material, are what he
calls spiritual, and the development of love, compassion and other
virtues is what means by ‘spirituality’, from my understanding of the
interview. Though efforts have been made to explain love and
compassion in terms of ‘evolutionary psychology’ these are untestable
theories and I find them uninteresting compared to the broader
cultural understanding of love, and philosophies that promote love,
rather than hate. I also have a deep belief in pluralism, and the
right of people to believe whatever they like as long as it doesn’t
harm others. For this reason I regard Satish Kumar and Deepak Chopra
(and the astrologers, mystics and New Agers the confronts in these
documentaries) as far less dangerous enemies of reason than people
(the vast majority of whom are men) that order bombs to be dropped on
other people. When the New Age became popular, back in the 1960s, the
children of the Age of Aquarius were resisting conscription to fight
in the Vietnam War and were part of the Peace and Civil Rights
Movement. They were celebrating new, revolutionary music created by
black and white Americans together, and racial integration was one of
the themes. Astrology bubbled away in the background, but the New Age
was a social and cultural movement that transcended astrology; it was
about optimism about a better, more peaceful and harmonious future,
when young people were no longer forced to ‘go to war’ which meant to
kill and maim innocent villagers in Vietnam. The slogan, which was
made into posters for teenagers such as myself, advertised for sale on
the back of comic books imported to Sri Lanka in the 1970s, asked –
“suppose they gave a war and nobody came?”
/
The five original members were Billy Davis, Jr., Florence LaRue,
Marilyn McCoo, LaMonte McLemore, and Ron Townson. They have recorded
for several different labels over their long careers. Their first work
appeared on the Soul City label, which was started by Imperial
Records/United Artists Records recording artist Johnny Rivers. The
group would later record for Bell/Arista Records, ABC Records, and
Motown Records.
The “tribal rock musical” Hair debuted on stage in 1967 was written by
James Rado and Gerome Ragni with music by Galt MacDermot (Canadian
composer and musician). Rado and Ragni were not themselves members of
the ‘hippie culture’, and actively researched the “hippies” (as Rado
has since described them) to write the musical. The shows main theme
was opposition the Vietnam War, which Rado knew would be
controversial, . The stage show was also controversial in including a
nude scene which was added for extra spice when the play was moved to
Broadway.

By the seventies, when I entered my teens, the New Age had already
been commercialised.
Milos Forman (One Flew Over The Cuckoo’s Nest in 1975) made film
adaptation of Hair in 1979. Czech film director – Hair – bass player
and singer?

Renn Woods is an African-American film and television actress/singer,


best known for her role as Fanta in Roots, and also as the girl with
flowers in her hair who sang "Aquarius" in the film version of Hair.

according to Wikipedia (the best source for some information and


downright awful for others):
“The New Age movement is a religious or spiritual movement that
developed in Western nations during the 1970s. Precise scholarly
definitions of the movement differ in their emphasis, largely as a
result of its highly eclectic structure. Nevertheless, the movement is
characterised by a holistic view of the cosmos, a belief in an
emergent Age of Aquarius – from which the movement gets its name – an
emphasis on self-spirituality and the authority of the self, a focus
on healing (particularly with alternative therapies), a belief in
channeling, and an adoption of a "New Age science" that makes use of
elements of the new physics.”

“The New Age movement includes elements of older spiritual and


religious traditions ranging from monotheism through pantheism,
pandeism, panentheism, and polytheism combined with science and Gaia
philosophy; particularly archaeoastronomy, astrology, ecology,
environmentalism, the Gaia hypothesis, psychology, and physics. New
Age practices and philosophies sometimes draw inspiration from major
world religions: Buddhism, Taoism, Chinese folk religion,
Christianity, Hinduism, Sufism (Islam), Judaism (especially Kabbalah),
Sikhism; with strong influences from East Asian religions,
Esotericism, Gnosticism, Hermeticism, Idealism, Neopaganism, New
Thought, Spiritualism, Theosophy, Universalism, and Wisdom tradition.”

when I wrote Alpha State: A State of Mind for the New Age, I had not
even made the connection between astrology, the Age of Aquarius and
the New Age. The New Age, for me, began in the 1990s, because that’s
when I was introduced to the movement that Richard Dawkins sees as the
“enemies of reason”, including the physician Deepak Chopra.

Satish Kumar does not claim to treat or heal, and was in the first
episode – slaves to superstition – rather than the second – The
Irrational Health Service, which is devoted to exposing medical
charlatans as “enemies of reason”. The Indian-American physician
Deepak-Chopra, who has made a name for himself by marketing a
combination of Ayurvedic treatments, ritual mediation and various
relaxation strategies and dietary regimes as “quantum healing” was
targeted, with rather more justification than Satish Kumar, who makes
it quite clear, in the unedited version of their interview, that when
he says spiritual, he does not mean supernatural. Kumar does not
believe in the supernatural, and believes that everything is natural,
but that all life is interconnected and interdependent. His way of
speaking is poetic and Indian, and the concepts he used are the
product of his culture, which is very different to that of Richard
Dawkins, who is very English indeed (though born in Kenya, which was
then called, by the British, British East India).
Dawkins has been described as “old school”, and this has an element of
truth in it – there is no older school in the English-speaking world
than Oxford University. From my perspective both Dawkins and Kumar
are old men, though Kumar is a few years older than Dawkins, which in
the English language as used in Australia, is commonly called “senior
to”. Old people gain wisdom by virtue of the longer time they have
spent on the planet – the wisdom of age and experience that are
traditionally respected in cultures around the world. Wise old men
have been regarded as sages in the East and the West. The difference
is that Satish Kumar is a sage in the Indian tradition, where sages
and teachers are all called “gurus”, while Dawkins is a sage in the
British tradition, where sages are called “professor emeritus”. By
that token, in the British hierarchy, Dawkins is not yet a sage. In
the Indian tradition, he would be guru Dawkins or Guru Richardawkins.
That would mean “venerable teacher Dawkins” in a tradition that has
great respect for teachers and teaching, and hold gurus in high
regard.
Wiktionary provides this explanation of ‘guru’:
“From Hindi ???? (guru) / Urdu ??? (guru), from Sanskrit ???? (gurú,
“venerable, respectable”). A traditional etymology based on the Advaya
Taraka Upanishad (line 16) describes the syllables gu as 'darkness'
and ru as 'destroyer', thus meaning "one who destroys/dispels
darkness".
The Upanishad talks about awakening the kundalini and thus realizing
Brahman, the Absolute Reality. Its verses on the importance of the
guru (teacher) are often quoted.
The Upanishads (Sanskrit: ???????,) are a collection of texts in the
Vedic Sanskrit language which contain the earliest emergence of some
of the central religious concepts of Hinduism, some of which are
shared with Buddhism and Jainism. The Upanishads are considered by
Hindus to contain revealed truths (sruti) concerning the nature of
ultimate reality (brahman) and describing the character and form of
human salvation (moksha).”
Sanskrit
I have also seen old men deliberately playing the part of sages, by
adopting the stereotypical style of a wise old man, and telling
anecdotes of their youthful exploits and the important people they’ve
met. They play on the cult of the guru – the natural tendency to
worship and obey old men with beards, under the mistaken assumption
that they are wise because of their white hair and long white beards.
This is seen as the superstition of Merlin and Gandalf in the West and
the superstition of the omniscient guru in India. Professors sometimes
wear a beard for this precise reason, though the habit has rather
grown out of fashion as long white beards have become associated with
the Maharishi Yogi and the “mystical gurus” with Mercedes Benz cars.
The Modern Western professors are mostly clean-shaven; the fashionable
ones wear a bit of stubble to show how non-conformist they are, while
others sport long hair, a moustache, or shave their heads, according
to their idea of fashion and style. I have watched the evolution of
the dress and grooming of celebrity professors over the years with
some amusement.

The framing of Satish Kumar and Deepak Chopra as ‘enemies of reason’


was done in 2007 by the British television station Channel 4 (C4) with
the enthusiastic assistance of Richard Dawkins who wrote and presented
the ‘documentary’ (in broad modern parlance) titled The Enemies of
Reason. Rather than a documentary, The Enemies of Reason was a polemic
against what Dawkins regards as the irrationality of superstition and
religion, which he sees as threatening the civilization itself. The
Enemies of Reason was a continuation of the battle he began with The
Root of All Evil? (he was apparently able to get a question mark added
to the provocative title C4 had planned). Irrationality and
superstition are concerns that I share, but I think what was done to
this elderly Indian philosopher, was rather unfair. I reached this
opinion after watching the edited and unedited versions of the
interview between Dawkins and Kumar, which are old news, but I’ve been
belatedly catching up with the antics of the popular British
evolutionary biologist.
There are several edits of their interview on YouTube. The longest,
almost 45 minutes long, contains both the shorter segments I have
watched, one in the documentary itself, and another short segment
posted by a supporter of Satish Kumar, where the philosopher makes a
great deal of sense (cheekily titled ‘Satish Kumar schools Dawkins’).
The long uncut version was posted in 2013 by someone at the University
of Chile, and has had a few more views on You Tube than the C4
documentary itself, which was also posted on YouTube in May 2013. The
two versions give very different impressions of the philosopher and
activist.
In the unedited version of the interview Satish Kumar happily admits
that he is not a scientist, and cannot prove what he says, when he
talks about spiritual and material aspects of the world. He has,
however, been a leader of the environmental movement in the UK –
Dawkins describes him as being on the “sandal-wearing end of the green
movement” and that he “counts among his many fans Prince Charles and
the Dalai Lama”. He is also the editor of Resurgence magazine, which I
am not familiar with. I have gathered that he is also the founder of
an institution called Schumacher College in the UK, that boasted as
its first lecturer James Lovelock, the founder of the Gaia Theory.
Dawkins, on the other hand, is a famous professor of science, from the
esteemed and ancient Oxford University, and Professor for the Public
Understanding of Science at Oxford when he made The Enemies of Reason.
Satish Kumar was not scientifically trained; rather he was trained as
a Jain monk, before being inspired by a book by Gandhi to leave the
order and live in a Gandhian ashram when he was 18. He became well-
known after he and a friend walked on what he called a “pilgrimage”
from the grave of Gandhi to that of JF Kennedy in the 1960s,
apparently inspired by the example of the philosopher Bertrand Russell
being failed for protesting against nuclear weapons at the age of 90.
I gathered this from the uncut version of the interview, which I
watched before investigating the presentation of Satish Kumar’s
philosophy in the televised documentary, which aired in Britain in
2007.
The Enemies of Reason followed Dawkins’ successful polemic documentary
The Root of All Evil, which also came in two instalments made and
aired with the assistance of Channel 4. Dawkins wrote the scripts,
then narrated and presented the two episodes, which aired in 2006, the
year before Enemies of Reason. He released his controversial The God
Delusion around the same time, making him many fans and admirers as
well as quite a few enemies. It was a divisive book, and intended to
be so – his mission since then has been to promote the causes of
atheism and Darwinism, which he regards as synonymous with reason and
science.

“You’ve agitated or angered one of the supposedly calmest people on


the planet, Deepak Chopra”
“I just tweeted some fact”…pause…laughs, provoking laughter in the
studio audience….
“I tweeted something joking about the origin of the universe, or
something and he tweeted back saying I’m going to shove my cosmic
consciousness up your arse.”
Apparently Chopra corrected him when he called him a “diamond-
encrusted guru” saying that the stones on his glasses, that Cox was
laughing at, were actually rhinestones. Cox goes on to talk about the
marvels of science and his own project to try and get the government
to take seriously the remote possibility that the earth will be hit by
a meteor. Welcome to modern science, and the science of the Internet.
This is a science that communicates in plain language and calls “a
spade a spade”. It is the science that Dawkins engages in with
remarkable vigour for a man of his age. Dawkins is a scientist of the
Internet Age, who tweets like the best of them, and engages in
televised debates where he argues that religion and science are
incompatible. Dawkins has been leading the charge against the
irrationality and superstitions of religion and mysticism, without
fear or favour. He’d denounced Christianity, Judaism and Islam – the
monotheistic religions the venerate Abraham – in The God Delusion, how
he turned his attention to the religions of the East, where he was up
against the New Age movement with its New Age gurus.

“I’m not picking on homeopaths, astrologers are mad as well”


These documentaries were made with the assistance of British Channel
Four (C4) which aired them in 2007. Dawkins and C4 had previously
collaborated on The Root of All Evil, which also came in two parts,
going to air in Britain in 2006. Capitalising on the publicity,
Dawkins released The God Delusion, which expanded on his argument,
presented in the documentary, the ‘root of all evil’ is nothing more
or less than religion. The solution to the superstition and evil of
religion, in Dawkins argument, was science and a scientific
understanding of the world. Since then, Dawkins has become a key
protagonist in a battle between science, pseudoscience and anti-
science, in which I have been mostly on his side.
The difference between pseudoscience and anti-science is that the
former dresses itself up as legitimate science, while the latter is
critical of the scientific way of looking at things, or the visible
results of science and technology, such as ever-more-lethal weapons,
pollution and deforestation, and the development of new scientific
methods of killing and enslaving people as cogs in a great scientific
machine. The anti-science movement points also to the manifold
failings of the medical profession, and its reliance on drugs and
surgery as methods of treatment, and neglect of natural healing
mechanisms. The anti-science movement is also characterised by various
conspiracy theories, of wildly varying credibility. These include
conspiracy theories about vaccines, fluoride, and more recently,
‘chem-trails’ and ‘geo-engineering’. These conspiracy theories are all
argued on the basis of what is taken to be ‘scientific reasoning’ and
‘evidence’, though what they mean by science is not what was declared
to be the scientific method by Karl Popper.
The Austrian philosopher Karl Popper developed the doctrine that the
scientific method consisted on developing falsifiable hypotheses
followed by efforts to falsify them. You could never be absolutely
certain about anything, just more certain. This is the paradigm that
all the so-called new atheists, like Dawkins, accept to be the
‘scientific method’ and why he maintains that he is not certain about
the non-existence of God, but that he is also not certain of the non-
existence of the tooth fairy. Uncertainty is a core assumption,
according to Popper’s philosophy, which he called critical
rationalism.
Popper famously argued that he cannot even be absolutely certain that
the sun would rise the next day, just because it had risen every day
in the past, or that the Laws of Physics dictated that it would. He
argued against coming to conclusions in the basis of inductive
reasoning and discussed, in great detail, the relative merits of
inductive and deductive reasoning with other sages of his day. These
included the Australian scientist John Eccles, who was famous for his
role in discovering the electrical and chemical role of synapses in
the brain.
The men and women that Richard Dawkins identifies as ‘enemies of
reason’ are, in my opinion, variously misinformed and deluded. So are
the people he identified in The Root of All Evil, in which he
confronted what he termed the “God Delusion”. In Dawkins own
terminology “God delusion” has become a ‘meme’, a unit of thought that
can be transmitted culturally, and be subjected to selective
pressures. Is this a reasonable meme, though? Is it good for society
to regard everyone who believes in God to be deluded? More
importantly, is it good for the British establishment to start calling
people of other religions and persuasions deluded because they don’t
adopt a British scientific way of looking at things?
The scientific paradigm promoted around the world by Oxford and
Cambridge universities, the centre of Britain’s academia for hundreds
of years, teaches that delusions are a sign of mental illness, and a
diagnostic feature of schizophrenia, mania and delusional disorder.
Schizophrenia is the advocated diagnosis for “odd” or “bizarre”
delusions, while “non-bizarre delusions” are taught to be more
diagnostic of “mania” and “delusional disorder”. The treatment for
delusions, regardless of the label that is applied, is drugs that
block neurotransmitters in the brain, rather than reasoned debate in a
conducive environment. The environment in which those labelled with
delusions are treated is that of a locked ward, from which they are
physically prevented from leaving. This is the case in Britain, and it
is also the case in Australia and, to a greater or lesser degree,
around the world. Under the existing system, though, less than 5% of
the population is “deluded”. This is because religious beliefs are
specifically excluded, in psychiatric diagnostic systems, from being
diagnosable as delusions.

This book is the result of my digesting The God Delusion over decade,
in the back of my mind while battling the Australian psychiatric
system’s very real, and very cruel treatment of people who are
‘diagnosed’ with delusions. This system was developed in Britain with
inspiration from Germany and exported from Britain to Australia as a
means of variably controlling natives, settlers, migrants and convicts
and modified by American psychiatry in recent years. A person with a
delusion is ‘mentally ill’ according to this diagnostic system, and
rather than being caused by others (the act of deluding someone and
thus causing delusions) it is taught, in universities around the
world, that delusions are caused by chemical imbalances in the brain,
and signs of serious, incurable mental illness. Delusions may be
variably diagnosed as evidence of schiozophrenia, schizo-affective
disorder or mania, all of which can be used as diagnostic
justifications for forced treatment in “closed” (meaning locked)
wards. Whether the delusion is diagnosed as evidence of schizophrenia
or mania depends on how ‘odd’ or ‘bizarre’ the delusion is, and
assessments are also made of how strongly held they are (degree of
conviction or ‘intensity’). If the delusion is not deemed to be
“bizarre” people can still be diagnosed as “delusional disorder” and
have drug treatment forced on them. In the Western medical system,
delusions are indicative of psychosis – being out of touch with
reality. The standard treatment for delusions is not to convince them
out of the delusions by reasoned debate, but to force drugs into them
by mouth or injections, and locking them up until they agree that they
have an incurable mental illness. This is termed “gaining insight into
the disease”. This is the problem with saying that everyone who
believes in God is deluded, and one of the problems that I will try
and address in this book. Frankly, I enjoyed The God Delusion, but the
behaviour of Dawkins since then has made me think he may be bit of an
enemy of reason too. I don’t think it is reasonable for a professor of
the public understanding of science declare people deluded without a
knowledge of the current scientific and medical treatment of delusions
in his own country.
What does it matter how a professor is dressed, you may ask? It
matters a great deal – since it is how the professor creates an image
that others judge him or her by. There is a difference between how
female and male professors dress and present themselves to the public
and to their peers and students, and to what they choose to wear when
they are in front of a camera. I am more interested in the clothes men
wear and how they shave their head (if in fact they do) than the
dress-style of women, when it comes to a sociological and
psychological analysis of professors, and how it came to be that an
esteemed Oxford professor chose to frame an Indian philosopher who is
not a professor, an academic, or scientist as an “enemy of reason”
rather than having a healthy, interesting conversation with him. What
makes Dawkins tick, in other words?
This book is the result of my pondering the words and actions of
Richard Dawkins more than anyone else. I began by liking him, though I
had never met him, and had no idea of what he looked like, how old he
was or how he dressed. I had no idea whether he dressed at all or
wrote stark naked – it was his words and how he expressed his ideas
about the evils of organized religion in The God Delusion that made me
like him. I had previously read The Selfish Gene, but other than the
concept of memes, which I have found useful I was not much influenced
by his evolutionary ideas, having already read and very much enjoyed
the popular science books of the great Stephen Jay Gould, whose
opinions about evolution I had every reason to trust, and did. Gould
was a Harvard professor who was also famous as being an “evolutionary
biologist” and an opponent of “creationism” being taught in American
schools, but argued that religion and science belonged to different
domains and could co-exist peacefully. Gould also wrote passionately
and wittily about the failings of science, where evolutionary theory
had gone wrong, and where academia had made mistakes in his own field
of palaeontology. He also wrote about the difficult questions in
evolution, such as how the theory explained the marvel of the human
eye or a bird’s wing (such as how is a part-wing an evolutionary
advantage). One thing I learned from Gould that hadn’t considered
before was the “flaws in design” which disprove the idea of a
perfectly designed creation – the theme of his book The Panda’s Thumb.
I was also mesmerised by Gould’s Wonderful Life about the fossil
discoveries in Burgess Shale in Canada, of a fascinating array of
animals, most of which disappeared in the mass extinction event that
preceded the “Cambrian explosion” that I had read about when I was a
child. I grew up reading about dinosaurs and evolution, and didn’t
find Dawkins’ idea that the primary determinant of selection is the
gene particularly startling. This was the central theory put forward
in the Selfish Gene. I knew that some people would misunderstand the
title as meaning that people are naturally selfish – that they have a
“gene” for selfishness. This would be a natural mistake, but not how I
interpreted Dawkins’ theory. I understood it quite easily, since it is
the logical, linear result of reductionism within a materialist
paradigm. This is the paradigm in which I was educated.
I have not read Dawkins’ book The Blind Watchmaker (1986) but I have
watched the documentary he made of the same name. In this documentary,
Dawkins argues for gene-centred evolutionary theory and against
“intelligent design”, as well as the bizarre claims made by biblical
fundamentalists in the USA at the time that fossilized dinosaur tracks
included the footprints of humans alongside those of the dinosaurs.
This confuses two issues – crazy ideas based on fundamentalist
Christian creationism that the earth is only 6000 years and reasonable
criticism of a gene-centred view of evolution. I didn’t doubt
evolution, and I was well aware that evolutionary theory proposes
“randomness” only for the variation of a species and not its
selection, which is far from random. The selection is based on
reproductive success, which requires survival to reproductive age, at
least. Darwin and Wallace’s similar theories were that speciation
occurred through the survival of some individuals rather than others.
These surviving individuals passed on various inherited traits to
their offspring, and the whole process of evolution was determined by
the fact that some survived and others didn’t. The strong survived and
the weak perished; the best adapted to an environment survived while
those less well adapted perished, or rather, did not have as good
reproductive success. Dawkins argues that the selection occurs not on
the level of the individual but on the gene. Others think that natural
selection occurs also on the individual and group levels. In my
opinion there is natural selection on all these levels as well as
between species (which doesn’t cause speciation, of course) and there
is also natural selection of what Dawkins calls “memes”. Biology and
zoology merge into anthropology and sociology, without clear borders.

The theory of the interconnectedness of life, which I read about in


the early 1990s, did not come to me from an ancient or mystical
source. It was strictly biological and materialist, in the opening
pages of a book on Australian fossils with a foreword by the great
British zoologist and film-maker David Attenborough, who truly
deserves the title of modern sage. The theory was described in the
book Riversleigh as the “concept of the four-dimensional bioblob”,
which is that all organisms are related to each other and physically
connected through the fourth dimension (time) creating a “gigantic
shape-changing organism moving though time and space”. This is an
extension of the old tree of life, concept, bringing it into the
modern age, and is an assumption of modern science, though the meme of
“four-dimensional bioblob” never caught on.
I read about the four dimensional bioblob concept in 1992, and struck
by the implication that all humans are physically connected, and that
all primates are physically connected, if one goes far enough back in
time. Everyone on the planet has been evolving for the same amount of
time, there is no race that is more or less evolved, though racial
differences do exist, and are obvious to see. To recognise race and
racial differences is not to be racist – to make negative judgements
on the basis of race is. Making negative judgements on the basis of
national stereotypes is regarded as much more politically correct,
around the world, and the basis of much hilarious comedy. Somehow
people don’t mind their nationality being made fun of by themselves
and even by others, though nationalism has been the cause of many wars
over the years.
Nobody these days makes fun about people’s race on British TV anymore,
but it was commonplace in the 1960s and 1970s, with tasteless sitcoms
like “Love Thy Neighbour”, where the central theme was the supposedly
comical intolerance and traded insults of “honky” and “nig-nog”
between a racist ‘black’ man and a racist ‘white’ man, both of whom
were middle-aged, living in the British suburbs. These were ignorant
“working class” stereotypes, that the TV stations in the UK exported
for us to watch in Australia, where no one was insulted as a nig-nog,
but a whole nation was denigrated as “abos”, “coons”, “boongs” and
other terms of abuse for what the British homogenized as “blacks”.
I’ll take this matter up again later.
The racial stereotypes of Love Thy Neighbour, which aired in the early
1970s, were a distinct improvement on those of the Black and White
Minstrel Show of the 1960s and the comedy

Devotion to ones guru is deeply engrained in Indian culture, more so


that devotion to one’s professor, however esteemed they may be, and
however popular they are. No one worships Dawkins, but he has many
fans who hero-worship him. He has a fan base which he plays to and
interacts cordially with. Dawkins is a professor of the Internet Age,
and was computer-savvy long before the rest of us. He was part of the
generation that developed computers and first used them in the
biological sciences. His debate with Sadith Kumar showed how different
their perspectives are when it come to
An enemy of reason is a person who won’t listen or let you finish your
sentence. A person who is rigid and dogmatic, maybe. Are they enemies
of reason or enemies of Dawkins? Not enemies he is conscious of, but
unconscious or subconscious enemies?

Kumar was the only self-identified philosopher featured in the first


episode, subtitled “slaves to superstition”.
In the second episode, titled The Irrational Health Service, the
enemies of reason were identified as the purveyors of various
alternative health treatments, many of whom are indeed enemies of
reason, and dangers to society. I’m right behind Dawkins in his expose
of alternative health quackery, including the pseudoscience of Chopra
and that of the purveyors of Traditional Chinese Medicine. I strongly
agree with him that ancient does not mean true and the product of
accumulated wisdom through the ages. The ancients in India and China
(and those in the Middle Ages) did not know about the Periodic Table
of Elements or the existence of cells, and very little about anatomy
and physiology. I agree that these were products of the European
Enlightenment that have transformed, fundamentally, medical and
scientific understanding. Doctors can also be enemies of reason,
though, and medical treatments can do more harm than most alternative
treatments, because the drugs used are more powerful than the usual
concoctions of herbs, vitamins and minerals that people waste their
money on, and much more dangerous than the placebos sold as
homeopathic treatments. There are many situations in which water is
safer than a prescribed drug – though of course, one might as well
drink water out of a tap at home.

How guilty is Satish Kumar of being an enemy of reason? What about the
two doctors, Deepak Chopra and Manjir Samanta-Laughton, who were also
featured in the Enemies of Reason documentaries, along with assorted
con artists and obviously deluded New Age charlatans? What is the New
Age movement, and to what extend has new age philosophy become a
religion? Was it a religion all along, or has it been made into one?
The first book I actually finished, back in 1996, was titled Alpha
State – A State for the New Age. The first chapter was titled “Western
Medicine and Holism”. I’ll begin with the first two paragraphs of this
book, which I never sought a publisher for:
“In recent years there has been increasing criticism of and
disenchantment with western medicine and psychiatry from a range of
people. These include patients who have in their perception received
superior care from ‘alternative’ and natural health practitioners,
natural practitioners themselves who criticise western medicine for
its excessive use of potentially toxic and extremely expensive drugs
and its lack of a holistic approach to individuals they treat, which
leads to a lack of sustained cures for many of the conditions treated
by western doctors.
“A rare group of critics of western medicine come from within the
community of western trained doctors themselves. Some of these defect
to ‘alternative medicine’, others give up medicine altogether. A few,
like Deepak Chopra have made attempts to integrate eastern and western
medicine but have, as a result of it been denigrated and ostracised by
the establishment doctors and specialists who view him as a crackpot
at worst or a maverick at best.”
A few years later I wrote in red pen the note “I think, really, he is
a bit of a crackpot! Especially his explanation of Vedic science and
‘quantum healing’”.
Back in 1995 and 1996, when I wrote this book, I was excited about the
New Age – the Age of Aquarius. The hippie anthem from the musical
Hair, which proclaimed “the dawning of the Age of Aquarius” was one of
my favourite songs when I was a teenager. The Age of Aquarius topped
the Billboard singles charts for six weeks in 1969, and even made it
even to Sri Lankan airwaves, where it was played on the country’s only
English music show on the radio, at the time. It’s a brilliant song,
carefully crafted to draw you in and hypnotise you with melody,
harmony and groove. A strong Black female voice lures you in before
the chorus that reverberates in your mind, reinforces as it is
repeated “The Age of Aquarius, the Age of Aquarius...Aquarius……
Aquarius.” The last Aquarius resolves decisively in the melody.
I only noticed the words of the chorus when I fell in love with the
song. I fell in love with the Black woman who sang the song too,
though I had no idea who she was or what she looked like. I fell in
love with the beauty of her voice, like many other young men of the
time. The song remained in the back of my mind till the Age of
Aquarius was reinvented as a marketable commodity and profitable
religion in the 1990s.
I didn’t listen to the lyrics of The Age of Aquarius closely enough to
realise that they were singing about an age based on astrological
conjunctions, and that the Age of Aquarius followed the Age of Pisces.
I didn’t believe in astrology, though I knew I was “on the cusp
between Virgo and Libra”, having been born on the 22nd of September.
Associating the idea of Virgo with a young woman – a virginal one at
that – I preferred, in my youth, to regard myself as a well-balanced
Libran than a feminine Virgo. In the 1990s, when I was in my thirties,
I was told by experts on such matters that being born in 1960 meant I
was ‘definitely’ a Virgo and that I was also a Rat according to
Chinese astrology. By then, I couldn’t have cared less – I had long
outgrown the superstition of astrology; even as a child I never really
believed it. I did know that my mother was a Piscean and my sister a
Gemini, though I never really attributed any other characteristics to
this label.
The New Age I was writing about in the 1990s, had less to do with
belief in astrology than a social and intellectual movement that
looked forward with optimism to the millennium. This movement was, as
Richard Dawkins rightly points out, extremely superstitious – it was a
time that people started seriously discussing the healing powers of
crystals and the predictive powers of tarot cards, along with
“parallel dimensions” and “alternative realities”. Young people in
Melbourne in the 1990s were enticed into shops that sold charms, and
assorted magical trinkets. Rupert Murdoch’s Harper-Collins corporation
led the action – a series of books published by as their “Aquarian”
series included the titles: Understanding Astrology; Understanding
Tarot; Understanding Runes; Understanding Reincarnation; Understanding
Crystals; Understanding Chakras; Understanding Numerology;
Understanding Dreams; and Understanding Astral Projection. If you
hadn’t been driven crazy enough by this superstitious nonsense you
could obtain more delusions from Dowsing; How to Develop your ESP;
Incense and Candle Burning; Invisibility and, believe it or not,
Levitation. If you believed the rest you probably would not baulk at
levitation.
This Aquarian series is the real culprit when it comes to enemies of
reason, and it should be Rupert Murdoch who should be interviewed by
Professor Dawkins if he wants to get to the root of the evils of the
New Age religion. Let me quote from Understanding Auras, first
published in 1987, and authored by a man who is otherwise
unidentified, named Joseph Ostrom. The book cost me $12.95, and I
bought it only because I was collecting evidence of a conspiracy to
drive people mad and then diagnose them as having schizophrenia. This
was an original hypothesis – I was looking for evidence, based on the
observation that the same beliefs being promoted by Harper Collins in
their Aquarian series was being promoted as evidence of schizophrenia
and psychosis in other publications by the same corporation, as well
as by mainstream medical and psychiatric opinion since long before I
studied medicine. What’s more, they really were delusional beliefs.
Looked at in religious terms, one might put forward the hypothesis
that the New Age religion is one of many religions created by Rupert
Murdoch and his media empire. He took a social movement that
crystallised around opposition to conscription during the Vietnam War,
the theme of Hair, which generated several hit songs, and made a
religion out of the Age of Aquarius – the New Age religion. This is
quite different to the New Age movement, which is spiritual, rather
than religion, and characterised by its emphasis on ecology,
environmental values, international friendship and peace, and similar
values, that are sometimes regarded as “green” or “left”. I think the
term “left” is best left alone, with its connotations of the pompous
debates between the Left and Right wings of the British parliament,
but the concept of the left and right in politics has changed a lot
since then. Left means reasonable and right means extremist, greedy,
materialist and capitalist. But left also means corrupt trade unions
and clueless street marchers, marching under various “left wing”
banners and slogans. Left meant Marxist, and I never was a fan of Karl
Marx. If Marxists meant fans of Groucho or Harpo Marx I would have
seen more sense it. They were funny. I have to say I find economics
boring, which is why I know nothing about it; maybe that’s why I’m
poor.
The New Age was fresh and new

by and instructed them on how to read auras and chakras.

What was, in the 1970s, a new age, is not new anymore. We are well
into the Age of Aquarius; about forty years into it. This age is going
to last for hundreds or thousands of years – it’s our long term
future.
The song peaked at number one for six weeks on the U.S. Billboard Hot
100 pop singles chart in the spring of 1969. The single topped the
American pop charts and was eventually certified platinum in the U.S.
by the RIAA
Drugs, racial integration, nudity, anti-conscription, anti-Vietnam War
– music and optimism.
Counter-culture; counter-terrorism and Chomsky
Bertrand Russell
/
“I call him the Diamond-encrusted guru” says the British physicist
Brian Cox of Deepak Chopra, referring to Chopra’s spectacles. It got
a laugh from the studio audience. Cox knows how to work an audience,
as indeed Chopra does. Born in 1968, Cox is a young professor at the
University of Manchester and a science presenter on the BBC. He knows
much more than Chopra about physics, and this makes gives him a good
position to make fun of the New Age “guru” and his wacky ideas about
quanta, non-locality and other quantum physics concepts.

I had never heard of Manjir Samanta-Laughton, but I noted the obvious


fact that Satish, Chopra and Manjir are all of Indian origin. Chopra
and Satish Kumar have many admirers, which Dawkins is prone to
describe as “followers”, in line with his description of Satish Kumar
as a “New Age guru”. Chopra is also frequently called by the same
Indian word for teacher. The endocrinologist himself prefers to call
himself a sage and a scientist – the Chopra Center had a “Sages and
Scientists conference” in the USA in 2014. The speakers at the Sages
and Scientists included many names I have never heard of, but a couple
– and just a couple, that I had heard of – the neuroscientist
Christsof Koch (who worked with Francis Crick, famous more for his
role in the discovery of DNA than his later career as a
neuroscientist) and Chopra himself, whose book Quantum Healing I read
twenty years ago when I was seeking an integration between Eastern and
Western scientific models myself.
In the USA, medical doctors are identified by the letters MD after
their name (standing, in the direct American way for Medical Doctor).
This is a status symbol, which is earned through a medical degree, but
it is more than a status symbol – it indicates that one has studied
and passed a basic medical degree. In the American system, every
lecturer in the university system is called a ‘Professor’. The
British, more formal and obsessed by hierarchies in their political,
academic and medical systems have a complex system of titles and
letters that can be written, before and after ones name. These are
granted on the authority of the Crown that the Queen’s ancestors have
worn on their heads for several centuries. This crown is encrusted, I
have heard, with jewels from India. The same Crown inflicted this
rigid hierarchy and system of Royal honours and qualifications on its
colonies, including Australia.
The British university system places professors above associate
professors, above plain old lecturers, who are divided between senior
lecturers and junior lecturers, who are above tutors. The progression
between tutors, lecturers and professors is evident from their job
descriptions – the tutors teach, the lecturers lecture and the
professors profess to know. It’s all about knowledge, but taught
according to a strict hierarchy. The main objective – academic
objective – is to climb the ladder – though when one gets to the top,
one’s main job is to administer the behemoth of one’s “department”.
Some professors love building empires and these are the ones that
succeed in doing so. Academia is a competitive world in the West. This
means that professors have to beat their competitors to get the top
job. This is done through a range of means, the most obvious of which
is publications – number of publications and, often to a lesser
degree, quality of those publications. Even when considering “quality”
of the publication, the yardstick is not the scientific merit of what
is written, but the prestige and reputation of the journal or book
publisher.
Richard Dawkins, Deepak Chopra and Satish Kumar have many ‘fans’ to
use another once slang term that has become common vernacular. Talking
of fans as enthusiasts rather than rotating contraptions emerged from
the era of TV and mass-media. Fans were short for fanatics – the
Beatles had fans, with an ambition of attracting as many fans as the
idol Elvis Presley, who was worshipped as the king. This is pop
culture, and I am a child of the pop culture of my time; so, despite
their age, were Dawkins, Chopra and Kumar. Pop culture, like all
culture, changes and evolves. It evolves much faster than biological
evolution – cultural evolution occurs over weeks, months, years,
decades and centuries, rather than millennia and millions of years.
Pop culture is a more recent development than popular culture, though
the term is used synonymously. In the world of music, though, there is
a clear distinction between what is “popular” and “pop music”. Pop
music is not considered to be “high art” – the fact that it is popular
is seen as indicating its inferiority to more refined tastes. There is
plenty of snobbery in the world of music, and this is directed towards
what one declares one likes; when I grew up it was not cool to admit
you liked pop music, even if you did. It was OK to be a music fan or
sports fan, because the slang word fan had evolved in its use such
that it no longer means fanaticism, but mere enthusiasm. The word love
might be a better word – I think it reasonable to love both the music
and the musicians who bring one the gift of beautiful music. Love is
very different to fanaticism, a point that I suspect would be better
appreciated by the philosopher Satish Kumar than the scientist Richard
Dawkins.

This culture had outgrown fanaticism and the hysteria of teenage girls
screaming and sobbing at the mere sight of four white English boys who
were often on TV, where they played to the love-struck emotions of
girls who imagined they were “in love” with them. The Beatles gave the
word “love” a bad name. It became clichéd and silly, and when I grew
up in the orient and antipodes my family and friends were not fans of
the Beatles, Elvis Presley or any of the “pop stars” who sang silly
love songs. We were variably interested in classical, pop, rock, folk
and jazz music, but not fanatical about the men and women who
performed it. This is not to say that we didn’t “fall in love” with
pop stars – but at least we didn’t

which is, again short for popular – I am a child of Western popular


culture, and, had I remained in England, where I was born, I would no
doubt have been brought up to have an ambition to go to Oxford or
Cambridge

Deepak Chopra
Eric Topol, MD
Jack Andraka
Kyra Phillips
Rudolph Tanzi
Bruce Vaughn
Christof Koch, PhD
Deepak Chopra
Eric Topol, MD
Jack Andraka
Kyra Phillips
Rudolph Tanzi

/
Deliberate meditation position of hands

/
“Peace with the Cosmos”

I hope this to be a contribution to science and philosophy, more than


a literary work. I like writing, but I don’t enjoy typing, and am not
good at it. I’ve seen the fluidity, speed and accuracy of writers who
grew up typing and marvelled at it. For me, typing is a slow and
laborious process, conducted with two fingers and two thumbs.
With pen and paper I do much better, since that’s how I learned to
write, but doctors don’t learn to write well. In fact, ‘doctor’s
writing’ was notorious for its illegibility before the Internet Age. I
studied medicine and worked as a GP before the Internet Age
transformed medicine, along with the rest of science. Now what matters
is not how clearly your writing can be written but how clearly you
express yourself, and how well you reason based on evidence – the
Internet Age brought the era of “evidence-based medicine”, a catch-
phrase that emerged in the 1980s and is still widely used. Who in
their right mind wouldn’t medicine to be evidence-based? Having been
trained in evidence-based medicine and worked as a doctor who thought
his treatment methods were based on sound scientific evidence I have
been pondering the evidence on which I, and other doctors, base their
treatments. In doing so I have tried to maintain a holistic approach –
not to miss the forest for the trees, as the saying goes.
One problem lies in what “evidence based medicine” analyses, and the
categories it uses for this analysis. Wikipedia provides this example:
“A 2007 analysis of 1,016 systematic reviews from all 50 Cochrane
Collaboration Review Groups found that 44% of the reviews concluded
that the intervention was likely to be beneficial, 7% concluded that
the intervention was likely to be harmful, and 49% concluded that
evidence did not support either benefit or harm. 96% recommended
further research.[52] A 2001 review of 160 Cochrane systematic reviews
(excluding complementary treatments) in the 1998 database revealed
that, according to two readers, 41.3% concluded positive or possibly
positive effect, 20% concluded evidence of no effect, 8.1% concluded
net harmful effects, and 21.3% of the reviews concluded insufficient
evidence.[53] A review of 145 alternative medicine Cochrane reviews
using the 2004 database revealed that 38.4% concluded positive effect
or possibly positive (12.4%) effect, 4.8% concluded no effect, 0.69%
concluded harmful effect, and 56.6% concluded insufficient evidence.
[54]
Complementary and alternative – what do they mean?
Reasoning involves a manipulation of words and numbers
/

MORE SPEAKERS
y in medicine; not like in geological time.

I wonder and ponder much more than I speak or write. I didn’t always.
I used to speak and write much more than I do nowadays, back when I
was working as a family physician. Most of my writing, for many years,
consisted mainly of writing “notes” about my patients, and most of my
speaking was giving medical advice. I also spoke to my family and
friends, and rarely to my medical colleagues, but most of my talking
was to patients, and most of my writing was also about patients.
Then, twenty years ago, I became a patient myself. This had the effect
of stopping me from talking, from which I have never fully recovered.
I can talk, quite reasonably, about many things, but I don’t
customarily do so. The physical effects of the haloperidol syrup that
first stopped me talking wore off long ago, and I am quite able to
talk, and talk quite normally again. The total experience of having
this drug forced on me has scarred me much more than the chemical
effects alone.
This is a book about me, my wonderings, ponderings, musings and
conclusions. Why should you care about any of these? Well, you may
not. I am not a philosopher or a historian, but I have come to
philosophical and historical conclusions based on my reasoning, not as
a philosopher or historian, but a scientifically-trained doctor.
Medical education is not big on philosophy and history, nor on
geography, geology and the other non-medical sciences. Scientific
training is strictly divorced from “the arts”, which means that we
learned nothing about emotional responses to music, conversations or
literature in our training. These are some of the things I have been
pondering about over the years, and occasionally writing about.
Most of my writing over the past twenty years has been about
psychiatry, though I am not a psychiatrist. The books I have written
have not been widely read, though they have been diagnosed as evidence
of mental disorder and paranoia by one of Melbourne’s most senior
professors of psychiatry, Professor Bruce Singh, who offered the
opinion, in a letter to the Victorian Medical Board that “whilst
individual paragraphs and pages contain scientifically valid material
and thoughtful comments and criticisms, interspersed amongst these are
clearly paranoid outbursts and much is made of guilt by association”.
Professor Singh presented his perspective of our one and only meeting
in a letter addressed to Dr Joanne Katsoris, registration manager at
the Medical Practitioner’s Board of Victoria with this introductory
paragraph:
“I saw Dr Senewiratne on 1 May 2001 in my office at the Royal
Melbourne Hospital. Dr Senewiratne presented with a large briefcase
full of multiple manuscripts that he had written and a book that he
had published himself, he showed me many of these over the course of
the interview. I had also been sent a copy of his manuscript on
schizophrenia prior to my seeing him. My comment on all of the
documents shown to me is that whilst individual paragraphs and pages
contain scientifically valid material and thoughtful comments and
criticisms, interspersed amongst these are clearly paranoid outbursts
and much is made of guilt by association.”
Professor Singh did not mention the “manuscript on schizophrenia” I
had laboured over for many years by name; it was a 400-paged book I
had recently finished and printed a dozen copies of which I had, after
some deliberation, titled The Politics of Schizophrenia – dodging the
Inquisitors and Curing Delusions. I knew that the title would be seen
as provocative when I sent a copy to the Medical Board, when they
accused me of being an “impaired practitioner”. I was defiant in my
comparison between psychiatrists and Christian inquisitors, and that
it was a good idea to dodge them. I did not really expect Professor
Singh to agree with my argument that:
“The word psychiatry is derived from the Greek psyche (mind/soul) and
iatros (treatment). It refers, literally, to treatment of the mind.
This treatment can be cruel or kind. If it is cruel, it can be
expected to compound existing problems (and can, moreover, be defined
as torture). If it is kind, it can lead to relief of suffering and
distress. Despite the popular idiom that ‘one must sometimes be cruel
to be kind’, cruelty and kindness are opposites; it is not possible to
treat someone cruelly and kindly at the same time. Neither is it
necessary to be cruel to achieve therapeutic success, unless
therapeutic success is defined as ‘compliance with the orders of the
therapists’. If successful therapy is so defined, and, in the area of
psychiatry it often is, then cruelty becomes a powerful tool for
forcing upon the ‘patient’ the opinions and views of the therapist.”
(p 170)
Psychiatry in theory is rather different to psychiatry in practice. I
had rather hoped that Professor Singh would give me a fair hearing,
after reading how important he recognised cultural context to be in
psychiatry. Transcultural psychiatry was a fashionable new trend that
had been featured in the local newspapers, along with a photograph of
Professor Singh, who is noticeably “transcultural” in that he looks
Indian and has a well-known and common Indian name. As you can see,
from this Flikr posting (which mentions only that he is Chair of the
Board), Professor Bruce Singh wears a tie and a suit, and this is how
he was dressed when he spent an hour attempting to diagnose me with
one or other “disorder”. He did have the option of declaring me to be
mentally well, but that meant allowing me to have divergent views in
medicine without pathologising me.
/
Professor Singh wrote, regarding my medical approach, “He states that
he takes the holistic approach to medicine and says that he practices
scientific medicine and does not prescribe herbs or vitamins. He
listens to his patients and uses investigations when appropriate”. In
his final “impression” the important man wrote that “I suspect that it
is probably true that Dr Senewiratne has the capacity to insulate his
more conventional medical treatment (with the notable exception of
psychiatric disorder) from his broader range of ideas and views
regarding the world for much of the time.” His final curse:
“This is not the situation of a man willing to accept that he has an
illness and to have treatment for it – this is an intelligent man who
is creative and intellectually highly active who is convinced that he
is not ill nor in need of treatment. This is a potentially dangerous
combination for any professional and particularly a doctor interacting
with the community.”
This paragraph was bound to still like mud, since it is was what read
last; being often the only thing that is actually read of a five-paged
report. It was a written curse that doomed me, and that I have been
fighting ever since. I have been pondering and wondering about this
curse, and how to escape it. I have been thinking about labels of
mental illness as curses and men like Bruce Singh as the Grand Wizards
of the cult of psychiatry – or the high priests of the religion of
psychiatry, if you prefer. They are also the masters of the mind in
the Free World, who rule their global enterprise though manipulation
and hypnosis, as well as what they call “biological treatments” –
drugs, shock treatment and brain mutilation. These are better known in
Australia by the euphemisms of ‘medication’, ‘ECT’ and
‘psychosurgery’. This is the paradigm I had when I was diagnosed as
having a “paranoid personality” by Professor Singh, though I minced my
words when I spoke to him, and hid my disdain of the empire he and the
other kings of psychiatry had built in Australia. Not that Australia
was my home, in fact Bruce Singh was born in Australia long before my
parents, sister and I arrived here in 1976, but I had sympathy with
the Aboriginal view of White Invasion, and still do. I had discovered
many things about this invasion that were brought to light by white
and black academics in the 1970s, after abandonment of the White
Australia policy.
In Australia there is an academic discourse that refers to “black” and
“white” historians, who have rather different perspectives. There are
also white historians who are accused to have a “black arm-band
version” of history. Not being a historian I have never ventured into
this debate, but couldn’t help but be aware of it and influenced by
it. By 2001, when I was assessed by Professor Singh, I had already
researched and written Eugenics and Genocide in the Modern World,
which touched on, but didn’t explore the connection between the White
Australia Policy, the atrocious treatment of Aboriginal people and the
Anglo-American eugenics movement. This same racist eugenics movement
shaped the development of academic psychiatry in Australia, being
reinvented as “psychiatric genetics” after the term “eugenics” fell
from favour when the monstrous application of the science of “breeding
better humans” by the Nazis was exposed. Of course, psychiatric
genetics is different in many respects to eugenics, since the
mechanism of inheritance in the form of the DNA molecule had not been
discovered, when Francis Galton wrote Hereditary Genius in which he
claimed to have evidence that the Australian Black is a “further grade
less intelligent” to the African Black, who was two grades, on
average, below the White Man.
In 1998, when I was researching and writing Eugenics and Genocide in
the Modern World, my partner Sara, who was studying education at the
University of Melbourne, found several books on eugenics in the
university library, including Hereditary Genius, and other books by
Galton, as well as others by his disciples, all of whom agreed there
was there was an urgent need to implement policies of segregation
between Blacks and Whites, as well as positive and negative eugenic
programs to “preserve the purity of the White Race”. This racist
science was inevitably embraced by a nation whose first Act of
Parliament was to restrict immigration in what has been since
described as the “introduction of the White Australia Policy”.
Wikipedia presents the bare outline of the White Australia Policy,
from official sources (
“The term White Australia Policy comprises various historical policies
that intentionally favoured immigration to Australia from certain
European countries, and especially from Britain. This was an attempt
of Australians to help shape their own identity after federation. It
came to fruition in 1901 soon after the Federation of Australia, and
the policies were progressively dismantled between 1949 and 1973.[2]
Australia's official First World War historian Charles Bean defined
the early intentions of the policy as "a vehement effort to maintain a
high Western standard of economy, society and culture (necessitating
at that stage, however it might be camouflaged, the rigid exclusion of
Oriental peoples)."[3]

Competition in the goldfields between British and Chinese miners, and


labour union opposition to the importation of Pacific Islanders into
the sugar plantations of Queensland, reinforced the demand to
eliminate or minimize low wage immigration from Asia and the Pacific
Islands. Soon after Australia became a federation it passed the
Immigration Restriction Act of 1901. The passage of this bill is
considered the commencement of the White Australia Policy as
Australian government policy. Subsequent acts further strengthened the
policy up to the start of the Second World War.[4] These policies
effectively allowed for British migrants to be preferred over all
others through the first four decades of the 20th century. During the
Second World War, Prime Minister John Curtin reinforced the policy,
saying "This country shall remain forever the home of the descendants
of those people who came here in peace in order to establish in the
South Seas an outpost of the British race."[2]

The policy was dismantled in stages by successive governments after


the conclusion of the Second World War, with the encouragement of
first non-British, non-white immigration, allowing for a large multi-
ethnic post-war program of immigration. The Menzies and Holt
Governments effectively dismantled the policies between 1949 and 1966
and the Whitlam Government passed laws to ensure that race would be
totally disregarded as a component for immigration to Australia in
1973. In 1975 the Whitlam Government passed the Racial Discrimination
Act, which made racially-based selection criteria unlawful. In the
decades since, Australia has maintained largescale multi-ethnic
immigration. Australia's current Migration Program allows people from
any country to apply to migrate to Australia, regardless of their
nationality, ethnicity, culture, religion, or language, provided that
they meet the criteria set out in law.[2]
Two references on the Wikipedia introduction to the White Australia
policy are the National Museum (reference [2]) and the Australian
government’s ‘official historian’ whose name is, with some irony, Mr
Bean. Mr Bean is reference [3] I had never heard of him, but he’s
apparently been a very influential man. Captain Charles Edwin Woodrow
Bean (18 November 1879 – 30 August 1968), usually identified as C.E.W.
Bean, was an Australian schoolmaster, judge's associate, barrister,
journalist, war correspondent and historian.[1]

Bean is renowned as the editor of the 12-volume Official History of


Australia in the War of 1914–1918. Bean wrote Volumes I to VI himself,
dealing with the Australian Imperial Force (AIF) at Gallipoli, France
and Belgium. Bean was instrumental in the establishment of the
Australian War Memorial, and of the creation and popularisation of the
ANZAC legend.

An Oriental Orientation
The term "Orient" derives from the Latin word oriens meaning "east"
(lit. "rising" < orior " rise"). The use of the word for "rising" to
refer to the east (where the sun rises) has analogs from many
languages: compare the terms "Levant" (< French levant "rising"),
"Vostok" Russian: ?????? (< Russian voskhod Russian: ??????
"sunrise"), "Anatolia" (< Greek anatole), "mizrahi" in Hebrew ("zriha"
meaning sunrise), "sharq" Arabic: ???? (< Arabic yashriq ???? "rise",
shur?q Arabic: ????? "rising"), "shygys" Kazakh: ????? (< Kazakh shygu
Kazakh: ???? "come out"), Turkish: do?u (< Turkish do?mak to be born;
to rise), Chinese: ? (pinyin: d?ng, a pictograph of the sun rising
behind a tree[1]) and "The Land of the Rising Sun" to refer to Japan.
Also, many ancient temples, including pagan temples and the Jewish
Temple in Jerusalem, were built with their main entrances facing the
East. This tradition was carried on in Christian churches. To situate
them in such a manner was to "orient" them in the proper direction.
When something was facing the correct direction, it was said to be in
the proper orientation.[citation needed]
East In Sinhalese:
Professor Singh is an important academic psychiatrist in Australia. He
is co-author of the main textbook used to train medical students in
Melbourne (and the state of Victoria) Foundations of Clinical
Psychiatry the first edition of which was published in 1994. This was
the same year that the American Psychiatric Association published the
DSM IV, updating the DSM III-R (where the R stands for ‘revised’).
Foundations of Clinical Psychiatry, the basis for diagnostic labelling
in Melbourne and Victoria that everyone who graduated as a medical
doctor was obliged to learn (if not accept) presents students with a
choice of “The DSM-III-R Classification” in Appendix B. or the “ICD-10
Classification (abbreviated form)” in Appendix A. These are rival
British and American classifications of mental “disorders”.
After an hour of two of grilling me, during which I became
increasingly annoyed, though never rude, I showed Professor Singh some
diagrams I had drawn after analysing the annual reports of the Mental
Health Research Institute (MHRI), and its links with various
organizations. I noted, for example, that the President of the MHRI
Board of Directors was Dr Ben Lochtenberg, who was also the boss of
Orica, previously the Australian branch of the British Imperial
Chemical Industries (ICI), which was a major exporter of cyanide,
explosives and detonators, around the world. I also noted various drug
companies and what their involvement was with the medical profession,
mainly in the area of psychiatry, and how various institutions,
internationally, were connected with the Australian medical research
institutions. I did not explain the connections, I just drew them, to
clarify my thinking on the serious matter of medical corruption.
Professor Singh noted these as evidence of what he called a “paranoid
personality”, and was much more interested in these than the
“scientifically valid material and thoughtful comments and criticism”
that he mentions but does not specify, elaborate on or refute. He
wrote:
“Dr Senewiratne showed me a number of his publications and papers that
he had written. Of particular interest to me were a series of charts
in which he drew lines connecting various organisations that had links
with each other. He was very much of the line of thinking that used
non-Aristotelian logic in the sense that he seemed to be suggesting
that if A had a connection with B and if B had a connection with C,
then A must have a connection with C. He obviously spent a lot of time
looking for these connections and drawing them as the intricate
diagram demonstrated. He claimed however that this line of reasoning
led him to hypotheses for which he then attempted to look for further
information to prove or disprove his theory. I found little evidence
of an open mind.”
I think my mind, is, on the contrary, too open. I have been quite
gullible at times, and fallen for scams, been successfully deceived by
news reports and other programs on TV, and conned by drug companies.
Gradually I have been becoming more critical and discriminating about
what I believe, although I have had lapses. I am guilty of many
things, but not a closed mind, and no one but Professor Singh has ever
accused me of having one. Others have accused me of having delusions,
and I have, but the fact that these delusions have disappeared is
proof that my mind is not closed. But what is a closed mind, anyway?
Minds do not open and close like boxes, black or otherwise.
A closed mind is a metaphor, but it is also a judgement. No one likes
to be accused of having a closed mind, and I was annoyed when I read
that this is what Professor Singh thought of me. Impertinent, maybe,
closed-minded, no. As for Aristotelian and non-Aristotelian logic I
didn’t know the difference; but didn’t Aristotle logically surmise
that we think with our heart rather than our brains, while his non-
Aristotelian colleagues had reasoned, correctly, that we think with
our brains? Didn’t Aristotle also logically surmise that there were
only four, or at most five, elements that corresponded to the four
humours that Galen and his followers believed for a thousand five
hundred years in the West? So what if I use “non-Aristotelean logic”?
Though he is one of the senior editors of Foundations of Clinical
Psychiatry, only two of the chapters are actually credited to
Professor Singh, both of which have co-authors. The other chapters of
the textbook are written by various Melbourne academics from the
University of Melbourne and Monash University. The two chapters by
Singh are “Making Sense of the Psychiatric Patient” and “Failure to
Cope and the Adjustment Disorders”. It is the first of these that has
helped me make sense of Professor Singh, because it tells me how
different what he teaches is from what he does. And what he teaches is
bad enough.
According to the chapter on Making Sense of the Psychiatric Patient,
careful attention should be paid to taking a history and recording
“case notes”. “The length and emphasis of the personal history will
vary widely – the aim is to build up a unique picture of the patient”
advises the professor. The following are “commonly recorded, although
the list is not exhaustive”: early development- complications of
pregnancy, feeding problems, achievement of ‘milestones’, childhood –
hyperactivity, bed wetting, phobias, friendships and play, major
childhood illnesses; school – including academic performance,
disciplinary trouble, peer relationships and emotional problems;
adolescence – adjustment difficulties, sexual behaviour, delinquency,
relationships, drug use; occupation – job record and satisfaction
ambition, and military service; sexual history – attitudes,
activities, past partners, sexual orientation, problems with impotence
or loss of libido; marital history – courtship, relationship with
spouse or de-facto, current state of marriage, past marriages or
divorces; children – names, ages, relationship with patient; habits –
alcohol, tobacco, drug use or abuse; forensic history – past offences,
convictions and sentences, anti-social behaviour; leisure – interests,
hobbies; social network – family, friends, supports.
This is just the “personal history”. There’s also the “psychiatric
history”, the “medical history” (described as a ‘past medical
history’, as if history can be otherwise) and the “family history”
which “gives a wealth of information about who a person is and his or
her background”. The family tree, which is described as a geneogram,
is “a highly effective way of encapsulating a large body of
information at a glance and is therefore an essential part of the
write-up”.
What, you might ask, is all this personal information written down
for? Well, it’s written as ‘case notes’ so that it can be presented at
case conferences and discussed by the ‘treating team’. They discuss
the “large amount of information” come up with a probable diagnosis
and a differential diagnosis, like they do in other areas of medicine,
and use it to decide on what treatment, out of various “biological
treatments” and “psychological treatments”, are best suited to the
individual needs of the unique patient, having taken a thorough
history and conducted a Mental State Examination (MSE). This is the
paradigm is promoted by The Foundations of Clinical Psychiatry as the
“biopsychosocial approach”, which is credited to the American
psychiatrist George Engel (1913-1999).
Professor Singh skipped most of the history taking, for obvious
reasons. His job was not to treat me or cure me, it was to judge me.
He had to make judgements about my judgements, especially my medical
judgements. That’s why I had been sent to see him. There had been no
adverse reports about my medical conduct or judgements, but I had been
declared mentally ill by his and my colleagues, and the Medical Board
wanted his expert opinion on the matter, as one of Melbourne’s most
senior psychiatrists.
This was his verdict:
“Dr Senewiratne presented as a neat well-dressed man and he was
obviously anxious about seeing me. He presented his story with great
clarity and exactness, he is obviously intelligent and has clear views
of the world and of multiple conspiracies he had identified about all
elements of his particular views. He was convinced that his
perspective was right. In my view he demonstrated the classical
thinking of the paranoid personality in that he is constantly looking
for links between various aspects of his behaviour and of the world
and then drawing those links together to reach conclusions about the
world, usually conspiratorial ones (see attachment of chapter from the
book Neurotic Style by David Shapiro, Basic Books, NY, 1965).”
I had never heard of David Shapiro, but I didn’t like the title of the
book. What the heck is a “neurotic style?” I was sent a copy of
Professor Singh’s 5-page report, but not the attachment he sent to the
Board. The closest to a “paranoid personality” in modern psychiatry is
a “paranoid personality disorder”, the label they have retrospectively
applied to tyrants like Stalin.
My book, which Bruce Singh pathologised after scan-reading bits of it,
was titled The Politics of Schizophrenia. If you read the following
profile from the University of Sydney Medical School you may get an
idea of the politics I was exploring in the years before I was
formally assessed by him:

SINGH, BRUCE S
MB BS 1968 PhD RACP FRANZCP
Bruce Singh became Foundation Chair of Psychological Medicine at
Monash University and the Royal Park and Alfred Hospitals in Melbourne
in 1984.
Born in Sydney in 1946, Bruce studied Medicine at the University of
Sydney. After graduating in 1968, he completed his Residency at the
Royal Prince Alfred Hospital, continuing as a Medical Registrar and
Psychiatric Registrar until 1975. He then became a National Health and
Medical Research Council (NHMRC) Travelling Fellow in the Clinical
Sciences and from 1975 to 1978, was based at the University of
Rochester, New York, the Institute of Psychiatry and Maudsley Hospital
in London, and at the University of Sydney.
In 1978, Bruce was appointed Senior Lecturer in Psychiatry at the
University of Newcastle where he remained until 1984, when he became
Foundation Chair of Psychological Medicine at Monash University and
the Royal Park and Alfred Hospitals in Melbourne. In subsequent years,
Bruce worked as Co-Director of the Master of Psychological Medicine
program at Monash University, also playing a role in the development
if the new medical curriculum at the University of Newcastle.

In 1991, he took up an appointment as the second Cato Professor of


Psychiatry at the University of Melbourne. In this role, Bruce is Head
of the Department of Psychiatry and has considerably expanded the
academic activities of the Department, which extend across three
Clinical Schools and ORYGEN Youth Health, with a strong emphasis on
links with the public psychiatric service system. Bruce has developed
Professorial positions at both the major private psychiatric hospitals
in Melbourne, the Melbourne Clinic and the Albert Road Clinic, and
additional Professorial appointments at Barwon Health and in the
Australian Centre for Posttraumatic Mental Health.
Bruce directs the Master of Medicine (Psychiatry) Program and the
Graduate Diploma of Mental Health Sciences, and is Chair of the Human
Mind and Behaviour stream of the new undergraduate curriculum at the
University of Melbourne. In 1996, he was appointed Associate Dean
(International), a position which he still holds, responsible for
international matters in the Faculties of Medicine, Dentistry and
Health Sciences, including the establishment of the Australian
International Health Institute of the University of Melbourne. As
Assistant Dean from 1997 to 2000, he was responsible for the North
Western Health Care Network.
The main thrust of Bruce’s research activities has been in the area of
schizophrenia, and his major achievement, together with D Copolov,
Director of the Mental Health Research Institute of Victoria, was the
establishment of the NHMRC Schizophrenia Research Unit, which he co-
directed from 1988 to 1996. His other areas of research include early
onset psychosis, psychiatric rehabilitation, psychiatric aspects of
disasters, rehabilitation in physical illness, and care-giving in the
community.
Throughout his career, Bruce has been a Consultant for both the state
and federal governments: He was Consultant to the Commonwealth
Department of Health for Evaluation of New Drugs from 1982 to 1990,
and Chief Policy Adviser for the Office of Psychiatric Services in the
Health Department of Victoria from 1988 to 1992, active in developing
the first National Mental Health Policy. He was Senior Medical Adviser
to the Mental Health Branch of the Department of Human Services in
Victoria from 2001 to 2004. Bruce was a Member of the Grants Committee
of the NHMRC from 1991 to 1996 and has been a Member of and Chairman
of Regional Grants Interviewing Committees on many occasions. He is an
active Member of the Royal Australian and New Zealand College of
Psychiatrists and was a Censor from 1980–88. He was Chairman of the
Fellowships Board and Committee for Examinations and Member of the
College Executive from 1988 to 1994. He served on the Community and
Research Committees of the Victorian Health Promotion Foundation from
its inception in 1990 until 2000.
Bruce was Director of Psychiatric Services at the Royal Melbourne
Hospital until 1995, when he was appointed Clinical Director of the
Royal Melbourne Hospital Psychiatry Clinical Business Unit, which
incorporated the Royal Park Psychiatric Hospital. In 1996, Bruce
became Clinical Director of North Western Mental Health, comprising
Adult, Aged and Adolescent Mental Health Programs, which service a
population of one million people in the northwest of Melbourne.
Bruce has written extensively on Psychiatry, having published more
than 120 papers and co-edited five books: The Foundations of Clinical
Psychiatry Textbook (published 1994 and revised in 2001),
Understanding Troubled Minds (1997), Family Caregiver (1998), and
Mental Health in Australia (2001).”

According to the impressive blurb, Professor Singh has been


researching schizophrenia, with his “major achievement”, together with
David Copalov, being the establishment of the NHMRC Schizophrenia
Research Unit” which he directed till 1996. I knew he was involved in
this, do I didn’t know the details and asked him about it, after he
had indicated that the “interview” was over, and he had all the
information he needed. He didn’t tell me what his judgement was; for
that I had to wait for the Board to send me a copy of the report.
He mentioned my query his report:
“As he left the room Dr Senewiratne enquired if I was still involved
with the NH&MRC. I replied that I was not, this was because I was
aware that the NH&MRC figures repeatedly in his theories of conspiracy
and corruption.”

Since then, Professor Singh has been awarded an Order of Australia


(AO),
Publications:
Foundations of clinical psychiatry (Vietnamese language edition).
Melbourne University Press. 2001
Several chapters in Mental health in Australia published by Oxford
University Press. 2001

Citation for the Award of Honorary Doctor of Medical Science


Professor Emeritus Bruce S Singh AM
Professor Emeritus Bruce Singh arrived at the University of Melbourne
in
1991 to take up the Cato Chair of Psychiatry with an already
impressive
reputation in the teaching, research and practice of psychiatric
medicine that
included seven years as Professor of Psychological Medicine at Monash
University. He was involved in the implementation of the first medical
problem
based learning curriculum while at the University of Newcastle in the
late
1970s and early ‘80s which established a synthesis of teaching and
learning
and clinical practice that has infused subsequent developments in
medical
education across the country.
His expansion of the University of Melbourne Department of Psychiatry,
as
Cato Chair, and leadership of the discipline in Australia, have been
signally
important to the development of sub-specialty research programs and
improved services for psychiatric patients. These have included
developing
three professorial positions at major private Melbourne psychiatric
hospitals
and eight additional professorial appointments to underpin important
research
in areas such as post-traumatic health, neuropsychiatry, old age
psychiatry
and women’s mental health. With an enviable record of securing grants
and
funding for research, Bruce Singh’s research contributions,
particularly into
schizophrenia, have earned him an eminent international reputation.
His role,
with Professor David Copolov, in the establishment of the NHMRC
Schizophrenia Research Unit, laid the foundation for the creation of
ORYGEN
Research Centre and Melbourne Neuropsychiatry Centre.
Wide recognition from governments, professional associations and
community
organisations for Bruce Singh’s outstanding contributions to mental
health
policy, research and practice, include the Centenary Medal of
Federation and
Membership of the Order of Australia, the 25th Anniversary Medal from
the
Federation of Ethnic Communities, the Indo Australasian Psychiatric
Association Award and the Victorian Public Healthcare Award, and
international recognition from the Royal College of Psychiatrists (UK)
and the
American Psychiatric Association.
Bruce Singh’s governance and policy expertise has been long sought by
governments, pharmaceutical companies and professional associations,
either as a consultant or through committee membership or
chairmanship. He
has been a consultant to the Medical Board of Victoria since 1984 and
chaired
many NHMRC regional grants interview committees. The development of
mental health policy for Victoria, and the creation of 30 academic
positions in
multiple higher education institutions across Melbourne, owes much to
his
many years as Chief Policy Advisor on Mental Health to the Victorian
Department of Human Services and to the Minister for Health.
The co-editor of five books, he is currently working on a history of
psychiatry in Victoria with historian Ann Westmore…He also acts a
reviewer for a wide range of international journals.”

Bruce Singh’s main collaborator in his research into schizophrenia was


Professor David Copalov, who I have never met, but is a big name in
schizophrenia in Melbourne. Copalov directed the Mental Health
Research Institute, originally located at the Royal Park Hospital in
Parkville, where I was first locked up, in 1995. I have never been to
the MHRI, but I obtained copies of the institute’s annual reports from
1997 to 2001, on which I based my analysis of Australian schizophrenia
research in the Politics of Schizophrenia. Copalov was, in addition to
his role as institute director, a ministerial advisor on mental health
and a supposed expert on cannabis, psychosis and schizophrenia. In
1991 the MHRI had published Schizophrenia research in Australia : a
present state review, edited by David Copolov and Bruce Singh. 1997
MHRI was organized in several ‘divisions’ in which different aspects
of ‘mental health’ were researched - the applied schizophrenia
division, the molecular schizophrenia division, and the Alzheimer’s
Disease Division. Of course, these are not aspects of mental health.
They are specific mental illnesses – relatively rare ones when it
comes to the mental problems of Victorians; schizophrenia is said to
affect about 1% of the population, and though Alzheimer’s Disease is a
bigger problem, most people do not ever develop dementia (as opposed
to the normal forgetfulness of old age).
The fact that the MHRI focused on schizophrenia and Alzheimer’s
Disease was to do with funding and grants. In his Director’s Report
Copalov boasted that “performance in competitive grant rounds was
strong, with success rates well above the national average”. Who keeps
tabs of the “national average” in “grant rounds”, other than an
administrator who is keen to attract more funds for his organization?
Copalov also boasts that “publication rates remained very good, with
many papers appearing in pre-eminent journals” and that the “results
of recent research attracted great interest not only in specialist
research circles but also in the media”. Wow! The media?!

Origins
The Mental Health Research Institute (MHRI) was founded in 1956 as
part of the Victorian Hygiene Department. The Institute then
specialised in common mental disorders in the community, with a focus
on risk factors, prevention and appropriate treatment.
The building was located close to the original Royal Park Hospital in
Parkville, one of Victoria’s busiest hospitals.
Change
A working party established in 1983 recommended that a major expansion
and re-organisation take place. This resulted in a new emphasis on
neuroscience and the creation of an independent organization housed in
a modern neuroscientific research centre completed in 1994 at a cost
of $5.5 million.
The Institute was incorporated in 1987, and has since grown rapidly
from an initial staff of three to over one hundred.

The Institute Today


Today, the Mental Health Research Institute (MHRI) is a premier
psychiatric research organisation. The Institute is fully accredited
as a medical research institute by the National Health and Medical
Research Council. It attracts competitive research funds and is
internationally recognised for its achievements.
The Institute is formally affiliated with the University of Melbourne,
Melbourne Health and Monash University.
Using laboratory and clinical research, scientists at MHRI study the
normal and abnormal brain to better understand cognition and
behaviour.
Donate Now
With your help, MHRI can help make a brighter future for all
Australians.
donate
Updates in your Inbox
If you would like to find out about recent happenings at the
Institute, signup to the monthly MHRI eNewsletter.
sign up for our eNewsletter

Professor Colin Masters

Position: Executive Director, Mental Health Research Institute and


Laureate Professor, University of Melbourne

E: c.masters(at)unimelb.edu.au
T: +61 3 9035 6650 / +61 3 9389 2905

Campus: Kenneth Myer Building, Level 5, Royal Parade (corner Genetics


Lane), The University of Melbourne, Parkville Vic 3010
Courier: As above (Enter through Gate 11)

Research Interests:

Alzheimer's disease: Structure, function and processing of the amyloid


precursor protein of Alzheimer's disease. Identifying genetic and
environmental factors relevant to the metabolism of the amyloid
precursor protein and the biogenesis of amyloid. Therapeutic
strategies for Alzheimer's disease based on this knowledge.

Prion diseases: including Creutzfeldt-Jakob disease, Parkinson's


disease, Huntington's disease

Techniques Used:

Current studies on Alzheimer's disease are now focused on identifying


the pathways through which environmental and genetic factors can
operate to cause this disease. In collaboration with the
pharmaceutical industry and biotechnology enterprises, a
multidisciplinary approach is now directed at identifying lead
compounds which can inhibit the production or aggregation of amyloid
in the Alzheimer's disease brain: a new class of protease inhibitors
has been identified which has very promising activity in vitro and
will be evaluated in human trials. Commercial development of compounds
which are directed at the toxicity of the A-beta amyloid has
commenced. A phase II clinical trial of a metal chelating compound
aimed at mobilizing the amyloid A-beta from the Alzheimer's disease
brain has also been completed following the demonstration of efficacy
of this compound in a transgenic mouse model of Alzheimer's disease.

Knowledge gained from the Alzheimer's disease arena is being


transferred to Creutzfeldt-Jakob disease/prion diseases, Parkinson's
disease, and other related degenerations.

Bibliography (1968-2011)

Relevant Publications:
(Top 10 cited publications, updated June 2010)

Kang J, Lemaire H-G, Unterbeck A, Salbaum MJ, Masters CL, Grzeschik K-


H, Multhaup G, Beyreuther K, Müller-Hill B. The precursor of
Alzheimer's disease amyloid A4 protein resembles a cell surface
receptor. Nature 1987; 325:733-736. (3121 citations)

Masters CL, Simms G, Weinman NA, McDonald BL, Multhaup G, Beyreuther


K. Amyloid plaque core protein in Alzheimer disease and Down syndrome.
Proc Nat Acad Sci USA 1985; 82:4245-4249. (2203 citations)

Weidemann A, König G, Bunke D, Fischer P, Salbaum JM, Masters CL,


Beyreuther K. Identification, biogenesis, and localization of
precursors of Alzheimer's disease A4 amyloid protein. Cell 1989;
57:115-126. (980 citations)

Masters CL, Multhaup G, Simms G, Pottgiesser J, Martins RN, Beyreuther


K. Neuronal origin of a cerebral amyloid: neurofibrillary tangles of
Alzheimer's disease contain the same protein as the amyloid of plaque
cores and blood vessels. EMBO J 1985; 4:2757-2763. (695 citations)

Bush AI, Pettingell WH, Multhaup G, D. Paradis M, Vonsattel J-P,


Gusella JF, Beyreuther K, Masters CL, Tanzi RE. Rapid induction of
Alzheimer A amyloid formation by zinc. Science 1994; 265:1464-1467.
(608 citations)
McLean CA, Cherny RA, Fraser FW, Fuller SJ, Smith MJ, Beyreuther K,
Bush AI, Masters CL. Soluble pool of A? as a determinant of severity
of neurodegeneration in Alzheimer's disease. Ann Neurol 1999; 46:860-
866. (580 citations)

Cherny RA, Atwood CS, Xilinas ME, Gray DN, Jones WD, McLean CA,
Barnham KJ, Volitakis I, Fraser FW, Kim YS, Huang X, Goldstein LE,
Moir RD, Lim JT, Beyreuther K, Zheng H, Tanzi RE, Masters CL, Bush AI.
Treatment with a copper-zinc chelator markedly and rapidly inhibits ?-
amyloid accumulation in Alzheimer's disease transgenic mice. Neuron
2001, 30:665-676 [and see Previews: Gouras GK, Beal MF. Metal chelator
decreases Alzheimer ?-amyloid plaques. Neuron 2001; 30:641-642]. (498
citations)

Koo EH, Sisodia SS, Archer DR, Martin LJ, Weidemann A, Beyreuther K,
Fischer P, Masters CL, Price DL. Precursor of amyloid protein in
Alzheimer disease undergoes fast anterograde axonal transport. Proc
Natl Acad Sci USA 1990; 87:1561-1565. (498 citations)

Hilbich C, Kisters-Woike B, Reed J, Masters CL, Beyreuther K.


Aggregation and secondary structure of synthetic amyloid ?A4 peptides
of Alzheimer’s disease. J Mol Biol 1991; 218:149-163. (466 citations)

Masters CL, Harris JO, Gajdusek DC, Gibbs CJ Jr, Bernoulli C, Asher
DM. Creutzfeldt-Jakob disease: patterns of worldwide occurrence and
the significance of familial and sporadic clustering. Ann Neurol 1979;
5:177-188. (447 citations)

Laureate Professor Colin L. Masters CV

The Melbourne Brain Centre consists of a Centre for Translational


Research at the Royal Melbourne Hospital, and two new purpose built
buildings – at The University of Melbourne’s Parkville campus and the
other at Austin hospital in Heidelberg..

The Mental Health Research Institute have co-located in the new


buildings with researchers from the University of Melbourne, and the
Florey Neuroscience Institutes.

Researchers working at the centre will investigate a broad range of


conditions affecting the brain, including multiple sclerosis, stroke,
Alzheimer’s disease, Parkinson’s disease, trauma, depression,
schizophrenia, anxiety, epilepsy and motor neuron disease.

Over 700 scientists will work side-by-side in state-of-the-art


laboratories next to world class clinical facilities. This power-house
of intellectual capacity and research strength will enable the
development of more effective diagnostic tools, treatments and
ultimately cures for brain and mind disorders.

The size and research strength of the group will position the
Melbourne Brain Centre as one of the top five centres for brain
research internationally, alongside the Institutes for Neurology and
Psychiatry (UK), the Janelia Farm at the Howard Hughes Medical
Institute (USA) and the Riken Brain Science Institute (Japan).

The Centre will play a key role in attracting leading scientists to


come to Australia or retaining talented Australian researchers who
might have otherwise moved or stayed overseas.

Funding for the building project was largely provided by the Federal
government, the Victorian State Government, through the Department of
Innovation, Industry & Regional Development and the University of
Melbourne.

David Copalov and MHRI


/
Copalov wrote the chapter on ‘biological therapies’ in Foundations of
Clinical Psychiatry

As Pro Vice-Chancellor (Major Campuses and Student Engagement),


Professor Copolov is responsible for providing Senior Management
oversight of:

student experience activities, including issues at the interface


between the University and student associations
central leadership programs, including the Vice-Chancellor’s Ancora
Imparo Leadership Program and Monash Minds
Caulfield and Clayton campuses
the University's Diversity Agenda.
He also provides strategic, policy and operational input in relation
to capital development, mental health policies and a variety of cross-
portfolio issues that arise within both the Office of the Vice
Chancellor and the Office of the Chief Operating Officer.

Professional background

Before joining the University full-time in 2004, he was the Executive


Director of the Mental Health Research Institute of Victoria - a
position he held for 19 years

He is a Professor of Psychiatry and Honorary Professor of Physiology


at Monash and Professorial Fellow in the Department of Psychiatry at
the University of Melbourne. He is a Director on the Board of the
Royal Women’s Hospital and a Director of the Australian Nuclear
Science and Technology Organisation (ANSTO).

He is a passionate advocate for a reversal of the dehospitalisation


which has taken place in the public mental health sector.

He has held several advisory appointments to federal and state


governments, including 12 years as a member of the Victorian
Ministerial Advisory Committee on Mental Health and eight years as the
psychiatric expert on the Australian Drug Evaluation Committee.

Academic interests

Professor Copolov trained in psychiatry and internal medicine and


received his PhD in the neurobiology of psychiatric disorders.

His research interests lie in schizophrenia, particularly in


understanding why people suffering from the condition hear distressing
voices and how they can be better helped to cope with this symptom.

He has published more than 220 papers. In 2011 he was awarded a Medal
of the Order of Australia for contributions to higher education,
medical research and professional organisations.

A passionate advocate for a reversal of dehospitalization??

From Yarra Bend to the Orygen Research Program

Conspiracy theories and the Dark Side of Psychiatry

Orygen program –
Orygen Youth Health (OYH) is a world-leading youth mental health
program based in Melbourne, Australia. OYH has two main components: a
specialised youth mental health clinical service; and an integrated
training and communications program.

Orygen Youth Health is part of the public mental health system in


Melbourne, Australia, and sees young people aged 15 to 25, with a
focus on early intervention and youth specific approaches. There is a
close connection with Orygen, the National Centre of Excellence in
Youth Mental Health.

Our innovative clinical program is comprised of three parts: Acute


Services, Continuing Care, and Psychosocial Recovery.
Multidisciplinary teams composed of psychiatrists and mental health
clinicians deliver individually tailored services such as mental
health assessment and care, crisis management, psychotherapy,
medication, family support, inpatient care, group work, and vocational
and educational assistance. Orygen Youth Health Clinical Program
treats around 450 clients per annum.

Our training and communications program provides training and


resources to improve the understanding of mental health issues in
young people and to promote the capacity of services and the general
public in supporting young people. We work with a variety of
organisations including health services, schools, drug and alcohol
services, and community groups within our catchment area.

The work of OYH will be important to you if:


You are a young person aged 15– 25 with mental health issues and
living in the western or north western area of Melbourne
You are a family member or carer for a young person aged 15–25 with
mental health issues and living in the western or north western area
of Melbourne
You are a service provider working with and supporting young people
with mental health issues
You want to have access to the latest information and training in
relation to youth mental health
Norman James and his version of the history of psychiatry.
Conspiracy theories, theories about conspiracies and theories about
possible or probable conspiracies….Theories are not beliefs.
Conspiracy theories, paranoid delusions and paranoia. And their
resolution through talk therapies and bibliotherapy.

Black boxes, black men, black magic and black humour.

In the 1970s I was exposed to the American Black civil rights movement
through Mohammed Ali, the boxing icon who boasted that he could “float
like a butterfly and sting like a bee”. I admired his fancy footwork
and his cocky attitude, though I shared neither, and was not keen on
my father’s suggestion that I take up boxing as a sport. I didn’t want
to be hit repeatedly in the head by bigger boys; being the shortest
boy in the class was bad enough as it is. Mohammed Ali was not short,
like me, but he was “black” like me and all the other boys in my class
in “Trinity”. That’s what we called our school – Trinity. Sri Lankans
in Sri Lanka and around the world, (in the so-called ‘diaspora’) know
that Trinity means Trinity College, a school founded by the British in
the medieval city that the British named Kandy. It was in Kandy that I
heard that the boxer my parents had admired as Cassius Clay had
converted to Islam and taken the Muslim name of Mohammed Ali.
My enthusiasm for Mohammed Ali, despite my relative indifference
towards boxing, which was not a sport at Trinity, was because of his
reputation, and hero-worship of the boxing superstar by my classmates,
most of whom were Muslim. I was mystified as to why a black American
man would convert from Christianity to Islam, since I saw Islam as the
religion of my Muslim classmates, who were not any more black or white
than their Christian, Buddhist and Hindu peers. I was taken with his
pointing out, in a snippet that I saw in a cinema, of the fact that
black is equated with evil and that this is a sign of white
oppression. I now think the argument is ridiculous; it is more likely
that black is associated with evil, in many cultures, because of the
absence of light and the darkness and dangers of the night, as well as
the contrast between darkness and illumination – physical and mental.
The black civil rights movement, which began in the USA for obvious
reasons, did have relevance to Sri Lanka, though this relevance has
not been explored, since the focus has been on issues of race and
religion, tied to Sinhalese and Tamil nationalism. But skin colour
prejudice and discrimination against “black” people is deeply embedded
in Sri Lankan and Indian culture. It is rarely mentioned or discussed,
and may be more of a problem in India than in Sri Lanka, but I was
quite aware of it when I was a child, not so much from my friends, who
were of all shades of brown, from the very fair to the very dark, but
from my family, and especially my maternal and paternal grandmothers.

The name Kandy is an Anglicisation of the Sinhalese word for mountains


– ????. This word cannot be transliterated exactly into English
letters, but the British pronounced Kandy the same way that they
pronounced the American word for lollies – candy (which includes what
we in Australia call chockies and the British, more properly, call
chocolates). Since colonial days and the British renaming of their
capital city ?? ???? (Maha nuwara, pronounced [maha?nu??r?]) the
Kandyans have developed a new identity as Kandyans, though this
identity was constructed on earlier foundations. I didn’t learn
anything about these foundations when I was at Trinity, though the
evidence of them was clearly evident every time I walked past the
Dalada Maligawa – what the British called the Temple of the Tooth.
This medieval temple is the most recent home to the Tooth Relic, which
is believed, by some, to be the actual tooth of the Buddha. I walked
past the temple on the way home every day for many years, but never
knew how old it was, or whether the golden casket I once saw actually
contained the Buddha’s tooth. I still don’t know about the tooth, but
do know much more about the history of Sri Lanka now, ironically, much
more than when I was actually living there.

; Tamil: ?????, pronounced [?ka??i])

Kunde

Twenty years ago, in 1995, I was thirty four years old. I was working
as a GP in my own suburban general practice in Melbourne, and the
proud father of a two-year-old daughter, who took up most of my time,
when I wasn’t at work. I was also trying to establish a parallel
career in music, having moved to Melbourne from Brisbane in 1988 more
in search of musical opportunities than medical opportunities. The
main reason I moved to Melbourne was that I had fallen in love with
Sue, the mother of my two-year-old daughter, Ruby. Sue, who I met when
I was a medical student when I did an elective term in the remote
mining town of Mt Isa back in 1981, had decided to move to Melbourne
to study Japanese at Monash University. She was already a nurse, which
is how I happened to meet her; she was studying nursing at the Mt Isa
Base Hospital when I, as a fifth year medical student was sent to
learn about “medicine” from a Sri Lankan doctor who I knew as “Kanaks”
who was known to his patients and colleagues in at the Mt Isa Hospital
as “Dr K”, because they were quite unable to pronounce his full
surname - Kanagarajah. Kanaks was an old friend of my father, and had
worked with him in the Kandy Hospital in Sri Lanka in the 1970s, when
they were both medical lecturers at the University of Peradeniya.
I had married Sue in 1991 in a civil ceremony held at my parent’s
house in Brisbane, which was interrupted by a violent thunderstorm.
This thunderstorm was welcome though, since it broke a drought that
was affecting Queensland at the time. Or so I was told, by my mother.
If, having been told that the thunderstorm that came upon our wedding
broke the drought, I concluded that the drought broke because of our
wedding, this would be a delusion. It would not make me mad, but it is
a crazy belief. It is irrational, as well as illogical. It would be
regarded as a sign of psychosis by modern psychiatry. I don’t have
such a delusion, and never have. I have, however had other delusions,
which I have decided to share with you, in the hope that they will be
helpful and enlightening to others. Some will not seem to you like
delusions, because you hold them to be true yourself. One person’s
delusion is another’s ‘deep and ancient wisdom’ or ‘mystical truth’.

“Indra (/??ndr?/), also known as ?akra in the Vedas, is the leader of


the Devas and the lord of Svargaloka or heaven in Hinduism. He is the
deva of rain and thunderstorms.[1] He wields a lightning thunderbolt
known as vajra and rides on a white elephant known as Airavata. Indra
is the most important deity worshiped by the Rigvedic tribes and is
the son of Dyaus and the goddess Savasi.”
The name of Indra (Indara) is also mentioned among the gods of the
Mitanni, a Hurrian-speaking people who ruled northern Syria from
ca.1500BC-1300BC.[5]
In Hinduism, Svarga (or Swarga) (Sanskrit: ??????), also known as
Swarga Loka, is any of the seven loka or planes in Hindu cosmology,
which sequentially are Bhu loka (Prithvi Loka, Earth), Bhuvar loka,
Swarga loka, Mahar loka, Jana loka, Tapa loka, and the highest,
Satyaloka (Brahmaloka).[1] It is a set of heavenly worlds located on
and above Mt. Meru. It is a heaven where the righteous live in a
paradise before their next incarnation. During each pralaya, the great
dissolution, the first three realms, Bhu loka (Earth), Bhuvar loka,
Swarga loka, are destroyed. Below the seven upper realms lie seven
lower realms, of Patala, the underworld and netherworld.[1] B. K.
Chaturvedi (2004)
The entry on Indra in the online Encyclopaedia Britannica is written
by an acknowledged American expert on Indology and the Sanskrit
language, Professor Wendy Doniger, who is a controversial figure in
India. The controversy is due to her books On Hinduism (2014) and The
Hindus – an alternative history (2009). which were withdrawn from
sale by Penguin amidst complaints that her books insulted Hindus and
Hinduism.
The Motto of the History of Hinduism: “Clearly the two—the animals of
the terrain and the animals of the mind—are intimately connected, and
both are essential to our understanding of Hinduism. If the motto of
Watergate was “Follow the money,” the motto of the history of Hinduism
could well be “Follow the monkey.” Or, more often, “Follow the horse.”
Three animals—horses, dogs, and cows—are particularly charismatic
players in the drama of Hinduism.” (Page 39)
.
“Indra, in Hindu mythology, the king of the gods. He is one of the
main gods of the Rigveda and is the Indo-European cousin of the German
Wotan, Norse Odin, Greek Zeus, and Roman Jupiter.
In early religious texts, Indra plays a variety of roles. As king, he
leads cattle raids against the dasas, or dasyus, native inhabitants of
the lands over which his people range. He brings rain as god of the
thunderbolt, and he is the great warrior who conquers the anti-gods
(asuras). He also defeats innumerable human and superhuman enemies,
most famously the dragon Vritra, a leader of the dasas and a demon of
drought. Vritra is accused as a dragon of hoarding the waters and the
rains, as a dasa of stealing cows, and as an anti-god of hiding the
Sun. Indra is strengthened for those feats by drinks of the elixir of
immortality, the soma, which priests offer to him in the sacrifice.
Among his allies are the Rudras (or Maruts), who ride the clouds and
direct storms. Indra is sometimes referred to as “the thousand-eyed.”
In later Hinduism, Indra is no longer worshipped but plays the
important mythological roles of god of rain, regent of the heavens,
and guardian of the east. Later texts note that break in the worship
of Indra. In the Mahabharata, Indra fathers the great hero Arjuna and
tries in vain to prevent the god of fire, Agni, from burning a great
forest. In the Puranas, ancient collections of Hindu myths and
legends, Krishna, an avatar of Vishnu, persuades the cowherds of
Gokula (or Vraja, modern Gokul) to stop their worship of Indra.
Enraged, Indra sends down torrents of rain, but Krishna lifts Mount
Govardhana on his fingertip and gives the people shelter under it for
seven days until Indra relents and pays him homage.
In painting and sculpture, Indra is often depicted riding his white
elephant, Airavata. Indra also plays a part in the Jain and Buddhist
mythology of India. When Mahavira, the Jain saviour and reformer, cuts
off his hair to signify his renunciation of the world, Indra, as king
of the gods, receives the hair into his hands. Buddhist mythology
sometimes mocks Indra and sometimes portrays him as a mere figurehead.
Wendy Doniger
Does Wendy Doniger know what she’s talking about? Is she in touch with
reality? Is Indra real? And if not, does that mean that hundreds of
millions of Indians are suffering from delusions? What about the other
Hindu gods? Is all belief in God and gods, angels and demons, devas
and asuras delusional?
I’ll check who Wendy Doniger is.

having been born at the St Pancras Hospital in London on the 22nd of


September, 1960.
I’m mad, and I think I’ve got good reason. I’m mad about lots of
things. Sometimes I show that I’m mad, mostly I don’t. Most of the
time I’m not mad, but that doesn’t mean I’m necessarily sane. It is I
who makes the judgement, in my own mind, about what is mad and what is
sane, though science does come into it. Or so I hope.
Psychiatrists never call people mad. They prefer the term “mentally
ill”, which they somehow think is less stigmatising than calling
someone mad. It is not, and I detest being regarded as “mentally ill”.
The label of “mental illness” is far more stigmatising than the label
of “madness”; madness has been cool since the British ska band proudly
adopted the name “Madness” in the 1970s.
I listened to Madness when I was a teenager, growing up in Brisbane in
Australia, where the band’s catchy pop hits were frequently on TV.
Madness was not my favourite band of the time, but I was strongly
influenced, in my musical taste, by British ska music, long before I
discovered its Jamaican reggae roots and the music of Bob Marley and
Peter Tosh. For some months my favourite band was UB40, which took its
name from unemployment benefit forms used in Thatcherite Britain, and
I loved The Beat’s anthem “Stand Down Margaret” about the conservative
British Prime Minister. I was on the side of the musicians, though I
didn’t know much about British politics or Margaret Thatcher. I fell
in love with the beat and the groove, back in the day when the beat
was actually generated from vinyl grooves. Exactly how it did so was a
mystery to me, though not to those with more knowledge of science,
physics and technology. These were my cultural influences, much more
than European and American philosophy or literature.

I was surprised to read, on Wikipedia, that Madness are still around:


“Madness are an English ska band from Camden Town, London, that formed
in 1976. One of the most prominent bands of the late 1970s and early
1980s 2 Tone ska revival, they continue to perform with their most
recognised line-up of seven members. Madness achieved most of their
success in the early to mid-1980s. Both Madness and UB40 spent 214
weeks on the UK singles charts over the course of the decade, holding
the record for most weeks spent by a group in the 1980s UK singles
charts. However, Madness achieved this in a shorter time period (1980–
1986). Madness have had 15 singles reach the UK top ten, one UK number
one single ("House of Fun") and two number ones in Ireland, "House of
Fun" and "Wings of a Dove".”

Though I remain officially mad, a few months ago the Mormon


psychiatrist who had been appointed to “treat” me kindly agreed to
stop ordering injections of a drug that had the effect of sterilizing
me, and dulling my creativity. The official label for my madness, as
documented in the case notes and confirmed by the Mental Health Review
Tribunal, is “Psychotic Disorder –NOS”. NOS stands for Not Otherwise
Specified.
I had been aware of the label of Personality Disorder –Not Otherwise
Specified, but not Psychotic Disorder – Not Otherwise Specified, both
sharing the abbreviation of PD-NOS. I assumed that the two labels were
similar in that “not otherwise specified” meant that it was not
otherwise specified in the DSM – the Diagnostic and Statistical Manual
of Mental Disorders, published by the American Psychiatric Association
(APA – an acronym it confusing shares with the American Psychological
Association). Psychotic Disorder – NOS is not elsewhere specified in
the DSM, which does specify all the known and accepted types of
psychosis, such as schizophrenia, mania and drug-induced psychosis.
According to the PsychoticDisorder.org website:
“Psychotic disorder NOS – Not Otherwise Specified- is a mental illness
which does not fall under any specific mental illness (because it
lacks specific illness traits) and does not have a specific way of
diagnosis. Its diagnosis is based on personal experiences reports or
what other people- around the patient- report as the behaviour of the
ill person. Psychosis is a mental illness which cause people to change
personality, behaviour and have mental malfunctions- especially loss
of sense of reality-. Psychotic disorders, hence, are those that
affect cognitive functions; altering personality and behaviour. “
Psychotic is the adjective of the noun psychosis, which is generally
translated these days as a mental state when someone is “out of touch
with reality”. This raises the obvious question as to how reality is
defined, and by whom. Is reality what scientists say it is or what
psychiatrists say it is? Is reality defined by consensus opinion, or
by what the media tells us, or by religious texts? Is the best judge
of reality science and the best way of ascertaining reality the
scientific method? How well does psychiatry match up to the standards
of scientific inquiry and the scientific method? How well do
psychiatrists match up to the ideal standards of psychiatry? These are
some of the questions I have been pondering about in the back of my
mind for many years, and are crystallising while I write. The crystals
are disjointed, but I’m hoping to bring them together at some point.
With various mental illness labels, some far worse than Psychotic
Disorder-NOS, dangling, like the mythical sword of Damocles, over my
head, I have to be careful about what I write. That is, if I intend to
publish it. In the privacy of my own home, on pieces of paper, I can
write what I want, and I do. But publishing something is different. It
becomes public, and it is intended for people to read what I have
written. I am therefore writing to a reader, or, hopefully, readers.
Yet I am not, as you may have gathered by now, a writer. My training
was as a doctor – which I was told, many years ago, means teacher
rather than anything to do with healing or health. The word doctor was
adopted by Western doctors from their Greek academic roots, but I was
told little about these roots when I studied Western medicine in the
Antipodes.
The University of Queensland did not educate doctors about the roots
of Western medicine, but there are many sources that I could have
turned to in order to critically examine the foundations on which my
education was built. I didn’t do this for many years, because I simply
did not think to. The Age of the Internet has made such investigation
easier for isolated madmen such as myself who do not have the means to
travel to Greece or the great museums and libraries in England, Europe
and the USA that contain the primary sources for finding out ancient
Greek philosophy, medicine and science.
With the miracle of the Internet I have discovered that psychosis is
another medical term that assumed a new meaning when the ancient Greek
philosophical roots were synthesised to create ‘New Latin’ medical
terms, in the 19th century. The Greek psyche – meaning soul (and
different to phren – or mind) was combined as a prefix with the suffix
-osis, meaning process. Osis was transferred to Latin from Greek, and
refers to any process, not necessarily a pathological one. Hence we
have, in biology, metamorphosis, symbiosis and osmosis, which do not
signify pathology, but are important factors in health of organisms.
In psychiatry, a branch of medicine, psychosis is not interpreted as a
“process of the soul”, as the Greek roots translate, but rather the
state of being “out of touch with reality” which is presumed to be a
“disease state” caused by a “disease process”. The problem is, despite
billions of dollars being spent and millions of lab rats sacrificed,
the scientific establishment has not been able to discover exactly
what these “disease states” and “diseases processes” actually are.
Osis –Greek
Schizophrenia is a “modern Latin” term derived from Greek roots
skhizein, meaning split, and phren, meaning mind. In was coined in
1908 by a Swiss psychiatrist, working at the Burgholzli clinic,
psychiatric hospital associated with the University of Zurich, by the
name of Eugen Bleuler (1857-1939). Bleuler was an eminent psychiatry
professor and a contemporary of Sigmund Freud, Carl Jung and the
“father of psychiatric classification”, Professor Emil Kraepelin of
the University of Heidelberg.
Wikipedia introduces Kraepelin as rather more than the “father of
psychiatric classification”, quoting the British psychologist Hans
Eysenck (1916 -1997) who was German by birth but was Professor of
Psychology at the Institute of Psychiatry, King's College, London (a
constituent college of the federal University of London), from 1955 to
1983:
“Emil Kraepelin (15 February 1856 – 7 October 1926) was a German
psychiatrist. H.J. Eysenck's Encyclopedia of Psychology identifies him
as the founder of modern scientific psychiatry, as well as of
psychopharmacology and psychiatric genetics. Kraepelin believed the
chief origin of psychiatric disease to be biological and genetic
malfunction. His theories dominated psychiatry at the start of the
twentieth century and, despite the later psychodynamic influence of
Sigmund Freud and his disciples, enjoyed a revival at century's end.”

whose ideas he generally supported, though he resigned from the


International Psychoanalytic Association in 1911, writing to Freud
that "this 'all or nothing' is in my opinion necessary for religious
communities and useful for political parties...but for science I
consider it harmful".
The Swiss psychiatrist, Eugen Bleuler, coined the term,
"schizophrenia" in 1911. He was also the first to describe the
symptoms as "positive" or "negative." Bleuler changed the name to
schizophrenia as it was obvious that Krapelin's name was misleading as
the illness was not a dementia (it did not always lead to mental
deterioration) and could sometimes occur late as well as early in
life.

The word "schizophrenia" comes from the Greek roots schizo (split) and
phrene (mind) to describe the fragmented thinking of people with the
disorder. His term was not meant to convey the idea of split or
multiple personality, a common misunderstanding by the public at
large. Since Bleuler's time, the definition of schizophrenia has
continued to change, as scientists attempt to more accurately
delineate the different types of mental diseases. Without knowing the
exact causes of these diseases, scientists can only base their
classifications on the observation that some symptoms tend to occur
together.

Both Bleuler and Kraepelin subdivided schizophrenia into categories,


based on prominent symptoms and prognoses. Over the years, those
working in this field have continued to attempt to classify types of
schizophrenia. Five types were delineated in the DSM-III:
disorganized, catatonic, paranoid, residual, and undifferentiated. The
first three categories were originally proposed by Kraepelin.

These classifications, while still employed in DSM-IV, have not shown


to be helpful in predicting outcome of the disorder, and the types are
not reliably diagnosed. Many researchers are using other systems to
classify types of the disorder, based on the preponderance of
"positive" vs "negative" symptoms, the progression of the disorder in
terms of type and severity of symptoms over time, and the co-
occurrence of other mental disorders and syndromes. It is hoped that
differentiating types of schizophrenia based on clinical symptoms will
help to determine different etiologies or causes of the disorder.

The evidence that schizophrenia is a biologically-based disease of the


brain has accumulated rapidly during the past two decades. Recently
this evidence has been also been supported with dynamic brain imaging
systems that show very precisely the wave of tissue distruction that
takes place in the brain that is suffering from schizophrenia.

With the rapid advances in the genetics of human desease now taking
place, the future looks bright that greatly more effective therapies
and eventually cures - will be identified.”
The Mormon psychiatrist is agreeable. He agreed that I had Psychotic
Disorder –NOS and he also agreed that the only way to find out if I
needed Invega injections was to stop them and “observe” me. Invega is
an expensive new drug that is classed as an “atypical” or “new
generation” antipsychotic, with the chemical name of paliperidone.
Despite being a medical doctor I had not heard of paliperidone until I
was told I was going to be injected with it. At the time I was a
prisoner at the PA Hospital, and knew that agreeing to the “depot”
injection was the only way I’d get out of the locked ward. This was in
mid-2013, and since then I had been visited at my home by a middle-
aged gentleman by the name of Nigel Lewin, who agreeably injected me
monthly with Invega Sustenna in my living room, sometimes in the
presence of an equally agreeable Indian gentleman by the name of Sagir
Parkar. Sagir is a psychiatry registrar and Nigel is a registered
nurse. Both are male, and both trained in England, where I was born.
I am even more agreeable than the agreeable Mormon psychiatrist. I
sometimes agree with people when I don’t agree with them mentally or
smile in apparent agreement. Usually it’s silent disagreement. The
better I know someone the more likely I am to voice my disagreement,
though I argue. I do enjoy a good debate, though, and I have had some
animated but never heated debates with Sagir and Nigel. I have never
debated with the Mormon psychiatrist, because debating with him would
have been both futile and fatal. I did reason with him, and he was,
I’m glad to say, amenable to reason. Not that I didn’t have Psychotic
Disorder –NOS, but that I didn’t need to be injected for it. Not that
he was convinced that I didn’t need the drug, he just agreed to stop
it and “watch me” over the next year or so.
Watching me and observing me are turns of phrase used psychiatry and
medicine with which I am very familiar. It means observing me, not for
signs of health, but of illness. The observations are to be made
monthly by Sagir and Nigel, both of whom I like and enjoy a good
debate with. Moreover they don’t think I am mad just because I
disagree with them, even if I laugh at what they say. This is very
unusual for mental health workers, and I have never experienced it in
qualified psychiatrists, though there are psychiatrists one can
disagree with who don’t regard all disagreement as insanity.
All psychiatrists make judgements about whether patients are deluded
or not, and have no reference point for such judgements other than
their own beliefs. I have learnt this from experience. I was careful,
at first, with debating with Sagir and Nigel, and have become bolder
over the past year. They are not psychiatrists, but they are the ones
who are tasked with making periodic assessments of my “mental state”.
It’s called an MSE – which stands for Mental State Examination. MSEs
are at the heart of modern psychiatry. Sagir is training to do MSEs
and his competence in doing MSEs will be examined if he is to become a
psychiatrist. Becoming a member of the college of psychiatrists
requires him to learn and comply with the doctrines of what an MSE
constitutes, and how it is used according to Australian psychiatry. He
never tells me exactly what he writes when he gets back to the
hospital, but I’ve seen enough MSEs and done enough myself to have a
good idea.

I called the Mormon psychiatrist a Mormon psychiatrist because I


didn’t know how to else to distinguish him from other psychiatrists I
have met. If others were Mormons they didn’t tell me, but then neither
this this one. I found this titbit out by Googling his name, and
reading what was in the public domain about him. He lists the Church
of Latter Day Saints as the only organization he belongs to, which I
have always known as the Mormons. His Linkedin business entry also
lists Richard Branson, the founder of Virgin as his only “influence”
and that he has a MBA – a Masters degree in business administration
from the University of Melbourne, with his original medical (and
surgical) degree from Ireland. The Mormon psychiatrist must have a bit
of dough, since he also likes “sailing, snow skiing, cycling, latin
dance, heliskiing”. I found this outdoorsy image rather surprising
given his rather portly figure, and found the image of the
conservative business-man cum psychiatrist doing “Latin dance”
strangely liberating. The fact that he was publicly declaring that he
was a Mormon made him fair game, as far as I was concerned, not
because I particularly hate Mormons, which I don’t, but because
everyone knows that Mormonism is crazy. It was a curious irony after
years of improbable irony that I was being declared to be psychotic by
someone everyone knows to be psychotic.
When I say everyone, I mean everyone who is not a Mormon. I am not
going to start calling them “the LDS Church” or the LSD Church, when
even Time magazine called them Mormons. I have a vivid mental image of
the Time magazine cover featuring the ostentatious headquarters of
this “church of latter-day saints” that regard notorious American
nineteenth century conmen as saints. Ah, the miracle of the internet;
here it is:
/
Leo Tolstoy described Mormonism as the “quintessential American
religion”. Mormons regard themselves as Christians, and grew out of
the Protestant Church in the USA, so they should really be regarded as
one more silly Christian sect, but they have their own, unique
ridiculous beliefs, added to the shared delusions of Christianity. The
1997 Time magazine article, the cover of which was imprinted on my
mind, presents this summary of Mormon theology:
“Mormon theology recognizes the Christian Bible but adds three holy
books of its own. It holds that shortly after his resurrection, Jesus
Christ came to America to teach the indigenous people, who were
actually a tribe of Israel, but that Christian churches in the Old
World fell into apostasy. Then, starting in 1820, God restored his
"latter-day" religion by dispatching the angel Moroni to reveal new
Scriptures to a simple farm boy named Joseph Smith near Palmyra, N.Y.
Although the original tablets, written in what is called Reformed
Egyptian, were taken up again to heaven, Smith, who received visits
from God the father, Jesus, John the Baptist and saints Peter, James
and John, translated and published the Book of Mormon in 1830. He
continued to receive divine Scripture and revelations. One of these
was that Christ will return to reign on earth and have the
headquarters of his kingdom in a Mormon temple in Jackson County, Mo.
(Over time, the church has purchased 14,465 acres of land there.)”

I quite like Sagir and Nigel, and I like the Mormon psychiatrist too.
I like the Mormon psychiatrist because he stopped the injections, and
realise that him being believing in the Book of Mormon (if indeed he
does) did not stop him thinking and acting rationally, when it came to
stopping the injections that his colleagues, and not himself, had
initiated. This made him a good doctor, in my eyes. It also makes me
wonder why I mentioned that he was a Mormon psychiatrist, rather than
good psychiatrist, or a Christian psychiatrist. The reason may be
that, after many years of being labelled with this and that, I have
started labelling people myself.
I have had my liberty constrained by many psychiatrists and psychiatry
registrars over the years, and didn’t like it. I didn’t like them when
they ordered that I be given drugs against my will, and I didn’t like
them when they ordered that I be “returned to the ward” if I went
home. I developed a strong dislike, indeed hatred at times, of
psychiatrists and psychiatry registrars who kept me prisoner in
various hospitals in Melbourne and Brisbane. Since they released me
last, a couple of years ago, my anger and hatred have dissipated
considerably, and I can reflect more rationally and reasonably on
psychiatry and the psychiatry profession. The same way that I question
Richard Dawkin’s blanket condemnation of religion, I hesitate to make
a blanket condemnation of psychiatry. My experiences as an involuntary
patient over the past twenty years have given me an unusual
perspective on medicine, since I myself trained and worked as a doctor
of medicine until I was prevented from earning a living as a doctor
because of the psychiatry profession, which declared me to be mentally
ill. The diseases they declared me to have were the most serious
psychotic illnesses in their textbooks – schizophrenia,
schizoaffective disorder and mania.

Paliperidone (trade name Invega), also known as 9-hydroxyrisperidone,


is a dopamine antagonist and 5-HT2A antagonist of the atypical
antipsychotic class of medications. It is developed by Janssen
Pharmaceutica. Invega is an extended release formulation of
paliperidone that uses the OROS extended release system to allow for
once-daily dosing.

Paliperidone palmitate (trade name Invega Sustenna, named Xeplion in


Europe and other countries) is a long-acting injectable formulation of
paliperidone palmitoyl ester indicated for once-monthly injection
after an initial titration period. Paliperidone is used to treat mania
and at lower doses as maintenance for bipolar disorder. It is also
used for schizophrenia and schizoaffective disorder.

I must admit that I was prejudiced against Mormons because of the


brainwashed young men with white shits and black ties who come
knocking at the door with peculiar ideas from a new American religion,
but I dared not discuss religion with this psychiatrist. Neither did
he attempt to discuss religion with me, though I would have been quite
happy to do so. I enjoy discussing religion. I rarely do, though,
because most of the religious people I know do not like their religion
challenged.
Psychiatry is also a religion, with a hierarchy of increasingly high
priests and several holy books. It is also a cult, with icons and
rituals, and a mentality of “insiders” and “outsiders”. Within
psychiatry there are several rival sub-cults, such as Jungians,
Freudians, and others. The dominant cult in Australia is that of
“biopsychosocial psychiatry”. This is controlled by one, and only one
organization – the Royal Australian and New Zealand College of
Psychiatry.
The RANZCP even has a “coat of arms”, which was adopted from Britain
in 1969. The symbolism is interesting:
/
The first thing that struck me was the metal helmet of a knight and
the battle shield, reminding me of the Christian Crusades to conquer
the Holy Land. The cross on the shield reminded me of how, back in
the 4th century, the Emperor Constantine was said to have been
converted to Christianity when he won a battle after painting crosses
on the shields. I didn’t notice the obviously Christian cross till I
looked more carefully, despite the fact that it is at the top of the
shield. It is the knight who caught my eye, but the overall symbolism
is clearly Roman Catholic, rather than overtly Masonic or Royalist.
This is because it is modern symbolism that was trying to shed its
Royalist and Masonic roots, despite still calling itself a “Royal
College”. Instead, it seems, they have embraced the mysticism of Carl
Jung and even evoke the Aboriginal “Dreamtime” for hip modern
audiences.
The RANZCP explains the symbolism of its “coat of arms” as follows,
with no mention of the Crusading knight’s armour:
“The crossed bands and central square refers to an intellect that has
become disordered and has turned its strength against itself and the
body – symbolised by the outer body circle, the disorganised inner
components of the mind and the enclosed central spirit.
Above is the symbol for the chemical compound alum, to which has been
attributed special mental healing powers. The symbol incorporates a
Roman Cross. Its supraposition is designed to suggest that it is
exerting healing influence over the disordered intellect.
The snakes entwined about the staffs, which could be caducei, are
taken from a coin illustrated by Jung. The snakes can also be related
to Ungud, the serpent of the Aboriginal dreamtime.”

The Latin motto ex veritate salus translates, according to the RANZCP


website as “out of truth (understanding) comes health (or well
being)”. Does this mean that truth is the same thing as understanding
or that veritate means both truth and understanding?
A more usual translation of undertsanding is perceptio. According to
the Word Hippo there are many other Latin words for understanding:
adprehensio, animus, apprehensio, cerebrum, comprehensio, comprensio,
conprehensio, conprensio, humanus, intellectus, intellegens,
intellegentia, intelligens, intelligentia, mens, pectus, peritus,
prudens, rationabilis, rationalis, sapiens, sciens, sensus…but not
veritate, which seems to mean truth. Am I just splitting hairs here?
Don’t both truth and understanding bring health? Let’s consider for
the moment that they do. How successful is the Royal Australian and
New Zealand College of Psychiatry at promoting the interests of truth
and understanding, and how much is it promoting health and wellbeing
in Australia and New Zealand?

The RANZCP controls all psychiatry in Australia through the system of


specialist registration. All psychiatrists must be members of the
college to call themselves psychiatrists and claim higher rebates from
the government for talking to people than do GPs, Psychologists and
other people who engage in talk therapies, however successfully cannot
claim rebates at all. This financial incentive is at the heart of the
medical system. The fact is most doctors would not go to work if they
were not paid to do so. Neither would most nurses, psychologists or
social workers. It is an essentially reluctant workforce, that look
forward to the weekends and holidays, as well as the end of the shift.
That does not mean that health workers do not want to and try to heal,
when they are at work. Doctors, nurses and psychologists do try to
heal while they are at work, and in many instances are successful in
doing so.
/

Dr Kym Jenkins graduated from the University of Manchester (UK) in


1980 with a Bachelor of Medicine and Bachelor of Surgery (MBChB), and
initially specialised in general practice. After moving from the
United Kingdom to Australia in 1986 she worked in general practice
before commencing psychiatry training and completed her Fellowship
with the RANZCP in 1998.

In addition to gaining a Masters of Psychological Medicine, Dr Jenkins


pursued further studies in medical education, gaining a Masters of
Education degree in 2008. Dr Jenkins has held a range of roles as a
consultant psychiatrist in both the public and private sectors. She is
currently Medical Director/Senior Clinician of the Victorian Doctors'
Health Program, runs a small private practice and is an adjunct Senior
Lecturer at Monash University.

Dr Jenkins has had extensive involvement in psychiatry-related medical


education, both within the RANZCP and externally. She has had roles
within the College as an accredited examiner, trainer and assessor and
has served as the Chair of the Committee for Examinations, Chair of
the Fellowship Attainment Committee and Deputy Chair of the Board of
Education. She has also been involved in the planning and development
of the new (competency-based) Fellowship program since its inception
and has chaired several of its working parties. She was a member of
General Council from 2005 to 2010.

Dr Jenkins was elected to the College’s inaugural Board in 2013 and


became President Elect in 2013. She will be President of the College
from 2017 to 2019. She chairs the Membership Engagement Committee.

Dr Jenkins can be contacted via HYPERLINK


"mailto:kym.jenkins@ranzcp.org"kym.jenkins@ranzcp.org.

I gathered that this psychiatrist is a Mormon, from his Linkedin


business posting, where he volunteers the information that he is a
member of the Church of Jesus Christ of Latter Day Saints, the
American church known as the Mormons. He also mentions that he is
interested in skiing, sailing and Latin dancing, in addition to his
psychiatry qualifications. I did not raise any of this, including his
religious beliefs, when I next saw him. It was not my place, and had I
done so he may well have regarded it as a sign that I was not mentally
well. If I attempted to debate the tenets of the Mormon religion, and
authenticity of his beliefs, he most certainly would have. It would
not have helped that I did not know then that some Mormons regard
“Mormon” as a pejorative term and prefer to be called the “LDS church”
(not to be confused with the LSD church which is more generally
associated with psychiatry). Patients are not expected to debate
religion with psychiatrists. They are not even allowed to ask
“personal questions” though they have plenty of personal questions
asked of them. This is in the nature of what is called in medicine
“the doctor-patient relationship”.
The doctor-patient relationship has changed over the years, which is a
good thing. Patients have become better informed about diseases and
treatment alternatives as a result of the Internet, and are more
likely to question the doctor’s judgement. They don’t expect to be
ordered rather than advised. One seeks medical advice, not doctor’s
orders, these days. There remains, though, a power imbalance between
doctors and patients, especially in psychiatry, which, alone of
medical specialties, can order “involuntary treatment” under lock and
key. These days they use magnetic locks and keycards, but a prison is
a prison, with or without bars. In many parts of the world mental
wards and cells do indeed still have bars and old-fashioned locks and
keys. I can only imagine the horrors that must occur there as an
adoption of the treatment methods models the rest of the world adopted
from the USA and the West. Hospitals and wards for the confinement and
treatment of those deemed (variously) as mentally ill exist all over
the world, but I have only seen the insides of public psychiatric
wards in Australia, where I was trained and later confined. What I saw
and experienced there shocked me and drove me from being a mild to a
strong critic of what passes for psychiatry in Australia.
Australian psychiatry has a dark history, which is largely untold.
Snippets can be gathered from the writings of the psychiatrist Eric
Cunningham Dax, who oversaw the reforms of the 1950s, when the old
asylums were modernized, and many of the long-term inmates were
“released into the community”, where they were “cared for” by
“community treatment teams”. This system has expanded enormously from
the public hospitals where they originated, to the present nation-wide
Mental Health Strategy, which has bipartisan political support at
state and federal levels. The psychiatric system is unchallenged in
Australia, and the many victims of abuse by the system are mostly too
afraid of, or broken by a system that demands compliance and has
powerful means of persuasion, to speak out about it. Many are deeply
convinced that they have whatever label they have been told they have
by the doctor, and repeat, like parrots, that they have a “chemical
imbalance”. If you ask them which chemicals are out of balance, they
will at best offer the names of serotonin or dopamine, depending on
whether they’ve been convinced they have depression or schizophrenia.
It so happens that the serotonin depletion theory of depression was a
convenient myth to sell Prozac and the other SSRI drugs, while the
dopamine excess theory of ‘schizophrenia’ has been used, since the
1960s, to sell the old antipsychotic drugs, like Thorazine and Haldol,
which are known to block dopamine receptors in the brain.
Thorazine, known in Australia, as Largactil are product names for a
drug called chlorpromazine, which was discovered in France and later
exported to the world as the “first drug that worked” for
schizophrenia and the drug treatment of psychosis. This is not to say
that it was the first drug that was tried, and said to work. The
previous century was full of such drugs, all of them now regarded as
toxic or ineffective. What made gave Thorazine an undeserved
reputation as a magic bullet? The answer, I suspect, is complex. The
spectacular growth of the pharmaceutical industry (and allied chemical
warfare) industry during the Second World War was one factor. The
other was the globalization of American and British psychiatry, devoid
of psychoanalytical mumbo jumbo, and firmly focused on what was
euphemistically called “biological psychiatry”.
Biology – from the Latin bios – means the study of life, and
biological psychiatry sounds very nice and scientific. Biological
psychiatry means something quite specific, and refers primarily to the
treatment methods, which are called ‘biological treatments’. These are
euphemisms for drugs and shock treatment. This came about through the
history of biology and a scientific focus on chemicals and electricity
as being at the heart of biochemistry and physiology. Biological
psychiatrists assumed that by manipulating brain electricity and
chemistry they could successfully treat minds. After all that’s what
psychiatry is – the medical discipline of treating minds. The Greek
iatros means treatment, and is quite distinct from logos – a word
usually translated as study of or knowledge of. This is the core
difference between psychology and psychiatry. One would hope and
expect that psychiatrists would have knowledge of the subject before
attempting to treat it. When it comes to the mind – psyche – there is
even the core problem of what we are talking about. Does psyche mean
soul, as the original Greek meaning intended, or does it mean mind? Is
the soul the same thing as the mind? What is the difference between
soul and a soul? Biological psychiatry doesn’t concern itself with
such philosophical wonderings, and gets on with the social problem of
diagnosing and treating, with a choice of drugs or shocks.
I have had 240 volt shocks a few times, but never to my brain, and
always by accident. One time I was deliberately shocked,
simultaneously, with 40,000 volts. The pain was excruciating and for
the only time in my life I literally saw multi-coloured stars. It was
like when one hits one’s head, and sees stars, but these were multi-
coloured and continued for as long as the shock was applied. I thought
I was being shot with a gun, and that this was the experience of
death. I had not heard of Tasers and the fact that they had been added
to the armoury of Australian police. This amounts to my personal
experience of being shocked, other than from the gentle shocks of
static electricity, and the childhood experiment of tasting the
sourness when you pass electricity through your tongue by licking the
electrodes of batteries. I have never ordered that someone else be
given electric shocks, though when I was younger I believed in the
efficacy and safety of what we called ECT and the Americans call
“electroshock”. Though I believed “it worked” for people with severe
depression (though not for other reasons, and certainly not in
children) I based my reasoning on less than scientific standards. The
fact that no-one even claims to know how ECT works should cause
doctors to hesitate in its use, given the fact that we know it
adversely affects memory, causes headaches, and frightens many people
who are threatened with it. The use of general anaesthetics does
prevent bones from being broken by violent limb contractions like the
bad old days, but memory loss is usual. Though usually the memory
returns, it can be permanently damaged, especially when ECT is given
in courses, as it usually is.
There are some people who swear that ECT has helped them, and there
are many who are happy with the drugs they are taking, and the
diagnostic label they have been given. My main concern is for people
who are being electrocuted, for whatever reason, against their will,
and those who are being drugged against their will. I am also
concerned about people being duped into ingesting various substances
(including drugs) on the basis of pseudoscience. backed by
pseudoscientific arguments based on pseudoscientific “evidence” of
imaginary or manufactured “diseases” and “disorders”.
Imaginary diseases and manufactured diseases
Manufactured diseases are real diseases which are created by humans,
of which there are fortunately few. I have spent many years
researching the theory that HIV is one such disease but am now
doubtful about it, for reasons I will explain later. Imaginary
diseases develop whenever someone imagines that they have a disease or
illness. Sometimes they are correct in their imagination, often they
are not. Can imagining that one has an illness, cause one to become
ill? Does thinking you have cancer or are likely to get cancer,
increase your risk of getting cancer – or heart disease? I have
pondered this matter for many years, looking for evidence to confirm
or refute the assumptions of the New Age movement that the mind could
heal the body and that the answer lay in meditation and meditative
practices. This belief was absorbed into the New Age religion from
Tibetan Buddhism, in particular, converging with concepts from other
Eastern traditions such as Hinduism and Taosim. I was immersed in the
New Age movement when I first pondered the mind-body relationship and
was open to the possibility, indeed I believed, that the mind could
heal the body. Now I am less sure, much less sure.
There are types of manufactured diseases that are frequently in the
news – chemical and biological weapons, though their use is more often
a threat than a reality (assuming that AIDS is not such a reality).
Manufactured disease also includes disease that is caused by “drugs
and alcohol”, which are both manufactured diseases, though
“alcoholism” is increasingly blamed on genetics. This is where
politics and economics come into the matter, with the influential
alcohol lobby predictably looking after its vested interests. Then
there’s the tobacco lobby, which used to pay for ads on TV with
“scientists” proving that cigarettes don’t cause lung cancer, until
the evidence made such a claim ridiculous, even to the gullible
public. The public was then much more gullible about the truth of what
they saw on TV than they are now.
Another group of manufactured diseases that are acknowledged by the
medical profession as a problem but is rarely in the news constitutes,
is iatrogenic disease.

Iatrogenic Disease:

I had always thought that the Greek iatros meant treatment, but I
find, from Wikipedia, that it actually means “brought forth from the
healer”. That changes everything. It raises the difference between
treating and healing. Nevertheless, iatrogenesis is not the same as
iatrogenic disease, since what is generated by the healer may be
positive, negative or neutral. For healer’s to legitimately claim the
mantle of healer, they need to have a positive therapeutic effect,
rather than a negative one, or neutral one.
However, Wikipedia does not differentiate between iatrogenesis and
iatrogenic disease, providing as examples of iatrogenesis:
Risk associated with medical interventions
Adverse effects of prescription drugs
Over-use of drugs, (causing - for example - antibiotic resistance in
bacteria)
Prescription drug interaction
Medical error
Wrong prescription, perhaps due to illegible handwriting, typos on
computer.
Negligence
Nosocomial (hospital acquired) infections
Faulty procedures, techniques, information, methods, or equipment.
The Encyclopedia of Death and Dying (which I have never consulted
before) informs that
“A 2000 presidential report described iatrogenic error and illness as
"a national problem of epidemic proportions," causing tens of
thousands of annual deaths. The report estimated the cost of lost
income, disability, and health care costs to be $29 billion a year.
The report concluded that half of adverse medical events were
preventable.
“Issued by the most respected agency of American medicine, To Err is
Human generated considerable attention and surprise by concluding that
up to 98,000 Americans are killed annually by medical errors. This
number slightly exceeds the combined total of those killed in one year
by motor vehicle accidents (43,458), breast cancer (42,297), and AIDS
(acquired immunodeficiency syndrome, 16,516).

Imaginary disorders and manufactured disorders


Iatrogenesis
Confirmation bias

Commands and suggestions


“Don’t imagine a pink elephant!” When someone pointed out the paradox
that one can’t help but think of a pink elephant when told not to do
so, it got me thinking again about commands and suggestions. Even if
commanded not to imagine a pink elephant, the command contains a
suggestion that results in the opposite of the actual words of the
command. A pink elephant pops automatically into one’s mind. The same
is the case, though not as dramatically, for “don’t think of an
elephant!” You can’t help but think of an elephant, if you know what
an elephant is.
What about the questions that the World Health Organization recommends
for the assessment of “suicidality”:
“Have you felt that life wasn’t worth living? Have you thought about
harming or killing yourself? Have you felt tired of living or as
though you would be better off dead? Have you ever felt like ending it
all?”
If you hadn’t you have now. The question puts the idea into your head,
it implants the suggestion. Likewise all the other questions that
doctors and other health workers ask their clients, patients, and
prisoners. Modern doctors don’t give orders much anymore, though the
old phrase “doctor’s orders” betrays a more authoritarian history.
Doctors used to give orders or commands, and the patient was expected
to obey – to comply. These orders were not always rude, though some
doctors are indeed rude, to their patients, as well as to their
colleagues, but rudeness does not win hearts, and the demands of
private practice are that one is polite to ones customers, however
much you might dislike them. Medicine is a service profession.
Giving orders is also against the nature of most doctors, as of most
people – they mostly have to be trained to give orders. This training
occurs in their university and hospital years, before they venture
into private medicine, where they are
Order and orders
Suggestions and advice
Persuasion
Enthusiasm and interest
Preconceptions – positive and negative
Time, timing and using time efficiently
Physical and mental evaluation and examination
Diagnosis, prognosis and self-fulfilling prophesies
Treatment, and iatrogenic illness
Is health merely the absence of disease or illness?
Disorder and disorders
Order and chaos; disorder and chaos
Random this and random that
Quantum this and quantum that
Mechanism and mechanistic
Materialism and materialistic
Spiritual, spiritualism
Spiritual health and mental health
Does a healthy body promote a healthy mind?
Does the mind rule the body or the body rule the mind?
Dualism and its alternatives
Religion and religiosity
Fact versus fiction
Truth versus untruth versus lies (deliberate and unintentional)
Neural substrates and neural correlates
How scientific is medicine?
How scientific is Science
What does scientific mean in the first place?
Rival disease models and their problems
Rival treatment models and their relative merits

Most doctors and health workers do not have prisoners, but the most
dangerous ones do. They rarely order these prisoners, the ordering is
left to other members of staff – they tell their prisoners what is
going to be done rather than ask what the prisoner would lik

Doctor’s Orders
“Can anyone read your mind?”
“Do you have a special relationship with God?”
“Is anything like electricity, X-rays or radio waves affecting you?”
“Are thoughts put into your head that are not your own?”
“Have you felt that you were under the control of another person or
force?”
BPRS (WHO)
I became consciously aware of the power of suggestions when I was
employed to teach anatomy and physiology to hypnotherapy students in
Melbourne. This was in 1998 and 1999 and the Millennium was imminent.
New Age ideas were common, including the idea that the mind could heal
the body through various means. The Clinical Hypnotherapy Centre was
run by a lady by the name of Kathleen Frith, who had studied hypnosis
under a man called Jim Golding. Golding died in 2001, but I did once
see him in action at Monash University, when he tried to hypnotise a
young audience that the pineal organ in the brain was a crystal that
contained the soul, which miraculously disappears when you die
(resulting in corresponding weight loss) and the equally implausible
story that an US Navy ship had time-travelled during the Second World
War (the legend of the Philadelphia Experiment). Some of Golding’s
audience seemed to swallow the story – they were hypnotised, judging
by the lack of laughter and the expressions on their faces. He didn’t
like it when I challenged his claims about the pineal after the
lecture (which I had expected to be a lecture about hypnosis rather
than an exercise in casting a hypnotic spell).
Kathleen once tried convincing me that one could change a person’s
gender through hypnosis.

When I accepted the position as a visiting lecturer I realised that my


students, who numbered about 15, came for many walks of life and had
different levels of knowledge about science and biology. They were all
interested in hypnotherapy, and I was too. I was mindful of the
difference between hypnosis and hypnotherapy. I was wary of the former
but open-minded about the latter. I entertained the serious
possibility that hypnosis could be used to promote health, and what
the theoretical basis of hypnosis was. Did it work, and if so, how did
it work?
Hypno means sleep, and in classical hypnosis the patient is encouraged
to enter a deep state of relaxation, and suggestions are made by the
hypnotist or therapist. Many techniques have been developed over the
years, and hypnosis was widely used by the medical profession in the
19th century. In recent years it has become something of a fringe
science, with a few psychiatrists practising hypnosis for such things
as pain relief, but the science of clinical hypnosis has not
progressed much in recent times.
I was never interested enough to learn how to hypnotise people myself,
but I absorbed some ideas from my brief exposure to supposed
hypnotherapy . One was that suggestions are important and when people
are relaxed they accept suggestions more readily. Another was that
words, phrases, ideas and suggestions can affect our health by acting
on the subconscious. I had been interested in the subconscious mind
since I developed my theory of motivation in 1995. In this theory, I
hypothesised that the subconscious mind was generated by distinct but
interconnected areas of the brain, including the basal ganglia, limbic
system and midbrain. The subconscious and unconscious parts of the
mind were more parts of the puzzle of consciousness that had me
stumped.
I have never liked Sigmund Freud and have no interest in his
writings. Unfortunately Freud has fundamentally shaped what people
understand as “psychoanalysis” – which to me means analysis of the
psyche – the mind or soul. Psychoanalysis is vital for self-knowledge,
Freudian psychoanalysis is disastrous. I am not enthusiastic, unlike
many opponents of Freud, about Jungian analysis either, with its
mysticism and mythical archetypes. Analysis of the psyche can be done
scientifically and rationally. This includes analysis of the
subconscious and unconscious aspects of the mind, as well as the
conscious ones. In addition, it is possible to consider scientifically
and rationally the concept of “collective unconscious” that Jung
coined, in addition to collective consciousness and collective
intelligence, which he didn’t include in his model.
Though I have not read much of Freud, I am familiar enough with his
well-known model of id, ego and superego to refrain from using any of
these terms other than ego. The Australian band Skyhooks sang in the
1970s that, “ego, is not a dirty word”. Some regard ego as the sense
of self, others as self esteem. They used to talk about “ego
boundaries being blurred in schizophrenia” and other such mystical
mumbo jumbo based on Freudian theories about the ego, superego and id.
Freud did not differentiate the subconscious, and like Jung, he
divided the mind between conscious and unconscious aspects. Somehow
the subconscious popped into the picture, and assumed centrality in
modern theories of how hypnosis and suggestions work.
I took from Freud the basic idea of the pleasure principle, and the
irrefutable fact that we have sexual instincts (which was obvious to
Darwin many years before, as were many other instincts, including
curiosity and play, as I was to discover later). I think it is
logically obvious that humans and other animals generally seek
pleasure and avoid pain, and that this is an important factor in
motivation. Freud’s ideas about infantile sexuality, anal, oral and
genital stages of development (and supposed ‘fixations’ at various
stages of mental development), the Oedipus and Electra complexes,
penis envy and the like, as well as his model for dream analysis in an
effort to “analyse” the mind are obvious reasons for Freudian analysis
to have fallen out of favour even in the USA. It never caught on in
Australia; Freud was regarded as a discredited crackpot even when I
studied medicine in the 1980s. He was mentioned only to show how far
psychiatry had advanced since then, with the development of what was
euphemistically called “biopsychosocial psychiatry”.
Words and phrases carry images, which obviously have a profound effect
on our health. The puzzle I have been pondering is what these effects
are, and why they occur in terms of neuroscience. This is a part of a
larger puzzle – how do what we see and hear affect us? And how can
directing our attention and concentration through our eyes and ears
promote our physical and mental health?
The puzzle has many missing pieces. Some are missing from my own
personal knowledge, some are missing from human knowledge.
Is thinking of a pink elephant good for one’s health? What about
thinking about a fluttering, green butterfly, or majestic crashing
waterfall? Does what one imagine affect one’s health or is that just
mystical New Age nonsense? Imagining something is very different to
the real thing, which is different again to realistic representations
of the real thing (such as photos and video). Is it healing to look at
beautiful, relaxing panoramas while listening to Mozart or Brahms? Or
does that depend entirely on whether you like classical music? Is
heavy metal more healing than an Indian raga to a heavy metal fan? Is
the healing power of music intrinsic to the music and the physical
properties of sound waves, or is it entirely subjective, depending on
taste in music. What determines taste in music? Why do some rhythms
make you feel like dancing and others not. Why is it that parrots can
dance in time to music but chimps can’t. Why is it that parrots can
imitate a human singing so much better than our much closer primate
relatives?
It is obvious that what we see and hear affects our emotions. It is
also obvious that our emotions have an effect on our physiology and
metabolism, which include anabolic and catabolic processes in cells
throughout the body, including in the brain itself. It is these
physiological and metabolic processes that I focused on first when
trying to understand mind-body healing mechanisms based on what we see
and hear.
I began my attempts at the puzzle in the brain, which has never seemed
like a “black box” to me. Instead I think of the brain as a soft grey-
pink organ with a delightfully organic shape. I think of the brain as
something quite distinct from the mind, though of course they are
closely connected, and interact with each other. My assumption has
always been that the mind, and consciousness are produced by the
brain, but that the mind, rather than the brain, controls behaviour.
All voluntary actions require the contraction of muscles and the
connections between the motor cortex and voluntary muscles, down to
microscopic and molecular detail, are well understood. What is less
well understood is the role of structures deeper in the brain than the
cortex, which can be easily studied through electrical stimulation of
the surface of the brain. These include the limbic system, basal
ganglia, hypothalamus and thalamus, which play key roles in
perception, attention, concentration, and emotion as well as memory
and impulses or urges to movement.
Connectionism can only be taken so far. Making connections between
different areas of the brain with known localised functions is a good
place to start, in understanding how the brain works. A lot is known
about the early neural processing of what we see and hear, but the
complex process of emotional reactions is less well understood. The
reasons of this are the methods used by neuroscientists to determine
the function of parts of the brain by studying animals. Monkeys have
similar brains to our own in some respects, even rats do, but rats and
even monkeys do not have anything like the complexity of our minds or
brains. Most importantly they do not have symbolic language. Symbolic
language is intrinsic to human thought in sickness and health. Without
symbols we cannot understand words or concepts – we cannot be rational
or irrational, we cannot think in language. But that does not mean
that elephants are not rational, They are; very much so. So was
Aristotle, who said that only humans and not animals have a rational
soul, wrong?
Wikipedia to the rescue:
“Elephants are among the world's most intelligent species. With a mass
of just over 5 kg (11 lb), elephant brains have more mass than those
of any other land animal, and although the largest whales have body
masses twenty-fold those of a typical elephant, whale brains are
barely twice the mass of an elephant's brain. The relationship between
brain size and intelligence (if there is any such relationship at all)
remains unclear. In addition, elephants have a total of 257 billion
neurons. [1] The elephant's brain is similar to that of humans in
terms of structure and complexity—such as the elephant's cortex having
as many neurons as a human brain,[2] suggesting convergent evolution.
[3]
Elephants express a wide variety of behaviors, including those
associated with grief, learning, allomothering, mimicry, play,
altruism, use of tools, compassion, cooperation,[4][5] self-awareness,
memory, and language.[6] Further, evidence suggests elephants may
understand pointing: the ability to nonverbally communicate an object
by extending a finger, or equivalent.[7] All indicate that elephants
are highly intelligent; it is thought they are equal with cetaceans[8]
[9][10][11] and primates[9][12][13] in this regard. Due to the high
intelligence and strong family ties of elephants, some researchers
argue it is morally wrong for humans to cull them.[14] The Ancient
Greek philosopher, Aristotle, once said that elephants were "the
animal which surpasses all others in wit and mind."[15]”
American science has come a long way since Louis “Jolly” West injected
the elephant Tusko with a massive dose of LSD in the vain hope of
causing a psychotic rage reaction and killing the poor animal instead.
This was in 1962, and caused some controversy, mainly about the
possible dangers of LSD, but also about cruelty to animals. Since the
1960s and the development of the related fields of ethology and
ecology, there has been wonderful cross-disciplinary research into
animal behaviour in the wild, which has told us much more about humans
and other animals than torturing and mutilating animals and examining
bits of them under the microscope or observing them in cages for
“behavioural abnormalities”.
Despite the progress in understanding animal behaviour in the wild
through careful observation and analysis, there is a parallel industry
of animal research on wild and domesticated. This includes testing of
various drugs and vaccines, for which small, cheap animals, like rats
are usually used. Thankfully international conventions have restricted
the more cruel forms of animal experimentation, based on the
reasonable assumption that animals that have a complex nervous system
also feel pain and can suffer. This means that anaesthetics are
mandated for vivisection of some but not other animals. At the same
time, the celebrated career of Professor Eric Kandel and his
schizophrenic mice make me wonder if the neuroscience establishment
has any idea about the difference between a mouse and a man, and
whether they are taking analogies between mice and men a bit far.
Eric Kandel won the Nobel Prize in 2000 for his work, over many
decades, on neuronal transmission and chemical synapses. His early
work involved studying the effects of the neurotransmitter serotonin
on individual neurones and small groups of neurons in a type of
mollusc (Aplysia) that has large neurons which can be individually
measured. He moved onto research on mice and rats, hoping to
understand memory and learning by focusing on the hippocampus, a
structure deep in the temporal lobes, that are involved in certain
aspects of memory in rats and men. The easiest way of studying a
structure in the brain is to remove it or ablate it by deep
cauterization. This method was used for many decades, through the
thirties, forties and fifties, and extending into the sixties, with
the rise of behaviourism in the USA and the global enterprise that
followed its example.
I’m not sure about the details regarding Kandel’s experiments on the
hippocampus of mice and rats, but I have watched the gentleman present
an unintentionally amusing lecture at the University of Basel in
Switzerland, which awarded him another honorary doctorate. The lecture
was in 2012 and titled “Mice, Men and Mental Illness: A Transgenic
Mouse Model of the Cognitive and Negative Symptoms of Schizophrenia”.
This lecture did not amuse his audience, and was not intended to. Most
of the men and women who listened to the respected authority on
schizophrenia, dopamine and neurotransmitters probably swallowed his
presentation hook, line and sinker. Professor Kandel, moreover,
believes what he is saying, which is why his audience is so easily
convinced by his nonsense. They don’t realise it’s nonsense because
they don’t know any more than him about the history of schizophrenia,
and what constitutes the “cognitive” and “negative” symptoms (versus
the so-called “positive symptoms”) of schizophrenia. More
surprisingly, they have evidently not considered the difference
between the cognitive processes of mice and men, despite the so-called
“cognitive revolution”.
Professor Kandel asks the obvious question: “how can one develop a
mouse model for something as complicated as schizophrenia?” His
solution is reductionism: “the disorders as they are clinically
defined are too complex so you have to take a component and study it
in great detail.” He provides an analogy from cardiology, saying that
instead of studying atherosclerosis, they studied cholesterol
metabolism as a feature of atherosclerosis, leading to the study of
LDL receptors.
According to Kandel’s model, “schizophrenia is characterised by a
genetic predisposition to develop three symptom clusters”. These he
classes as “positive symptoms” – or “the craziness” (including
disordered thought, hallucinations and delusions), “negative symptoms”
(social withdrawal, blunted affect and decreased motivation) and
“cognitive deficits” (attentional deficits, deficit in working memory
and deficit in executive functioning).
You may wonder how one tells if a mouse is deluded or hallucinating,
or if its thoughts are disordered. The answer is, as Professor Kandel
admits, you can’t: “the positive symptoms are difficult to study in
the mouse” is something of an understatement. One can’t recognise
blunted affect (or blunted facial expression) in a mouse either. What
you can do is breed mice that are socially withdrawn and forget their
way around mazes by knocking out particular genes and imagine that you
have developed a “mouse model for schizophrenia”. Maybe you can also
breed mice that can’t concentrate well, too, but this would not be
comparable to schizophrenia or ADD, even though ADD is theoretically a
deficit of attention.

My knowledge is incomplete and splintered, and I know it. I have long


forgotten most of the things I have done and learned in the past,
though scattered remnants remain accessible to my conscious mind and
active recollection. This is the case for everyone, because memory
fades, and human knowledge itself is splintered. We can only remember
a fraction of what we experience, and though my memory is reasonable
by human standards, it may not be so from the standard of an elephant.
Elephants are said to have excellent memories. Though I don’t doubt
this, I’m not sure how well the memory of an elephant compares with
that of a human. What I do know is that I once understood how to do
calculus with considerable ease but can’t remember how to do it any
more through lack of use, and my general mathematical competence took
a dive when I became a doctor and started practising medicine rather
than mathematics. Forgetting calculus was a slow process, but until I
checked recently what calculus actually is, I had forgotten everything
about it including its mathematical purpose, even the basic fact that
it is the mathematics of change. I also know that elephants, though
they may remember many things that we humans cannot, and may even be
able to count and add or subtract small numbers, are quite unable to
do learn calculus at all, despite their famed memory. I know for a
fact that chimps have excellent memories, and so do dogs, though again
their memory is quite different to our own. It has to be, because
words are not at the core of their memories. Words are integral to
human memory, though words are not, of course, the only things that
are remembered.

Negative preconceptions
Dementia – from the Latin loss of mind – has been predicted to
increase around the world as the population ages. Of these cases of
dementia, most will be diagnosed as having “Alzheimer’s” or
Alzheimer’s Disease, for which there is no known cure, and few proven
treatments. For the past fifteen years I have been sporadically
investigating the intuitive hypothesis that an active mind prevents or
decreases dementia, and considering social and psychological factors
that might interfere with continued learning. For example the
pervasive cliche that one is “too old to learn” or “too old to
change”, that one “can’t do science” or “has a poor memory for dates
and can’t do history”…and so on. It is common to hear people declare
that they “can’t sing”, “can’t draw” (though rarely that they can’t
write, even when they can’t) based usually on discouraging childhood
experiences. Could such negative preconceptions have a bearing on the
development of dementia?
Negative preconceptions can be expected to lead to negative appraisals
and avoidance of situations anticipated to be unpleasant, as well as
negative evaluations of other people. This may be frank prejudice
against people based on their skin colour or the shape of their nose,
or broad prejudice based on gender or age, accent and fluency in a
particular language, or class and caste prejudices. All can be viewed
as negative preconceptions, which can lead to negative experiences
that reinforce the prejudice through confirmation bias. Negative
preconceptions can also discourage from learning about new things and
continuing the “scaffolding” process the Jerome Bruner theorised about
– where new learning is built on an existing framework of knowledge by
parents and teachers. But new learning also comes from within – we
have natural curiosity. Though it got me into a lot of trouble when I
proposed the hypothesis that the curiosity instinct is a protection
against dementia and depression, I’ll propose it again a couple of
decades later, in the hope that no one will lock me up for it.
Negative preconceptions build up in our minds over the years about
ourselves, others and the world. These negative preconceptions include
obviously harmful prejudices against others, but many more are simply
limitations we place on ourselves and our enjoyment of the world. We
decide on the basis of one or two negative experiences that we don’t
like this or that type of music or art, or type of food. I’d often be
told my older English patients, “I don’t like spicy food” or by
younger Australians “I don’t like vegetables”. It struck me that some
people must like very few things, and the more things you like, the
more pleasure you gain out of life. Gaining pleasure from learning new
things is the same as satisfaction of curiosity. It is pleasant to
have ones curiosity satisfied. Conversely having ones curiosity denied
is stressful, causing irritation, frustration and, importantly,
boredom. Boredom is greatly underestimated as a cause of stress and
suffering. Chimpanzees in cages visibly suffer from boredom, children
in classrooms learn to hide their boredom, but don’t always succeed.
When they are detected as being bored and fidgeting like bored
children do, there is a waiting diagnosis for them, depending whether
they “act out” or “tune out”. If they act out the favoured label is
ODD, if they tune out it is fashionably termed ADD.
ODD and ADD are American psychiatric labels, promoted by the American
Psychiatric Association (APA) for disobedient, bored, argumentative or
distractible children (who have many and varied reasons for their
behaviour, including responding to authoritarianism and oppression by
adults as well as their siblings and peers – being a child is a
complex matter). ODD stands for Oppositional Defiant Disorder and ADD
stands for Attention Deficit Disorder, which is closely related to
ADHD (Attention Deficit Hyperactivity Disorder). ADD and ADHD are
sometimes used interchangeably, but ADHD was only synthesised in 1994
with the DSM IV, by merging “hyperactivity” and “inattention” into a
“disorder”. The American neuroscience establishment then started
claiming that ADD/ADHD was a “genetic disorder” and had the statistics
to “prove it”, which was excitedly published in the Murdoch
newspapers. No one explained how this supposedly genetic disorder
suddenly appeared in epidemic proportions coinciding so precisely with
the publication of the DSM IV and the associated marketing campaigns
to capitalise on any new labels or broadened diagnostic criteria for
the application of these labels. The campaigns to sell the drugs came
in waves, carefully researched and orchestrated marketing campaigns,
by rival, though often colluding drug companies. Drug companies
collude so much they often merge, with giant corporations becoming
even more powerful. This is what is known as Big Pharma – companies
like GSK, Pfizer, Merck, Roche and Eli Lilly. All of these companies
develop and market psychoactive drugs and have common strategies, when
it comes to convincing doctors to prescribe.
When the DSM IV came out I was in suburban general practice in
Melbourne, where most of my patients were elderly. Nobody tried to
convince us that children had “bipolar disorder”, “schizophrenia”, ADD
or ODD, not drug reps, not psychiatrists and the medical literature.
When I was in medical school in the 1980s we were told that children
who were labelled “hyperactive” were just “very active” and that true
hyperactivity only affected about one in 200 children. These few
children, we were told, had a “paradoxical” response to stimulant
drugs, such as amphetamines, which instead of speeding them up, had
the effect of slowing them down. Other children were, in my training,
labelled only as “FLKs” – which stood for “funny looking kids”, who,
it was said, might have mental disabilities as well as “funny” faces.
As the years have gone by, such unscientific approaches have been
replaced with more scientific-sounding words, but ODD and ADD are as
unscientific as FLK is. Diagnosing childhood BD – bipolar disorder –
and schizophrenia is worse, since the drugs routinely used to treat
these “disorders” are extremely toxic, more so than the drugs used to
treat ADD/ADHD. The big problem with ADD is not the side-effects, it
is the fact that stimulant drugs are generally addictive. It has been
known that amphetamines stimulate concentration, but are also
physically and psychologically addictive for more than a century.
There are other side effects of stimulant drugs, depending on the
type, but I had a general concern about starting so early the habit of
taking a pill for ones ills. I was also concerned about the
possibility, indeed the probability, that children would be
scapegoated and stigmatised. Whenever they did or said something out
of order the natural tendency of the parent or teacher (or sibling) is
to wonder, “has he been given his medication?”
I think my concerns were justified, given the problem of crystal
metamphetamine (“ice”) abuse. Like heroin and cocaine, amphetamine
manufacture is a product of Western science and technology that is now
wreaking havoc around the world, spreading like a cancer from the
cities to the towns and villages. It’s ODD that they can’t ADD this
all up.
I am a Wikipedia enthusiast, but this is not the place to find out
what ODD really means; on the contrary it significantly confuses the
issue, assuming that ODD is a real brain disease, characterised by
genetic factors and poorly functioning areas of the brain. Mention is
made of “amygdala, prefrontal cortex, anterior cingulate, and
insula, as well as interconnected regions” that have been implicated
by “neuroimaging studies” in delinquent American youths. Wikipedia
provides a reference for the statement that “ODD has an estimated
lifetime prevalence of 10.2% (11.2% for males, 9.2% for females)”,
which led me to Daniel Dickstein, who made the claim. Dickstein,
according to the Bradley Hospital, Rhode Island website, “leads
Bradley's Pedi-MIND research program, which uses brain imaging
techniques, including magnetic resonance imaging (MRI) and behavioral
measures to identify biological markers of psychiatric illness,
including bipolar disorder in children and adolescents. Such markers
could help physicians make more accurate diagnoses.”
Bipolar disorder in children??

In late 1994 and early 1995 I went through a period of relatively


sudden integration of ideas, which led to new ideas, hypotheses and
theories. I was quite conscious of the change in my functioning, and
enjoyed the different mental state I was in. At the time I was working
as a solo GP, and made a few changes in how I treated my patients,
adopting what I hoped was a more holistic approach to their care. I
also asked them more carefully about drug side effects, and was more
thorough in my clinical assessments and history-taking. I became more
interested and engaged and distinctly more sociable.
Though my mind seemed to be working more efficiently than usual, and I
found I needed less sleep, my mind was not “racing” (in my subjective
experience) and my observable behaviour to my wife, secretary, family
and friends was, at that stage “normal”. Though I was certainly
“hypomanic” according to psychiatric criteria, I did not think there
was anything wrong with me – quite the reverse. I remember one of my
friends at the time commenting that he reckoned I was the happiest guy
on earth. I laughed. I did feel very happy, and felt like everything I
knew was falling into place and that I was, for some reason having
more and more “insights”. This is how I thought of my ideas – I
thought of them as ‘insights’. It was as if “a penny dropped” to use
the old English phrase that I grew up with. I have heard people with a
Catholic background using the term “epiphany” for what I take to be a
similar experience – a convincing idea.
Many of my insights were about philosophy, psychology and psychiatry.
Others were about politics. All were based on what I had previously
seen and heard, from one source or another, evaluated in the light of
what I was discovering in the dozen books I had impulsively bought and
started to read, all at the same time. One was about Chaos Theory, and
without reading the book carefully I decided to do some chaotic
experiments of my own, I started observing the effects of apparent
chaos in nature and its aesthetic appeal. I had the simple idea that
chaos theory suggests that systems of order grow out of chaotic
systems. I wondered how this might relate to what we humans find
beautiful in what we see and hear. I had known that the Fibonacci
series has a bearing on the proportions of natural growth in plants
and animals, such as the proportions of a snail-shell and the number
of flowers on a petal. Could the Fibonacci series be a common
mathematical principle in human aesthetic, given its well-established
musical significance? I also decided to trust my instinct, and decided
to no longer wear a watch to estimate the time. This did cure me of my
previous habit of constantly looking at my wrist.
The insight that had me most excited occurred after a couple of hours
painting and playing with my 2-year-old daughter. This was that
communication, curiosity and play are instincts that contribute to
motivation and can be harnessed to improve health. At the time I was
not well-equipped to challenge my assumptions, and had no one to do it
for me. I was surrounded by artists not scientists, though some had an
interest in alternative medicine. So I reasoned on the basis of the
little psychology and neurology I had learned in medical school, using
the terminology of the 1970s in my core assumption that motivation
depends on interplay between three forces – instincts, conditioning
and free-will. The addition of free will as a third force (influencing
instincts and conditioning) came to me as an insight, which I regarded
as very important.
There was more to my theory of motivation, since I tried to integrate
my ideas about instincts with parts of the brain that are known to be
involved with subconscious urges to movement and emotions – the basal
ganglia and limbic system. I knew these areas to be characterised by
receptors for the neurotransmitter dopamine, which I associated,
therefore with emotions and movement. This was the basic model, to
which I added what I had recently discovered, in a Scientific American
publication, about the Reticular Activating System (RAS) a network of
neurones connecting the brainstem and higher cortical centres that is
involved in maintaining ones state of consciousness. I read that the
RAS utilizes the neurotransmitter noradrenaline in its synapses, and
noradrenaline is, I found, manufactured from the catecholamine
dopamine.
At the time I accepted the dopamine theory of schizophrenia, but I
thought that negative preconceptions also played a role in what many
psychiatrists at the time were calling “the schizophrenias”. I
contemplated the idea of delusions and reasoned that people with, for
example, negative preconceptions (whether or not valid) of the CIA,
would be prone to noticing things and misinterpreting things in terms
of the CIA and believing they were being followed. I knew that the
problem of schizophrenia was much more complex than misinterpretation,
but I did not find out about the history of the term, systematic
abuses in the name of its treatment, and other reasons to regard
‘schizophrenia’ as a harmful anachronism that should have been
discarded with the rest of pseudoscientific phrenology.
I also developed some theories about the pineal organ in the brain,
despite knowing very little about it, other than it was known to
secrete melatonin during the night, and that light shone into the eye
decreases secretion of melatonin in animals and humans. I also knew
that the pineal is connected with the visual system in a complex way,
and that it is an evolutionarily ancient structure found all the way
from fish to mammals. I also knew that the pineal was suspected to be
a magnetic sense organ in birds and I wondered if it might also in
humans, and be involved in our sense of direction as it was thought to
in birds. The little I knew about the pineal turned out to be much
more than the psychiatrists who were called in to diagnose me, who
still thought it was a useless vestige of primitive brains, as was
thought before the discovery of melatonin in 1958. The belief that the
pineal was vestigial was so prevalent in my own medical family, that
for many years every time I mentioned the pineal over the phone, they
would ring each other wondering “has Romesh gone off again? He’s going
on about the pineal again”. The psychiatrist who recommended my
admission to Prince Charles Hospital in Brisbane in 1995, with a
provisional diagnosis of mania cited, as evidence of madness my
“delusional theories on schizophrenia, autism and magnetism affecting
the pineal organ”.

In retrospect what may have seemed like insights, but they were
actually scientific theories of variable merit. Some were falsifiable
and others were not (which discounts them from being scientific). The
psychiatrists termed it hypomania – a mental state on the ways to
mania – meaning madness.

Such factors, according to my hypothesis, may contribute to both


dementia and depression, by discouraging social interaction and new
learning.

Though I have long forgotten the complex mathematics that I started


studying at school, in other areas my knowledge has increased
dramatically since my teenage years. Some of this has been rote
learning, some has been experiential, some has been obligatory for my
profession as a doctor, but most has been voluntary self-directed
learning. I have been trying to find things out, and it is something I
spend a fair amount of time doing, but in sporadic bursts on different
subjects that interest me. This has resulted in lots of bits of poorly
integrated information, some of which I know to be true, some of which
I am doubtful about. But what criteria should I use to determine if
something is true? I had assumed that the answer lies in logic, reason
and commonsense, but I have been led to believe that commonsense and
reason do not work when it comes understanding matter and energy at
the quantum level, and it can only be understood through the type of
mathematics I started studying at school but have long forgotten. That
means I have to trust the physicists and mathematicians of the West
whose physics and maths also gave us atom bombs. The fact is that atom
bombs, however heinous, do work - they explode as they are calculated
and designed to.

I have also forgotten many things that I learned much more recently.
Other things, learned years ago, remain fresh in my mind. Some of
these things were memories from long before I learned and then forgot
calculus. Why do people remember some things and not others? Is it
just a matter of reinforcing old memories whenever you recall them?
I’m sure if I regularly used calculus I would still remember it, and
would have progressed in my knowledge and understanding of mathematics
rather than regressing from the time I was a teenager at school. Does
that mean I’ve lost synapses in my brain that I could have kept active
by practice? I do believe so.
When I studied medicine in the 1980s, neural plasticity was not
recognised. It was assumed that once the brain had developed to adult
size, there were changes only in the function but not the structure of
the brain. Furthermore, it was assumed that personality, established
in childhood, remains constant the rest of one’s life. In psychiatry
this was termed the “pre-morbid personality” – the personality before
the patient became morbid – meaning ill. The idea that neurons and
other brain cells can grow, forming new connections between cells
throughout life is now widely accepted, and termed neural or neuronal
‘plasticity’. It is also established that neurons, though they reduce
in number, increase in connectivity throughout adolescence, and that
use of neuronal circuits promotes their development and preservation.
It is also known that the growth of the brain is stimulated by
activities that exercise it. The area of motor cortex corresponding to
the little finger of professional violin players has been found to be
larger than normal, reflecting their increased use of this finger.
Calculus and the fact that I have forgotten it, are of less concern to
me than memory and synapses. Doubtless if I was a mathematician I
would feel differently, but I am particularly interested in the brain
and how it works than the mathematics of change. At the same time,
what interests me most about the brain is how it changes, and how
these changes can be put to good use – meaning to promote health.
Having knowledge is not necessarily good for the health. Not having
knowledge is worse. But a little bit of knowledge can be a dangerous
thing if you think it to be a lot of knowledge.
How can I recover my lost knowledge and integrate what I know? At last
the tool is available to me and billions of others, and the
formidable, but vital, task is at hand. The tool is the miracle of the
Internet, which had transformed my life and that of millions of
others. For millions more it has been a given for as long as they can
remember, in the same way the TV was a given when I was a child
growing up in England. The key difference is that TV is inherently
splintered and splintering, while the Internet is inherently
integrative and holistic. The TV is linear, while the Internet is
lateral, as well as being interactive. You can search for what you
want and ask questions in such a way as to fill in the gaps in your
knowledge, as well as revise and learn more, with the greatest of
ease. Having grown up with the TV and remembering what it was like
before Google, Wikipedia and YouTube I have been convinced of the
merits of the Internet, though at first I resisted the new technology
and saw it as a sinister extension of the “manufacture of consent”
that Noam Chomsky had warned us about. Gil-Scott Heron rapped in the
nineties that the “revolution will not be televised”, while the
Disposable Heroes of Hiphoprisy rapped of “television, drug of the
nation, breeding ignorance and feeding radiation”. But the Internet
has liberated humanity in a way TV was never able to, or intended to
do. TV was always about selling advertising, as far as the commercial
channels went. The Internet is also about selling advertising, but it
has turned out to be much more than that. It has become an
unparalleled source of integrated information – though this
integration remains incomplete, and probably always will be. As more
things are discovered and thought of there are more things to
integrate, so the process of integration is necessarily ongoing, but
the Internet provides the means to a vast body of knowledge, far more
than can be stored in the greatest library let alone the greatest
mind.
I have always been aware that the Internet can spread untruths and
falsehoods the same way that the TV can. In some ways it is worse in
this respect, since any rubbish can be posted on the Internet, while
only rubbish that passes “journalistic standards” (whatever they are)
makes it onto the TV. A discerning reader or viewer can find out the
truth, and connect the dots using the Internet in a way that the TV
and old news media never made possible. The Internet not only makes
connecting the dots easy, it brings up the connections made by others,
all over the world. In its best manifestation this had resulted in the
collaborative multi-disciplinary effort of Wikipedia, which has become
my primary source of information on any subject on which I am
uncertain. I don’t trust Wikipedia any more than I trust the sources
various entries provide, but it is a good start for developing an
integrated, factual worldview. YouTube is a valuable secondary source,
but Wikipedia is unrivalled as a primary source, despite its many
flaws, errors and mistakes.
While it is a good place to start Wikipedia is rarely a wise place to
end ones investigation if one wants to find out the truth about
something. What it provides is the mainstream view, but this is an
important view to have, to start with. The mainstream is mainstream
for a reason, but the mainstream may be wrong about things, and you
don’t find that out from Wikipedia. You might find out about
“controversies” though, which can lead you in the direction of the
whole picture. In other areas there are no controversies, as far as
Wikipedia is concerned, but the information provided is incorrect.
This is not because, as its critics argue, “Wikipedia can be written
by anyone” but because the references used are themselves incorrect,
or present only one or two of many competing hypotheses, theories or
views. Wikipedia is also much more reliable in some areas than others,
partly because human knowledge is more certain in some areas than
others.
What is posted about a particular topic on Wikipedia is no doubt
influenced by vested interests and propaganda, though I suspect it is
less so than the other mass-media and the Internet generally. At the
same time, when I have read the entries on some topics on which I have
more knowledge in which I have detailed knowledge, Wikipedia has
rarely been as reliable or readable as truly trustworthy sources. But
truly trustworthy sources are hard to come by. In many areas Wikipedia
is the best of several less reliable alternatives. I’m not certain
about this, though. I believe it right now, but I’m not certain.

I believe much more than I know for certain. Many things that I
thought I knew for certain have turned out to be untrue. They were
delusional in Buddhist sense, but not delusional in a psychiatric
sense, where a delusion is a “false, fixed belief”. If beliefs change,
they were obviously not fixed in the first place. If delusions are
defined merely as false beliefs, few would think that none of their
beliefs are false (that their beliefs are true and correct in every
respect). Finding out that one is wrong is liberating, though it can
also be humbling and embarrassing. There is also a natural reluctance
to believe that you are wrong, which extends to reluctance to believe
that the institutions and individuals that taught you were wrong.
Still, judging by my track record I still have delusions that have not
revealed themselves to me, or been revealed to me by others, as such.
I suspect that I am still suffering from delusions, but I’m not sure
what they are – if I did, I would reject the false belief and assume
the true one.
False certainty
People can feel certain, but be wrong. This is false certainty. People
who think the earth was created 6000 years ago, or that the earth is
flat may be certain about it – meaning they feel certain about it –
but they are wrong. I have watched with some amusement, the self-
proclaimed “Darwinist” Richard Dawkins debating the facts of evolution
with “Young-earthers”, and struck by how tenaciously the latter stick
to their delusion. Dawkins is certain about Darwin’s theory of natural
selection and the creationists are certain about the Biblical account
of creation, and their interpretation of the ancient book. It is
obvious to any rational viewer that Dawkins makes more sense – vastly
more sense, and that the creationists are, to be blunt, stark, raving
mad. This is where the amusement lies, for those of us who agree with
Dawkins about the ancient age of the earth and evolutionary theory.
Why are we amused by crazy people showing that they are crazy? I don’t
know, but I know that watching people make fools of themselves in
public is popular on YouTube, and religious debates featuring Dawkins
and his merry band of atheists have many views and “likes” on YouTube.
I’ve enjoyed them myself, and was spurred on to watch more of Dawkins
media crusade against religion, discovering other icons of atheism
such as the physicist Lawrence Krauss and the outspoken journalist
Christopher Hitchens, author of a book, which I have not read called
“God is Not Great”. Hitchens has written many books, and was very
famous during his lifetime, but I had not heard of him till recently.
Likewise Lawrence Krauss and the other leaders of the atheist
movement, other than Dawkins and Dan Dennett, an American philosopher
I was familiar with (though I did not know about his ardent atheism).
Watching these debates has been interesting but not, for me,
confronting. I don’t have a horse in the race, so to speak. Others,
including but not exclusively, religious people are affronted by
Dawkins, needless to say. People who deeply believe in God can’t be
expected to like being told that their cherished beliefs are
delusional. Their natural reaction is to argue that Dawkins is wrong,
but I am yet to find a situation where they won the debate. Dawkins
likes to debate and to win debates, and they are fun to watch, because
he, and the other atheist intellectuals are, owing to their privileged
educational backgrounds, extremely articulate and often witty. I did
not need to be convinced about the truth of evolution, or the evils of
some religion, though I hesitate to damn all religion, ritual and even
superstition. Maybe religion is instinctual and serves an important
purpose? Maybe religion has some redeeming features which I have
overlooked? The debates, and the longer documentaries about Dawkins’
crusade against religion since writing The God Delusion, have prompted
me to reflect on the role of religion in health and medicine. If
belief in God is a delusion, what does it mean for other religious
beliefs, and the treatment of aberrant or unusual religious and
spiritual beliefs by psychiatrists in Australia and elsewhere. Are all
superstitions to be regarded as delusions, and to what degree are what
are currently diagnosed as delusions commonly held superstitions –
including superstitions disseminated through the globalisation of
religion?
I wondered before about why we are amused when people show themselves
to be crazy. Maybe we have a sadistic streak, or maybe just atheists
do, and they are the ones who watch these videos and attend the
various atheist gatherings. Do atheists have a more cruel sense of
humour than religious people? I really don’t think so, after watching
a recent video of the so-called Islamic State (or ISIS) in Iraq. There
are obviously some forms of craziness that no rational person finds
amusing.
The problem of Islamic extremism and, specifically the September 11,
2001 attacks on the World Trade Centre in New York are what
Christopher Hitchens says promoted him to embark on his campaign
against religion, which he continued to his death in 2012. Hitchens’
book “God is not Great” was a direct challenge, and a word play on the
Muslim exhortation Allah Akbar – God is Great – that the Muslim
extremists have the grotesque habit of shouting when they chop of
heads, filming it with the Western technology that they profess to
hate, and posting it on the Internet, in the hope of attracting more
brainwashed young men to fight and kill people they variously label as
infidels. I am confident that this is not the Islam that most Muslims
follow, any more than Christians follow the example of the Crusades,
but Dawkins and Hitchens have a point when they say that we should
develop our systems of morality (not to mention science) on modern,
enlightened views than those devised thousands of years ago, before
the discoveries of science. I would add before the invention of
international laws and covenants on human and animal rights. These
were not covered by any of the ancient religious traditions, or by the
laws of individual nations, and have transformed what we, in the
modern world, think is acceptable. The fair and equitable enforcement
of these laws is a different matter.
I enjoy watching videos of Hitchens even when I don’t agree with what
he says. He has a likeable directness, and apparent nonchalance. He
can also be passionate in his arguments. He’s a powerful orator and
powerful oratory is pleasurable to watch. Hitchens was a well-known
writer for Vanity Fair and other publications, but I am not familiar
with the Anglo-American media and did not know about his positions on
such things as the Vietnam War, Kissinger, Iraq, Iran and North Korea
until after I had watched and enjoyed his arguments against religious
fundamentalism and the God of the Old Testament. When I watched his
contributions in the political sphere I was rather disappointed in his
apparent blindness to American militarism, that I was well aware of
from such trusted sources as Noam Chomsky. I was also disappointed by
how little he seemed to think of Iran and the achievements of its
people, and his rather dubious claim that having been to Iran, he
could say that most Iranians supported an invasion of their country to
“rid it of the mullahs”. I think this rather unlikely, and gather that
Iran is a relatively progressive, rapidly-developing nation with a
long and proud historical tradition. The literate and multilingually
articulate Iranian people that I have seen on the Internet seem quite
capable of shaking off any domination by the mullahs, if indeed they
are being dominated by the mullahs at all. I don’t know enough about
the subject, but suspect that Hitchens, despite visiting the country
and writing about it and boasting that he had been to all three “Axis
of Evil” countries, did not either. If the people Hitchens met wanted
their country invaded by Americans it must only have been the crazy,
not to say unpatriotic, Iranians who would have expressed such an
opinion.
In his opinions about Saddam Hussein and the probable outcome of an
invasion of Iraq, Hitchens, always happy to venture his opinion and
predictions, was glaringly wrong, as events have turned out. Before
the 2002 invasion by the so-called “coalition of the willing”,
Hitchens predicted that it would all be over within five years of
overthrowing Saddam Hussein, and that the “relatively equal numbers”
of Kurds, Shi’ites and Sunnis would lead to a power balance rather
than sectarian division. He boldly predicted that there would not be
much in the way of revenge attacks and vengeance. He was obviously
wrong here, and I was not too confident on his opinions regarding
Syria either, since I have a suspicion that President Assad is being
demonised, along with President Putin and the other enemies of the US
military and body politic.
One of the words Hitchens has taught me is “numinous”. I had never
heard it before and I liked the sound of it. Sounded like luminous and
we all like to be illuminated. Hitchens used this word in the context
of wonderful and awe-inspiring experiences – “I do not deny the
numinous” he said. I have a linking for my old Oxford Dictionary,
though I could have checked the meaning of the word faster on the
Internet. The dictionary says numinous means “Of a numen; spiritual;
indicating presence of divinity; awe-inspiring”. A numen is,
according to the same dictionary a “local or presiding deity”. What
does Hitchens mean when he says he rejects religion but does not deny
the numinous?
Hichens was a journalist rather than a scientist, but the surviving
leaders of the atheist movement have more to say about science than
politics, which is perhaps just as well. When people such as Sam
Harris have ventured into political debate with the likes of Noam
Chomsky in defence of American foreign policy, they are liable to
expose their ignorance of history and politics. Harris is a
neuroscientist, (though I’m not sure what that means, in his case) and
has many fans as a leader of the atheist brigade rather than any
particular insights into the neurosciences. He specialises in various
ways to wittily put down religion – with a particular focus on
Christianity, Judaism and Islam, and is a popular debater and writer.
Dawkins and Krauss have more to say about science than they do about
politics, with a shared perspective on religion. This is that religion
– all religion – is harmful, because it is based on blind faith rather
than evidence. Dawkins insists that his main concern is with truth and
what is true, what people do is less concern to him. He says though
that he is, in addition to being motivated by love of science, and how
when you are in love you want to talk about it and share it, but
admits to also being angry – angry about children being labelled as
belonging to one or other religion before they are old enough to know
better. The same way we don’t label children according to the
political persuasion of their parents, we shouldn’t label them as
“Christian” or “Muslim” children. This, Dawkins, argues is a form of
child abuse. I can see his point, and am aware that there is a lot of
pressure on children to conform to the religion of their parents and
family, even when logic and rationality dictate otherwise. They are
often afraid, or ashamed, to admit that they don’t believe in their
parental religion. They may feel guilty or embarrassed at their “loss
of faith”. I have a lot of sympathy for such people, and admire people
like Dawkins and Hitchens for drawing attention to the problem of
coercive religion, and Hitchens in particular for his stand against
both female and male genital mutilation in the name of religious
tradition.
When I trained in paediatrics in the 1980s, circumcision in Australia
was in rapid decline. In the 1960s and 70s most Australian men were
circumcised, though I’m not sure why this came to be. There was a
campaign in the 1970s, led by paediatricians, against the practice,
against arguments that included the curious idea that boys would want
to look like their fathers (which patients of mine offered as a reason
to circumcise their sons even in the 1990s) and the idea that it was
good for hygiene. In the 1980s we were told that cutting off the
foreskin to keep the penis clean is like cutting off the ears to keep
the head clean. It stick in my mind, and I used it on the increasingly
rare occasions that I was asked my opinion about whether it was a good
idea to circumcise boys. During our training we were also shown
photographs of circumcisions gone wrong, resulting in horrible
infections and penile deformities. It was enough for me to be
convinced, for good, that circumcision is a brutal tradition, however
widely it is practiced, and even if it was practiced in Australia long
before the White man came here, as I have been told was the case.
Ritual mutilation was common in many tribal societies, and entered
Judaism and later Islam, through the tradition ascribed to Abraham (or
Ibrahim in the Muslim tradition). This is not a good reason to cut of
the foreskin of infants, with or without anaesthetic, 3000 years
later.

Science can tell us what is, what was and what will be, but not what
should be. It can tell us only a bit about what was in the bigger
scheme of things, but it can tell us a lot, in terms of the total
amount of knowledge one person can digest in a lifetime. The rest of
my life could easily be spent discovering what science already knows –
not thinks or believes – about the past. It includes all that science
has discovered about history, anthropology and biology, and what
science has discovered about physics, chemistry and cosmology. The sum
of all this knowledge is still only a small part of scientific
knowledge – there is also linguistic knowledge and the vast amount of
knowledge that has been acquired, through scientific means, of the
human body, including the brain and nervous system.
What is known is naturally a much smaller amount of information than
what is theorised or hypothesised. People have many theories, about
all manner of things, and these theories are dynamic. Scientific
theories come and go. Some are disproved, some are just forgotten
about and go out of fashion, only to be rediscovered some years or
decades later. Some are disproved and yet resurface time and time
again in various guises. This is especially so when it comes to
theories about how various medical treatments work. Note that it is
rarely the theory that leads to the treatment, rather the treatment
that leads to the explanatory theory. This has been the tradition in
Western as well as Eastern therapeutic models. In the case of the
drugs I prescribed daily as a GP I understood the mechanism of action
of only a few, satisfied that no one knew how they worked but they
did. I assumed that someone in the past had proven their efficacy, or
they wouldn’t be listed as pharmaceutical drugs for the treatment of
various conditions. In terms of the specifics I was guided by the
opinions of specialists in the field, though I found that each
specialty had a peculiar obsession with itself and tended to see
everyone through the prism of its own particular doctrines.
I also trusted the drug reps but not much further than I could throw
them. Drug reps are enthusiastic promoters of their drugs and tell
doctors about the upside and not much about the downside of various
drugs. They themselves do not know about the down side, since they are
not told about them in their pre-sale indoctrination. It’s all done by
hypnosis, though it’s never acknowledged as such. The process of
getting people to suspend their critical judgement and implant
suggestions “directly into the subconscious” is how the advertising
experts are taught to think of the minds of their targets. The art and
science of hypnosis has been neglected by the medical profession, but
not by the advertising industry and media machine.
I like the media machine much more than I used to. After the initial
hypnosis more off I became wary of the TV, which I had trusted as a
source of information when I was a teenager and first came to
Australia, and had a profound effect on my taste. Back in the 1970s I
was well and truly hypnotised every time I turned the TV on, which was
frequently. There had been no TV in Sri Lanka, so when we arrived in
Brisbane I watched the box till my eyes were square. I fell in love
with pop stars and actresses, as well as for some reason, Princess
Caroline of Monaco, who I regarded as very beautiful. I had a crush on
Suzie Quatro and Deborah Harry from Blondie, and a picture, from TV
Week, of the country singer Linda Rondstadt. The TV directed my likes
and dislikes, and I fancied girls who looked like the pretty girls I
saw on TV. During this time of TV hypnosis I often found myself
humming jingles from the TV in my mind and out loud. Then there were
slogans from the TV “Don’t sign anything till you see Zupps!” I
gathered from the ad that Zupps was a used car outlet located in what
later became my home suburb – Moorooka. Until I moved here and gained
a much richer perspective of Moorooka, this Aboriginal word meaning
“Iron Bark”, was “Moorooka, the magic mile of motors” where “you’re
sure to find a car to suit your style”. This may be how most
Brisbanites of my age think of Moorooka to this day – from a meme
generated by a used car dealer’s catchy jingle, but things have
changed dramatically in Moorooka since the 1970s when it was famous
for its mile of car retailers. Now it is most remarkable for its
multiculturalism.
Moorooka was last in the news in Brisbane when it was affected by
flood a few years ago, and there is not much about my hometown on the
Internet. Moorooka is one town of thousands that are covered by the
Queensland news media, and is mentioned only when there is a natural
disaster that affects it. In fact, though it has a distinct identity,
Moorooka is only one of many suburbs in Brisbane, and grew, like the
rest of suburbia, out of a British convict colony on the Brisbane
river, named after the British governor of the day, Sir Thomas
Brisbane.
Most of the suburb names in Brisbane are not Aboriginal. When I busk
in the local shopping centre it is rare indeed to see any of the
original Jaggera people coming to the Woolworth’s supermarket. Who I
do see is a multitude of different races of all colours, shapes, sizes
and presumed ages. I don’t spend much time guessing people’s ages, but
I do notice many things about new and old Australians and how they
interact with a stranger playing unfamiliar music on keyboard in the
arcade, and respond to the grooves and melodies he is producing. The
music is unfamiliar in that they have never heard the particular music
I am playing, since it is original and improvised, though certain
aspects of the music would be variably familiar to them. Previously I
have busked playing guitar and singing in Melbourne, which was also an
interesting experience, but my recent ventures into the commercial
heart of Moorooka have been different, because the same people return
to do their shopping, and get used to seeing me there. I generally
smile at everyone who walks by. Some people smile back, and some say
something too. Usually it’s nice. If they don’t like my music they’re
unlikely to say anything at all. Sometimes they smile because I’m
smiling at them, and not because they like my music, but sometimes
they show, from their body movements that they do – whether they smile
or not, and whether they say anything or not. This is because rhythmic
music makes them dance – what I call the dance instinct.
I am intrigued by the dance instinct, and makes people feel like
moving their head and limbs in time to music. Why doesn’t it happen
with our eyes or tongue? What parts of the brain are involved, and how
can the dance instinct be used to promote health? As soon as infants
are old enough to coordinate the movements of their limbs many start
dancing with them, and I have seen toddlers dancing well before they
can talk. Why did this instinct evolve, if indeed it is instinctual,
and I’m sure those, like Steven Pinker who deny a musical instinct,
would also deny a human dance instinct.

I have always regarded myself as a scientist rather than a healer; I


studied the science of medicine rather than the art of healing.
Medicine treats disease, though doctors also try to promote health,
and increasingly medicines are given to healthy people with the
promise of keeping them healthy or making them healthier still. I was
trained to identify myself as a doctor, but never a healer. If a
patient got better, it was because of the treatment I had prescribed,
in addition to their own natural healing mechanisms, but not because
of any healing effect that I myself had on the patient.
I never kept statistics on my patients, though I kept hand-written
case notes. I recorded my patients as cases, but the relationship
between a doctor and patient is rather different to other “cases” such
as legal cases. The therapeutic relationship is supposed to be healing
as well as the treatment prescribed. This is not evident through the
impersonal case notes that GPs write about their patients (or enter on
computer screens).
Doctors, around the world, are taught to “take a history” as the first
step in a clinical examination. This often perfunctory for surgeons,
but is vital in medicine and psychiatry. Taking a “good history” is
regarded as a sign of good medicine, but in reality, the history is
too often bypassed due to time pressure, and lack of appreciation of
how important history-taking is, and how the time used in taking the
history can also be used to implant healthy suggestions by a skilful
therapist. Every utterance by a therapist is a suggestion, including
the ones they offer as mere observations, and to greater or lesser
extent acts as a hypnotic suggestion. These suggestions can have a
positive, negative or neutral effect on the health of the client,
customer, patient or prisoner as the case may be.

The gems of truth are scattered between disciplines and nations, and
between different historical and philosophical traditions. Science is
a body of knowledge (from the Latin scientia, meaning knowledge)
derived from various means of reasoning and logic. There have been
confusing debates over the years about what the “scientific method”
is, but this debate was exclusively in the West, largely between
philosophers in Europe, Britain and the USA. I will discuss my take on
these debates later, but first I might consider the difference between
knowledge and belief, theory, hypothesis, suspicion and disbelief.
Knowledge, belief, theory, hypothesis, suspicion and disbelief are all
degrees of belief. These may be regarded as degrees of certainty. This
is, of course a rather arbitrary classification, but it will do for
now. People talk about things being “scientifically proven”. This
claim is often made for various drugs and other treatments.
Scientifically proven implies that scientific experiments have been
done, using the scientific method – being that falsifiable hypotheses
are supported (made more likely) or disproved (in which case the
hypothesis is discarded). This is the Western philosophical definition
of what the “scientific method” is, not the definition of science as
construed by the ancient Greek scientists and philosophers, or those
of the East, in China, India, Persia, Egypt and Iraq – which led the
world in technology after the fall of the Greek and then the Roman
Empire. So when did the West become the sole credible authorities on
science, scientific knowledge and what the scientific method entails?
The dominance of Western Science came about through the Scientific
Revolution, which has been written about at length. Scientific
“revolutionaries” like Isaac Newton, Voltaire and Descartes were
indeed brilliant and original thinkers, but the greatest discovery of
the eighteenth century, from my perspective, is the invention of the
microscope and amazing discovery that all living organisms are
composed of tiny, amazingly varied cells. This was a true revolution
in and of itself, and it was made in the West.
The other great revolution, made around the same time as Darwin
published his theory of natural selection as an explanation for
evolution, was in the East, according to the perspective of the West –
in Russia, with the wonderful creation of the Periodic Table of
Elements by the Russian scientist Dmitri Mendeleev. The ancient and
medieval Greeks, Romans, Indians, Persians and Chinese physicians had
all assumed that the observable world was composed of only four or
five primary elements – Mendeleev showed that that there were over a
hundred, and predicted from his model elements that had not yet, but
have since, been discovered. Physicians in the West also believed in
the four elements and four related humours, the balance of which
determined health until the Scientific Revolution – which occurred
over a couple of centuries. Descartes, for example, still believed in
the four humours, though he made some surprisingly accurate
observations about the nervous system, along with his philosophical
meditations. The four humours that determined health were named as
blood, phlegm, black and yellow bile, though the meaning of these
terms is not what we mean today (especially phlegm). Each of the four
humours was associated with a personality type, the names of which are
now part of the common (and less common) English vocabulary – sanguine
(dominated by blood), phlegmatic (dominated by phlegm), choleric
(dominated by yellow bile) and melancholic (dominated by black bile).
Melancholic and melancholia were used as medical terms for what was
renamed “depression”. The humoral theory was that each of the humours
had variable amounts of each of the four elements, and that blood had
all of them. This was the rationale behind the widespread intervention
of blood-letting, which was a mainstay of medical treatment by
physicians in the West as well as the Near East (from the Western
perspective – also called rather misleadingly the Muslim World).
Despite growing up in the South and South East of the European maps, I
know much more about the history of Western medicine than I do of the
medical traditions of the East, but unlike most doctors I have
approached them sympathetically, with the assumption that they had
something to offer to what I recognised as a limited perspective
inherent in my training in what I saw as Western Medicine. This was
about twenty years ago, when I first started seriously thinking about
Chinese and Indian health models, and the unfamiliar concepts of chi
(qi), yin and yang, and meridians from Traditional Chinese Medicine
and the concept of chakras from the Ayurvedic system of India and Sri
Lanka. I also looked towards Buddhist philosophy for a new
understanding of psychology, with a growing awareness of the many
flaws in the philosophies of Western psychiatry and psychology.
In late 1994 I met a young woman who challenged my assumptions. Her
name was Helen, and she was a masseur and shiatsu therapist who was
studying Traditional Chinese Medicine. I met Helen and her friend Sara
at a nightclub, and became deeply infatuated with Sara (who later
became my partner and the mother of our daughter Zoe). My infatuation
with Sara led me to become friends with Helen, who directly challenged
me with what she had learned about shiatsu, and was learning about
Traditional Chinese Medicine. I had an open mind about meridians, yin
and yang, but found her explanation of chi unsatisfying. It was clear
that when she was talking about liver and kidneys being affected by
yin and yang, she wasn’t talking with any of the basic knowledge of
these organs that I had gathered at university. In retrospect, using
the modern sceptic terminology, I was hypnotised by woo woo.
Helen introduced me to another occult tradition, for which Sara was
enthusiastic, that of Tarot Cards. I had never heard of Tarot Cards
till Helen brought out a pack, carefully wrapped in her silk
underwear, for some mystical reason. She then brought out the Holy
Book for its interpretation – the Mystic Tarot Book. I was hooked when
Sara gave me a copy of the Mystic Tarot Book and a pack of tarot cards
for my 34th birthday. In retrospect, the Tarot cards drove me mad, for
a while, mainly by convincing me that Sara was in love with me and we
were destined to be together. As it turned out, we did, though; maybe
the tarot cards shaped this event. They certainly didn’t predict them,
and I now don’t think such prediction of the future is possible. But
at the time I was consciously maintaining an open mind, under the
impression that it was unscientific to have a closed mind – you might
say my mind was so open my brain fell out.
I now realise that when I fell in love with Sara I suspended my
critical judgement regarding what she said, and to some degree what
her friend Helen did as well. I believed what she said and I trusted
her judgement, even when it conflicted with what I thought I knew.
This was very unusual for me, but had happened when I had fallen in
love in the past. Over the years, though, my critical faculties
returned to me when it comes to what Sara thinks, though I still
respect what she says, and take her opinion seriously.
Sara now says she warned me, at the time, that tarot cards only show
you what you want to see, and can be interpreted however you want. If
she did so, I was so mesmerised by the occult ritual that I didn’t
hear it – I certainly didn’t heed it. The Mythic Tarot Book presents
various layouts, based on different traditions, though I didn’t bother
reading all the details – I was interested in what it said about the
future. I was excited about the New Age. The millennium was only five
years away.
I was caught up with New Age excitement which was fed by two books
that I read at the time – the Celestine Prophesy by James Redfield and
the Book of the Hopi by Frank Waters. The Book of the Hopi has been
around since the 1960s, but had a resurgence in popularity with the
New Age movement in the 1990s. The thing that captured my attention
the most in the Book of the Hopi was the authors claim that the Hopi
Indians of Arizona had a health system that had the equivalent of the
Indian Chakra model and that the brow chakra corresponds with the
pineal organ in the brain. This led to my enduring interest in the
pineal organ, which I had thought, till then, was a primitive vestige
of the “reptilian brain” that no longer had a function in humans. All
we had learned in medical school was that the pineal, which has no
known function in humans, usually calcifies with age. This
calcification, we were told, can be used to determine whether a
patient had a brain tumour or haemorrhage which was compressing and
distorting the brain (called midline shift, when the calcified pineal
is no longer in the midline).
In 1995 I had neither access to a computer nor to a medical library,
so I had to depend on the few old medical books I had acquired from my
father and a few that I had bought myself over the years, along with
the many medical magazines that were routinely sent to me by sole
virtue of the fact that I was a GP, as well as the CME (Continued
Medical Education) literature that allowed me to call myself a
“Vocationally Registered” GP. These sources didn’t tell me much about
the pineal, other than the 1980 edition of Harrison’s Principles of
Medicine, my father’s old physician’s bible, which had some intriguing
information:
“Since the discovery of melatonin in 1958, compelling evidence has
accumulated that the mammalian pineal is not a vestige but an
important component of a neuroendocrine control system. The organ has
been shown to function as a neuroendocrine transducer.”
This respected textbook contained a lot of information I didn’t know
about, including the metabolic pathway by which the amino acid
tryptophan is converted to serotonin which is then metabolised to
melatonin in the pineal. Acknowledging the mistake, in passing, that
the pineal is a vestige, Professor Richard Wurtman, the author of the
chapter on Diseases of the Pineal Gland, informs us that the pineal is
now thought to influence secretion of hormones by the pituitary, and
to be influenced by light. I was particularly interested in the pineal
organ’s connection with the visual system, and the fact that it has a
sympathetic innervation:
“The ‘information’ about the light travels to the pineal by a
route which involves (1) the inferior accessory optic tract (2)
centers in the brain and spinal cord that regulate the sympathetic
nervous system and (3) the sympathetic nerves to the pineal which
originate in the superior cervical ganglia. The diurnal variation in
melatonin secretion from the human pineal, which causes melatonin
levels to peak during the hours of darkness, provides the body with a
circulating ‘clock’ which is under the direct control of the lighting
environment”
In the early 1990s there was a new sensation on the pharmaceutical
market, about which there was a flurry of hype and pop psychiatry
books – Prozac. Eli Lilly’s Prozac was the first of the SSRI
antidepressants, which rapidly usurped the position of No.1
antidepressant from the tricyclic antidepressants, which had been the
mainstay of psychiatric treatment of depression from the 1950s to the
1990s. I was trained to prescribe tricyclics liberally, in the belief
that these drugs were a good way of getting people off the more
addictive benzodiazepines (or benzos) which were regarded as a big
problem at the time. We were taught that the mechanism of action of
tricyclics is uncertain, but that it was likely to work by affecting
the neurotransmitter noradrenaline (called norepinephrine in the USA).
This led to the promotion of the “noradrenaline theory of depression”
which the drug reps promoted before switching to the “serotonin theory
of depression” with the new class of drugs (which were said to work
through blockage of re-uptake of serotonin by synapses in the brain).
This is where I was at in 1995, when I started trying to make sense of
these conflicting views, all claiming to be “scientific”, and wound up
getting myself diagnosed as insane and locked up for talking too much
about my developing theories.

From my vantage point on the far south-east of the old colonial maps,
it’s difficult to ignore the fact that Australia is not, by any
stretch of the imagination, part of the West. These maps showed
Australia down in the far, lower corner, with only New Zealand further
away from the centre of the Empire. People in England (and the USA)
still describe us as living “down under”. One doesn’t find many
Australians regarding England as being “down under”, although of
course it is. The idea of The West is pervasive though, and important
for very practical reasons – including the diagnosis and treatment of
disease. What I studied at the University of Queensland and Royal
Brisbane Hospital was Western Medicine.
My training in medicine was conventional for the 1980s in Australia,
and was splintered from the outset. We studied anatomy and physiology
as separate subjects, which were disconnected from our studies of
embryology, histology, biochemistry, sociology, psychology and various
specialties of medicine and surgery. These were supposed to somehow
come together when we were confronted with whole human beings with
real problems who sought our medical assistance. We were called the
healing profession, but we did not learn much about health. We did
learn an enormous amount, if splintered, about disease and how to
diagnose it. We also learned the rudiments of how to treat these
diseases with drugs, and when surgery was indicated rather than drugs,
but we learned very little about the natural healing mechanisms of the
body or what we could do to maximise these without resorting to drugs
or surgery.
Over the past twenty years I have been theorising on the healing
mechanisms that were ignored in my training, but that can be explained
in terms of established science. These have been centred on what might
be called the mind-body relationship and a holistic approach to
health, but here I hesitate, because both “mind-body medicine” and
“holistic medicine” are often taken to mean something very different
to my own interpretation of these terms.
When I studied medicine there was already widespread criticism of the
tendency of doctors to view patients in terms of diseased body parts
rather than whole people. This was reinforced by our training. When we
were first allowed on the medical wards we’d be given a list of
patients with “interesting signs” – examine Mr Smith’s liver, listen
to Mrs Brown’s heart (to detect a murmur) or palpate Mr Green’s
enlarged kidneys. It is important that students gain these clinical
skills, but it is easy to lose sight of the human beings who are
enduring the many indignities of being in hospital. This was the
opposite of what I, and many other doctors, regard as a holistic
approach – approaching patients (and I will use this dubious term for
the moment) as a whole rather than dismembered parts.
There was also criticism of the trend towards specialization and
superspecialization, where each speciality had no idea of what was
going on in other specialities. It was, theoretically, the General
Practitioner, or GP who looked after the “primary care” of patients in
the community (the public), coordinating the general healthcare of
each individual in society. This was the theory then, and it is still
the basic paradigm of healthcare in Australia today. The problem is
that GPs are trained first in a splintered view of the body and mind
in university followed by splintered post-graduate studies in the
public and private hospital systems, which are associated with a
splintered network of universities, rival drug companies and assorted
vested interests. Health is big money, and big profits are to be had.
Diagnosing and treating diseases around the world is a multi-billion
dollar industry, in which GPs play a key role. This is why, when I was
working as a GP in Melbourne, the drug companies inundated me with so
much drug-promotional literature, samples and prescribing incentives,
that I started refusing to see reps, except when I was particularly
interested in the drug they were selling. It was only after I had been
working as a GP for decade that I started realising how splintered and
limited my training at university and the hospitals had been, and how
poorly it had equipped me for the essentially holistic task of healing
rather than indefinitely treating. What was needed was for me to
integrate the splinters of knowledge I had about disease, health and
healing and integrate this with what I knew about anatomy and
physiology. My knowledge in all these areas was full of big holes, but
I had studied rudiments of all of them at university and gathered
other knowledge through my three years of hospital training (mostly in
paediatrics) and many years in general practice, mostly as family
doctor in Melbourne.
This tendency to splintering is inherent in the medical system,
reflecting a historical division in Western medicine between surgery
and medicine. My degree was termed a bachelor of medicine and surgery,
and from the time of my hospital internship I had the choice of
steering towards either medicine or surgery. If I chose paediatrics, I
could become either a paediatrician or a paediatric surgeon; the two
disciplines were necessarily separate. The knowledge and skills
required for good surgery and orthopaedics is very different to the
knowledge and skills required to be a good physician, though both
medicine and surgery benefit from what are called “bedside manners”.
When I worked in the hospitals in the 1980s surgeons and orthopaedic
surgeons in particular were notorious for having poor bedside manners,
while physicians prided themselves on having good bedside manners as
well as superior “clinical skills”. This was not necessarily the case.
Some of the medical professors had neither knowledge nor clinical
skills, but had climbed the ladder in order of seniority over the
years. Some of the surgeons were not only rude to their patients but
they were also dangerously incompetent by today’s standards. I have
been assured that the situation has changed, maybe it has. I think
doctors are generally more polite than they used to be in the past,
when the doctor’s gave “orders” which the patients were expected to
“comply with” if they were to be judged “good patients”. Old habits
die hard, though, and in some areas of medicine such as psychiatry,
authoritarianism is still the order of the day. This is the opposite
of what I mean by holistic medicine, which is focused on the whole
human being – mind, body and (dare I say it?) spirit.
Wikipedia has this to say about Holistic Health:
“The holistic concept in medical practice, which is distinct from the
concept in alternative medicine, upholds that all aspects of people's
needs including psychological, physical and social should be taken
into account and seen as a whole. A 2007 study said the concept was
alive and well in general medicine in Sweden.
Some practitioners of holistic medicine use alternative medicine
exclusively, though sometimes holistic treatment can mean simply that
a physician takes account of all a person's circumstances in giving
treatment. Sometimes when alternative medicine is mixed with
mainstream medicine the result is called "holistic" medicine, though
this is more commonly termed integrative medicine.
According to the American Holistic Medical Association it is believed
that the spiritual element should also be taken into account when
assessing a person's overall well-being.”
Wikipedia the makes the startling claim, from the American Cancer
Society website that, “Holistic health is a diverse field in which
many techniques and therapies are used. Practitioners of alternative
approaches may include many methods including colon therapy, metabolic
therapy and orthomolecular medicine.”
There is nothing holistic or scientific about colon therapy, metabolic
therapy or orthomolecular medicine, and most of what is passed off as
“holistic” by the plethora of “alternative medicine” practitioners is
not what I mean by holistic. I mean merely that the whole is greater
than the parts that constitute it, when it comes to living organisms
and ecosystems, and that reductionism is of value only when it is
integrated to gain a whole picture. Reductionism is vital for science,
but so is holism. This is the case for ecology, and also the case for
human biology and psychology. We gain much information by looking at
the detail down to the molecular and atomic level, but unless this
information is integrated into a whole we cannot hope to understand
biology, which is intrinsically holistic on many levels. Biology is
also based on individuals; individual bodies with individual minds
which are subjected to the forces of natural selection and artificial
selection, as well as the interventions of intended healers. Some of
these healers are medically trained, others are not. Some are deluded
about their ability to heal. Some are realistic about their
limitations. Some do actually heal. Some heal but only call it
treatment or cure rather than healing. When I trained as a doctor it
was not regarded as acceptable to call oneself a healer. This term was
reserved for quacks and religious cranks.
The definition of Holistic Medicine by the Canadian Holistic Medical
Association is closer to what I mean as holistic medicine and reads as
follows:
“Holistic medicine is a system of health care which fosters a
cooperative relationship among all those involved, leading towards
optimal attainment of the physical, mental emotional, social and
spiritual aspects of health.
It emphasizes the need to look at the whole person, including analysis
of physical, nutritional, environmental, emotional, social, spiritual
and lifestyle values. It encompasses all stated modalities of
diagnosis and treatment including drugs and surgery if no safe
alternative exists. Holistic medicine focuses on education and
responsibility for personal efforts to achieve balance and well being.
This Canadian website touches on the confusion I have encountered over
the years.
“Other Terms Associated with Holistic Medicine
Alternative Medicine is often used by the general public and some
healthcare practitioners to refer to medical techniques which are not
known or accepted by the majority "conventional" or "allopathic"
medical practitioners (usually M.D.'s). Such techniques could include
non-invasive, non-pharmaceutical techniques such as Medical Herbalism,
Acupuncture, Homeopathy, Reiki, and many others. However, the term
Alternative Medicine can also refer to any experimental drug or non-
drug technique that is not currently accepted by "conventional"
medical practitioners. As non-invasive, non-pharmaceutical techniques
become popular and accepted by large number of "conventional"
practitioners, these techniques will no longer be considered
Alternative Medicine.
Alternative Medicine refers to techniques that are not currently
accepted by "conventional" practitioners, but what is currently
accepted is quickly changing. Even the definition of "conventional
practitioners" is quickly changing. Therefore, techniques that are now
considered part of Alternative Medicine will soon be considered part
of "conventional" medicine. The terms Holistic Healing and Holistic
Medicine are slightly more stable than Alternative Medicine and are
therefore preferable.
Complementary Medicine is often used by "conventional" medical
practitioners to refer to non-invasive, non-pharmaceutical techniques
used as a complement to "conventional" medical treatments such as
drugs and surgery. The term implies that "conventional" medicine is
used as a primary tool and the non-invasive, non-pharmaceutical
techniques are used as a supplement when needed.
In many cases, properly chosen non-invasive and non-pharmaceutical
healing techniques plus properly chosen lifestyle changes can
completely and safely heal both acute and chronic illnesses. In other
cases, "conventional" medicine is only needed in emergencies or when
the safer non-invasive, non-pharmaceutical methods fail. In some cases
"conventional" medicine will be a major part of a Holistic Healing
Plan, but in some cases it is not needed at all.”
According to Wikipedia:
“Natural Healing usually refers to the use of non-invasive and non-
pharmaceuticals techniques to help heal the patient. When most people
use the term Natural Healing, they are usually referring to physical
healing techniques only.”
What is physical healing?
When I checked on Google the only thing that came up for ‘physical
healing’ was the power of prayer, where the contrast is between
physical healing from spiritual healing (whatever that means). There
are also assorted websites talking about Tibetan Buddhism and mind-
body techniques for self-healing. Again they are talking about healing
the physical body (as opposed to the mind) and not any particularly
physical treatment, such as physiotherapy or massage.
I am interested in understanding natural healing mechanisms in order
to promote natural healing. By this I mean non-interventional
treatments, avoiding the use of drugs or surgery (both of which carry
risks). This seems like sensible clinical medicine – to reassure
patients that they will get better by themselves (or that there is
nothing wrong with them – if indeed there is nothing wrong them).
There are many things people can do to speed this process of recovery
up, and other things they may do which slow the recovery down, but
even untreated, many maladies are temporary and are healed by the body
in time. It is these healing processes that I am most interested in,
with the related question of what suggestions can be made, in the
course of a consultation, to promote, rather than hinder these natural
healing mechanisms.
In my experience there are several difficulties with promoting a non-
interventional approach in medicine. Patients come to their doctors
for many reasons. Sometimes it is to check if they are ill in the
first place. What is sometimes called a “check-up”. Sometimes they
already know or think they know what is wrong with them, and have a
clear idea what drug or other treatment they are after (such as a skin
cancer removed, or a wound sutured). At other times they present with
symptoms of which they are uncertain, regarding the cause or
seriousness of their illness (whether it is, to use a common phrase,
“something to worry about”). This is where sound scientific knowledge
and clinical experience is essential. Experience is necessary to
recognise what is normal from what is abnormal, and the difference
between normality and health. What is normal, meaning common or usual,
is not the same as what is healthy. Reassurance without carefully
checking for serious disease is irresponsible and dangerous. On the
other hand instilling pessimism and hopelessness, or causing
unnecessary worry are known to be detrimental to short term and long-
term health. It is a fine balance that is necessary and this balance
should be based on a sound understanding of human biology and
psychology in sickness and health. The medical tradition has been, for
obvious reasons, focused on sickness, hoping that identifying and
treating sickness with drugs and surgery will lead to health (taken to
mean the absence of illness).
The big change in medicine over the three decades since I graduated,
is the increased role of so-called preventive medicine, when risk
factors for various diseases (especially heart disease) are identified
and treated. This is a very costly business for governments, since
cholesterol-lowering and blood-pressure lowering drugs are expensive.
There are also various “early detection” and “screening” programs,
especially for cancer. These programs are all supported by statistical
studies, though questions have been raised about conscious and
unconscious bias in these studies, and the ubiquitous role of the
pharmaceutical industry in driving medical research and publications.
I say so-called, because preventative medicine often causes needless
fear, anxiety and expense, which are experienced as stress. Stress in
turn is known to cause or worsen a range of medical problems, as well
as often being a mental problem in itself.
Words are powerful tools. Words can heal, but words can also kill.
They don’t kill straight away except when orders are given to behead,
bomb or shoot someone, but words can be intentionally or
unintentionally harmful to the health. In the mouths of trusted and
respected people the power of words and phrases increases. When many
people use particular words or phrases, particular memes, they grow
further in power and influence, for good or ill. Increasing
“awareness” of “ADD”, “schizophrenia” or “Alzheimer’s Disease” does
nothing to promote the health of society, though it may get many
people into experimental drug treatment programs. Likewise campaigns
to increase awareness of depression, which has been a bonanza for the
drug companies and the sales of Prozac and other SSRI drugs.
On the other hand early diagnosis of some diseases – what I regard as
real diseases – is important and saves many lives. Breast self-
examination for women, looking for early breast cancer and knowledge
of the signs of breast cancer; knowing that cardiac pain is usually
felt as a dull heavy or crushing pain in the chest, rather than a
sharp pain, and may not be severe to be serious is essential knowledge
that should be passed on to patients by their doctors if they hadn’t
already been taught this at school or by their parents (which is, in
fact very unlikely, despite the millions of lives this simple
knowledge could save). It is essential that patients know that
abnormal bleeding from any orifice needs medical attention, though
most don’t need reminding of this, and worry about it automatically
(though they may be reluctant to seek medical advice). I have mixed
feelings about immunization, cholesterol screening and PAP smears,
though I am generally in support of PAP smears done by experienced
doctors who will not over-treat. It is the same with immunizations. I
do believe in immunization as a preventive measure but am concerned at
the increased number of antigen assaults we are subjecting our
children to, and worry about possible long-term chronic risks such as
autoimmune disease and chronic arthritis. I am not concerned about
mercury and other additives in vaccines, or the supposed association
between autism and vaccination; I think the documented increase in
autism diagnosis can be more reasonably attributed to broader criteria
for application of the label, and increased popularity of the
diagnosis.
My reservations about cholesterol screening are also centred on over-
treatment, and the vested interests of drug companies. The statin
drugs are effective in lowering LDL cholesterol and raising HDL
cholesterol and I accept that high LDL is associated with the
development of atherosclerosis and high HDL has a protective effect
when it comes to heart disease. I also remember how, as soon as the
meme that “high cholesterol causes heart disease” caught on in the
1980s, the big chemist stores were offering on-the-spot blood tests to
measure the cholesterol of the customers, with the hope of selling
them products that had been “scientifically proven” to lower
cholesterol (this was before the development and release of the
statins, which do actually do this, as I have found from my own
clinical experience). It was only later that the mainstream started
talking about “good” (HDL) and “bad” (LDL) cholesterol. What was not
promoted by the drug reps who visited me to encourage me to prescribe
their particular brand of statin drug, was the non-drug methods of
lowering LDL and raising HDL cholesterol. These were, of course,
behavioural changes – changes in diet and activities.
Much has been written about the dietary changes that can be made to
raise HDL (such as more whole grains and fish) and lower LDL
(reduction of saturated fats and fats generally) with various fads
coming and going. I haven’t kept up with these fads, and have little
interest in diet, though I do recognise that it is very important. It
is one of the gaps in my knowledge that I hope to address in the
fullness of time. In the meantime I am drinking a cup of coffee and
pondering the broader physiology and metabolism of cholesterol and how
it relates to the healing mechanisms I have mentioned, but not
expanded on.
Cholesterol is an essential molecule for
/

In researching cholesterol, and reading the Wikipedia entry on this


fascinating molecule, I discovered the existence of the International
Network of Cholesterol Skeptics. Before visiting their site I read
what their enemies had to say about them. I don’t always do this, but
I chose to do so on this occasion. It was on a website called Science-
Based Medicine, which promised to “explore issues and controversies in
science and medicine”. The author, Harriet Hall, doesn’t like the
International Network of Cholesterol Skeptics at all:
“The cholesterol skeptics’ website rejects the consensus of medical
science, saying that consensus is politics, not science. I beg to
differ. Consensus based on opinion is politics. Consensus based on the
evidence is an integral part of the scientific enterprise: when the
evidence is convincing, the majority will be convinced.
“Dissent, debate, and questioning the status quo are important in
keeping science honest, but science doesn’t advance through activist
groups like this. We didn’t need an “X-Ray Skeptics” group to realize
that routine annual chest x-rays were a bad idea; we saw the evidence
and stopped x-raying everybody.
We are still struggling to understand all the ins and outs of
preventing cardiovascular disease. Current guidelines have been
criticized because they are often based on extrapolations and on
insufficient data about actual outcome. A letter to the editor of the
American Family Physician compared guidelines in six different
countries, and found that the American guidelines would save a few
more lives but only at much greater expense: 198 patients treated at
an expense of over $198,000 to prevent one death. I’m sure we’ve
gotten some things wrong, and our present approach will surely be
revised as we continue to learn. But to reject the cholesterol
connection and statins entirely is to throw the baby out with the
bathwater. In my opinion, THINCS is spreading misinformation that
could lead patients to refuse treatment that might prolong their life
or at least prevent heart attacks and strokes.”
THINCS is the thoughtful acronym the The International Network of
Cholesterol Skeptics have chosen for themselves. It has provoked my
curiosity, given what Harriet Hall, a vigorous defender of statin use,
claims are the arguments of the sceptics – that one can lower
cholesterol too much, causing neuronal damage and other health
problems in the elderly, and that these drugs may also cause cancer
with long-term use, in addition to the common side-effect of muscle
pains. Maybe I have been too hasty to comply with consensus opinion of
doctors rather than the hard scientific data – the evidence.
Cochrane Collaboration on cholesterol-lowering drugs:
Steven Novella writing on Science Based Medicine website:
“This recent Cochrane review of statin use for primary prevention
supports the conclusion that statins are safe and effective in
reducing vascular events and overall mortality even in primary
prevention. The benefits are statistically small, which is expected
for a preventive measure in a low risk population. It is still unclear
where to draw the line in terms of which patients should receive
statins, but these data will help practitioners and patients make
individualized decisions about cholesterol management and vascular
prophylaxis.
Because this is ultimately a judgment call, the results of this study
can be spun to a variety of conclusions. The study authors chose to
present an overall negative conclusion – that the effect size is too
small to be worth it. While other experts, looking at the same data,
have come to the opposite conclusion – that statins are worth it. It
is important to emphasize that the debate is not about whether or not
statins have a real effect – they do, but about the cost-benefit of
statins as an intervention for primary prevention.
One could also argue that Cochrane reviewers, given that their purpose
is to provide objective and thorough reviews of existing evidence for
specific clinical questions, should take a more neutral approach to
interpreting the data. This is not the first Cochrane review discussed
on SBM that can be criticized for taking a decidedly biased approach
to the evidence in their conclusions. This should prompt some soul-
searching, in my opinion, on the pat [sic] of the Cochrane
collaboration.”
So who’s behind SBM –Science Based Medicine?

“The benefits of natural medicine have been well documented. We can


complement or act as an alternative to conventional treatments. The
essential difference is that natural medicine aim to stimulate and
strengthen the body's immune system. So you fight what ails you, by
building your defences. When required we combine our treatment with
counselling, on all levels, helping you to cope with life and giving
you the mental strength to realise your health potential.”
Here is a brief description of the main form of natural medicine we
offer:
Treatment modalities provided at Brisbane Holistic Health, three
centres of which have opened in the past 20 years:
ACUPUNCTURE: A 2000 year tradition of the Chinese. We stimulate neural
points to treat muscular pain, and various acute and chronic disorders
and diseases.
HOMEOPATHY: A 19th century German orthodox medicine that uses the
vaccination principle to trigger the body's immune system response.
MEDICAL HERBALISM: A treatment that combine traditional herbalism with
modern clinical training and diagnostic skills.
NATUROPATHY: A combination of these and other natural therapies for
more holistic treatment.
LIFESTYLE & DIETARY ADVICE: Treatment involving dietary changes to
boost immunity, this with healthy lifestyle planning to create a
general well-being.
The main focus of Brisbane Holistic Health is on massage, spas and
saunas, catering for the rich, and seeking “corporate clients”.

Promoting disease and promoting health are opposites. What about


promoting “disease awareness”? When does promoting disease awareness
become, in practice, promoting disease? There have been many examples
of doctors telling a patient that they would be dead in six months,
but the patient has been alive years later.

One event I remember clearly was the death of our dog Smoky, when I
was 13.
/
Charles M Schultz’s Peanuts cartoons made beagles famous around the
world, but I got to know a delightful beagle personally before I was
introduced to the world of Snoopy, Linus and Charlie Brown. When we
arrived in Kandy from England in 1968, when I was eight, our first pet
was a beagle, a special breed that was said to be “blue”. My uncle
Terence was breeding these hounds as attractive, friendly pets rather
than hunting dogs (they are smaller and have a grey and brown rather
than black and brown coat). We called him Smoky because he was not
what could reasonably be defined as blue. The blue beagle puppy
pictured above seems to have blue eyes, but I remember Smoky’s as
being brown.
I loved Smoky as much as I loved any human, when I was a child living
in Sri Lanka and attending high school in the hill town of Kandy. When
Smoky was run over by a bus, when I was 13, I cried for hours. I could
not stop sobbing; it was uncontrollable. I just lay on my bed and
sobbed for hours. I had never wept like this before, and I have never
wept like it in the four decades since then (despite many times I have
been moved to tears over the years). This was a different level of
grief – I was devastated. Then I got angry. I blamed my new school-
friend who had visited me at home to play for the first time, and whom
Smoky had followed down the hill to the main road, where he had been
run over by the bus. I never spoke to him again. Then I blamed the guy
in charge of it all – God.
Of course I could have blamed myself; maybe deep inside that was
indeed what I was doing, when I projected my anger against my school-
friend. I never thought to blame the bus driver or my dog Smoky;
instead I blamed the poor boy that Smoky followed down the hill. I
thought he should have brought him or led him back, rather than
letting him walk onto the main road (which was about 200 metres down a
winding road that led up to our house, along which Smoky did not
usually venture). I didn’t think about the tragedy rationally; I was
driven by overwhelming emotion.
As far as I can recall, after sobbing for a few hours I was able to
get control of the actual sobbing and recuperate from the trauma of
the sudden loss. In medical school, years later, I learned about
normal phases of grief, including anger and projection. The
experience, the most emotionally devastating event in my life till
then, taught me something about what it feels like to lose someone you
love; for to me Smoky was someone not something.
I gathered some consolation from the idea that Smoky was in heaven,
since he was clearly an innocent and good-natured dog. It did not
cross my mind that dogs do not have souls; if humans went to heaven
when they died, life in heaven would be empty without dogs, cats,
birds and butterflies. My concept of heaven was of a place much like
earth, but without mosquitoes.
I knew about mosquitoes because I had been bitten by them. When we
first arrived in Sri Lanka from England where I was born and spent my
early years, every bite would erupt into an itchy blister. After a few
months this stopped, and I stopped reacting to the bites so
dramatically, but I developed an early dislike of these little
creatures and absolutely failed to see why God would create such
pests.
When I was eleven my mother showed me how to recognise malaria
parasites in white blood cells under the microscope and I learned
about the life-cycle of the malaria parasite and how mosquitoes spread
the disease. I was told that malaria causes more illness and death
than any other infectious disease, and that it was only with the use
of DDT after the Second World War, that Sri Lanka was able to largely
eradicate malaria. This increased my belief that mosquitoes were evil
and they must therefore be the work of the Devil. I had a clear
polarity in my mind between what was good – God and what was evil –
the Devil.

When I was ten I noted in my journal a problem that had been on my


mind. Maybe someone had asked me; maybe I thought of it myself. “Why
does God not get rid of the Devil?” I asked my mother, as well as the
school priest, Reverend Nallathamby. They gave rather different
answers, which led me in rather different directions. My mother gave
the same answer that she gave to explain the 6 days of creation in the
Book of Genesis – one day for God is many years for humans. This was a
curious answer. Did this mean that God hadn’t noticed yet and was
talking his time to get rid of evil? The school padre gave me the
standard Protestant apologist answer that “man is given free will”.
This did not answer the puzzling problem of mosquitoes – the evil of
mosquitoes is not caused by man applying his free will in an evil way.
What about earthquakes, floods and other natural disasters – what they
call “Acts of God”? Why does God do these horrible acts? Why does God
kill nice dogs like Smoky and spare all the nasty dogs that bite
people?

One of the big gaps in my knowledge is the deliberations and arguments


of Western philosophers. I have never read the writings of the men
(and few) women who are widely regarded as great, or influential
philosophers. The Internet is helping me gain a bit more understanding
of what they’ve been arguing about over the centuries, and this book
is my attempt to reconcile what I believe about science, medicine and
health with my limited understanding of philosophical discourse and
jargon. Frankly, I can’t understand their verbal arguments at times
any more than I can understand quantum mathematics.
Evolution and the principles of biology and chemistry are complex, but
I find them much easier to understand. This may be because they are
essentially mechanistic and it is easy to think in a mechanistic way.
It is a natural way to think about things – effects have causes, and
one can logically ascertain at least some of these causes. I can’t
think of anything happening without a cause, and in this sense I am
both a mechanist and a rationalist. It least I am, at this point in
time, though it might change. It might change even while writing this
book.
Let me clarify what I mean by mechanist, but first I’d better check
what the philosophers define mechanism as. There’s no better place to
start than Wikipedia:
“Mechanism is the belief that natural wholes (principally living
things) are like complicated machines or artifacts, composed of parts
lacking any intrinsic relationship to each other. Thus, the source of
an apparent thing's activities is not the whole itself, but its parts
or an external influence on the parts. The doctrine of mechanism in
philosophy comes in two different flavors. They are both doctrines of
metaphysics, but they are different in scope and ambitions: the first
is a global doctrine about nature; the second is a local doctrine
about humans and their minds, which is hotly contested. For clarity,
we might distinguish these two doctrines as universal mechanism and
anthropic mechanism.”
Wikipedia continues to say that:
“There is no constant meaning in the history of philosophy for the
word Mechanism. Originally, the term meant that cosmological theory
which ascribes the motion and changes of the world to some external
force. In this view material things are purely passive, while
according to the opposite theory (i. e., Dynamism), they possess
certain internal sources of energy which account for the activity of
each and for its influence on the course of events; These meanings,
however, soon underwent modification. The question as to whether
motion is an inherent property of bodies, or has been communicated to
them by some external agency, was very often ignored. With a large
number of cosmologists the essential feature of Mechanism is the
attempt to reduce all the qualities and activities of bodies to
quantitative realities, i. e. to mass and motion. But a further
modification soon followed. Living bodies, as is well known, present
at first sight certain characteristic properties which have no
counterpart in lifeless matter. Mechanism aims to go beyond these
appearances. It seeks to explain all "vital" phenomena as physical and
chemical facts; whether or not these facts are in turn reducible to
mass and motion becomes a secondary question, although Mechanists are
generally inclined to favour such reduction. The theory opposed to
this biological mechanism is no longer Dynamism, but Vitalism or Neo-
vitalism, which maintains that vital activities cannot be explained,
and never will be explained, by the laws which govern lifeless
matter.”
“One of the first and most famous expositions of universal mechanism
is found in the opening passages of Leviathan by Thomas Hobbes (1651).
What is less frequently appreciated is that René Descartes was a
staunch mechanist, though today, in Philosophy of Mind, he is
remembered for introducing the mind–body problem in terms of dualism
and physicalism.
“Descartes was a substance dualist, and argued that reality was
composed of two radically different types of substance: extended
matter, on the one hand, and immaterial mind, on the other. Descartes
argued that one cannot explain the conscious mind in terms of the
spatial dynamics of mechanistic bits of matter cannoning off each
other. Nevertheless, his understanding of biology was thoroughly
mechanistic in nature:
"I should like you to consider that these functions (including
passion, memory, and imagination) follow from the mere arrangement of
the machine’s organs every bit as naturally as the movements of a
clock or other automaton follow from the arrangement of its counter-
weights and wheels." (Descartes, Treatise on Man, p.108)
His scientific work was based on the traditional mechanistic
understanding that animals and humans are completely mechanistic
automata. Descartes' dualism was motivated by the seeming
impossibility that mechanical dynamics could yield mental experiences.
—"Mechanism" in Catholic Encyclopedia (1913)
Given this definition I already have doubts as to whether I want to be
a mechanist. But is the Catholic Encyclopedia misrepresenting
Descartes and the position of mechanism. What is the alternative to
mechanism – vitalism? Dynamism or neo-vitalism? What do all these isms
mean, and is the only alternative to agree with Descartes that living
organisms are “automatons”?
I have read experts on Descartes saying that he is frequently
misunderstood. One such misunderstanding was Descartes famous claim
that the pineal organ in the brain was the seat of the soul – the
point at which the soul communicated with the body. I read some years
ago that though he has often been ridiculed for this apparently absurd
claim, Descartes was clear that the soul could not be localised to any
one part of the body and suffused the whole person. At the same time I
have read many times that Descartes is to blame for “mind-body
dualism” in the West, and that the Eastern traditions were not marred
by such dualism. I cannot read French and cannot be bothered reading
the English translations of Descartes works at this stage, but I have
a feeling that the Catholic Encyclopedia says is not correct to say
that that Descartes thought that humans are “completely mechanistic
automata”. If that were the case there would be no function for the
rational soul, which Descartes obviously believed in.
I haven’t read any of Aristotle’s work either (nor of any of the other
great Greek philosophers, other than a smattering of Plato’s Republic)
but I gather, again from the Internet that the famous philosopher
theorised that there are three components to the soul – the nutritive,
sensitive and rational souls. Plants have only a nutritive soul,
allowing them to grow and reproduce, animals have a sensitive soul as
well, but only humans have a rational soul (as well as a nutritive
soul and sensitive soul). I can see merit in this view, so I might
take it as a starting point in my investigation into the mind, soul,
consciousness and healing.
What is a rational soul and are dogs rational? Are butterflies
rational, for that matter? Do butterflies have souls? What is a soul,
exactly?

Beagles were originally bred as hunting hounds, mainly by English


gentlemen to hunt rabbits and foxes. They have been engineered to have
a medium size, with long, soft ears, a cute face and friendly, even
disposition to humans (including children) making them popular pets.
Knowledge of breeding animals to develop desired traits was acquired
by various cultures in ancient times, and was further developed in the
Middle Ages. There were Siamese cats, and different Burmese and
Tibetan breeds, and Pekingese dogs in China. In the West there was a
profusion of dog breeds, widely different in size, shape and
temperament. This was known long before Darwin developed his theory of
evolution of species by natural selection. Evolution of different
breeds (with different appearance) using deliberate or unnatural
selection was assumed knowledge.
It is only recently that I have realised the controversy about
evolution, science and religion. In Australia it is not a matter that
is discussed much. I had assumed that everyone sensible believed in
the basic tenets of evolution by natural selection. Having watched a
number of debates on the matter and watching with some amusement the
antics of the British biologist Richard Dawkins, I gather that this is
not the case. A surprising number of people don’t believe in
evolution.

These debates have made me realise that evolution is something I am


deeply interested in, but had assumed was quite compatible with
religious beliefs. I myself rejected organized religion when I was in
my teens, but I remained interested in religions and religious
traditions. I regarded the holy books of various religions (none of
which I could understand in their native languages) as based on myths
rather than true history. But what is true history? How true are the
histories of various nations and the history of the English nation and
the English language? Who wrote – who writes – this history?
Of course there are many versions of history, but in my view there is
only one true history – what actually happened. No recollection or
recording made by the few literate men and women of history is exactly
what happened, but with improved technology we are getting closer and
closer to developing methods of recording reality for posterity. The
development of photography was one such achievement, and I am a great
fan of the art and science of photography. Photography has transformed
science, from studies of the cosmos to studies of the tiny organelles
in cells. Film and video recording transformed it again.
The Internet is one tool that was devised perhaps more with control
and capitalism in mind than emancipation that has truly revolutionized
knowledge. I am a true believer in the benefits of the Internet,
because I remember how difficult self-directed research was before the
Age of the Internet. Though it is often criticised, I am also a big
admirer of Wikipedia. I think Wikipedia is a great place to start when
investigating a topic, but should never be where you stop if you want
to know the truth.
This is where things get difficult. How do you know if what Wikipedia
says is true? That, of course, depends on the quality of the
references, which can be followed easily enough. But how reliable are
the references? That depends on the subject, and I have found wild
disparities in how accurate the Wikipedia entries are in areas in
which I have some knowledge. Wikipedia reflects ‘mainstream’ thinking.
Fortunately, mainstream thinking in science is mostly reasonable,
though mainstream thinking in politics and religion is not. Mainstream
thinking in history depends on whether it is the mainstream in India,
Brazil, Russia, the USA, Canada, Nigeria, China, France, Germany,
Indonesia, Sri Lanka or Australia, to some degree, but there are
certain historical facts that people around the world agree on. There
are more facts that academics agree on than in the general public,
though there are key differences in historical perspective between
historians of different nations.
There is general consensus that Christopher Columbus sailed across the
Atlantic in 1492 and the general accounts of Columbus’s voyages as
recorded at the time. Likewise there is general consensus about the
arrival of Captain James Cook in what he called Botany Bay in 1770,
and that his sailed on the good ship Endeavour. By 1770 the historical
record was quite dependable in some respects, so we can be confident
that the ship’s naturalist was Joseph Banks, who collected biological
specimens to take back to London, where they would be studied at the
Natural History Museum and experts at Cambridge and Oxford. This was
the distinctly European way of studying nature. Shoot it or catch it
in a net, kill it as painlessly as possible, and collect its carcass
to study in a large room full of other such carcasses, before
dissecting the specimen and examining every part of it carefully under
the microscope. All the great naturalists were collectors who had
their own personal collections of butterflies, fossils, shells,
beetles and other treasures. The Victorian fascination with collecting
pretty things extended to pretty insects, especially butterflies and
stuffed tropical birds. New Guinea’s gorgeous birds of paradise and
birdwing butterflies were favourites, resulting in subsequent laws to
protect these species. Other favourites were the brightly coloured
day-flying moths of Madagascar, which found their way into glass
display cabinets designed for the fashionable walls of Victorian
gentlemen’s homes.
These displays of butterflies fell out of favour with a changing
aesthetic and morality about displaying dead animals as decorations.
In Victorian times it was fashionable for naturalists to go big game
hunting along with the other gentlemen and display the heads of wild
animals as trophies. I took a dim view of this, even when I was a
child, but I made an exception when it came to my own trophy
collecting. I did not think of it as trophy collecting, and I’m not
sure that I do now either. Let me explain.
This is a famous passage from the pen of Alfred Russell Wallace, who
collected butterflies in the Malay Peninsula, Indonesia and New
Guinea, following his voyage to the Amazon contributing to his
independent (from Darwin) development of the theory of evolution by
natural selection:
“ I had taken about thirty species of butterflies, more than I had
ever captured in a day since leaving the prolific banks of the Amazon,
and among them were many most rare and beautiful insects, hitherto
only known by a few specimens from New Guinea…I had the good fortune
to capture one of the most magnificent insects the world contains, the
great bird-winged butterfly, Ornithoptera poseidon. I trembled in
excitement as I saw it coming majestically towards me, and could
hardly believe I had really succeeded in my stroke till I had taken it
out of the net and was gazing, lost in admiration, at the velvet black
and brilliant green of its wings, seven inches across, its golden
body, and crimson breast. It is true that I had seen similar insects
in cabinets at home, but it is quite another thing to capture such
one’s self – to feel it struggling between one’s fingers, and to gaze
upon its fresh and living beauty, a bright gem shining out amid the
silent gloom of a dark and tangled forest.”

I can imagine Wallace’s delight. I felt the same excitement when I


caught a rare, beautiful butterfly as a child. Unlike Wallace I
outgrew it, and can see the moral absurdity of waxing lyrical about
the beauty and wonder of a beautiful creature you are about to kill,
for no reason other than that it was unfortunate enough to appeal to
your aesthetic sense. Though I see this now as a moral absurdity, this
was indeed my mentality when I collected butterflies, shells and
birds, under the impression that what I was doing was scientific,
because it was based on scientific identification and classification
of the animals. At the same time, my collections, and the observations
I made when collecting butterflies, do have some scientific value.

I only saw a TV cartoon of Snoopy after we came to Australia in 1976,


when I was fifteen. During the seven years we lived in Sri Lanka there
was no TV, and a restriction on foreign “luxury goods” like records
and books. There were some English books for sale in the few English
language bookshops in Sri Lanka, of which there was only one in Kandy.
Kandy, once the capital of the Hill Kingdom, and the last to stand
before Britain conquered the whole of Sri Lanka (the Ceylon) was
surrounded by forest-covered mountains, and others that had been
cleared for cultivation. My favourite place in the world was
Udawattekele (which translates from Sinhalese as ‘upper garden
forest’), where I was allowed, from the age of 13, to venture to
alone, armed only with my butterfly net and binoculars. Closer to
home, and during our trips around the island, I also took my air-gun.
My twin obsessions were hunting butterflies for my butterfly
collection (with my net) and hunting birds for my feather collection
(and attempts at taxidermy). I had started with butterflies when I was
eleven (when the whole family was involved in the hobby) but had
become increasingly interested in killing birds and shooting for the
sheer pleasure of hitting and killing the target. I even practiced by
shooting butterflies on the wing, provided I was sure that it was a
“common species”. I had no real comprehension that what was a common
species for me in Sri Lanka could be a very rare and treasured species
(or non-existent species) for butterfly collectors in other parts of
the world. I wasn’t much interested in other parts of the world. My
obsession was to catch as many different species as I could, identify
them and pin them out. This process had a ritual, which became almost
automatic. I would stalk the butterfly after seeking a likely
environment for butterflies, and use various attack strokes with my
net (which I made myself, with the help of my mother). I learned,
through trial and error, different strokes of the net depending on the
particular flight patterns of different families and species, and
whether the insect was on the wing, or settled on a leaf or the
ground. Different situations required different approaches, since, as
I discovered, butterflies have good eyesight.
Adding butterflies to my collection required me to kill them. My
father’s friend Joyce, who introduced the family to butterfly
collecting, assured us that “squeezing the thorax of the insect firmly
kills it without much pain”. I assumed this to be the case until I
started thinking about it. The more I hunted butterflies with modified
hunter gatherer instincts I observed more and more evidence that the
creatures I was hunting and killing were both sentient and likely to
feel pain, though with enormous differences from our own sentience and
experience of physical and mental pain.
The research my parents were conducting into the fluoride content of
drinking water enabled me to go butterfly hunting all over the island,
especially in the northern and eastern dry zones, where we usually
stayed at irrigation bungalows that had been established during
colonial times, when the British had started renovating the ancient
irrigation works that had been built under the patronage of Sri Lankan
kings in ancient times. I never learned much about the history of
these kings or the rice-centred civilizations they ruled over, but I
did discover, from the personal experience of hunting bird and
butterflies, that the fauna varies according to geography. Some
butterflies could only be found in the northern Jaffna peninsula,
where they were common, but not found in the south. Many more were
found in the south and east but not in the south. When we visited
Horton Plains, in the highest mountain range, I collected Red Admirals
(Vanessa Atalanta), which are found in in temperate Europe and North
America but whose range extends to the mountains (but only the
mountains and not the coast) of Sri Lanka.
My reference text for identifying butterflies and determining how rare
they were, which was my foolish preoccupation, was The Butterfly Fauna
of Ceylon by L.G.O.Woodhouse, the former Surveyor-General of Ceylon
and was published by the Colombo Apothecaries Company. The second
edition of this unique reference text was published in 1949. The
Butterfly Fauna of Ceylon records information about Sri Lankan
butterflies caught before 1949, and I noticed, when I was collecting
the insects in the 1970s, that some butterflies that were recorded as
“common” were proving to be elusive. One whose name and appearance in
the book caught my imagination was the “Monkey Puzzle”. This is a
photograph of this beautiful butterfly by someone who has adopted a
more humane way of studying butterflies than catching them and
suffocating them by squeezing their thorax.
/
Krishna Mohan is a photographer and educator in Moodabidri in the
Karnataka State of India who is leading the way in ethology, though he
does not claim such a title. Ethology is a discipline that had escaped
my attention until very recently. I hadn’t even heard the word, or
that is the new scientific discipline that studies “animal behaviour”.

This is his response to one of the interested readers on his


photography website:
???????????,
???? ?????? ????????? ???????? ???? ??????
(lycaenidae). ??????? ??????
40% ????????? ? ?????????? ???????. ?????????? ??????? ????????? ????
????????? ?????????????? ? ????? ?????????. ????? ????? ???? ?????? ??
?? ???????????????, ? ??????? ???? ???? ??????????? ????? ??? ????????
??? ???????????.

? ?????? ????
3 ????? ??????????? ??????? ??????????????. ???? ????? ????????? ?????
? ???????? ?????? ????? ??????? ??????????????.
Its method of alighting is interesting – as soon as it lands, it turns
around and waggles its tail filaments, it also sidesteps for a while –
all this is apparently to confuse a predator as to which side is the
head. This is a likely reason that the first naturalists may have
named the species the Monkey Puzzle.
????
????? ????

Learnin the lingo

“Giddai, Howyagoin?” the boy asked me, a few days after I arrived in
Australia, and had started year 11 at ‘Churchie’, as I found my new
school was fondly called by its students.
I had no idea what he was saying. “What?” I asked.
“Giddai. You know, giddai, giddai…like Hawaii!”
I could see that he thought I was daft, or at best unable to
comprehend English. What I could not understand was Australian, and
the uniquely Australian pronunciation of the old-fashioned English
greeting, “Good Day”. If you say “good day” to an Australian they will
understand it, but they will also recognise you as a foreigner. The
Australian greeting is “giddai”, not “gidday” or “gooday”.
The appropriate response, if one wishes to be recognised as Australian
and not English, is “I’m good”. The English response is, “I’m well”.
There is a difference between being well and being good, which I have
wondered about for many years. Did the Australian language betray its
criminal heritage, where the early colonists were eager to assure
their neighbours that they were good, rather than assure them about
the state of their health?
Pissed and Fell Over (PFO) – pissed: English - drunk, American –
angry.

Deadly – The concert was deadly


Mob - where are your mob from?
Gins, lubras and picaninnys
Racist placenames in Queensland
In Queensland's Alton Downs, near Rockhampton, people are debating if
they should rename Black Gin Creek Road. White Australians used to
call an Aboriginal woman a 'gin', often implying that they were used
for sexual services by the white men. Another 'Black Gin Creek Road'
is near Bambaroo, 60kms north-west of Townsville, Queensland. Nigger
Creek near Wondecla, QLD, is yet another example.

The reason that people so ferociously advocate for keeping these


racist beacons of Queensland's past has a lot to do with the fact that
many non-Indigenous Australians do not know what it feels like to be
called a 'gin' or a 'nigger' or a 'coon'.
—Amy McQuire, Aboriginal journalist [22]

HYPERLINK
"http://espace.library.uq.edu.au/view/UQ:179979/Amy_Humphreys_Honours_
Thesis_2008.pdf"http://espace.library.uq.edu.au/view/UQ:179979/Amy_Hum
phreys_Honours_Thesis_2008.pdf
Amy Humphreys wrote in her 2008 social science honours thesis
Representation of Aboriginal Women and their Sexuality that:
“colonists who engaged in prolonged personal and sexual
relationships with Aboriginal women were called a ‘gin-jockey’ or
‘combo’ and were
rejected by white society as morally degraded individuals”.

“In the Queensland Figaro in 1883 one contributor remarked that


he:
could, if necessary, tell of, and produce people who know of, hundreds
of instances where white savages have raped young maiden [sic] and
older gins — ay, and boasted of their deeds vain-gloriously (14 April
1883).

“Though asserted in these accounts, rape is not witnessed. Nonetheless


it is still used to
highlight sexual exploitation, and more significantly the low status
of Aboriginal
women in colonist society, with some according to Evans describing the
rape of
Aboriginal women as ‘gin busting’ (1982:15). Aboriginal women’s
sexuality was
defined by their sexual interactions with colonists, whose access to
the wider colonial
discourse enabled them to construct and enforce their representation
of sexual liaisons
with Aboriginal women. Aboriginal women however, had no such chance to
represent
themselves. For Aboriginal women, their sexuality was degraded and
debated within a
colonial Christian discourse that claimed perversion whilst enjoying
the benefits of
their dispossession and resultant poverty (Evans 1982:7-12).”

Evan, the source referenced in Amy Humphreys’ thesis is Queensland


historian Professor Raymond Evans, author of 1975 ‘And the Nigger
shall disappear’… :Aborigines and Europeans in Colonial Quensland, in
Exclusion, Exploitation and Extermination: Race Relations in Colonial
Queensland.

1982 ‘Don’t you remember Black Alice Sam Holt? Aboriginal Women in
Queensland History’. In Hecate, 8(2.1):6-21.

Wiktionary: boong
English
Etymology[edit]
First used by soldiers in New Guinea. Suggested sources are

Malay boong (“brother”),[1]


Indonesian dialectal bung (“brother”)
A New Guinea native language
An Aboriginal Australian language.[2]
Previously the word binghi was used widely in similar fashion to the
present-day use of the term Negro for peoples of African ancestry, see
titles from this booklist and also writings of Xavier Herbert (e.g. in
Capricornia), for example.
1930-35; < Dyangadi (Australian Aboriginal language of Macleay River
valley, E New South Wales) bi?ay elder brother.
Collins dictionary:
binghi (?b????)
(Australian, offensive, slang) an Aboriginal person

Noun
boong (plural boongs)

(Australia, slang, dated) A native of New Guinea. ?


(Australia, slang, very pejorative, ethnic slur) An Australian
aboriginal.
Synonyms:
(Asian or dark-skinned person): Fuzzy Wuzzy Angel
(aboriginal): abo, Jacky
1959, Xavier Herbert, Seven Emus, 2003, page 5 — The term boong is
originally Malayan, meaning “brother”, but it doesn't mean anything
like that in Australian usage.

From the Urban Dictionary:


HYPERLINK "http://www.urbandictionary.com/define.php?
term=Boong"http://www.urbandictionary.com/define.php?term=Boong

Boong (114 up, 93 down)


Boong is a racist and derogatory word used to describe Aboriginal
Australian people.

Contrary to what is already written, 'Boong' is derived from the word


'Boonga-boonga' which is actually an Eastern Australian Aboriginal
word (Eora and Cadigal People for example) that means arse or bum and
was originally used by the Aboriginal people as an insult to the
colonisers.

Many Aborignal languages do not have swear words and, in Aboriginal


Culture, the arse is the most disgusting part of the body. The
Aborignal People would taunt the europeans and call them Boonga's
(whilst flashing their black arses at them 'Braveheart ' movie style).
Once the europeans realised what they were being called they turned it
around and began using the word to describe Aboriginal People.

This is accurate information which was sourced from my personal


lecture notes as well as papers studied at the Bankstown campus of the
University of Western Sydney. Diploma in Indigenous Community Studies,
Issues in Aboirignal Education, Native Title in Australia.
Examples:
"Hahaha look at these silly white boys, their muskets can't shoot as
far as we can throw our spears, they can't get us!" (Then while baring
their arses) "Boonga boonga boonga hahaha!"
"These Gubba's are spreading disease all over the country, nothing but
boonga-boonga's!"
by The ONLY educated one July 16, 2009

Boong (1171 up, 599 down – top definition)


1.Native Australian 2.Person with flat, upturned nose. 3.Person used
as target practice or human shield. 4.Dole bludger. 5.Ingestor of
methylated spirits, bagged wine and insects. 6.Cave dweller. 7.Person
who lives like a bum or is dirty. 8.Semi-evolved being.
Examples:
1.If it weren't for boongs, this country would be richer. 2.Lets throw
another boong on the barbie.
by John Barry November 20, 2004

boong (660 up, 296 down)


A defamitory word used against Australian Aboriginals, refering to
their race.
Origin:
During the 1950's & 60's people would actualy chase Abiriginals off
their propery with 4wd's, & it's reported that 'boong' is the sound
they make when they hit the bull bar.
Example:
Is that all those boongs do? sit in the city & scab cigarettes?
by cheffie0987 September 28, 2005

boong (431 up, 238 down)


derogatory term for a native australian, also known as aboriginal, abo
and coon.
reputation of raiding petrol stations as a cheap way of getting high.
Example:
hey john man, don't sell the metho to that boong, he's just gonna
drink it.
by random February 12, 2005

Boong (377 up, 267 down)


a dark colored australian primate with a flat nose and low forehead,
normally found in the desert or redfern.Diet is known to include 2
stroke oil as well as varoius items found in the trash.The word boong
originates from the sound they make when they bounce off your bumper
or bonnet
Hell wazza that boong has gone and left a dent in my bonnet
by gook hunter October 29, 2006

1988, The Bulletin, Issues 5617-5625, page 121 — They would doubtless
have been amused to learn that in New Guinea, where the term "boong"
originated, it means "brother" and has a kinship with the Indonesian
"bung" and Thursday Island's "binghi".

Let me start with the school I was sent to by my parents, at


considerable expense, in the belief that it was the “best boy’s school
in Brisbane”. This was debatable, even at the time, though the school
had a better academic record than it does nowadays. At the time I knew
no better and assumed as the school culture insisted, that it was not
just the best school in Brisbane but the best school in Queensland. No
one was grandiose enough to suggest that it was the best school in the
world, but there wasn’t much talk about the rest of the world. There
was little talk of the rest of Queensland, either. Most of the
Australian language that I learned at school was about sport and
science. I didn’t talk much about girls when I was at school, in fact
I don’t remember talking about them at all, though they were certainly
on my mind.
The main things that were on my mind when I first arrived in
Australia, when I was 15, were butterflies. I had been an obsessive
collector of butterflies since I was 10, and had caught about 150
different species from around Sri Lanka, when my parents had taken me
on research trips, during which I added to my growing collection. I
was obsessed with catching new species and identifying them in The
Butterfly Fauna of Ceylon and ascertaining, on the basis of what
Woodhouse had written to be the case, how rare or common it was. I
wanted to catch the rare ones.

According to the school’s current (28.5.2015) website:


“Churchie’s rich history and longstanding traditions date back to 1912
when William Perry French Morris founded the School at Toowong, before
establishing it on the present site in East Brisbane in 1918. Canon
Morris based the School’s ethos on the patron saint, St Magnus, a
Viking Earl known for his Viking strength of character and his
qualities as an educated man with a Christian nature.
The School crest reflects the character of the Viking tradition – the
shield and battle axes stand for Viking courage and the axes are
crossed to signify self-sacrifice. Churchie’s core values of
scholastic attainment, personal development, spiritual awareness and
community service build on the characteristics and attributes
displayed by St Magnus.
The School’s Viking tradition is reflected in many aspects of school
life – rowing boats are named after Vikings; architecture represents
Viking icons; and the School’s mascot, Eric, a Viking effigy makes
regular appearances at sporting events. In early days Canon Morris
called on the boys to ‘finish hard’ in all their pursuits and this cry
is often called on today. In Canon Morris’s first address to parents
he stated his aim was to ‘train characters as well as minds’. He
encouraged boys to take part in physical activity as well as their
studies.

I would have thought that the Vikings are more famous for
slaughtering, raping and pillaging others than self-sacrifice; and
it’s hard to see how crossed axes signify self-sacrifice either. I’m
not sure that self-sacrifice is such a good thing, anyway. Isn’t it
what all these jihadis think they’re doing when they strap on suicide
bombs and what the Tamil Tigers thought they were doing when they blew
themselves up in the hope of becoming martyrs?

THE AUSTRALIAN OCTOBER 31, 2007 12:00AM:


“Robert Sharwood will be released on November 8 after serving one year
of a 33-month sentence for sexually assaulting and sodomising a boy of
13 in Brisbane 30 years ago.
The Brisbane District Court was told last November that Sharwood, then
a 30-year-old priest, groomed and seduced the boy, picking him up at
the bus stop most school days and having oral and masturbatory sex
with him on 300 occasions. Anglican church officials knew of
Sharwood's pedophilia, but despite that knowledge appointed him
chaplain at the prestigious Brisbane Anglican private boys' school
Churchie from 1985 until 2002.”
The Anglican church had “counselled” the priest, but allowed him to
carry on as a priest and teacher at Churchie – short for Church of
England Grammar School.

Who Invented the Scientific Method?


Martyn Shuttleworth (Freelance writer, British living in Greece)
The question of who invented the scientific method is extremely
difficult to answer, simply because it is difficult to pin down
exactly where it started.

The scientific method evolved over time, with some of history's


greatest and most influential minds adding to and refining the
process.

Whilst many point to Aristotle and the Greek philosophers as the prime
movers behind the development of the scientific method, this is too
much of a leap.

Whilst the Greeks were the first Western civilization to adopt


observation and measurement as part of learning about the world, there
was not enough structure to call it the scientific method.

It is fair to say that Aristotle was the founder of empirical science,


but the development of a scientific process resembling the modern
method was developed by Muslim scholars, during the Golden age of
Islam, and refined by the enlightenment scientist-philosophers.

The Muslims and the Scientific Method

Muslim scholars, between the 10th and 14th centuries, were the prime
movers behind the development of the scientific method.

They were the first to use experiment and observation as the basis of
science, and many historians regard science as starting during this
period.

Amongst the array of great scholars, al-Haytham (Alhazen 695-1040 AD)


is regarded as the architect of the scientific method. His scientific
method involved the following stages:

Observation of the natural world

Stating a definite problem

Formulating a robust hypothesis

Test the hypothesis through experimentation

Assess and analyze the results

Interpret the data and draw conclusions

Publish the findings

These steps are very similar to the modern scientific method and they
became the basis of Western science during the Renaissance.

Al-Haytham even insisted upon repeatability and the replication of


results, and other scholars added ideas such as peer review and made
great leaps in understanding the natural world.

Europe and the Renaissance

The question of who invented the scientific method shifts to Europe as


the Renaissance began and the wisdom of the Greeks and Arabs helped
Europe out of the Dark Ages.

Roger Bacon (1214 - 1284) is credited as the first scholar to promote


inductive reasoning as part of the scientific method.

Here, findings from an experiment are generalized to the wider world,


a process used by almost all modern scientists. His version of the
Islamic scientific method involved four major steps, which lie at the
root of our modern method.

Observation
Hypothesis
Experiment
Verification
This process continued with the Enlightenment, with Francis Bacon
(1561 - 1626) and Descartes (1596 - 1650). Francis Bacon continued the
work of his Renaissance namesake, strengthening the inductive process.
His method became:

Empirical Observations
Systematic Experiments
Analyzing Experimental Evidence
Inductive Reasoning
Bacon's inductive method was a way of relating observations to the
universe and natural phenomena through establishing cause and effect.

Descartes broke away from the model of induction and reasoning and
again proposed that deduction was the only way to learn and
understand, harking back to Plato. His method was almost the reverse
of induction:

Establish First Principles


Deductive Reasoning
Interpretation
Mathematical Analysis
Reasoning Cycle - Scientific Research

Descartes believed that the entire universe was a perfect machine and
that, if you knew the first principles, derived from mathematical
proofs.

As an example, he deduced that planets revolved around the sun because


they were floating in a liquid 'ether' filling space!

Newton and the Modern Scientific Method


Any discussion about who invented the scientific method must include
Isaac Newton, as the scientist who refined the process into one that
we use today.

He was the first to realise that scientific discovery needed both


induction and deduction, a revolution in the scientific method that
took science into the modern age.

After Newton

There were many other great thinkers who refined the scientific
method, including Einstein, Russell, Popper and Feyerabend, amongst a
whole host of other great thinkers.

However, it may no longer be correct to talk of the 'scientific


method,' rather the 'Physics Method' or the 'Psychology Method,'
because each scientific discipline has started to use its own
methodology and terminology.

However, it is an old quote, but Newton's statement that, 'If I see


further, it is only because I stand upon the shoulder of giants', is
very apt when looking at who invented the scientific method.

All of these great thinkers, and many others beside, had a great
influence upon determining the course of modern science as we know it.

So, when you ask 'Who invented the Scientific Method?" the answer is
no-one, as the scientific method is in a state of constant evolution
and modification.

Sadly, if you were looking for a simple answer that will fit into the
short answer section of a test, you will not find it here!
Wikipedia:
The Islamic Golden Age refers to the period in Islam's history during
the Middle Ages when much of the Muslim world was ruled by various
caliphates, experiencing a scientific, economic, and cultural
flourishing.[1][2][3] This period is traditionally understood to have
begun during the reign of the Abbasid caliph Harun al-Rashid (786 to
809) with the inauguration of the House of Wisdom in Baghdad, where
scholars from various parts of the world sought to translate and
gather all the known world's knowledge into Arabic.[4][5] It is said
to have ended with the collapse of the Abbasid Caliphate with the
Mongol invasions and the Sack of Baghdad in 1258.[6] Several
contemporary scholars, however, place the end of the Islamic Golden
Age to be around the 15th to 16th centuries.[1][2][3]

Starting in the 16th century, the opening of new sea trade routes by
Western European powers to South Asia and the Americas bypassed the
Islamic economies, and led to colonial empires, greatly reducing the
Muslim world's prosperity.
HYPERLINK
"http://en.wikipedia.org/wiki/Scientific_method"http://en.wikipedia.or
g/wiki/Scientific_method
The scientific method is a body of techniques for investigating
phenomena, acquiring new knowledge, or correcting and integrating
previous knowledge.[2] To be termed scientific, a method of inquiry is
commonly based on empirical or measurable evidence subject to specific
principles of reasoning.[3] The Oxford English Dictionary defines the
scientific method as "a method or procedure that has characterized
natural science since the 17th century, consisting in systematic
observation, measurement, and experiment, and the formulation,
testing, and modification of hypotheses."[4]

The scientific method is an ongoing process, which usually begins with


observations about the natural world. Human beings are naturally
inquisitive, so they often come up with questions about things they
see or hear and often develop ideas (hypotheses) about why things are
the way they are. The best hypotheses lead to predictions that can be
tested in various ways, including making further observations about
nature. In general, the strongest tests of hypotheses come from
carefully controlled and replicated experiments that gather empirical
data. Depending on how well the tests match the predictions, the
original hypothesis may require refinement, alteration, expansion or
even rejection. If a particular hypothesis becomes very well supported
a general theory may be developed.[1]

Scientific Sages and Certainty

I believe that the sun will rise tomorrow morning. I have faith in
this. I am certain about it. I also realise that the rising of the sun
is something of an illusion. The sun appears to ‘rise’ because of the
rotation of the earth. It rises only relative to my position. While it
is rising to me, here in the Antipodes, it’s setting on the opposite
side of the world.
The famous philosopher of science, Karl Popper argued that just
because the sun has risen every day in the past does not mean you can
be certain that the sun will rise again tomorrow. His argument was
that one cannot be 100% certain about anything, just closer to the
truth, as various falsifiable alternative hypotheses are disproved by
observable, measurable evidence. This is the paradigm of science I was
trained to believe in – that science is what scientists do when they
use the “scientific method”, which constitutes developing and then
testing falsifiable hypotheses with “empirical evidence”. But what do
all these terms mean? I must admit I have never really considered them
deeply. I just accepted Popper’s definition of science without even
knowing who Karl Popper was, or why his definition of science was
taught to me at medical school.
Karl Popper (1902-1994), was an Austrian philosopher and
psychoanalyst, who had a long and distinguished career at the London
School of Economics and University of London, where he played a key
role in shaping what scientists think science is in the West.
According to Popper’s definition, scientific theories must be
falsifiable to regard them as ‘scientific’. Thus things, including the
rising of the sun tomorrow, become very likely but never certain.
Whether or not we should trust our belief that the sun will rise
tomorrow, and regard it as certain, on the basis of inductive
reasoning, was questioned by Popper.
In Conjectures and Refutations (1963) the philosopher argued that:
“The sun may have risen again after every past day of which we have
knowledge, but this does not entail that it will rise tomorrow. If
someone says: 'Ah yes, but we can in fact predict the precise time at
which the sun will rise tomorrow from the established laws of physics,
as applied to conditions as we have them at this moment', we can
answer him twice over. First the fact that the laws of physics have
been found to hold good in the past does not logically entail that
they will continue to hold good in the future. Second, the laws of
physics are themselves general statements which are not logically
entailed by the observed instances, however numerous, which are
adduced in their support. So this attempt to justify induction begs
the question by taking the validity of induction for granted. The
whole of our science assumes the regularity of nature – assumes that
the future will be like the past in all those respects in which the
natural laws are taken to operate – yet there is no way in which this
assumption can be secured. It cannot be established by observation,
since we cannot observe future events. And it cannot be established by
logical argument, since from the fact that all past futures have
resembled past pasts it does not follow that all future futures will
resemble future pasts.“
Good philosophers raise questions that one hasn’t thought of. I had
assumed that everyone accepted that the sun will rise tomorrow, and
that this is accepted fact. It is a true, correct belief, and it is
reasonable and wise to have faith that the sun will rise. If one did
not believe the sun would rise it would likely lead to irrational,
unreasonable thoughts and actions. It came as a surprise that Karl
Popper, whose sage advice I was trained to respect, argued otherwise.

Liz Williams published a series of articles in the Guardian in 2012


titled “Karl Popper, the enemy of certainty” in which she explained:
“The search for truth was, Popper considered, the strongest motivation
for scientific discovery. His role was to determine how we can ascribe
truth to the claims made by science, religion and politics…Following
on from Hume and the latter’s rejection of induction, Popper took a
stand against an empiricist view of science, endeavouring to show via
his rejection of verificationism, and consequent espousal of
falsificationism, how scientific theories progress”.
The certainty of physicists in the mathematics of Newton had been
apparently shaken by empirical evidence supporting the relativity
theories of Einstein. Popper saw a clear difference between the
physical sciences and the pseudo-science of psychology, in how
physicists dealt with this new evidence. Williams opines on Popper’s
new perspective, after he became disillusioned with Marxism and
psychoanalysis, both of which he had once considered scientific:
“The scientist should reject theories when they are falsified. For
instance, Einstein’s theories generate hypothetical consequences
which, if shown to be false, would falsify the entire theoretical
structure on which they rest. Psychological theories, however, in
their attempt to explain all forms of human behaviour, can continually
be shored up by subsidiary hypotheses. Exceptions can always be found.
On a Popperian model, psychology resembles magical thinking: if an
expected result does not manifest, explanations can be found which
explain that failure away, and thus the core theory remains intact.
This, Popper considered, is a weak point – the theory cannot be
properly tested if it is inherently unfalsifiable.”
Popper migrated to England immediately after the Second World War (in
1946), after first migrating from Austria to New Zealand (in 1937),
where he had been lecturer in philosophy at the University of New
Zealand in Christchurch. It was here that he wrote his influential
work The Open Society and its Enemies. His denunciation of Marxism as
“pseudoscientific” along with his similar denunciation of Freudian
(and Alderian) psychoanalysis, doubtless contributed to his popularity
in Britain and its Western capitalist allies, such as Australia, where
Popper was the only philosopher who was mentioned during my medical
studies. In 1946, after the Second World War, he moved to England to
become reader in logic and scientific method at the London School of
Economics. Three years later, in 1949, he was appointed professor of
logic and scientific method at the University of London.
While In New Zealand, Popper met the famous Australian
neurophysiologist John Eccles, with whom he speculated on the problem
of free will. In 1984 they co-authored The Self and Its Brain: an
argument for interactionism. Free will has been one of the enduring
problems of neuroscience, psychology and philosophy, with some arguing
that free will is an illusion, and others insisting on its importance.
I’m not sure what “interactionism” is, other than it postulates mind-
body dualism, where the mind and body are separate, interacting
entities. Or something along those lines.
Sir John Eccles was another great scientific mind and acknowledged
modern sage, who won the Nobel Prize in Medicine in 1963 for his role
in the discovery of synaptic transmission in the brain. Eccles had
previously thought that all neuronal conduction was electrical, but
his research with Bernard Katz established the fact that acetyl
choline (ACh) acts as a neurotransmitter, carrying signals across
synapses, the junctions between outgoing axons and incoming dendrites
on nerve cells (neurons). This discovery was fundamental to the modern
development of ‘biological’ psychiatry and neurology.
Rather curiously, Eccles regarded himself and all of us as “spiritual
beings”. In a lecture on the Unity of Conscious Experience in 1965 he
ended with the words:
“The primary data upon which I build everything even as a scientist is
myself as a conscious being and ultimately as a spiritual being. By
virtue of our brains and the tremendous wealth of input from sense
organs and the muscles that our brains control, we receive from this
world and we give to this world as individuals, but each of us in a
sense has a conscious existence as a spiritual being apart from this
world. Let me conclude by stating again that I believe there is a
great unknown mystery in each one of us, in every living person.”
Popper revealed his views on religion only on the condition that his
1969 interview on the subject be released only after his death. He
criticised both organized religion and “some forms of atheism which
are “arrogant and ignorant and should be rejected”. He sensibly argued
that “the whole thing goes back to myths which, though they may have a
kernel of truth, are untrue”. “Why then” he asks, “should the Jewish
myth be true and the Indian and Egyptian myths not be true?”
Though I don’t know a lot about him, in my view Popper was a sage and
so was Eccles, but even sages are fallible. According to the ancient
Greeks, a sage was a philosopher who attained the wisdom they sought.
In the Greek tradition being a sage was a masculine pursuit, and all
the people I think of as sages are men. It helps if they have long
white beards, but beards are not necessary since I also think of
Confucius, Gautama Siddhartha (the Buddha) and the monk Mahanama (the
founder of the Jain religion) as “sages”. I have no idea whether they
wore beards, but I think of them as sages, nevertheless. I think of
the famed Islamic physician Ibn Sina (Avicenna) and the Belgian
anatomist Andreas Vesalius as sages, and Charles Darwin too (who had
the added advantage of a long white beard). I think of sages as wise
old men, in other words. Wise, but fallible, old men.
I recently asked my 82-year old mother what she thinks “sages” are –
“wise old men” she answered. This may be where I gained the idea. I
asked her why she thought sages were all men, and what wise old women
were called. “Witches, I suppose” was her surprising answer.
Surprising because she doesn’t believe in witches, and surprising
because of the insight she showed into how wise women have been
regarded historically. She added the reason that there had been so
many more prominent male than female intellectuals in history was a
reflection of relative educational opportunities for women, and the
fact that women have looked after the wise men and their children,
while they made their names as sages.
I do not think of Jesus of Nazareth or the prophet Mohammed as sages.
Jesus because he did not live long enough to gain the wisdom of years
and Mohammed because the “revelations” he had do not strike me as
particularly wise. Another key difference between Jesus and Mohammed
is that Mohammed founded the Muslim religion as a reinterpretation of
Christianity, which was already the official religion of the Roman
Empire; this was very different to the essentially pacifist,
communalist teachings of Jesus. Mohammed expanded his empire by
personal conquest, the Christian Empires expanded because of the
military conquests of the Roman Empire. I do not think of the Emperor
Constantine as a sage, though some might. I’m not too sure about King
Solomon, and am loath to think of Moses as a sage (though both were
said by their followers to be wise and are usually depicted with
beards). Deciding on who and who is not a sage is a subjective
decision, deciding on who, with my subjective, mental
retrospectoscope, I personally consider has been notably wise, in
human history. But I confine myself here to wise old men, and most of
the wise people I have got to know well over the years have not been
men. They have been women.
As my mother wisely observed, wise women weren’t called sages – they
were called “witches”. The wisdom of the wise women was called “folk
wisdom” rather than Scientific Knowledge, which was argued about by
men in male-dominated universities. Witches were treated very
differently to sages. The wise women were wise because they were less
inclined to argue than the cantankerous old men, who wanted to “win
debates” like they had when they were younger men. Old men sometimes
grow bitter and hardened, including old men in universities over the
centuries. Some of these old men built empires over which they could
rule, pushing their cherished theories with all their argumentative
might. Max Planck famously observed that “science advances one funeral
at a time”.
Max Planck was another acknowledged scientific sage, who won the Nobel
Prize in Physics in 1918, for his development of quantum theory.
Quantum theory revolutionized human understanding of atomic and
subatomic processes, just as Albert Einstein’s theory of relativity
revolutionized the understanding of space and time. Most people would
accept that Einstein too was a sage, as well as a “genius”. Not all
sages are regarded as geniuses and many widely acknowledged geniuses
did not live long enough to be regarded as sages. Many geniuses are,
in fact, children – who are given the terms like prodigy or “gifted”.
Geniuses are clever, but they are not necessarily wise. But what is
wisdom, and what is its relationship to knowledge?
The word science is derived from the Latin scientia, meaning
knowledge. Considering all of human knowledge, scientific knowledge
can be regarded only as a part, a small part. There is also historical
knowledge and philosophical knowledge, cultural knowledge and legal
knowledge in addition to religious knowledge and personal (and
interpersonal) knowledge. All these types of knowledge have a bearing
on the acquisition of wisdom, which is as elusive for scientists, I
suspect, as for anyone else.
I’ve been trying to get my head around philosophical jargon, in order
to gain some insights into and understanding of wisdom. According to
Wikipedia, “the Austrian-British philosopher Karl Popper (1902 -1994)
is known for his rejection of the classical inductivist views on the
scientific method, in favour of empirical falsification: A theory in
the empirical sciences can never be proven, but it can be falsified,
meaning that it can and should be scrutinised by decisive experiments.
If the outcome of an experiment contradicts the theory, one should
refrain from ad hoc manoeuvres that evade the contradiction merely by
making it less falsifiable. Popper is also known for his opposition to
the classical justificationist account of knowledge which he replaced
with critical rationalism, "the first non-justificational philosophy
of criticism in the history of philosophy."
What does this mean? What do “inductivist” and “justificationist”
really mean? I assume that inductivist means someone who uses
inductive as opposed to deductive reasoning, but surely there is room,
indeed need, for both types of reasoning in scientific thinking? Both
‘inductivist’ and ‘justificationist’ register with red lines on
Microsoft Word spellcheck. I’d never heard of “justificationism” until
I read the Wikipedia entry on Karl Popper. Maybe I, too, am a
justificationist?
According to the Live Science website:
Inductive reasoning is the opposite of deductive reasoning. Inductive
reasoning makes broad generalizations from specific observations. "In
inductive inference, we go from the specific to the general. We make
many observations, discern a pattern, make a generalization, and infer
an explanation or a theory," Wassertheil-Smoller told Live Science.
"In science there is a constant interplay between inductive inference
(based on observations) and deductive inference (based on theory),
until we get closer and closer to the 'truth,' which we can only
approach but not ascertain with complete certainty." Dr. Sylvia
Wassertheil-Smoller, is a researcher and professor emerita at Albert
Einstein College of Medicine.

Does this mean that complete certainty is never justified?


According to Wikipedia: “the theory of justification is a part of
epistemology that attempts to understand the justification of
propositions and beliefs. Epistemologists are concerned with various
epistemic features of belief, which include the ideas of
justification, warrant, rationality, and probability. Of these four
terms, the term that has been most widely used and discussed by the
early 21st century is "warrant". Loosely speaking, justification is
the reason that someone (properly) holds a belief. When a claim is in
doubt, justification can be used to support the claim and reduce or
remove the doubt. Justification can use empiricism (the evidence of
the senses), authoritative testimony (the appeal to criteria and
authority), or logical deduction.”
Is belief that the sun will rise tomorrow warranted? Is such a belief
rational? Is it just probable or is it certain? From my
unphilosophical perspective, belief that the sun will rise tomorrow
(and the day after that) is very much justified. What’s more, I have
faith that it will be so, and believe that this faith is also
justified.
I am also aware that some of the more questionable beliefs I, and many
other people, hold are held on the basis of “authoritative testimony”.
Over the years I have questioned this authoritative testimony and the
holy books of medicine, of which there are only two that spring to
mind. These are Harrison’s Principles of Internal Medicine, my
father’s professional bible and the Diagnostic and Statistical Manual
of Mental Disorders (DSM), commonly referred to as the “psychiatrist’s
bible”. Maybe one could regard Gray’s Anatomy as the anatomist’s
bible, though I have never heard it referred to as such.
Harrison’s Principles of Internal Medicine is not treated as a bible
by all physicians, though it is widely respected as an authoritative
textbook. To my father, though, it is the absolute authority on any
matter that it covers. He purchases each new edition, so he can “keep
up” with the “progress of medicine”. It is the reference text he
trusts on matters of medicine and health. My father also regards
himself as a Christian, but he has much more faith in “Harrison’s”, as
he calls it, than he does the Christian Bible. I think this relative
faith in the collective opinion of American medical professors is very
much warranted. The contents of Harrison’s Internal Medicine may be
slanted towards the interests of the pharmaceutical industry, but they
are much more reliable than the writings of any religion, and more
likely to lead to healing than prayer to any god of the present or
past. Though fallible, unlike ancient religious texts Harrison’s
Principles of Medicine is updated every couple of years, and there is
clear progress in the knowledge it contains.
The DSM is also updated with new editions, though not as frequently as
Harrison’s Principles of Internal Medicine. The DSM is published by
the American Psychiatric Association (APA) and has been the standard
reference text for the application of labels of “mental disorder” in
the USA, Canada and Australia since the 1950s, when it was first
published. Again, it is not intended to be regarded as a Bible, but it
often is. The rival ICD (International Classification of Disease) is
not worshipped quite the way the DSM is in Europe and the UK, since it
is produced by the World Health Organization and not the APA. The APA
has many features of a religion, and of a cult. The DSM also updates
its information, but not in the way that Harrison’s Principles of
Internal Medicine does, which modifies its contents on the basis of
new scientific evidence. The notable characteristic of the DSM is that
new editions have more and more disease labels and broader criteria
for their application.
By Popper’s definition the DSM and ICD are not scientific. They do not
contain falsifiable hypotheses. In fact, they do not provide
hypotheses at all. They present, instead, an open-ended system of
classification based on consensus opinion of psychiatrists, influenced
by various lobby groups that campaign for “changes in classification”.
Hence homosexuality stopped being classified as a mental disorder in
the 1973, while the new ‘disorder’ of ‘attention deficit hyperactivity
disorder’ (ADHD) was created with the publication of the DSM IV in
1994. Though it has “statistical” in its title, the DSM does not
provide psychiatrists with any numbers, statistical or otherwise. What
are presented are stereotypes of various “disorders”, with additional
categories for people who do not satisfy the “diagnostic criteria”.
There is a label for everyone, though psychiatrists use their
discretion about who to apply a label to, or whether to apply a label
at all. They are, predictably, less likely to use the same diagnostic
criteria on themselves.
The American physician who edited the first five versions of
Harrison’s Principles of Internal Medicine, Tinsley R Harrison (1900-
1975) was another modern sage. The first edition was published in 1950
and it remains, in its current edition, one of the most widely read
and regarded textbooks in medicine. Harrison was a cardiologist, but
his textbook covered all aspects of internal medicine, and was a
monumental work of integrative science. He integrated medical
knowledge from various specialties in one textbook, with a focus on
the clinical diagnosis and clinical course of various diseases, their
pathogenesis and associated pathophysiology and treated and untreated
prognosis. He did this by organizing for experts in the various
specialties to write chapters, which he collected together and edited.
The textbook did not cover certain diseases, though. Notably it did
not say much about diseases and illnesses of the mind and what can be
done about it. It said nothing about how the mind influences the body;
even the well-known placebo effect is not discussed. Mind-body dualism
is deeply embedded in the medical sciences. Psychiatry deals with the
mind, neurology with the brain. This, it has been said, has resulted
in a mindless neurology and a brainless psychiatry.
The great sages of the East had very different perspectives on the
mind to the modern scientific sages of the West. This is seen most
famously in the philosophy of the sage Gautama Siddhartha, who came to
be known as the Buddha or “enlightened one”. The Buddha did not talk
about the brain, but he had a lot to say about the mind. He was
focused on virtuous thinking and conduct, analysing these in
excruciating detail, in developing a method to alleviate suffering.
Suffering, he theorised, was an inevitable part of life, and could
only be permanently eliminated when the long cycle of reincarnation
(samsara) reaches an end (nirvana). The cycle of rebirth, which each
individual has endured since time immemorial, is caused by good and
bad karma (merit) which accrues from previous lives. This was the
previous Hindu belief, which the Buddha reinterpreted, removing the
inequities of caste (which was justified by karma in the Hindu
tradition) and introducing the concept of anatman – meaning no-self or
no-soul (absence of atman or soul in Sanskrit). The concept of anatman
has been debated ever since, since it seems inconsistent with belief
in reincarnation, and is sometimes interpreted as meaning no
unchanging soul or self. The Buddha taught that one can achieve
liberation (moksha) from samsara by ridding oneself of attachment or
craving for this or future lives. It was liberation by extinction. At
the same time, the Buddha’s teaching about how to reduce suffering by
“the Noble Eightfold path” (right view, intention, speech, action,
livelihood, effort, mindfulness and concentration), his promotion of
the virtues of metta (loving kindness), karuna (compassion), muditha
(rejoicing in the joy of others) and upeksa (equanimity) and
identification of greed, hatred and delusion as vices seems like
sound, profound philosophy to me. According to the Buddha:
Metta embraces all beings
Karuna embraces all those who suffer
Muditha embraces the prosperous
Upekkha embraces the good, bad, loved and unloved, pleasant and
unpleasant.
The Buddha didn’t sound as if he was uncertain about things. But then
he wasn’t a scientist, he was a philosopher, and a wise one. He urged,
though, that people maintain scepticism about even his enlightened
words, and judge for themselves on the basis of reason and experience.
His teachings, though, have become religious doctrines – the word
dharma, meaning both Law and Truth – is used to refer to what the sage
is said to have said. He, as far as I’m aware, did ask his followers
to discuss and debate the dharma, not follow it blindly. There is deep
wisdom in Buddhism, part of the wisdom being that the founder of the
religion did not regard himself as infallible, a god or channelling
the word of God.
I have only mentioned deceased sages so far, but there are many sages
and other wise people alive today. There are probably as many more
wise women, young men and children than sages (wise old men) on the
planet, but they are not the ones whose talking heads and hands are
most apparent on the miracle of the Internet. Historically, it is
supposedly wise old men who have shaped the dominant religions, as
well as field of philosophy. They have also dominated science.
One modern sage of the Internet Age who looks and sounds very sage
indeed is the philosopher Daniel Dennett, who is blessed with a long
white beard and a deep voice, as well as an incisive intellect. After
watching some debates featuring Dennett and watching him in friendly
discussion with like-minded atheists, I took some time off to read a
book by the wise man that I bought some years ago, but hadn’t read. It
is titled “Kinds of Minds”, and I read it immediately after I had
finished “The Amazing Hypothesis” by Francis Crick, which again I
bought many years ago but had read only bits of. Both books are about
the “problem” of consciousness, which until now I had not thought of
as much as a problem. I didn’t really know what the problem was, let
alone why it is called “the hard problem of consciousness”. Now I do –
the hard problem is the question of “how do brains produce
consciousness?”
Crick and Dennett both agree that brains produce minds, but they have
very different approaches to the subject matter, Crick taking a more
reductionist, ‘scientific’ approach and Dennett taking a more
philosophical approach. This is to be expected, since Crick calls
himself a scientist, while Dennett calls himself a philosopher. My
impression is that Dennett’s arguments are just as scientific as
Crick’s, and considerably more modest (as reflected in the respective
titles of their books). Dennett argues that free will is an illusion,
and that consciousness cannot be localised to any part of the brain;
he argues cogently that consciousness exists as a continuum from the
simple sentience to the complexity of human consciousness according to
the complexity of brains, which function primarily as “information
processors”. Crick argues, in the postscript of The Amazing
Hypothesis, that Free Will (he uses capitals) does indeed exist, and
“is located in or near the anterior cingulate sulcus” though “in
practice, things are likely to be more complicated”, and “other areas
in the front of the brain may also be involved”. Crick also argues
that “it is better to avoid a precise definition of consciousness
because of the dangers of premature definition”. “Until the problem is
understood much better, any attempt at formal definition is likely to
be either misleading or overly restrictive, or both”, he explains. The
model Crick promotes is based on computer analogies, in which the
individual neurons (nerve cells) provide the circuits through the
network of interconnected nerve fibres (axons and dendrons) forming
neural networks. He presents his “astonishing hypothesis” that the
brain produces the mind, as alternative hypothesis to dualism:
“It is difficult for many people to accept that what they see is a
symbolic interpretation of the world- it all seems so like ‘the real
thing’. But in fact we have no direct knowledge of objects in the
world. Instead, people often prefer to believe that there is a
disembodied soul that, in some utterly mysterious way, does the actual
seeing, helped by the elaborate apparatus of the brain. Such people
are called ‘dualists’ – they believe that matter is one thing and mind
is something completely different. Our Astonishing Hypothesis says, on
the contrary, that this is not the case, that it’s all done by nerve
cells.”

In The Understanding of the Brain (1973), Eccles summarized his


philosophical position on the so-called 'brain-mind problem'. He
announced that “he fully accepted the recent philosophical
achievements of Sir Karl Popper with his concept of three worlds. I
was a dualist, now I am a trialist! Cartesian dualism has become
unfashionable with many people. They embrace monism in order to escape
the enigma of brain-mind interaction with its perplexing problems. But
Sir Karl Popper and I are interactionists, and what is more, trialist
interactionists!” According to Eccles the three worlds are very easily
defined and that in this classification, there is nothing left out: it
encompasses everything that is in existence and in our experience. All
can be classified in one or other of the categories enumerated under
Worlds 1, 2 and 3.
World 1, according to Popper comprises “physical objects and states”
(inorganic matter and energy, biology and artefacts such as tools,
books, works of art and music); World 2 referred to “states of
consciousness” (perception, thinking, emotions, dispositional
intentions, memories, dreams, creative imagination) and World 3 was
“knowledge in the objective sense” (records of intellectual efforts,
philosophical, theological, scientific, historical, literary,
artistic, technological and theoretical systems such as scientific
problems and critical arguments).
The idea of trialism hasn’t caught on. Interestingly, trialism is the
name given to a political movement in the Austro-Hungarian Empire in
the 19th century, which may be where Popper got the word (though not
the concept, which was quite different). In 1986 the British
philosopher John Cottingham, an expert on Rene Descartes, tried
introducing the idea of trialism again, as an extension of Descartes
dualism (between mind and body) adding the third category of
‘sensation’. This hasn’t been a successful meme either.
Popper and Eccles three worlds paralleled a more widely accepted
division – a political one – between First, Second and Third Worlds.
This made their model even more confusing, since 1st, 2nd and 3rd
worlds mean something very different to their World 1, 2 and 3. Popper
and Eccles were proud sons of the 1st World, which was the Cold War
enemy of the 2nd, or Communist World. The 3rd World produced the raw
materials, while the 1st and 2nd Worlds manufactured them into weapons
and sold them back to the 3rd World. While the scientific sages were
pontificating on consciousness and free will, the 1st and 2nd Worlds
were fighting proxy wars in 3rd World nations in which the civilians
of the 3rd World were slaughtered with bombs designed with 1st world
scientific knowhow and built in 1st world factories.
The sage who has most consistently and powerfully opposed 1st world
and American militarism is undoubtedly Noam Chomsky, who was also a
pioneer in the cognitive revolution, when American psychologists
became interested in the mind again, after abandoning it in favour of
measuring ‘behaviour’. The neuroscientist Antonio Damasio credits
Chomsky with beginning the cognitive revolution with a 1957 paper;
from what I have read elsewhere Chomsky’s 1959 review of Verbal
Behavior by B F Skinner was a key publication in this “revolution”.
Damasio says that the cognitive revolution led to a resurgence in
psychological interest in emotions, which had been neglected since the
19th century work of Darwin and William James, and the early 20th
century work of Freud. Apparently the Society for Neuroscience,
founded in 1971 only had its first symposium on emotions in 1995 and
emotion only “caught on” in the 1990s.
While it is true that Skinner and the behaviourists had ruled that
subjective aspects of emotions could not be studied scientifically, it
is not true that there was no focus on emotions during the earlier
parts of the 19th century, before the supposed revolution when
behaviourism was abandoned in favour of more enlightened “cognitive
science”. The anatomist James Papez proposed a neuronal circuit for
emotions (later termed the Papez Circuit) in the 1930s, after
observing the effects of injecting rabies into the brains of cats.
There were studies at the time to identify what parts of the brain
were necessary for consciousness and emotional reactions by surgically
excising bigger and bigger chunks of brain from living cats, and
sectioning their spinal cords at various levels to observe their
behaviour. Prior to this, the physiologist Walter Cannon developed,
after experiments on dogs, the influential doctrines that the
sympathetic branch of the autonomic nervous system is involved in
“fight or flight” while the parasympathetic branch is concerned with
“rest and digest”. Obviously fight and flight are emotional responses
to anger and fear. Cannon coined the term ‘fight or flight’ in 1915;
over the past century it has become one of the most successful memes
in physiology.

From what I understand of the scientific method as espoused by Popper,


the discipline can only disprove, or falsify, theories, and not prove
them. A falsifiable hypothesis that has not been falsified yet stands
as “the best fit” for the observable facts until it is superseded by a
better hypothesis. The classic example is the “progress” from the
mechanistic theories of Newton to Quantum Theory. Einstein’s theories
superseded those of Newton, in the same way that the germ theory of
infectious disease superseded the idea of spontaneous generation (or
punishment by God or the work of the Devil). This is the blurb.
Philosophers seem to love isms. They talk about materialism and
rationalism, nihilism, humanitarianism and functionalism. Popper, who
has been described as the “enemy of certainty” termed his new approach
“critical rationalism”. I don’t know what most of the isms mean – this
is something that philosophers presumably debate about. Then there are
the religions that are termed as isms – Hinduism, Judaism, Sikhism,
Jainism and others. Buddhism and Confucianism are regarded by some as
religions and others as philosophies, since they do not concern
themselves with gods or the supernatural. Some of the critics of
science also speak of scientism – science as a religion. Often
scientific opinions are accepted on faith – faith in the
pronouncements and opinions of recognised experts. Most people base
their scientific opinions on what they have been told by others, not
on the basis of their own experimentally tested hypotheses. They have
faith in the system to come up with reasonable hypotheses and test
them. How reasonable is this faith?
I am certain about many things. I am certain, for example of the
existence of the sun, and of the earth and the people who inhabit it.
It seems curious to me that solipsism ever existed as a school of
philosophy. The idea that I am only certain of my own existence and my
own mind is preposterous, as far as I am concerned. I am just as
certain of many other things. I am certain that there is a blue plate
in my kitchen, and that there is a gum tree outside my window (though
you may not be, since from your reasonable perspective I may be
mistaken or lying). I can’t see the plate, but I can see the tree if I
look to the right. Even if I’m not looking at the tree I am certain of
its existence, and of the existence of other trees. In percentage
terms, I’d rate my certainty of these things as 100% or close to 100%.
I am also close to 100% certain that my skin is dark because of the
secretion of melanin by melanocytes and that Vitamin D is synthesised
from cholesterol (though I base this belief on written sources that I
trust, rather than on any experiments I have done, or could do,
myself). Does that make me a “certaintist”?
Though I am reasonably certain, if not 100% confident, that my skin,
eyes and brain contain the pigment melanin, I am less certain about
what melanin is, what its functions are, how and why it evolved, or
its chemical composition. I do have theories about all these things,
but I regard them as hypotheses, rather than beliefs or “knowledge”.
The difference between belief, knowledge and delusional conviction
were brought home to me in a recent phone conversation with an elderly
gentleman who rang me to seek my advice about his 41-year-old
daughter, who had been in and out of mental hospitals since her
twenties, having initially becoming psychotic (I was told) after a
week-long fast, when she drank only water. She had been diagnosed with
schizophrenia and was on a particularly toxic antipsychotic drug,
clozapine. The old man was convinced his daughter became “psychotic”
whenever she smoked even tobacco cigarettes, and wanted to know what I
thought about this. When I questioned him about it he admitted that he
used to go on fasts “for religious reasons” before his daughter did
the same. I recognised him to be suffering from religious delusions
himself when he asked me if I thought she might be ingesting a
“cigarette demon” which was “possessing” her.
I tried reasoning with him, after clarifying that he was talking about
tailor-made tobacco cigarettes from a packet, rather than cannabis. He
said that this is indeed what he was talking about and that no one
believed him, and that he knew there was no scientific evidence to
back what he was saying. I said, “I realise you believe that the
cigarettes are sending her mad, but..” He interrupted me, “I don’t
believe, I know. I’ve seen it with my own eyes.” Convincing this
gentleman, over the phone, that what he thought he saw with his own
eyes was not true was not easy. I’m not sure how successful I was,
though I suggested some alternative explanations, such as that it was
only when she got angry enough to defy his rule that she not smoke
that she went and got some cigarettes, which he was interpreting as
insanity. He doubted this, and asked me if believed in demonic
possession. It was his daughter who was in the mental hospital and not
him – in fact he had “guardianship” of her, under mental health laws.
I gained the impression that the most important thing this woman
needed was to be able to take control of her own life, and gain some
independence from the domination of her father. It may be that she,
too was deluded; if so I had identified a likely source of some of her
delusions.

The Not-so-astonishing Hypothesis of Dr Crick

The Astonishing Hypothesis, subtitled the Scientific Search for the


Soul was published in 1994, and was the culmination of Francis Crick’s
investigations into the visual system (mainly of monkeys, and mainly
of the neurons of the cortex) in support of what he regarded as an
“astonishing hypothesis” – that “you, your joys and your sorrows, your
memories and your ambitions, your sense of personal identity and free
will, are in fact no more than the behaviour of a vast assembly of
nerve cells and their associated molecules”. Crick then progresses in
his reductionist model to the ridiculously reductionist assertion that
“As Lewis Carroll’s Alice might have phrased it, “You’re nothing but a
pack of neurons”.
Crick says that “this hypothesis is so alien to the ideas of most
people alive today that it can truly be called astonishing”, hence the
title of his book. The title of his book and Crick’s “astonishing
hypothesis” amused many who found nothing surprising, let alone
astonishing, about his hypothesis. It has always been a core
assumption of neuroscientists that the mind is produced by the brain
and the behaviour of the cells in it. It is plain wrong to assume that
all the brain’s activity is confined to the neurones (which form a
minority of the cells in the nervous system, which are mostly glial
cells, whose function is less well understood). It is also not true
that you are only your brain or only your mind. You are, obviously,
your body as well.
Crick’s hypothesis is not, in other words, original. He is restating
an old assumption, that the mind is produced by the brain. I have
assumed this to be the case for as long as I can remember. When I
mentioned Dr Crick’s “astonishing hypothesis” to my 80 year-old
mother, she said “isn’t the mind the same thing as the brain?” I was
quick to assure her that they were quite different things, revealing
my own dualism. “The brain is a soft organ inside the skull, while the
mind is your thoughts and thinking” I explained. My mother agreed with
this separation between mind and brain, and this has allowed us to
continue clearer discussions about both. It is obvious that the brain
affects the mind and the mind affects the brain, but they are not the
same. I know I have a mind due to my subjective experience of it, but
I know I have a brain only because science tells me I have one, and I
believe, for many reasons, that this is true. I have seen what I was
told was a CT scan of my brain, but it is possible (though very
unlikely) that it was someone else’s scan and not my own. I would
venture to say, on reflection, that I am 100% certain I have a brain,
and so does everyone I have met who has talked to me or that I have
seen talking. They also have minds, a fact of which I am certain,
though I’m not certain that they all have souls. I’m not sure that I
have a soul either, though it seems like a nice thing to have, in
addition to a mind.
What, I asked my mother, after establishing the difference between
mind and brain, is the soul? She said she hadn’t thought about it. Dr
Crick, who died in 2004 aged 88, never discovered what the soul was
either. My mother thinks, though, that when she dies, her soul
(whatever that is) will live on, and be united with God (whatever that
is) in heaven (wherever that is). Maybe this belief gives her comfort,
but others believe that they may go to hell and face eternal
damnation. Maybe this possibility is in the back of my mother’s mind
but she hasn’t told me about it. It is certainly deeply rooted in
Christian theology.
At the beginning of his introduction to his “astonishing hypothesis”
Crick poses a question and answer, apparently a “Roman Catholic
catechism” which reads as follows:
Q: What is the soul?
A: The soul is a living being without a body, having reason and free
will.
How can there be a living being without a body, with or without free
will or reason? The science I studied assumed that all living things
have bodies, and it is by their bodies that they are classed as
different species. Some of these species have brains, others do not. I
think of consciousness as an emergent property from organized networks
of neurones, and both reasoning and free will to be a product of
consciousness (and therefore activity of the brain, but also activity
in the rest of the body). The brain is part of the body, but there are
important unanswered questions about how the brain affects the rest of
the body (and vice versa). These questions are not explored in The
Astonishing Hypothesis, which has little to say about reason or free
will, either. It is mainly about scientific experiments investigating
visual perception, using the brains of macaque monkeys.
Apart from its misleading title, I found Crick’s book an interesting
read, and I discovered some things I didn’t know about the primate
visual system. I was interested in his theory that consciousness was
the result of reverberating circuits between the cortex and thalamus;
this (like the less-than-astonishing hypothesis itself) is not his own
but the theory of the neuroscientist Donald Hebb. This was Crick’s
best explanation for the ‘mind-body problem’, as it’s called, the
‘problem’ of dualism – the apparent duality of the brain and
consciousness.

Crick doesn’t talk about the soul at all, despite claiming to be


writing about the scientific search for it. He doesn’t mention the
Greek word for soul – psyche – or its relationship with the
disciplines of psychology and psychiatry, less the big differences of
opinion and contradictions in both disciplines (both of which claim
the mantle of being scientific). He doesn’t even discuss language – a
characteristic that is regarded as integral to the Aristotlean idea
that humans have a “rational soul”, while animals only have a
“sensitive soul”. Dr Crick doesn’t mention Aristotle at all, or the
fact that the classical Greek philosopher was the first to explicitly
define the soul of plants, animals and humans in the Western
tradition, on which Dr Crick’s own science is based. Less surprisingly
he doesn’t mention soul music, or the rapture we experience with many
types of music. He doesn’t mention rapture or ecstasy at all (or any
emotional reactions, even those we feel in response to what we see,
though he claims to be starting with the visual system in order to
solve the puzzle of consciousness). Neither does he mention near death
experiences, telepathy or any scientific observations that challenge
the assumption that minds can exist only with brains. His model is
reductionist, though he does refer to neural networks. This model is,
at best, connectionist. There is nothing wrong with connectionism, as
long as it is done with awareness that the whole may be greater than
the sum of its parts, as far as function is concerned. A holistic
perspective is essential in the neurosciences as it is in science
generally.
Crick partly addresses criticisms such as my own at the end of the
book:
“Many of my readers might justifiably complain that what has been
discussed in this book has very little to do with the human soul as
they understand it. Nothing has been said about that most human of
capabilities – language – nor about how we do mathematics, or problem
solving in general. Even for the visual system I have hardly mentioned
visual imagination or our aesthetic responses to pictures, sculpture,
architecture and so on. There is not a word about the real pleasure we
get from interacting with Nature. Topics such as self-awareness,
religious experiences (which can be real enough, even if the customary
explanations of them are false), to say nothing of falling in love,
have been completely ignored. A religious person might aver that what
is most important to him is his relationship with God. What can
science possibly say about that?”
Crick’s justification for these omissions in his model – and it is a
model – is that “such criticisms are perfectly valid at the moment,
but making them in this context would show a lack of appreciation of
the methods of science. Koch and I chose to consider the visual system
because we felt that, of all possible choices, it would yield most
easily to an experimental attack. The book shows clearly that while
such an attack will not be easy, it does appear to have some chance of
success. Our other assumption was that, once the visual system is
fully understood, the more fascinating aspects of the “soul” will be
much easier to study.”
After he published The Astonishing Hypothesis, Dr Crick gave an
interview in which he explained his astonishing hypothesis. This
interesting interview is available on YouTube:
HYPERLINK "https://www.youtube.com/watch?
v=qzs2aAcfOTQ"https://www.youtube.com/watch?v=qzs2aAcfOTQ
It is evident that Crick’s model is entirely mechanistic, based on the
belief that the brain can best be likened to a computer, but with
vastly greater capacity for parallel processing: “your brain is a
machine that is processing information”. Crick himself went into the
neurosciences later in life, after achieving fame and the Nobel Prize
in Medicine for his co-discovery of the DNA molecule. (An example of
the scientific establishment ignoring the contributions of women can
be seen in the work of Rosalind Franklin that led to Watson and
Crick’s discoveries).
In the 1990s the neurosciences were booming, according to Crick,
though he doesn’t say what all these neuroscientists are doing. He
said that there had been an increase from 1000 to 23,000
neuroscientists; it’s hard to see a corresponding increase in factual
knowledge about the brain and mind. Dr Crick’s research methods give
an indication why. The primary method for studying the mind, in Dr
Crick’s paradigm, is to study the effects of electrically stimulating
individual neurones and arrays of neurones in anaesthetised animals.
He hoped future the neuroscientists of the future would continue this
line of research using animals with “smooth cortices” like rats and
monkeys, rigged up with an array of hundreds of electrodes, each
measuring the activity of an individual nerve cell, and the
development of chemicals that could block various aspects of nerve
transmission. These could be watched, he said, on a TV screen to
observe their patterns.
Crick believed that consciousness was the result of “reverberating
neural circuits” based on the theories of the Canadian
neuropsychologist Donald Hebb (1904-1985), who is famous for
developing Hebb’s Law that "Neurons that fire together wire together."
Actually what Hebb said was more nuanced: “When an axon of cell A is
near enough to excite cell B and repeatedly or persistently takes part
in firing it, some growth process or metabolic change takes place in
one or both cells such that A's efficiency, as one of the cells firing
B, is increased.” This is now termed “Hebbian learning” and is
considered one factor in the learning process when viewed on a
cellular level.
Hebb also studied rats, chimpanzees and humans, during his long
career. He began by instituting behaviour change programs in schools,
moving to studying the effects of raising rats in darkness and
measuring the weights of their brains, before studying the effects of
the removal of parts of the brain in humans, and the psychological
effects of sensory deprivation and brainwashing on university
students. The latter became controversial when it was revealed that
the sensory deprivation experiments were secretly funded by the CIA as
part of the MK Programs, which involved neuroscientists, psychiatrists
and psychologists in Canada and the United Kingdom and USA, in some of
the most respected academic institutions in the world. The MK programs
also involved such things as “psychic driving” under the effects of
LSD, as described by Dr Stan Grof, who was himself injected with LSD
and placed under bright strobe lights, while wired to an EEG machine
to measure the electrical patterns on his scalp. Grof had an “out of
body experience” which led to his subsequent career researching and
theorising on what he calls “non-ordinary states of consciousness”.
Grof continued studying the effects of LSD after becoming Chief of
Psychiatric Research at the Maryland Psychiatric Center at the
University of Maryland in the USA, until LSD research and use were
banned.
Hebb said later that, “The work that we have done at McGill University
began, actually, with the problem of brainwashing. We were not
permitted to say so in the first publishing.... The chief impetus, of
course, was the dismay at the kind of "confessions" being produced at
the Russian Communist trials. "Brainwashing" was a term that came a
little later, applied to Chinese procedures. We did not know what the
Russian procedures were, but it seemed that they were producing some
peculiar changes of attitude. How? One possible factor was perceptual
isolation and we concentrated on that.”
Hebb postulated that consciousness was the result of reverberating
circuits in the brain, and that these circuits are active even when
the subject is asleep or unconscious. Crick looked for these
reverberations, but didn’t find them. He explains that “most of the
experiments have been done on animals under anaesthetic, so it’s not
surprising we haven’t seen the reverberation”.
Crick cheerfully adopts the attitude of Popper that we can’t be
certain about anything. This was picked up by the interviewer who
observes, in response to Crick’s inability to define consciousness
that “Every answer needs to be qualified…there seems to be no simple
way to state a simple and obvious theory about anything”. Crick,
famous for his discovery of DNA, answers, “If you ask anybody in the
field to define a gene, they have great difficulty finding a
definition. And that’s because we believe it’s all evolved by natural
selection - a lot of molecular gadgetry and so on - you don’t expect
there to be a crisp answer as in Newtonian Mechanics. Biology is very
different from physics in that respect.”
Crick argues that “you have to understand conscious and unconscious
processes in the brain before you get too deeply into free will”,
though he does suggest a specific area in the brain that has been
identified to be important in decision-making (the cingulate gyrus) as
a possible “seat of the Will” that fed into the higher, planning
levels of the motor system. It seems to me that free will and will (or
volition) are not quite the same. The will may seem free, but be the
result of previous learning or instinct. One can decide something
without an accompanying physical action, and one can act without
consciously deciding to move. These decisions and actions are
dependent on consciousness, but Crick doesn’t define consciousness.
Here Antonio Damasio comes to the rescue. Damasio is mentioned by
Crick in his postscript on Free Will, as having “also arrived at the
same idea” that the “seat of the Will had been discovered at or near
the anterior cingulate” after studying a woman with brain damage who
had apparently “lost her Will”. Unlike Crick and others who are
evasive about what consciousness is, Damasio is clear about his
definitions, and therefore easier to understand.
In a 2011 TED talk titled “The Quest to Understand Consciousness”,
Damasio defined consciousness as “that which we lose when we fall into
deep sleep without dreams or go under anaesthesia”. He explained that
“we all woke up this morning with the return of our conscious mind”.
This is easier to handle. It is clear that our consciousness is lost
when we are asleep, but that our brains remain active, especially
during dreaming sleep. Our memory is also active when we are asleep,
and while sound asleep our senses still remain subliminally active, in
that we can be woken from sleep by loud sounds, bright lights or being
touched on our skin. While we are asleep there is also amazing healing
that occurs routinely. Our fatigue, from hours of wakefulness (and
that increases with duration of wakefulness) disappears during sleep,
in a way more profound than conscious relaxation can achieve. Sleep is
vital for physical and mental health. But what is sleep and how does
sleep heal?
The TV rarely brings good news. Tonight was different. There is a
conference in Sydney encouraging the reinvigoration of Indigenous
languages in Australia and the Pacific Region. The good news is that
at last, there are moves to teach indigenous languages throughout
Australia, and that the promotion of Aboriginal languages has resulted
in a dramatic reduction in youth suicide, as well as unexpected
lucidity in people who had been diagnosed with mental illness. They
have found that even languages that had supposedly “died out” a
hundred years ago are being spoken again, in everyday conversation.
The bad news is that of 250 known Aboriginal languages, 110 are
endangered.
I didn’t learn much about the sun at university, other than that it
was by far the commonest cause of skin cancer, of which melanoma – a
cancer of pigmented melanocytes - is the most dangerous. Melanocytes
are so named because they contain the dark pigment melanin, which
protects against damage from the rays of the sun. Queensland, where I
studied medicine, is said to be one of the “melanoma capitals” of the
world. The promotion of skin creams to protect against the “harmful
rays of the sun”, which we are told, are called “ultraviolet” (not to
be confused with ultra-violent), is standard public health policy in
Australia, despite the fact that the original inhabitants of Australia
have no need for such creams (and neither do I nor my children, and
all the other dark-skinned inhabitants of multicultural Australia). We
also learned that sunlight is important for the healthy development of
bones, due to its effect in stimulating vitamin D production in the
skin, and that lack of sunlight causes rickets, characterised by weak
bones that bend in children, resulting in bowed legs and other
deformities, which have been known to develop more easily in dark-
skinned children brought up in temperate regions where there is less
sunlight.
How can I be certain that there is such a thing as melanin, or that it
is produced by melanocytes, or that there is such a thing as vitamin D
and that it is also produced in the skin under the influence of
radiation from the sun? The answer is that I have faith in science and
the scientists who made the relevant discoveries. But how justifiable
is this trust in the scientific endeavour and the scientific
establishment as a whole? Is there a difference in the science of the
East and the science of the West, or is it only the science of the
West that constitutes Real Science? Is there such a thing as
“mainstream science” versus “alternative science” or is alternative
science an oxymoron? I will consider these questions later; first I
will consider the question “what is belief”? When can belief
reasonably be regarded as certainty, and when can belief be reasonably
regarded as knowledge? Where does probability come into the equation?
How true are texts, and how much can various authorities be trusted,
regardless of whether they wear the mantle of scientist or holy man?
Is scientific thinking incompatible with religion, and to what degree
has science become a religion?

The Four Horsemen of the non-Apocalypse


The “Four Horsemen of the Non-Apocalypse”, as they have been called –
the biologist Richard Dawkins, the physicist Lawrence Krauss, the
philosopher Daniel Dennett, the neuroscientist Sam Harris and the
journalist Christopher Hitchens - have been leading the charge on the
side of “rational” science against “irrational” religion. They
maintain that science is self-critical, and scientists are generally
prepared to discard cherished hypotheses when confronted by evidence
that disprove them, while religions preach that blind faith is a
virtue in itself. The Four Horsemen of the Non-Apocalypse are the
prophets of the New Atheism, and have many fans. They focus, though,
on the Abrahamic religions, Judaism, Christianity and Islam. I have
never heard them mention Buddhism, Confucianism, Zoroastrianism or
Hinduism in their attacks on religious belief and promotion of atheism
(though Hichens and Dawkins imply that no one sensible believes in
polytheism any more). The name itself was not adopted by these
prophets but by their legion of fans and is an ironic moniker based on
the Apocalypse and Day of Judgement prophesied in the last book of the
Christian Bible – Revelations.

Hitchens, who died in 2011, used to point out that there is nothing
new about atheism, but the arguments of these intellectuals are, to
some extent new (including their arguments against new theological
arguments that have arisen from quantum and cosmological discoveries).
Some critics have accused the New Atheism of being a religion itself,
and its proponents of attacking other religions with a religious
zealotry, in proselytising their particular brand of atheism and
“scientism”. Others have pointed out that the leaders of the New
Atheism are all male and all white, and provide a white, male
perspective.
The British philosopher John Gray, himself an atheist, wrote in the
Guardian in 2008 that:
“Zealous atheism renews some of the worst features of Christianity and
Islam. Just as much as these religions, it is a project of universal
conversion. Evangelical atheists never doubt that human life can be
transformed if everyone accepts their view of things, and they are
certain that one way of living – their own, suitably embellished – is
right for everybody.”
In Gray’s view, “Dawkins, Hitchens and the rest may still believe
that, in the long run, the advance of science will drive religion to
the margins of human life, but this is now an article of faith rather
than a theory based on evidence”.
The famous linguist and political analyst Noam Chomsky has criticised
the New Atheists as being “religious fanatics” who believe in the
State Religion, rather than the other religions, and there is some
truth in this, when it comes to Christopher Hitchens, who declared
himself to be an “anti-Theist” in addition to an atheist and “adeist”.
This terminology, used by Hitchens, differentiated between deism and
theism (though both have the same linguistic roots, meaning God – from
which also the Greek god Zeus is derived). The New Atheists are not
all as vehemently anti-religion as Hichens was. Richard Dawkins has
described himself as a “spiritual atheist”, making the distinction
between “spirituality” and “religiosity”. One can be spiritual, in
this view without being religious, and presumably religious without
being spiritual, but what do the words religious and spiritual
actually mean? What constitutes a religion? What is a spirit, and
what is the difference, if any, between spirit and soul? Does atheism
necessarily infer disbelief in spirit and soul? What can science
contribute to an understanding of spirit, soul and mind (if indeed
these are different things)? Does thinking scientifically and
rationally lead inevitably to atheism?
The Four Horsemen of the non-Apocalypse, all true believers in the
steady progress of science and the falseness of religion, argue that
religion is not only false, but that it is harmful to society. Dawkins
argues that the falsity of religion is itself harmful, and that his
aim is to promote rational thinking, and the scientific method. He
expresses the hope, which he admits may not be realistic, that
humankind will abandon religion in favour of Science.
Dawkins says that, "We are all atheists about most of the gods that
societies have ever believed in. Some of us just go one god further."
Hitchens uses the same argument – since the dawn of civilization
people have worshipped thousands of gods, which are mutually
exclusive. By the laws of probability, Hitchens argues, the likelihood
of ones chosen god being the true god is less that one in a thousand.
Though they win applause from their many fans, these arguments are
somewhat flawed; though the many religious debaters that have
challenged the Four Horsemen have had difficulty identifying some of
these flaws because of their own beliefs in the Biblical Word. It is
easy to argue against the morality (and factuality) of the holy books
of Judaism, Christianity and Islam, pointing to their support of
slavery and the subjugation of women and children alone. It is not so
easy to argue that all religion is harmful, as the New Atheists
maintain.
Examples of behaviour and teachings that with a modern understanding
of morality we can clearly recognise as evil, abound in the Old
Testament. King Solomon, famed for his supposed wisdom, advised that
to “spare the rod” was to “spoil the child”. The story of Abraham
being prepared to make a human sacrifice of his son, Isaac, because
the “voice of God” or an “angel of God” commanded him to do so, and
that it demonstrated his “faith” or “belief”. That this is both bad
and mad is easy to establish on logical and humanitarian grounds,
without recourse to science. Philosophy has more to say about morality
than science does. The book of Leviticus has Moses getting angry that
his soldiers had spared the women and children in their genocide of
the Midianites, ordering that the women be killed, and all the male
children, but the virgin girls be spared death, to be used as slaves.
This is obviously evil. Science doesn’t enlighten us on these matters,
philosophy may, but the reason more of us know not to beat our
children is not because of science. It is not because of religion
either. Maybe it is because of natural human morality.
Dawkins sums up the God of the Old Testament as “the most evil
character in fiction”. In The God Delusion he writes:
“The God of the Old Testament is a petty, unjust, unforgiving control
freak, a vindictive, bloodthirsty ethnic cleanser, a misogynistic,
homophobic, racist, infanticidal, genocidal, filicidal, pestilential,
megalomaniacal, sado-masochistic, capricious, malevolent bully.”
Hitchens uses similar terms – cruel, vindictive, murderous,
judgemental and genocidal. I have a long agreed with Hitchens and
Dawkins on this assessment of the God of the Old Testament, but I am
not so convinced that religion is at the root of as much evil as they
claim. I am also cognisant of what religion, ritual and even theology
have contributed to human civilization. The world would be a poorer
place without the magnificent architecture of cathedrals, temples and
mosques and the various religious festivals that add colour, music and
even mystery to human culture. Religious sculptures can be extremely
beautiful, even sublime, and date back to the dawn of human
civilization and long into prehistory. At the same time, there has
been much evil done in the name of religion. The inquisitions of the
Catholic Church, the Crusades, and the monstrous behaviour of the
Spanish conquistadores in the Americas are obvious examples. When the
Portuguese, enthused with Catholicism and hatred of idolatry, arrived
in India and Sri Lanka in the 1600s, they wasted no time in reducing
to rubble as many Hindu and Buddhist temples as they could. Monstrous
brutality was committed on “heathens” by the Spanish and Portuguese
soldiers, justified in the name of spreading the Word of Jesus.
Having been baptised and confirmed as an Anglican Christian, and even
winning the “Christianity Prize” for many years running at a church
school in Sri Lanka I am familiar enough with the contents of the Old
and New Testaments to wonder whether I should choose, as an object of
worship, Yahweh, the jealous and vengeful god of the Jews or
Sarasvati, the Hindu goddess of knowledge and wisdom. Both are mere
metaphysical concepts to me, but I believe in many metaphysical
concepts, why not gods and goddesses? As long as I know they are
metaphysical concepts and not supernatural beings. Maybe one can
worship Goodness – or God – without any thought of reward. Can one
worship Wisdom, or Truth, or Beauty as metaphysical concepts with or
without agency? Is it reasonable to define what one worships as one’s
god (or gods)?

The “four horsemen of atheism”, as they have been called are the
British biologist Richard Dawkins, the British-American journalist
Christopher Hitchens, the American philosopher Daniel Dennett and the
American neuroscientist Sam Harris. Their discussion (not debate) on
the evils of religion, hosted by Hitchens his home, has become a
classic among their many secular fans.
HYPERLINK "https://www.youtube.com/watch?
v=rRLYL1Q9x9g"https://www.youtube.com/watch?v=rRLYL1Q9x9g
Hitchens argues that there is nothing new about atheism, and this is
true. Atheism has a long philosophical tradition in both the East and
the West. The New Atheists focus only on the atheism of the West and
never mention, for example, the Hindu C?rv?ka, a materialistic and
atheistic school of Indian philosophy, that had developed a systematic
philosophy by 6th century CE. C?rv?kas rejected metaphysical concepts
like reincarnation, afterlife, extracorporeal soul, efficacy of
religious rites, other world (heaven and hell), fate, and accumulation
of merit or demerit through the performance of certain actions. C?rv?
kas also refused to ascribe supernatural causes to describe natural
phenomena.
The ‘four horsemen of atheism’, pit science against religion, which is
argued to be harmful and ignorant of the facts. Dawkins is rightly
concerned about fundamentalist Biblical Creationism being taught as an
‘alternative theory’ to evolution by natural selection, but extends
his criticism to religion more generally and decries faith as a reason
for believing anything. It could be said, though that he has more
faith in science than an objective assessment of this all-too-human
endeavour warrants.
HYPERLINK "https://www.youtube.com/watch?
v=50pq71Bmils"https://www.youtube.com/watch?v=50pq71Bmils

Dawkins courted acrimonious debate when he titled his polemic against


religion “The God Delusion”. Needless to say, believers in God were
incensed. I was not. I was amused and entertained by his arguments.
They confirmed what I already thought – that man created many gods in
their own image, rather than God creating man in His image. Many years
ago I had rejected the idea of a supreme deity that was masculine in
any way. I also agreed with Dawkins that humankind had progressed –
meaning improved – in morality in spite of rather than because of
religion.
Hitchens, a British journalist who migrated to the USA, was famous for
his arguments in favour of atheism and against all religion. He was
particularly damning of the Judao-Christian tradition and Islam, but
he regarded all gods to be man-made and false. The Christian and
Jewish god of the Old Testament (Yahweh) he saw as tyrannical, cruel
and even genocidal, judging by the accounts in the Bible. He observed
that the story of Jesus of Nazareth was an endorsement of vicarious
human sacrifice, something to be condemned. He made the powerful
argument that it is better to be good for humanistic reasons than to
avoid hell or be rewarded in a life after death. eAfter publishing God
is Not Great – how religion poisons everything he engaged in a series
of debates on the subject:
HYPERLINK
"http://en.wikipedia.org/wiki/God_Is_Not_Great"http://en.wikipedia.org
/wiki/God_Is_Not_Great
Hitchens debating Christian theologist William Lane Craig:
HYPERLINK "https://www.youtube.com/watch?
v=SRWUAQm2MS4"https://www.youtube.com/watch?v=SRWUAQm2MS4
Craig presents three arguments for the existence of God, which he
claims is necessary for any objective system of morality. These are
The cosmological argument
The teleological argument
The moral argument

In his 2011 debate with Sam Harris, Craig expanded on these arguments
and Harris struggled to counter them. Few scientists have the combined
oratorical and debating skills of William Lane Craig:
HYPERLINK "https://www.youtube.com/watch?
v=IwhgPjPCpL8"https://www.youtube.com/watch?v=IwhgPjPCpL8
According to Wikipedia’s entry on dialectics, debates are won through
a combination of persuading the opponent; proving one's argument
correct; or proving the opponent's argument incorrect. By such a
definition there was no clear (to me) winner of the debate between Sam
Harris and William Lane Craig. Neither convinced his opponent. Though
I agreed with much of what Harris said, I did not think he addressed
the specific arguments for the existence of God that were expounded by
Craig. Harris is perhaps limited by his training as a cognitive
neuroscientist, rather than a physicist, when it comes to cosmological
arguments.
The cosmological expert among the New Atheists is Lawrence Krauss, who
also frequents the debate circuit in support of the atheist arguments
– which are generally regarded as synonymous with “materialist”. This
results in a sharp division between materialism and spiritualism,
though Krauss makes a distinction between spirituality and religion.
In this debate Krauss argues in favour of the proposition that
“science refutes God”:
HYPERLINK "https://www.youtube.com/watch?
v=x0qmr5AYFTg"https://www.youtube.com/watch?v=x0qmr5AYFTg
Krauss and Shermer won the debate, according to the voting of the
audience. The vote for the motion increased from 37% to 50%; the vote
against also increased, but only from 34 to 38% (converts from the 27%
that were undecided before the debate).
Hitchens argues against all religion, but celebrates what he calls
“the numinous”.
HYPERLINK "https://www.youtube.com/watch?
v=KkxgrrHbKG0"https://www.youtube.com/watch?v=KkxgrrHbKG0
In the following panel discussion on science, faith and religion at a
science festival in the USA, Krauss is accompanied by the British
philosopher Colin McGinn who professes atheism but adds that certain
things are beyond human understanding, the Vatican astronomer Brother
Guy Consolmagno and Professor Kenneth Miller, a cell biologist who
believes both in science and Catholicism.
HYPERLINK "https://www.youtube.com/watch?
v=KkxgrrHbKG0"https://www.youtube.com/watch?v=KkxgrrHbKG0

Dawkins and Krauss


HYPERLINK "https://www.youtube.com/watch?
v=R_TGGBduF_g"https://www.youtube.com/watch?v=R_TGGBduF_g
Chomsky and Krauss:
HYPERLINK "https://www.youtube.com/watch?
v=Ml1G919Bts0"https://www.youtube.com/watch?v=Ml1G919Bts0

One of the Anglican doctrines that had me most confused, as a child,


was that of the Trinity. How, I wondered, could the Father and the
Son, as well as the Holy Spirit all be the same being? I was taught
not to believe in ghosts, but here was the school chaplain talking
about the “Holy Ghost”. At this age I was familiar with the comic
books of Casper the Friendly Ghost. I knew that ghosts were imaginary.
I learned to disregard them as superstitions, though the existence of
ghosts and spirit entities do not necessarily imply the existence of
the supernatural. By the time I was 15, rationality prevailed, and I
grew to regard the miracles of Christianity and all other religions as
superstitions. I don’t believe in miracles, according to Hume’s
definition of miracles as requiring the suspension of the Laws of
Nature. Hence I no longer believe in the resurrection of Jesus of
Nazareth or anyone else; and I regard myself as having been
misinformed that the only true miracles that have ever occurred were
those mentioned in the Bible. I once believed these things, but when I
was a child I thought childish thoughts. These were not childish
thoughts that emerged spontaneously from my childish mind. They were
childish thoughts that were implanted into my mind by adults that I
loved and respected. They themselves had the childish thoughts
introduced to their impressionable minds by their parents when they
were children. It was called a “good, Christian upbringing”.

Science, too, has become a religion to some. This religion is called


the “New Atheism”, and it has its prophets, such as Richard Dawkins,
Daniel Dennett, Sam Harris, Lawrence Krauss and Christopher Hitchens.
I have sympathy for this new religion, the adherents of which are
adamant that they are not religious at all, and that religion is the
enemy of reason.
Noam Chomsky is one of the intellectuals who is on record as calling
the ‘New Atheists” (as they have been dubbed) “religious fanatics”. To
be fair, Chomsky also blames “religious fanaticism” of economists
regarding particular economic doctrines. Chomsky alleges that Hitchens
is a follower of the “State Religion” akin to the “religion” that the
“market knows best”.
HYPERLINK "https://www.youtube.com/watch?
v=zt9QCAUPPeY"https://www.youtube.com/watch?v=zt9QCAUPPeY

I also believe that the sun consists mostly of hydrogen and helium,
and that the heat of the sun is caused by a nuclear explosion that has
been going on for billions of years, that the sun is one of millions
of stars in the Milky Way galaxy, and that this is one of billions of
galaxies that I can see only as stars in the night sky. This is
something of a miracle to me, in that I am mystified as to how this
came to be, and how scientists know this, but I have good reasons for
believing it. The reason I believe that what appear to me as stars are
actually galaxies, is that I have basic faith in the integrity of
Western science and the Western scientists of the past, and not
because I have looked at stars through a telescope, though I have seen
photographs of distant galaxies taken through telescopes.
Though I heard the theory many years ago, until recently I was not so
convinced about the Big Bang Theory. To my sceptical mind it sounded
too much like creationism in disguise, and though I accepted that the
distant galaxies showed red shift, indicating that the universe was
expanding, I was not convinced that extrapolating back to a
‘singularity’ 13.8 billion years ago was logical and reasonable. When
scientists were talking about the first seconds of the formation of
the universe, I wasn’t sure that they weren’t deluding themselves. How
on Earth could we mere humans presume to understand the first seconds
of the universe?
My scepticism about the Big Bang has largely evaporated after hearing
the debates posted on YouTube of the physicist Lawrence Krauss, who
argues that the Big Bang is evidence against the biblical account of
creation, though the Catholic pope did, apparently, claim the theory
as evidence of biblical truth when the Big Bang theory was first
proposed. The current Pope Francis has declared that the Big Bang is
fact, but that is not a good reason to believe it to be so. The
alternative Steady State Theory, held by the famous British astronomer
Fred Hoyle, was accepted by most physicists in the 1930s, who were
concerned that the originator of the Big Bang theory, Monsignor
Georges Lemaître, was a Roman Catholic priest; it was thought that he
was trying to sneak creationism into science. It was, in fact, Hoyle
who is credited with coining the term Big Bang, intending it
pejoratively and contrasting it with the Steady State Theory, which he
favoured. In my ignorance of the evidence in favour of the Big Bang,
and without trying to find out the truth about it, I clung to the
discredited Steady State Theory until I watched Professor Krauss
recently. Since reading the Wikipedia entry on the Big Bang theory my
remaining doubts have been resolved.
Krauss is a great educator, and the author of several popular science
books including The Physics of Star Trek and A Universe from Nothing.
Though I have not read these I have watched several debates, lectures
and discussions on YouTube in which he appears. The debates have
centred on the battle between science and religion, raising the issues
of belief and faith. Krauss proudly wears the label of atheist and
scientist, and argues against holding beliefs on the basis of religion
or faith, rather than evidence. He says there is no evidence of a God
or gods and uses cosmological arguments, including the Big Bang (on
which he is an expert) to argue against the Biblical account of
creation, as told in the Book of Genesis, the first book of the
Christian Old Testament.

I was brought up to identify myself as a Christian, though my


ancestors a few generations back were Hindu and Buddhist, and I
rejected all religious belief when I was about 15. I attended Sunday
school when I was a child in England, where we memorised the names of
books of the Old Testament by heart, with the aid of a Haydn melody.
My mother taught me the Lord’s Prayer and read Bible stories to me,
but also took me to the magnificent Natural History Museum in London,
where I gazed in awe at the dinosaur skeletons. She told me that the
dinosaurs lived millions of years before humans evolved, and that they
suddenly became extinct millions of years ago. Though a Christian, my
mother fully accepted Darwin’s theory of evolution, having studied
zoology at University in Ceylon (now Sri Lanka) in the 1950s.
My mother explained away the six day creation story at the beginning
of Genesis by saying that one day for God might mean millions of years
in human time. I didn’t know enough about Genesis to counter this at
the time, and I accepted her explanation. I wondered how Noah fitted
all the different animals in his ark, but didn’t know enough about
zoology to realise how ridiculous the proposition was, and how clearly
the current distribution of animals in the world refutes the idea that
they all originated from an ark washed up on Mt Ararat in Turkey only
a few thousand years ago. Obviously he didn’t have any kangaroos
hopping on his ark.
It’s funny how otherwise sensible people hang on to such silly ideas
in the back of their minds, without really thinking about them. My now
82-year-old mother, who qualified as a zoologist and has worked as a
medical researcher, only thought seriously about the implausibility of
the story of Noah when I discussed it with her recently. It’s not that
she believed it. She neither believed it nor disbelieved it, though
she knew the myth well enough. She admitted that she’d “never really
thought about it”. When I pointed out how the distribution of mammals
in the modern world conclusively disproves the story of Noah’s Ark,
she had no problem accepting this. There are others who are not so
easily convinced by evidence and logic. They believe in the story of
Noah in every detail as recorded in the Holy Bible.
I recently watched a YouTube video showing some American atheists
visiting the “Creation Museum” in Kentucky, an American state about
which I know little about other than that it produced KFC and Colonel
Sanders, who it is rumoured was connected with the KKK (though this
may well be an urban myth). Kentucky has also brought to the world
state-of-the-art dinosaur models, which are depicted alongside models
of Adam and Eve in the Garden of Eden at the Creation Museum. Children
are predicably impressed by the displays, imprinting in their minds
images that reinforce the delusion that the earth is only 6000 years
old, and the ludicrous idea that humans lived at the same time as
dinosaurs. Hopefully this madness will not spread much further than
Kentucky and, at most, the southern states of the USA. The museum cost
more than $ 20 million to construct, an indication of the big money
behind the “young earth” lunacy but most of the people in the world –
the vast majority – do not have religious views that conflict so
glaringly with the findings of science.
Though I was brought up to believe in Christianity I was also brought
up to have faith in Western science and Western medicine, studying
science subjects throughout my school years, before studying more
science at the University of Queensland, where I learned also to call
myself a doctor of medicine. I lost my belief in Christianity when I
went through my adolescence, but retained my faith in Western science,
which I called science, but thought of as Science, with a capital S.
Likewise I regarded Western medicine as Medicine with a capital M. I
believed in Science and Medicine, with little insight into what the
alternatives were, or much about the history of Science and Medicine.
Science, with a capital S includes western science as a subset.
Science is global, not Western, Eastern, Northern of Southern. Western
science is what I studied, and what is commonly understood as being
“science”, but is only the western component of Science. Likewise
Western medicine is a subset of Medicine (which also includes Eastern
and Indigenous medical knowledge). Nobody talks of southern or
northern medicine or science, but they do speak of Western science and
medicine, as opposed to the scientific and medical models of the East,
of which there are several distinct traditions – notably the Islamic,
Chinese and Indian systems, each with written traditions hundreds of
years old, and each of which has developed doctrines they have
regarded as true, correct, and scientific. The Islamic, Chinese and
Indian scientific models have also yielded different medical models,
with the common aim of improving people’s heath – healing. I learned
nothing of these models when I studied at school and university; in
recent years I have spent more time trying to understand them, and see
how they contribute to our collective knowledge about nature (Science)
and health (with an awareness that Medicine is not quite the same as
Health, and in fact both Western and Eastern medical practices often
have adverse effects on health – individually and collectively).
In Russia, these days, there is much talk of “the West”, which is
regarded as definitely not including Russia. Russians regard
themselves as belonging to the East in the East-West political divide,
which is also reflected in a scientific divide, which was clearly
evident during the Cold War, despite the fact that the Russian
scientist Dmitri Mendeleev gave Science the Periodic Table of
Elements, the basis of all modern chemistry. During the Cold War,
“Eastern Bloc” science was regarded with more scepticism in the West
than the “reputable science” that emanated from the English-speaking
universities in the USA, Canada and UK in Australia, despite that it
is geographically in the far south-east. There are obvious historical
reasons for this. The result was that though we learned the Periodic
Table, we weren’t told that Mendeleev was Russian, just as we weren’t
told that the algebra and the Arabic numerals we use in “the West”
came to Europe from the East – and specifically from India via the
great Islamic civilizations of the Middle Ages. When I studied
medicine we learned that William Harvey, an Englishman, had discovered
the heart-lung circulation in the 1600s and Jenner, also an
Englishman, invented vaccination. We weren’t told that these
discoveries were made by Muslim physicians in Persia and Turkey, long
before. We learned how the Englishman James Parkinson had discovered
“Parkinson’s Disease” in the 19th century but never heard mention the
name of the great Persian physician Ibn Sina (Avicenna) whose texts
were at the foundation of all of Western medicine, when he and other
Muslim scholars challenged the doctrines of Galen, the Greek-Roman
physician to the gladiators, whose mistaken ideas about anatomy and
physiology had almost Biblical authority for more than a thousand
years in the West.
Ivan Pavlov was one Russian scientist that was mentioned as an
innovator in psychology; he would have been difficult to ignore
completely due to his contribution to the favoured school of
psychology at the University of Queensland at the time – behaviourism.
The American and British behaviourists based their work on “operant
conditioning” on the “classical studies” of Pavlov in Russia. It is
household knowledge that Pavlov “conditioned” dogs to salivate when a
bell is rung – what was called “classical conditioning”. The Harvard
psychologist Skinner continued Pavlov’s tradition in training pigeons
and children, developing educational and child control programs based
on “positive and negative reinforcement” to encourage “desirable
behaviour” and discourage “undesirable behaviours”. It was assumed
that it was adults who decided what was desirable or undesirable and
administered the rewards or punishments. Behaviour-modification
programs became the rage, and they tried, in the East and the West,
various ways to program their citizens to be heterosexual (but not too
much so), patriotic and law-abiding. What this meant in practice
varied between the different states that adopted behaviour
modification programs. In Western schools, there was a general
improvement in terms of the physical assault of children that was
still common when I was a schoolboy in Sri Lanka in the 1970s. Instead
of ritualised assault, the behaviourists favoured “time out” at a
punishment. This was a significant improvement of the “cuts” that were
given by cane or ruler in boy’s schools that were established by the
Western Churches throughout the colonized world.
I remember one teacher, when I was eight, who announced to the class
that he had thought of a way to improve our appalling spelling in
English. I had recently started school at Trinity College in Kandy,
the hill capital of Sri Lanka, after migrating from England. English
was my first language, so it wasn’t that much of a problem, but his
idea was that for every word we got wrong we would get a “cut”. This
meant holding out our hand, while he hit it with the sharp side of a
ruler. This is painful, but it doesn’t actually cut the skin; the
teacher controls the assault such that no laceration occurs. It is an
example of behaviourism without sound humanity, morality or ethics,
though his intent was pure, in that he was just trying to encourage
his student to learn correct spelling by punishing mistakes.
I can’t remember if this teacher actually carried out his threat, but
there were several teachers in Trinity College who just hit students
because they were angry; there were also ritual (and occasionally
public) canings. I know this to have been the case in schools
throughout the British Empire, those run by both the state and the
church, though the problem of child sexual abuse seems to be more a
feature of Church schools (especially boarding schools) than state
schools. Overall, this has been one of the most striking and welcome
changes in the school system in the West – the fact that children are
not beaten into submission. Unfortunately, there are newer, equally
cruel ways of controlling children that have spread from the unholy
alliance between drug companies and the medical profession – labels of
“conduct disorder” and “oppositional defiant disorder” for children
who were previously called “juvenile delinquents” and labels like
“attention deficit disorder” for children who were previously labelled
as having “minimal brain damage” or “feeble-mindedness”.
I wasn’t assaulted by teachers much when I was at school (I was caned
a couple of times) but that was because I was extremely obedient. I
listened attentively in class, even when the teacher was boring (which
was often enough). I raised my hand to ask a question and obeyed the
rules the teacher declared, however unjust they were. I memorised what
I was instructed to memorise and memorised it well enough to win
subject prizes in what I took to be Science as well as religion (which
was exclusively Christianity as proclaimed by the Anglican Church – or
at least its branch in Sri Lanka). In science this meant memorising
mathematical laws and applying them, though I did not understand the
derivation of most of these laws, and memorising various Laws of
Physics, though again I didn’t understand the derivation of these
laws. In religion we studied selected bits of the New Testament,
notably the four Gospels and St Paul’s voyages, described in the Acts
of the Apostles. We were expected to commit to memory the order in
which St Paul visited various towns, beginning with “Antioch in
Syria”, but were not expected to know where, exactly Syria is, or its
history other than the fact that St Paul started his voyage there.
This is what would be tested in the exam, and I learned to study in
such a way as to maximise my success in exams. This is not a good way
to gain a holistic perspective of the world or a truly scientific one
(which is, I would argue, necessarily holistic).

Dating the Dinosaurs and Deep Time

When I was seven I was given a copy of Burian and Augusta’s classic
Prehistoric Animals, a book that I treasured. This was the antidote to
the story of Noah’s Ark, and it made more sense to me, despite the
fact that I struggled to understand the text.
/
The reason I loved this book so much was the pictures, which are
exquisite. In 1963, when it was published, it was revolutionary. Never
before had the landscapes and animals of millions of years ago, been
brought to life so vividly. This is the Czech artist Zdenek Burian’s
rendition of the Middle Devonian landscape, about 400 million years
ago:
/
In my opinion, Zdenek Burian’s art stands among the greatest human art
of all time. It is not just because of his excellent technique and
aesthetic. It is his creative ability, his ability to bring fossils to
life, and to create realistic landscapes according to the scientific
facts as they were understood at the time. This required intimate
knowledge of the plants as well as the animals, and the technical
ability to draw and paint them. It required close collaboration with
the palaeontologist Josef Augusta and others. He was a pioneer in art
as well as science.
/
This painting of Diplodocus, from Prehistoric Animals, is one of close
to 500 prehistoric images painted by Burian between the early 1930s
and 1981. He was a prolific artist, credited with over 15,000 works.
Though I accepted, because my books and my mother told me so, that the
dinosaurs became extinct about 60 million years ago, I did not know
how palaeontologists came to this conclusion. I didn’t realise that
there was a gradual evolution of understanding of the age of the
earth, based on scientific evidence as it unfolded over two centuries.
When the famous British geologist Charles Lyell published Principles
of Geology in 1830 he argued that the earth was at least 300 million
years old. Scientists now estimate the age of the earth as about 4.5
billion years – more than 10 times older – based on radioisotope
dating of the oldest rocks and meteorites on earth. Lyell’s date of
300 million years suggested the age of the earth was much older than
most of his contemporaries thought in the West (though not as old as
was taught by the doctrines of Hinduism and Buddhism in the East).
Lyell proposed that the geological features of the earth (including
the various layers of fossils) had developed gradually by natural
(physical) forces that were still at work, over hundreds of millions
of years (gradualism) while the dominant scientific view in the West
was that of the French naturalist Georges Cuvier (1769-1832) who
thought the earth was only a few million years old at most. Cuvier,
who was also famous for his studies of comparative anatomy, was the
first naturalist to establish, from his studies of the fossil record,
the now widely accepted fact of mass extinctions, which he attributed
to catastrophes, such as floods, volcanoes and earthquakes. These
extinctions, according to Cuvier, were followed by creations of new
forms of life by God. The flood of the Old Testament was the most
recent of these catastrophic events. There developed a debate between
‘catastrophists’ such as Cuvier and the proponents of
‘uniformitarianism’ also known as ‘gradualism’, first proposed by
James Hutton in the late 18th century and popularised by Lyell in
Principles of Geology. Lyell was a close friend and colleague of
Charles Darwin, providing him with a geological framework for the
theory of natural selection, which was also gradualist – evolution,
according to Darwin, occurred slowly and gradually, over millions of
years.
The fact that deeper layers of fossils indicate earlier animals – the
basis of stratigraphy – was established by the Danish scientist (and
later Catholic bishop) Nicholas Steno (1638-1686) who introduced the
law of superposition, the principle of original horizontality and the
principle of lateral continuity in a 1669 work on the fossilization of
organic remains in layers of sediment. Steno logically surmised that
“at the time when any given stratum was being formed, all the matter
resting upon it was fluid, and, therefore, at the time when the lower
stratum was being formed, none of the upper strata existed”. His
principle of original horizontality stated that “strata either
perpendicular to the horizon or inclined to the horizon were at one
time parallel to the horizon”. Steno’s principle of lateral continuity
stated that sediments initially extend laterally in all directions;
his principle of cross-cutting relationships held that “If a body or
discontinuity cuts across a stratum, it must have formed after that
stratum.” Steno's ideas still form the basis of stratigraphy
contributing to the development of James Hutton's theory of infinitely
repeating cycles of seabed deposition, uplifting, erosion, and
submersion.
I did not grasp, until I was much older, that the geological eras –
the Palaeozoic, Mesozoic and Cenozoic are so named because of the
types of animals fossilized in various geological strata, rather than
the absolute ages of the rocks, which were unknown at the time. The
Palaeozoic, Mesozoic and Cenozoic eras comprise the Phanerozoic eon
(from 542 million years ago to the present) which is named as such
because of the fossils in various rock layers (strata). The geology is
named according to the zoology (hence Palaeozoic – ancient life,
Mesozoic – middle life and Cenozoic – new life). This is why different
sources have given different dates as to when each era began and
ended. Scientists (or rather naturalists) knew that the trilobites,
which were plentiful and diverse before the age of the dinosaurs,
became extinct long before the dinosaurs did because the trilobite
fossils were confined to deeper layers of sedimentary rock. Likewise,
they knew that the mastodons, mammoths and sabre-toothed cats had
become extinct more recently, since the dinosaurs were found in deeper
(and thus older) layers of rock. The first fish fossils are found in
deeper layers still, and these were more ancient than the dinosaurs
(and other land vertebrates). On this basis they created a
chronological sequence of the zoology of deep time of which they were
confident, dividing say the Mesozoic era into Triassic, Jurassic and
Cretaceous periods, and the Cenozoic era into older and younger
periods based on the types of fauna and flora preserved in the
fossils, but were not so confident about the precise lengths of these
periods or of the larger eras and eons. This required the development
of radioisotope dating in the 20th century.
There has been refinement and standardization of terminology over the
years since then, and many changes since I was a child and read for
the time about the Age of Dinosaurs and the Age of Mammals. By
international convention, the older term Tertiary, referring to the
period from 65 million years to 2.5 million years ago, was abandoned
in 2004. Instead, the subdivision of the Cenozoic has been
standardized as divided into 3 periods and 7 epochs. The present
Holocene epoch (the most recent epoch of the Quaternary period of the
Cenozoic era) began 12,000 years ago and is also known as the Age of
Man. The fact that it is not called the Age of Woman reflects the
history of both science and religion, as well as the history of
philosophy. But I’ll get to that later.
Nowadays the Proterozoic – Phanerozoic boundary is precisely dated at
541 (±1) million years ago. The Phanerozoic eon encompasses the
enormous period of time from 541 million years ago to the present, but
the preceding Proterozoic lasted even longer, from 2,500 to 542
million years ago. It used to be thought that life began only during
the Cambrian period, the first period of the Phanerozoic eon, but in
the last century there have been amazing discoveries of pre-Cambrian
fossils, such as the Ediacara fossil bed in South Australia and the
Burgess Shale discoveries in Canada. These indicate that there was an
explosion of life at the end of the Proterozoic eon followed by a
catastrophic mass extinction about 540 million years ago, in which
only a few phyla survived. These surviving phyla have given rise to
all the species of animal – vertebrate and invertebrate - on the
planet today.
There was another mass extinction (the P-Tr extinction) about 250
million years ago, at the end of the Permian period (the last period
of Palaeozoic era) in which it is estimated that 95% of all marine
species (including the trilobites) and 70% of terrestrial vertebrate
species become extinct. It is the only known mass extinction of
insects, and is thought to be the worst extinction event that has
occurred in the history of life. The fossil record suggests that more
than 50% of all families and 80% of all genera became extinct. The
cause of this catastrophic extinction is unknown; theories include
meteor impact, environmental change, massive volcanic eruptions in
Siberia and sudden release of methane, by methane-producing microbes,
into the ocean. The fact that the extinction occurred is undisputed.
The P-Tr extinction marks the beginning of the Triassic period of the
Mesozoic era, and the beginning of the age of the dinosaurs, which
fascinated me as a child, as it does children around the world today.
When I was a child, the cause of the extinction of the dinosaurs,
which had, by then, been fairly precisely dated to 65 million years
ago, was one of the great unanswered scientific mysteries.
The K-T boundary refers to the geological layer above which no
dinosaurs are found. K-T stands for Cretaceous-Tertiary, the
Cretaceous period being the third period of the Mesozoic era (the Age
of the Dinosaurs) and the Tertiary period being the old (and now
discarded) name for the first period of the Cenozoic era, heralding
the rise of mammals as the dominant vertebrate species on earth. The
K-T boundary has been radioisotope dated to 66 million years ago. It
is estimated that 75% of the plant and animal species, including the
dinosaurs and plesiosaurs, became extinct over a period of a few
million years (though the time frame of extinction is subject to
debate).
The discovery of iridium deposits at the K-T boundary in the 1970s by
the Nobel-prize winning physicist Luis Alvarez, may have solved this
mystery. In 1980 Alvarez published evidence that he had discovered a
fine layer of iridium, which is abundant in asteroids but rare on
earth, in a number of sites, hypothesising that a large meteorite or
comet had impacted with the earth, causing the mass extinction that
killed off the dinosaurs. The finding was challenged, as it should be,
but subsequent findings have supported Alvarez’s hypothesis. In the
1990s evidence of a large impact crater called Chicxulub was found off
the coast of Mexico, providing support for the theory. Other
researchers later found that the extinction of the dinosaurs at the
end of the Cretaceous may have occurred rapidly in geological terms,
over thousands of years, rather than millions of years as had
previously been supposed. Others have suggested alternative extinction
causes such as increased volcanism, particularly the massive Deccan
Traps eruptions in India that occurred around the same time, and
climate change. However, on March 4, 2010, a panel of 41 scientists
agreed that the Chicxulub asteroid impact triggered the mass
extinction.
More recently still, evidence has emerged of another catastrophic
extinction only 75,000 years ago, when most of humanity may have been
killed as a consequence of the massive explosion of the Toba
supervolcano in the Indonesian island of Sumatra, following which
there may have been a period of rapid cooling that lasted thousands of
years. It was proposed in the 1990s, that the Toba explosion produced
a “genetic bottleneck” in human evolution. This hypothesis is
supported by mitochondrial DNA studies that suggest that today's
humans are all descended from a very small population of between 1,000
and 10,000 breeding pairs that existed about 70,000 years ago. There
is recent evidence that a small human population may have survived the
Toba volcano in Jwalapuram, Southern India. A 2007 paper by Michael
Petralgia and others, revealed stone tools found above and below the
layer of 74,000 year-old Toba volcanic dust. The relative precision
with which dates of events such as the Toba explosion can be
determined is due to the modern technology of radioisotope dating.

Radioisotope dating was discovered in the early 1900s. This is a


dating technique that uses the rate of natural radioactive decay of
various elements, such as carbon, potassium or uranium in rocks or
organic material. Radiocarbon dating has transformed archaeology,
while potassium-argon and uranium-lead dating have transformed
palaeontology and geology. Measuring the rate of decay of
radioisotopes enabled “absolute” dating rather than relative dating
based on stratigraphy, for the first time. There are several isotopes
that can be used, suitable for different time frames. Carbon dating
becomes inaccurate for organic material more than 50,000 years old,
but in this case potassium-argon, uranium-lead, or uranium-thorium
dating can be used. Of course these are not really absolute, in that
they have a margin of error, but the degree of confidence with which
we can date the dinosaurs, the oldest rocks on earth and our own human
ancestors has dramatically improved over the past decades. We can be
confident that the earth is about 4.5 billion old, based on
radioisotope dating of the oldest meteorites and rocks found so far.

Believing that the dinosaurs existed millions of years ago doesn’t


prove the theory of evolution, of course. It does mean that the world
was not created on Sunday 23 October 4004 BC, the date calculated by
the Irish bishop James Ussher from a careful study of the Bible in the
17th century. Ussher calculated the dates of other biblical events,
concluding that Adam and Eve were driven from the Garden of Eden on
Monday 10 November 4004 BC. The bishop also calculated what he claimed
was the exact date that Noah’s Ark touched down on Mt Ararat.
Though there are some variations in the proposed date, there is a
Christian movement in the southern “Bible belt” of USA that insists,
like Ussher, that the Earth is only 6,000 years old and that the
dinosaurs died out during the flood that killed all the animals except
those that Noah saved in the ark that God commanded him to build.
God’s plan was to destroy all of sinful mankind, of which Noah was the
only just man; he and his immediate family were saved from a
devastating flood that killed all the people and animals on the planet
except those that Noah had saved. Noah had led these animals,
according to the tale I learned as a child, “two by two” (one male and
one female) into a wooden ark (the dimensions of which he had been
given by God). All of humanity is therefore descended from Noah via
his three sons, Ham, Shem and Japheth. These fundamentalists, who take
adopt a literal interpretation of Genesis, are known as “young Earth
creationists”, who comprise a segment of the Christian evangelical
movement. They run schools where the science curriculum includes the
story of Noah’s ark, not as a curious and obviously incorrect tale,
but as a real event in natural history. The Christian evangelical
movement has considerable political clout in the USA and is actively
promoting this nonsense in missions throughout the world, as well as
at prayer events more like rock concerts than the traditional Church
services of the Anglican and Catholic churches.
There have been polls suggesting that as many as 40% of Americans
think the earth is less than 10,000 years old, though these polls have
been questioned, since it seems to depend on how the questions are
worded. Surveys have asked about people’s belief in the age of the
earth, and also about evolution of both humans and other animals.
Americans are more prepared to accept evolution of animals than of
humans. The National Center for Science Education in the USA, which
promotes the teaching of evolution, assessed in 2013, in the light of
earlier claims that 40% of Americans believed in a ‘young earth’ that
was less than 10,000 years old, that “the hard core of young-earth
creationists represents at most one in ten Americans—maybe about 31
million people—with another quarter favouring creationism but not
necessarily committed to a young earth. One or two in ten seem firmly
committed to evolution, and another third leans heavily toward
evolution. About a third of the public in the middle are open to
evolution, but feel strongly that a god or gods must have been
involved somehow, and wind up in different camps depending how a given
poll is worded.”
It seems that people are more prepared to accept evolution if it
doesn’t conflict with their religious beliefs. If the questions are
worded such that they bring into question their belief in God,
Americans in the USA are less likely to say they believe in evolution.
They show some reluctance in accepting that we are descended from
apes, and are closely related to chimpanzees, as Charles Darwin
proposed in The Descent of Man, published in 1871. They show even more
reluctance to agree that we are apes, regarding humans as being
related to, but fundamentally different from chimpanzees, bonobos,
gorillas, orangutans and gibbons (the five other species of ape). The
leaders of the Anglican and Catholic churches began by opposing
Darwin’s theory when it was first proposed but they have accepted it
for many years, developing theological arguments that reconciled
evolution with God, to the satisfaction of most of their respective
congregations. Even the Jehovah’s Witnesses, who used to go house to
house arguing against evolution, have largely given up on insisting
that the earth is only 6,000 years old and that evolution is a wicked
lie dreamed up atheists like Charles Darwin. The young-Earth
creationists have a more entrenched delusion, based on their rigid
adherence to the literal word of the Bible.
Most Christians accept the scientific view that the earth is about 4.5
billion years old, and that humans evolved in Africa from ancestors we
share with chimpanzees. This common ancestor lived about 6 million
years ago in equatorial Africa, where our closest primate relatives,
chimpanzees, bonobos and gorillas are confined to this day. Our early
ancestors then migrated to the plains of Africa, evolving through
various species to our present modern Homo sapiens. The fossil
evidence of this evolution, including all the “missing links” one
needs to believe in human evolution from hominid ancestors in Africa,
proves beyond doubt that, as Charles Darwin argued 150 years ago, we
all have common ancestors in Africa.
The discovery of various species of Australopithecus in Ethiopia,
Kenya and Tanzania, beginning with the discoveries of Louis and Mary
Leakey in the 1950s, has indicated some of the early diversity of
hominin evolution, with several species of Australopithecus coexisting
in Africa millions of years ago.
/

Louis Leakey, though the son of missionaries and a devout Christian,


was convinced that Darwin was right and that humans evolved in Africa.
His discoveries and those of his wife Mary and son Richard proved
beyond reasonable doubt that this is the case. The fossil record
indicates that some hominins became extinct, including some species of
Australopithecus, as well as, probably, the Homo erectus populations
in China and Indonesia. There is currently scientific consensus, based
on DNA evidence, that all of humanity, including the populations of
Asia, Australia and the Americas, are descended from a population of
modern Homo sapiens that left Africa as recently as 100,000 years ago,
and that the earlier Homo erectus populations that had left Africa a
million years earlier (and others, like Homo floresiensis, fossils of
which were found in the Indonesian island of Flores in 2003, and dated
to 38,000 to 13,000 years ago by carbon dating) became extinct. There
is debate about the fate of Neanderthal Man (Homo neanderthalensis) in
Europe, some believing that these people were killed by invading
groups of modern Homo sapiens (Cro-Magnon Man), though there is
evidence, from DNA which has been recovered from Neanderthal remains,
that there was some interbreeding between Neanderthals and modern Homo
sapiens (in Eurasia, where all the Neanderthal remains have been
discovered so far). There is also disagreement about whether
Neanderthals should be regarded as a different subspecies, rather than
species, from our own (in which case they are termed Homo sapiens
neanderthalensis rather than Homo neanderthalensis).
We can be confident that one or other species of Australopithecus,
confined to the African continent, evolved into Homo sapiens. This
occurred over several million years via Homo erectus and maybe Homo
habilis, though the details are still uncertain, partly because the
definition of what constitutes a species is problematic when it comes
to extinct animals (including hominids and hominins). Living creatures
can be defined as separate species if they cannot interbreed and
produce fertile offspring. This is how various butterfly and bird
species, which look very similar, are known to be of different
species. When it comes to extinct animals, other criteria need to be
used to determine if two fossils are of the same or different genus or
species. There is no clear cut-off point between one species and the
species it evolves into. A corollary of this is that there was no
“first man” or “first woman”. Our ancestors gradually became human
over many, many generations.
When I was a boy, I learned that the famous Peking Man and Java Man
fossils, discovered in the 19th century, belonged to the genus
“Pithecanthropus”, rather than our own Homo genus. In anthropology
lectures at university, when I was 17, I learned that these are now
classified as Homo erectus and have been dated to about one million
years ago. The details of how these early hominin fossil discoveries
fit together with subsequent discoveries is becoming clearer as dating
techniques improve. This is one of the most dynamic and exciting areas
of science, and has been for the past half century, since the
discoveries by the Leakey family in east Africa and the discovery of
DNA in the 1950s. The exact lines of descent are still being worked
out, aided by modern “DNA archaeology” and new fossil discoveries.

In The Origin of Species Darwin provided evidence of evolution as well


as a theory of how different species evolve – the theory of natural
selection. The evidence, more so than the fossil records, came from
his studies of comparative anatomy and behaviour between various
animal species, and variation within a species. He argued that there
is a struggle for life (or struggle for existence), between
“individuals and varieties of the same species” in which the best
adapted individuals survive, passing their particular variations
(inherited traits) to the next generation. The key factor was
reproductive success, rather than mere survival. Those individuals
that had the most surviving, reproducing offspring passed on the
variations (such as a longer or stouter beak in birds) to subsequent
generations. Speciation occurs when populations became isolated, and
different environmental factors (including predators and plants)
determine which variations are selected for by natural selection.
Darwin did not coin the phrase “survival of the fittest” (it was
coined by Herbert Spencer after reading The Origin of Species), but he
agreed with it, and used it himself in the 5th edition of The Origin
of Species (published in 1869).
In formulating his theory, Darwin drew on the older theories of the
geologist James Hutton and his own grandfather Erasmus Darwin, a
physician and naturalist, and author of Zoonomia (published in 1794),
which Darwin read and commented on. Erasmus Darwin (1731-1802) wrote
in Zoonomia that “the strongest and most active animal should
propagate the species, which should thence become improved” and that
“the earth began to exist, perhaps millions of ages before the
commencement of the history of mankind”. James Hutton (1726-1797) ,
who developed the theory of uniformitarianism that was promoted by
Lyell in Principles of Geology, had written, in 1794 in Investigation
of the Principles of Knowledge that, “if an organised body is not in
the situation and circumstances best adapted to its sustenance and
propagation, then, in conceiving an indefinite variety among the
individuals of that species, we must be assured, that, on the one
hand, those which depart most from the best adapted constitution, will
be the most liable to perish, while, on the other hand, those
organised bodies, which most approach to the best constitution for the
present circumstances, will be best adapted to continue, in preserving
themselves and multiplying the individuals of their race.”
Other compelling evidence of evolution comes from the fields of
embryology and comparative biology (botany and zoology). The
similarity between various species was the basis of Linnaeus’s
classification according to phylum, order, family and genus, in
Systema Naturae (published in 1735, more than a century before
Darwin’s Origin of Species). It is obvious that various cats belong in
the same group, and are more closely related to each other than are
various types of monkey. Linnaeus believed in Biblical creationism as
an explanation for the marvellous creatures from all over the world
that he collected and classified, creating the Latin nomenclature of
genus and species names and general framework of taxonomy that we use
to this day (and Darwin assumed in developing his theory of how
speciation came about in the 19th century). The definition of species
as being unable to breed together and produce fertile offspring came
later, from Alfred Russel Wallace, based on his studies of Malaysian
butterflies Wallace’s studies of butterflies led him to conclusions
similar to those Charles Darwin had arrived at by studying variations
in the beaks of birds off the South American coast decades earlier (in
the Galapagos Islands, where Darwin shot fifteen species of what he
called “finches” and studied differences in the length and shape of
their bills, concluding that they must have had a common ancestor from
the mainland sometime in the past).
Darwin and Wallace independently developed the theory of evolution by
natural selection a century before the discovery of DNA and genes,
which provided a chemical explanation for evolution, and a mechanism
that explains the process by which traits are inherited. DNA could be
used to explain how simple life has given rise to the complexity of
multicellular organisms such as ourselves, since all living organisms
(as we understand them) require DNA to code for the proteins that
provide the basic structure of a cell. Long before Darwin it had been
established, with the discovery of the microscope, that all life is
composed of cells. Cells are the building blocks of all plants and
animals, including ourselves. This is one of the great discoveries of
Western science, and all of Science. DNA is another and has become a
household word (or acronym, if less people know what it means) –
leading it to me misused and misunderstood. The Australian Prime
Minister, Tony Abbott recently declared that fiscal prudence is “in
the DNA” of the Liberal Party.
The biological importance of DNA is frequently exaggerated. It is
true, of course, that the DNA in our chromosomes code for the
synthesis of proteins and proteins are biologically vital molecules.
They are also responsible for inheritance of various traits and the
physical forms (structure) of living organisms. The physical form and
function of cells and collections of cells (in organs and organisms)
is based on the DNA code in a very fundamental way, but there are many
other factors that influence structure and function. The idea that
there is “a gene” for this or that disease has generated many millions
for research, most of it wasted, but there are bigger problems than
the wastage of time and money. Genetic studies have a long history of
pseudo-science with catastrophic social and political results, which
we will get to soon.
This process of evolution depends on gradual change over enormous
periods of time – deep time. Deep time is measured in millions of
years rather than hundreds and thousands. This makes what seems
somewhat miraculous – the gradual development of the wonders of life
on the planet – explainable. According to the British biologist
Richard Dawkins the seeming miracle of evolution can be best explained
by the theory of natural selection, as described by Charles Darwin and
Alfred Russell Wallace in the 1850s. I agree with him, and see no
reason to call evolution a theory, rather than a fact. He says this
himself, and that it is called a “theory” only for “technical
reasons”. He points out that this sometimes leads to people thinking
that this means it is “only a theory” and therefore can be taught, in
schools alongside the ‘alternative’ theory of ‘intelligent design’ –
the old Christian theological argument that the perfection and wonders
of nature imply an Intelligent Designer or God. Darwin’s theory,
according to Dawkins and others, removes the need for any sort of
creator God, whether Yahweh, Allah or Brahma. This has led to a stand-
off between “creationists” and “evolutionists” and public debates as
to whether “religion” (creationism) is inconsistent with the
discoveries of “science” (evolution). There are, of course many
religious people who believe in evolution, but it is true that
religious belief can get in the way of accepting and understanding
evolutionary biology, if this religious belief is that the earth is
only a few thousand years old, or that humanity originated from Adam
and Eve in the Garden of Eden or Noah and his sons.
According to Salman Hameed, who teaches astronomy and religious
studies at Hampshire College in Amherst, Massachusetts, disbelief in
evolution is also a problem in Muslim society, though evolution is
taught in the textbooks of most Islamic nations. He has lectured in
Pakistan on reconciling evolution with Islam and is concerned about
the rise of creationism in the Muslim world. Hameed is trying to
promote the teaching of evolution in Islamic countries, and is
concerned that Richard Dawkins and other atheists will push Muslims
away from evolution. In an interview with New Scientist he said that
“a large portion of people, vast majorities, reject evolution.
Compared to the US, where 40% are comfortable with evolution, in the
Muslim countries that would go down to 10, 15, or 20%. In Turkey, one
of the more secular Muslim countries, the level is between 22 and
25%.These low acceptance rates are because evolution has not been in
the public discourse, so it depends on what people believe evolution
is. Right now, there is a misperception that evolution equals
atheism.”
Hameed says that the Koran itself does not provide a single clear-cut
verse that contradicts evolution and that in the Muslim countries,
young Earth creationism is non-existent, but evolution has become a
symbol for Western dominance and a sign of modernity. There is a
concern that believing in evolution leads to atheism. Hameed is
concerned that Richard Dawkins is feeding this fear, and pushing
Muslims away from evolution, but that Muslims have a deep faith in
science, with a long scientific tradition of which they are proud. He
thinks that convincing Muslims about the facts of evolution is just a
matter of time and good presentation of the evidence, and that
existing textbooks make a good start.
The ancient texts of Hinduism and Buddhism contrast dramatically with
the Book of Genesis in their claims about the age of the cosmos and
the Earth. The myths of Hinduism and Buddhism about yugas and kalpas
also contradict science, when it comes to the age of the earth, though
not quite as dramatically as the Book of Genesis and the story of
Noah’s ark. These religions speak of epochs of time, known as yugas,
which have occurred in cycles for billions or even trillions of years.
The yugas, which occur in continuous cycles from a golden age known as
the satya yuga, when men were taller and lived longer due to their
greater virtuosity to the kali yuga, a dark age, when men were shorter
in height and lifespan, corresponding to their greater evil.
According to the Manu smriti (Laws of Manu), one of the earliest known
texts describing the yugas, the length of the satya yuga is 4800
years, the next treta yuga lasts 3600 years, followed by the dvapara
yuga for 2400 years and the final kali yuga, in which the authors
believed they lived lasted 1200 years. The beginning of the kali yuga
was dated to the battle of Kurukshetra, the main story of the
Mahabharata.
The supposed life span and height of men during the progression of the
yugas, is ridiculous – men of the satya yuga lived 100,000 years, in
the treta yuga they lived 10,000 years, in the dvapara yuga they lived
1,000 years and in our present kali yuga they live 100 years (with a
prediction that they will, as the burden of sin shortens their height
and lifespan, shrink to little dwarfs with a lifespan of only 30
years, before ascending back through the sequence of yugas to another
golden age (satya yuga, when people will again live for 100,000
years).
According to Hindu tradition, the Manu smriti records the words of
Brahma, and that the first man, Manu, was the creator god’s son. He
was created in India, needless to say. Religion is not static though,
and there have been many developments in Hinduism since the Laws of
Manu were composed. Hinduism is a very diverse religion, if indeed it
is one. The Indian Prime Minister Narendra Modi has claimed that
Hinduism is a way of life rather than a religion. It includes, for
example, philosophers who deny the existence of gods but still call
themselves Hindus.
The Manu Smriti was one of the first Sanskrit texts studied by the
European philologists, first translated into English by Sir William
Jones, whose version was published in 1794. Jones considered Manu's
laws to be older than the laws of Solon the mythical lawgiver of
Sparta as well as Lycurgus, the legendary lawgiver of Athens, whose
laws he thought may have been adopted from the Manu smrti, transmitted
to them orally rather than in writing. In his book Bible in India
(1869), Louis Jacolliot, a judge and not a scientist, wrote that Manu
smriti was the foundation upon which the Egyptian, the Persian, the
Grecian and the Roman codes of law were built and that the influence
of Manu is still felt in Europe. Jacolloit was interested possible
Indian origins of western occultism, and drew comparisons between the
legends of Krishna and Christ, concluding that the account in the
Gospels is a myth based on the mythology of ancient India.
The German philosopher Friedrich Nietzsche (1844-1900), who famously
declared the death of God, was also an admirer of the Manu smriti,
based on his reading of a translation by Jacolliot, deeming it "an
incomparably spiritual and superior work" to the Christian Bible.
Nietzsche also considered the Hindu caste system promoted in the Laws
of Manu to be a good idea. Jacolloit also favoured the theory of a now
submerged continent, in either the Indian or the Pacific Ocean, called
Lemuria. Helena Blavatsky and the Theosophists drew on Jacolliot’s
writings to argue for the existence of a continent called Lemuria that
was submerged in a great flood. Blavatsky based her opinions not on
science but on “divine revelation” and selective readings of occult
literature to conclude that Lemuria was home to a “root race” that was
7 feet tall, sexually hermaphroditic, egg-laying, mentally undeveloped
and spiritually more pure than the following "root races”. This is, to
say the least, bizarre, and modern geological understanding of
tectonic plates conclusively disproves the theory of Lemuria. In
ignorance of the geological facts, some Tamil writers such as Devaneya
Pavanar have tried to associate Lemuria with Kumari Kandam, a
legendary sunken landmass mentioned in the Tamil literature, claiming
that it was the cradle of civilization.
Kalpa is a Sanskrit word (Hindi: ???? kalpa) meaning an aeon in Hindu
and Buddhist cosmology. The duration of time of a kalpa varies between
various Buddhist and Hindu schools of thought. The sage Gautama
Siddhartha (the Buddha) is not known to have given a precise
definition of how long a kalpa lasts but is claimed to have said that
if you count the total number of sand particles at the depths of the
Ganges river, from where it begins to where it ends at the sea, even
that number will be less than the number of passed kalpas.
The Hindu Bhagavad Gita, which is part of the Mahabharata epic, states
that a kalpa is a day of Brahm?, and one day of Brahm? consists of a
thousand cycles of four yugas, or ages. The duration of each of these
yugas has hugely increased since the composition of the Manu smriti,
with the interpretation that one year of the demigods was 360 human
years. This makes the figures astronomical, and pushes back the
creation of the universe much further back in time than the Big Bang
theory does.

In the 2006 Channel 4 documentary The Root of All Evil? Dawkins


confronted one of the leaders of the Christian evangelical movement in
the USA, Pastor Ted Haggard, about the Biblical creation myth and his
disbelief in evolution. This was after one of Haggard’s services,
which featured a musical group armed with electric guitars and
drumsticks rather than guns or swords. Though the documentary doesn’t
show much of what was said at the service it’s clear that there were
children running about, jumping around and praising Jesus with their
parents. It was rather like a rock concert, but without the sex or
drugs - a benign rock concert for the whole family to praise Jesus.
Rock concerts are attended by fans of rock stars, who are treated as
demigods by their fans. These were Jesus fans, rather than Haggard
fans, as far as I could tell. There is no sign that he was trying to
build a cult, though he may well have been trying to make money by
‘Jesus talk’. The whole event looked silly rather than sinister.
Dawkins begins by sarcastically complimenting Haggard on his “show”
which he says must have cost a lot of money. The pastor smiles and
agrees, saying he wanted it to be fun. Dawkins then says that it
reminds him of the Nuremberg rally and the Dr Goebbels would be proud.
Haggard ignores this provocative comparison with the Nazis and says
that he doesn’t know anything about Nuremberg rallies, but others have
compared it to a rock concert. This was my first impression too - a
rather nerdy rock concert.
It seems to me that comparing evangelical Christianity with Nazism is
rather unfair. Evangelical Christians may be deluded about the age of
the earth, but theirs is a movement obviously more religious than
political, and doesn’t promote hatred or violence (though it may
promote the idea that non-believers are wicked and sinful, hence
feeding hatred and violence indirectly) .The Nazis were a political
party, not a religious organization, and were extremely scientific in
their methods. In their policies of mass-murder of psychiatric
patients, homosexuals, Jews and Gypsies, the Nazis were motivated not
by religious fervour but by German nationalist fanaticism and the
scientific dogmas of eugenics – they were trying to breed a “superior
Aryan race” by eliminating the “undesirables”, beginning with the
inmates of their asylums, and continuing on to kills millions using
very scientific, but cruelly inhumane methods. The Nazi rallies, which
Goebbels orchestrated as the propaganda minister, did have a certain
religious flavour, perhaps, in which the Hitler was idol-worshipped,
but the National Socialists justified their evils in the name of
science rather than religion.
Eugenics, from the Greek for “good breeding” is a scientific term
coined by Francis Galton in the 1880s at Cambridge University in
England, and actively propagated around the world. The eugenics
movement was spearheaded by Galton and Charles Darwin’s son, Major
Leonard Darwin, who together founded the “eugenics education society”
in 1907. In 1926 the Eugenics Education Society was renamed the
Eugenics Society, and included several literary, political and
scientific intellectuals of the time. The biggest rival to the British
Eugenics Society was the American Eugenics Society; Galton’s ideas of
segregation and sterilization of “defectives” led to thousands of
“eugenic” sterilizations in the USA between 1890 and 1910, mainly of
boys who had been labelled as “feeble-minded”. It comes as no surprise
that Galton had the support of British academia as a whole, for his
basic theory that the cream of upper-class British society, the most
brilliant of whom were mathematicians in Cambridge, were naturally
endowed with “genius” (or intelligence) than the lower classes of
society, and that the lower classes were breeding too fast. They
agreed that something needed to be done to control the disparity in
breeding rate between the classes, for the preservation of the finest
aspects of the British race and British civilization. There were
intellectuals who denounced eugenics at the time, for lack of
scientific credibility and dubious ethics, but they were not able to
stop the doctrine from taking root in academic and political circles,
in Europe (especially Germany and Scandinavia), the USA, Australia and
Canada.
Darwin wrote an appreciative letter to Galton after reading Hereditary
Genius. Galton wrote a grateful reply which is also reproduced here.

Letter from Darwin to Galton.

DOWN, BECKENHAM, KENT, S.E.


December 23rd

"MY DEAR GALTON,--I have only read about 50 pages of your book (to the
Judges), but I must exhale myself, else something will go wrong in my
inside. I do not think I ever in all my life read anything more
interesting and original--and how well and clearly you put every
point! George, who has finished the book, and who expressed himself in
just the same terms, tells me that the earlier chapters are nothing in
interest to the later ones! It will take me some time to get to these
latter chapters, as it is read aloud to me by my wife, who is also
much interested. You have made a convert of an opponent in one sense,
for I have always maintained that, excepting fools, men did not differ
much in intellect, only in zeal and hard work; and I still think this
is an eminently important difference. I congratulate you on producing
what I am convinced will prove a memorable work. I look forward with
intense interest to each reading, but it sets me thinking so much that
I find it very hard work; but that is wholly the fault of my brain and
not of your beautifully clear style.--Yours most sincerely,
(Signed) "CH. DARWIN"
This appears in The Letters of Charles Darwin as LETTER 410.
/

The idea that some races and classes are genetically superior to
others was the keystone of eugenics, which was developed by Charles
Darwin’s cousin Francis Galton and promoted by Darwin’s son, Leonard
Darwin from the 1880s onwards. Basing their ideas on what they took to
be the latest scientific and statistical evidence (based on IQ
testing), the eugenicists recommended social programs of segregation,
sterilization and prevention of marriage of those deemed to be
“feeble-minded” or “insane”. This was in order to “improve the stock”
in Britain, which they feared was racially degenerating due to the
proliferation of the inferior lower classes. The same thinking was
applied globally in terms of the relative merits of different races,
resulting in different evils in the various nations that embraced
eugenics doctrines.

There were two sides to eugenics – “positive eugenics” and “negative


eugenics”. In positive eugenics the “good” or “desirable” individuals
were given various incentives to have large families. Under the Nazi
regime this was the “ideal Nordic type” with blonde hair and blue eyes
(and also tall, healthy and intelligent, according to Nazi standards)
who were regarded as the most superior type of the Aryan race. The
concept of an “Aryan race” was based on a confusion between language
and race, which is still sometimes made today, but was a feature of
mainstream scientific thought in the 19th century. German scholars had
confirmed the earlier observation by Dutch and Italian missionaries
that the Hindu sacred language of Sanskrit had remarkable similarities
to European languages – what are now called Indo-European languages.
In the 19th century, German philologists spoke of Indo-Germanic
languages, placing primacy on German over the other, and it is now
known, earlier European languages. The ‘Aryan race’ of the Nazis was a
mythical race that spoke the Aryan languages, the older term for the
Indo-European languages. In recent times debate about the Indo-
European languages has continued, including academic debate about the
locality in which these languages first developed, as “proto-Indo-
European” or PIE. The most favoured PIE homeland is presently southern
Russia, based on the work of Marija Gimbutas, though others have
proposed Anatolia, while a few scholars such as Conrad Elst argue that
the PIE homeland, and the origin of the Indo-European languages was in
India (the Out of India theory).
In 18th and 19th century India, the British colonials and Indologists
divided the many languages of India as being Aryan (north Indian) or
Dravidian (south Indian) corresponding to Aryan and Dravidian races.
The Dravidian darker-skinned race comprised those that spoke the
Dravidian languages, such as Tamil, Telugu, Kannada and Malayalam in
the southern half of India. The northern Indians were thought to have
a mixture of Aryan and indigenous (Dravidian) blood, but that the
uppermost caste – the Brahmins – were more Aryan and therefore
lighter-skinned, and that the skin colour prejudice so rife in India,
stems from this history. This race distinction has been challenged
since then, as well as the colonial claim that the Sanskrit language
was introduced to India by invading, white (or lighter skinned) Aryans
who, armed with the superior technology of horses and chariots,
conquered and dominated the less vigorous and technologically advanced
Dravidian inhabitants of the subcontinent.
The eugenics policies promoted by Galton and later taken up by Winston
Churchill and others, were focused on “negative” more so than
“positive” eugenics. They advocated two primary means of preventing
“defectives” from breeding – imprisonment in labour camps and
sterilization. Churchill favoured irradiation of the gonads with x-
rays as an effective means of sterilization, but accepted that a
“small operation” may also be necessary for girls, inquiring into the
necessary changes in law to permit them. He promoted a policy of
imprisonment of “mental defectives” for life – perhaps using the
incentive of freedom to encourage the best behaved “feeble-minded”
individuals to seek “voluntary” sterilization. According to Martin
Gilbert, writing for The Churchill Centre, in 1911 Churchill spoke to
the British House of Commons about the need to introduce compulsory
labour camps for “mental defectives”. The year before, Churchill had
argued that were “at least 120,000 feeble minded people at large in
our midst” who deserved “all that could be done for them by a
Christian and scientific civilization now that they are in the world”,
but who should be “segregated under proper conditions so that their
curse died with them and was not transmitted to future generations”.
Churchill’s official biographer, Sir Martin Gilbert, wrote in 2009 (in
a webpage titled Churchill and Eugenics) that:
“Such detention, as well as sterilization, were at that time the two
main ‘cures’ to ‘feeble-mindedness’. They were put forward by the
eugenicists, those who believed in ‘the possibility of improving the
qualities of the human species or a human population by such means as
discouraging reproduction by persons having genetic defects or
presumed to have inheritable undesirable traits (negative eugenics) or
encouraging reproduction by persons presumed to have inheritable
desirable traits (positive eugenics).”
“Mental defectives” was a collective term for people who were given
specific labels of defectiveness according to the “science” of
eugenics developed by Galton and merged with contemporary psychiatry
and neurology in a somewhat uncomfortable alliance. The terms they
used to classify various degrees of intellectual impairment – idiot,
imbecile, moron and feeble-minded - have become common terms of abuse
around the world. These terms were initially classes of patients based
on “Intelligence Quotients” or IQs, a concept pioneered by Galton,
though he did not invent the term. Idiots had an IQ of 0-25, imbeciles
an IQ of 26-50, morons an IQ of 51-70, and the feeble-minded were
judged as more intelligent that idiots, imbeciles or morons, but also
included people who were judged to be mentally defective on the basis
of criminality or insanity.
Some scientists dispute IQ entirely. In The Mismeasure of Man (1996),
the Harvard palaeontologist Stephen Jay Gould criticized IQ tests and
argued that they were used for scientific racism. He argued that “the
abstraction of intelligence as a single entity, its location within
the brain, its quantification as one number for each individual, and
the use of these numbers to rank people in a single series of
worthiness, invariably to find that oppressed and disadvantaged groups
—races, classes, or sexes—are innately inferior and deserve their
status.”(pp. 24–25)
Another dubious claim that has been rather more enduring was that
“schizophrenia” is a “genetic defect”. This was made in Stalinist
Russia to justify incarceration and “treatment” of political and
social dissidents in the 1950s, following the Nazi policy of
“euthanasia” (good or mercy killing), the euphemism for mass-murder
favoured by the Nazis as a “cure” for mental illness, who were central
to defining what constituted “schizophrenia” (then called dementia
praecox) in the first place, with the work of Professor Emil Kraepelin
at the Heidelberg University. The search for a schizophrenia gene,
with many false claims of success, has continued for many decades
since then. Likewise there is a general belief in the West that
“alcoholism” is a “genetic disease”, with a total ignorance of the
influence of alcohol advertising, and relative susceptibility to such
things as advertising and peer group pressure.
In 1899 Winston Churchill, who became a fervent supporter of draconian
“eugenic” measures, wrote to his cousin Ivor Guest that, “the
improvement of the British breed is my aim in life”. His biographer,
Sir Martin Gilbert, writes:
“Like most of his contemporaries, family and friends, he regarded
races as different, racial characteristics as signs of maturity of a
society, and racial purity as endangered not only by other races but
by mental weaknesses in a race. As a young politician in Britain
entering parliament in 1901, Churchill saw what were then known as the
‘feeble-minded’ and the ‘insane’ as a threat to the prosperity, vigour
and virility of British society.”
As the years passed, Churchill’s paranoia about the “feeble-minded and
insane classes” increased, to the point that he regarded British
racial health a serious and urgent issue. He wrote to Prime Minister H
H Asquith in 1910 that:
“The unnatural and increasingly rapid growth of the Feeble-Minded and
Insane classes, coupled as it is with a steady restriction among all
the thrifty, energetic and superior stocks, constitutes a national and
race danger which it is impossible to exaggerate. I am convinced that
the multiplication of the Feeble-Minded, which is proceeding now at an
artificial rate, unchecked by any of the old restraints of nature, and
actually fostered by civilised conditions, is a terrible danger to the
race.”
Churchill had been impressed by

In recent years Dawkins has tried to resurrect the discredited idea of


eugenics, reframing what are meant by ‘positive’ and ‘negative’
eugenics in a positive light. He argues that sixty years after the
Holocaust, and Hitler’s implementation of what in his madness he
called “eugenics”, but was really pseudoscience, it is time to start
discussing the topic again. He has introduced the idea of
“intelligently designing” human children to enhance such things as
musicality or athleticism. He compares “forcing children to have music
lessons” with genetically engineering children to be more musical.
Disregarding the historical meaning of the phrase (which led to the
genocide of the Nazis and sterilization programs in the USA, Canada,
Europe and elsewhere into the 1970s) Dawkins redefines “negative
eugenics” as the very reasonable practice of aborting foetuses with
serious chromosomal abnormalities (such as Down’s syndrome) which is
currently done in many countries around the world, and has been for
many years. This could be extended in what Dawkins classifies as
“negative eugenics” to abort or select (in in vitro fertilization)
embryos with known genes for breast and other cancers, or known
genetic abnormalities like Huntington’s Disease.
Nietszche may have admired the caste system espoused by the Manu
smriti, but there are plenty of people who don’t, including myself.
The caste system and caste laws are regarded by some as integral to
Hinduism, though this may not be so. In the Manu smriti, believed by
some Hindus to be the divine word of Brahma, there are many things to
be criticised on the basis of modern ideas of morality, especially
regarding the treatment of women and the “lower castes”.
Some such comments about women in the Manusmriti have been posted on
the website Nirmukta, in the aim of promoting science, free thought
and secular humanism in India. It was compiled by Hirday Patwari:
1. “Swabhav ev narinam …..” – 2/213. It is the nature of women to
seduce men in this world; for that reason the wise are never unguarded
in the company of females.
2. “Avidvam samlam………..” – 2/214. Women, true to their class
character, are capable of leading astray men in this world, not only a
fool but even a learned and wise man. Both become slaves of desire.
3. “Matra swastra ………..” – 2/215. Wise people should avoid sitting
alone with one’s mother, daughter or sister. Since carnal desire is
always strong, it can lead to temptation.
4. “Naudwahay……………..” – 3/8. One should not marry women who has have
reddish hair, redundant parts of the body [such as six fingers], one
who is often sick, one without hair or having excessive hair and one
who has red eyes.
5. “Nraksh vraksh ………..” – 3/9. One should not marry women whose names
are similar to constellations, trees, rivers, those from a low caste,
mountains, birds, snakes, slaves or those whose names inspires terror.
6. “Yasto na bhavet ….. …..” – 3/10. Wise men should not marry women
who do not have a brother and whose parents are not socially well
known.
7. “Uchayangh…………….” – 3/11. Wise men should marry only women who are
free from bodily defects, with beautiful names, grace/gait like an
elephant, moderate hair on the head and body, soft limbs and small
teeth.
8. “Shudr-aiv bharya………” – 3/12.Brahman men can marry Brahman,
Kshatriya, Vaish and even Shudra women but Shudra men can marry only
Shudra women.
9. “Na Brahman kshatriya..” – 3/14. Although Brahman, Kshatriya and
Vaish men have been allowed inter-caste marriages, even in distress
they should not marry Shudra women.
10. “Heenjati striyam……..” – 3/15. When twice born [dwij=Brahman,
Kshatriya and Vaish] men in their folly marry low caste Shudra women,
they are responsible for the degradation of their whole family.
Accordingly, their children adopt all the demerits of the Shudra
caste.
11. “Shudram shaynam……” – 3/17. A Brahman who marries a Shudra woman,
degrades himself and his whole family ,becomes morally degenerated ,
loses Brahman status and his children too attain status of shudra.
12. “Daiv pitrya………………” – 3/18. The offerings made by such a person at
the time of established rituals are neither accepted by God nor by the
departed soul; guests also refuse to have meals with him and he is
bound to go to hell after death.
13. “Chandalash ……………” – 3/240. Food offered and served to Brahman
after Shradh ritual should not be seen by a chandal, a pig, a cock,a
dog, and a menstruating women.
14. “Na ashniyat…………….” – 4/43. A Brahman, true defender of his class,
should not have his meals in the company of his wife and even avoid
looking at her. Furthermore, he should not look towards her when she
is having her meals or when she sneezes/yawns.
15. “Na ajyanti……………….” – 4/44. A Brahman in order to preserve his
energy and intellect, must not look at women who applies collyrium to
her eyes, one who is massaging her nude body or one who is delivering
a child.
16. “Mrshyanti…………….” – 4/217. One should not accept meals from a
woman who has extra marital relations; nor from a family exclusively
dominated/managed by women or a family whose 10 days of impurity
because of death have not passed.
17. “Balya va………………….” – 5/150. A female child, young woman or old
woman is not supposed to work independently even at her place of
residence.
18. “Balye pitorvashay…….” – 5/151. Girls are supposed to be in the
custody of their father when they are children, women must be under
the custody of their husband when married and under the custody of her
son as widows. In no circumstances is she allowed to assert herself
independently.
19. “Asheela kamvrto………” – 5/157. Men may be lacking virtue, be
sexual perverts, immoral and devoid of any good qualities, and yet
women must constantly worship and serve their husbands.
20. “Na ast strinam………..” – 5/158. Women have no divine right to
perform any religious ritual, nor make vows or observe a fast. Her
only duty is to obey and please her husband and she will for that
reason alone be exalted in heaven.
21. “Kamam to………………” – 5/160. At her pleasure [after the death of her
husband], let her emaciate her body by living only on pure flowers,
roots of vegetables and fruits. She must not even mention the name of
any other men after her husband has died.
22. “Vyabhacharay…………” – 5/167. Any women violating duty and code of
conduct towards her husband, is disgraced and becomes a patient of
leprosy. After death, she enters womb of Jackal.
23. “Kanyam bhajanti……..” – 8/364. In case women enjoy sex with a man
from a higher caste, the act is not punishable. But on the contrary,
if women enjoy sex with lower caste men, she is to be punished and
kept in isolation.
24. “Utmam sevmansto…….” – 8/365. In case a man from a lower caste
enjoys sex with a woman from a higher caste, the person in question is
to be awarded the death sentence. And if a person satisfies his carnal
desire with women of his own caste, he should be asked to pay
compensation to the women’s faith.
25. “Ya to kanya…………….” – 8/369. In case a woman tears the membrane
[hymen] of her Vagina, she shall instantly have her head shaved or two
fingers cut off and made to ride on Donkey.
26. “Bhartaram…………….” – 8/370. In case a women, proud of the greatness
of her excellence or her relatives, violates her duty towards her
husband, the King shall arrange to have her thrown before dogs at a
public place.
27. “Pita rakhshati……….” – 9/3. Since women are not capable of living
independently, she is to be kept under the custody of her father as
child, under her husband as a woman and under her son as widow.
28. “Imam hi sarw………..” – 9/6. It is the duty of all husbands to exert
total control over their wives. Even physically weak husbands must
strive to control their wives.
29. “Pati bharyam ……….” – 9/8. The husband, after the conception of
his wife, becomes the embryo and is born again of her. This explains
why women are called Jaya.
30. “Panam durjan………” – 9/13. Consuming liquor, association with
wicked persons, separation from her husband, rambling around, sleeping
for unreasonable hours and dwelling -are six demerits of women.
31. “Naita rupam……………” – 9/14. Such women are not loyal and have extra
marital relations with men without consideration for their age.
32. “Poonshchalya…………” – 9/15. Because of their passion for men,
immutable temper and natural heartlessness, they are not loyal to
their husbands.
33. “Na asti strinam………” – 9/18. While performing namkarm and jatkarm,
Vedic mantras are not to be recited by women, because women are
lacking in strength and knowledge of Vedic texts. Women are impure and
represent falsehood.
34. “Devra…sapinda………” – 9/58. On failure to produce offspring with
her husband, she may obtain offspring by cohabitation with her
brother-in-law [devar] or with some other relative [sapinda] on her
in-law’s side.
35. “Vidwayam…………….” – 9/60. He who is appointed to cohabit with a
widow shall approach her at night, be anointed with clarified butter
and silently beget one son, but by no means a second one.
36. “Yatha vidy……………..” – 9/70. In accordance with established law,
the sister-in-law [bhabhi] must be clad in white garments; with pure
intent her brother-in-law [devar] will cohabitate with her until she
conceives.
37. “Ati kramay……………” – 9/77. Any women who disobey orders of her
lethargic, alcoholic and diseased husband shall be deserted for three
months and be deprived of her ornaments.
38. “Vandyashtamay…….” – 9/80. A barren wife may be superseded in the
8th year; she whose children die may be superseded in the 10th year
and she who bears only daughters may be superseded in the 11th year;
but she who is quarrelsome may be superseded without delay.
39. “Trinsha……………….” – 9/93. In case of any problem in performing
religious rites, males between the age of 24 and 30 should marry a
female between the age of 8 and 12.
40. “Yambrahmansto…….” – 9/177. In case a Brahman man marries Shudra
woman, their son will be called ‘Parshav’ or ‘Shudra’ because his
social existence is like a dead body.

The Portuguese and Spanish were followed in their attempted conquest


of the world for Catholicism, by the Protestant Dutch and British,
along with the Catholic French, who were much less inclined than their
Iberian predecessors towards cutting off heads to save souls. Their
conquest began with trade – Dutch and British trading companies,
backed by military force, which gradually wrested control of the
Spanish and Portuguese territories and global maritime trade through
the 18th and 19th centuries. This trade was centred trade in African
slaves. The transatlantic slave trade of Africans was justified by the
scriptures by the Spanish and Portuguese, who used the same Bible to
justify the mass-murder of the “heathen” native inhabitants of the
Americas. The missionaries followed, converting the survivors and
subsequent generations to Roman Catholicism.
Though he objected to corruption in the Catholic Church, such as
“indulgences” and paying sums of money to achieve forgiveness of sins,
Martin Luther, the first Protestant, did not protest against the crime
of slavery. When King Henry VIII founded the Church of England in
1534, placing himself and his descendants at its head, the last thing
on his mind was releasing slaves. By the 1690s, the English were
shipping the most slaves from West Africa. They maintained this
position during the 18th century, becoming the biggest shippers of
slaves across the Atlantic. By 1783, the triangular route that took
British-made goods to Africa to buy slaves, transported the enslaved
to the West Indies, and then brought slave-grown products such as
sugar, tobacco, and cotton to Britain, amounted to 80 percent of Great
Britain's foreign income. The motive of the British and Dutch was
money and ruthless exploitation of land and resources (including human
resources) rather than religion, though they may have justified
slavery from the Old Testament, which also declares that man was
created to “dominate and subdue” nature. Domination and subjugation
was certainly the colonial strategy, along with divide and rule, for
which religion was a useful tool. In the first century AD, the Roman
philosopher Lucius Seneca observed that the common man thinks religion
is true, the wise that it is false, and the rulers that it is useful.
Though the Christian Bible may have been used to justify slavery, and
appears from some passages to condone it, there existed slavery in
many non-Christian cultures. In fact history suggests that for
hundreds, even thousands, of years it has been common practice that
conquered peoples were subjected to slavery by the victors. There was
slavery in Asia and Africa before the Europeans arrived there, and
there was also slavery throughout the Moslem world. It is well known
that the Arab Muslims dominated the trade in African slaves to Asia
long before the Portuguese appeared in the Indian Ocean and began
their competition for the spice trade with the Muslims.
It has been pointed out by Christian apologists that the movement to
abolish slavery was led by religious people, and specifically
Christians, albeit not from the Anglican or Catholic mainstreams.
William Wilberforce, who led the abolitionist movement in Britain,
underwent an evangelical conversion before he was motivated to
campaign for an end to slavery. The Quakers, who also led the
abolitionist movement, were an obscure, if influential, Christian
sect, who preached fear (quaking) in the face of God. They recognised
the evils of slavery, when many other religious people did not. Was it
because of their inherent goodness or because of their religiosity
that Wilberforce and other Christians opposed slavery?
Many scientists believe that we have what might be called “religious
instincts” – they say we have evolved to attribute agency to the
dramatic features and forces of inanimate nature, giving rise first to
animism and then to organized religion. Maybe this is true, but I’m
not so sure that the world will be a better place if religion is
replaced by science and rationality, even if this were possible. I do
think that teaching about evolution in schools, including the time
frame of the various geological ages, the movement of the continents
on their respective tectonic plates, which explain everything from the
raising of the Himalaya mountains to the evolution of birds from
dinosaur ancestors and the global distribution of the human race from
our origin in Africa, is essential for the children of the future.
The palaeontologist and science writer Stephen Jay Gould, one of my
intellectual heroes and an internationally recognised authority on
evolution, maintained, till his death in 2002 that there was no
necessary conflict between science and religion. He was strongly
opposed to the teaching of “Intelligent Design” (code name for
creationism) in schools, because it is nonsense, but he did not
generally criticise religion or religious belief. Gould argued that
religion and science are concerned with different domains.
Those who argue that science and religion are not incompatible might
reasonably argue that science explains the facts of the universe,
developing falsifiable theories to explain various observations, which
are then tested by experiment – what is, while religion is concerned
with matters of morality and conduct – what should be, including what
should be done (what we should do, individually and collectively).
Science is, in Gould’s view, amoral, just as Nature is. Not immoral,
just amoral. It makes observations, but does not make judgements on
what is good and evil. Judging what is good and evil is very much the
premise of religion, but what kinds of rules and judgements do the
religions of the world make, and how good (or evil) are they? Both
science and religion make claims to truth – does this make a collision
between them inevitable?

Being wrong and doing wrong

Dawkins published The God Delusion in 2006, preceded a few months


earlier by the Channel 4 television documentary The Root of all Evil?
This was broadcast in two halves – The God Delusion and The Virus of
Faith:
HYPERLINK "https://www.youtube.com/watch?
v=ld4X9NQdnog"https://www.youtube.com/watch?v=ld4X9NQdnog (The God
Delusion)
HYPERLINK "https://www.youtube.com/watch?
v=HzjYuFhcBKM"https://www.youtube.com/watch?v=HzjYuFhcBKM (The Virus
of Faith)

The titles of ‘The God Delusion’ and ‘The Root of all Evil’ are
deliberately provocative. Dawkins is prepared to go where most polite
people fear to venture and deliberately confronts religious belief,
which he regards as inconsistent with Science. Science with a capital
S. Dawkins has founded a Foundation for Reason and Science and was the
Professor for the Public Understanding of Science at Oxford
University. He clearly does not believe that to get along in society
on should avoid discussing religion or politics.
The first of these documentaries, The God Delusion, went to air in the
UK in January 2006. Channel 4 said that 2/3 of the responses had been
positive. The book The God Delusion was released in September 2006,
and developed the themes of the documentary further. The documentary
was uploaded to YouTube on 5.7.2013 and has had 7,140 views so far
(9.5.2015). An overwhelming majority of 68 likes contrasts with 3
dislikes. This suggests that people generally agree with Dawkins’
views, at least on face value. But 7,140 views is not many in almost
two years. Maybe only Dawkins fans are inclined to view a documentary
titled “The Root of All Evil” that is not about money.
The Wikipedia page on the documentary says that Dawkins admitted that
the title The Root of All Evil? was not his preferred choice, but that
Channel 4 had insisted on it to create controversy. This, he
apparently said on the Jeremy Vine Show on the BBC in 2006, but I have
no way of knowing whether this is true. Apparently the concession was
the addition of the question mark, changing it from a statement to a
question.
The provocative title is obviously based on the popular saying that
“money is the root of all evil”. The more biblically-oriented
capitalists emphasise the Bible says that it is not money but the
“love of money” that is the root of all evil and not money itself. But
Dawkins and Channel 4 provide a polemic in support of the idea that
religion is the root of all evil (and not money or the love of it,
which he does not discuss).
But what do they mean by religion, and what do they mean by evil? What
other causes can be identified for evil other than religion? Is human
nature at the root of all evil, or is there also evil in nature? What
about the Devil – is the Devil or Satan the root of all evil, as some
religious people believe? Is belief in the Devil without reference to
the supernatural consistent with both science and atheism, or when we
reject the existence of God is it assumed that we also reject the
existence of the Devil, the personification of evil? What, perhaps
more importantly, are the roles of such things as nationalism,
territorialism, authoritarianism, sexism, acquisitiveness, politics,
racism and intolerance in causing evil? There is also the problem of
the evils created by scientists – chemical weapons, land-mines, multi-
barrel rocket-launchers, AK 47s and other weapons, including nuclear
bombs. Bombs may be used by religious fanatics, but they are
manufactured using scientific knowledge.
In the introduction of The Root of All Evil? Dawkins begins ominously,
with what he regards as the consummate evil in the world as he saw it
in 2006:
“There are would-be murderers all around the world who want to kill
you, me and themselves because they are motivated by what they think
is the highest ideal. Of course, politics are important – Iraq,
Palestine, even social deprivation in Bradford, but as we wake up to
this huge challenge to our civilized values, don’t let’s forget the
elephant in the room, an elephant called religion.”
Dawkins is warning about the dangers of suicide bombers, who he
implies are mostly male and motivated by religious faith. His target
for criticism is faith in Judaism, Christianity and Islam – the
Abrahamic Religions, that all venerate what the Christians call the
Old Testament. My limited understanding is that in Judaism a variant
of the Old Testament is venerated as the Torah, which is only some of
the corpus of Jewish religious literature. The Jews do not recognise
Jesus as a prophet or messiah, while Islam accepts Jesus of Nazareth
as a prophet. Some schools of Islam also believe in the Second Coming
of Jesus and the Apocalypse, while the Jews are still waiting for the
Messiah. In Islam, from what little I understand of the religion, some
of the Old Testament stories are accepted, including the importance of
the Old Testament prophets like Moses and Abraham and the six-day
creation, but there is considerable diversity and divergence of
thought in Islamic scholarship about such things.
Dawkins admits that politics is important to understand the “would be
murderers” that he imagines are plotting evil suicide-bombings all
over the world. He says they want to kill “you, me and themselves”,
This may be true in his case, given his public attacks on their faith,
but I’m confident that it is not the case for myself. If I went around
saying people wanted to kill me but I didn’t know who they are I would
probably, given my circumstances, be locked back up in a mental
hospital. He mentions the politics of Iraq, Palestine and Bradford,
but doesn’t explore them, nor is there much depth in his analysis of
the problem of suicide bombers, who are not, by any means, responsible
for most of the murders that occur in the world today. Most murderers
kill others and not themselves. Many of the murders are called
“military operations”, including many of the murders in Iraq and
Palestine.
If one looks more carefully at the current threat of militant Islam in
the Middle East, Africa and Asia, one finds the USA and the West
arming and training of the Mujahideen in Afghanistan to fight a proxy
war against the atheist, communist Russians in the 1980s, giving rise
to the Taliban. One finds the arming of the secular regime of Saddam
Hussein with chemical and biological weapons and support, by the USA
and West, of Hussein and number of other dictators around the world in
the 1970s and 1980s. Palestine and Iraq have long and complex
histories, including the crusades and the centuries-old struggle for
control of Jerusalem and the Holy Land of the Bible. More recently,
European colonization and dominance, both cultural and political, and
the colonial wars, culminating in the First and Second World Wars,
have fed Islamic militancy as well as nationalism in the previously
colonized world. Islamic militancy has many complexities and a long
history. So does Christian militancy and Jewish militancy. There have
been plenty of militant Buddhists and Hindus in history, too, though
Dawkins focuses only on the Abrahamic religions.
It is a common and accurate perception, influenced by the media, but
confirmed by statistics and the actual statements of suicide bombers,
that Islamic fundamentalism and extremism have been responsible for
most of the suicide bombings in the world since the defeat of the
Tamil Tigers (LTTE) in Sri Lanka in May 2009. At the time that Dawkins
and Channel 4 made this documentary it’s hard to see how the LTTE’s
female suicide bombers, who were not motivated by belief in God or
heaven to strap on ‘suicide vests’ and blow up both military and
civilian targets, escaped the attention of Channel 4 and Professor
Dawkins. Christopher Hitchens, who also uses the example of suicide
bombers as an argument against all religion, corrected himself in
later debates, when he made reference to the Tigers use of suicide
bombers as an exception to his pronouncements against Islam and, more
broadly, religion.
According to Dawkins:
“The suicide bomber is convinced that in killing for his God, he will
be fast-tracked to a special martyr’s heaven. This isn’t just a
problem of Islam. In this program I want to examine that dangerous
thing that is common to Judaism and Christianity as well, the process
of non-thinking called faith.”
He then states his position clearly:
“I am a scientist, and I believe there is a profound contradiction
between science and religious belief.”
During their militant campaign for a separate state for Sri Lankan
Tamils, which they called Tamil Eelam, the organization known as the
“Tamil Tigers” were a specific military and political entity formally
known as the Liberation Tigers of Tamil Eelam or LTTE. The political
ideology of the LTTE was ostensibly secular and Marxist, and they drew
inspiration from the “liberation struggle” of Cuban Marxists. This was
their ideology as espoused by the self-declared “political strategist
and theoretician” of the LTTE, Anton Balasingham, who spent his later
years orchestrating the international political machinery of the LTTE
from London, together with his wife Adele, who, when the husband and
wife team were still in Sri Lanka, led the Women’s Wing of the Tigers.
There is YouTube footage of Adele Balasingham handing out vials of
cyanide to young women, who were Tamil recruits for the LTTE, but all
the supporters of the LTTE, including Adele Balasingham were not Tamil
(though all of the actual fighters were). These young women were
expected to commit suicide by swallowing the cyanide if they got
caught. It is a scientific discovery that the ingestion of cyanide is
fatal at a certain dose and it was a scientific calculation that
allowed the LTTE to know what dose to give these women (most of whom
were still girls). The guns that they carried and killed with were
invented and refined into ever-more murderous models through the
genius of science. The suicide vests that were pioneered by the Tigers
were also contributions from Western science and technology.
At the same time, the LTTE cynically used religion and religious, as
well as non-religious Tamil cultural traditions to further their
political and military agenda. Self-sacrifice is integral to the Hindu
religion, martyrdom as a direct route to Heaven is a feature of the
Christian and Muslim religions. The LTTE created a cult around its
military leader, Vellupillai Prabakaran, who was hero-worshipped, but
the concept of martyrdom was promoted more for nationalistic reasons –
which were politically secular. The suicide squad was given the self-
identity of Black Tigers, and before they went on their suicide
mission they would be honoured by a dinner with the leader. Those who
died in the cause were commemorated in rituals that calculatingly drew
on both Hinduism and Christianity. For example, Prabakaran was
insistent that the dead “cadres” as they were called, were buried in
the Christian tradition rather than cremated in the Hindu tradition,
so that he could impress Western visitors with the determination and
sacrifice of the “freedom fighters” as evidenced by the cemeteries
full of white crosses. The Australian paediatrician, John Whitehall,
who trained the LTTE medicos after the 2004 Asian tsunami, says that
he was very impressed by these cemeteries. He was supposed to be.
The LTTE welcomed any religion to kill and be killed in the interests
of winning the war for “self-determination” of the Tamils of Tamil
Eelam. The state they aimed to create was secular, rather than
religious, but there was to be supremacy of two languages – Tamil and
English, rather than Sinhalese, which was seen as the language of the
enemy. Sinhalese was also the language of the majority of the
population of Sri Lanka, and this was one of the roots of the evil
that ensued, but not the root of “all” the evil of the war. There were
many factors, some political, some social, some linguistic. All the
killing and maiming was done with weapons, developed through Western
science, though much damage was also done through words and
propaganda, which were used to raise money for the war effort on both
sides. Though portrayed as an “ethnic war” between the Sinhalese and
Tamils, there were many factors that fed into the equation. Religion
was one of them, in that the state religion of Sri Lanka is Buddhism
and the large majority of Sinhalese are Buddhist, with a small
minority of Catholic and Protestant Christians. The Tamils are mostly
Hindu, also with a minority of Catholic and Protestant Christians.
Most of the members of the Sri Lankan army were Sinhalese and
Buddhist, while most of the Tigers were Hindu or Christian, adding a
religious dimension to the conflict.
Language, though, is not the root of all evil any more than religion
is. There is no “root of all evil”. Evil has many causes, many roots.
The challenge is to identify them, including the evils of certain
religious doctrines – but also the evils of certain scientific
doctrines, political doctrines, economic doctrines and philosophical
doctrines. One of the roots of evil is dogma and dogmatism, but there
are scientific dogmas and well as religious, political and social
dogmas. It seems sensible and mature to look for the evil in one’s own
belief system before attacking the beliefs of others, and to make a
clear distinction between the two different meanings of what it is to
be wrong. Wrong can mean incorrect, it can also mean evil or wicked.
This is the important difference between mad and bad, between delusion
and crime. We all agree that crime is wrong (though we may differ on
what constitutes a crime). Delusions are also wrong, in that they are
not true and correct beliefs, but they are not wrong in a moral sense.
There is an important difference between being wrong and doing wrong.
Being wrong means that you are mistaken, and this can be corrected.
There is no moral condemnation of people being wrong, and though they
might resist admitting it to themselves or others, being wrong about
things is not evil or sinful. Doing wrong is a different matter. To do
wrong is to sin and all sins are evil or wicked, according to the
shared religious traditions of East and West. Some sins, like murder,
rape and stealing are condemned by all religions except those that
truly worship evil (and there are such religions, though with
fortunately few adherents).
The regime of Adolf Hitler is almost universally regarded as evil. The
exception is the neo-Nazis who continue to hero-worship Hitler and the
Nazi regime, who are mostly in the West – the UK and Europe, and North
America (the USA and Canada). I personally regard Hitler as one of
most evil men in history and regard the Nazis with revulsion, but not
because of Hitler’s supposed “paganism” and interest in Theosophy. I
don’t care if Hitler was a vegetarian and cared about animals, nor if
he truly loved Eva Braun; what matters to me is the genocide that he
presided over.
The undoubted evil of Hitler has been used by both sides of the
argument for and against religion. The atheists claim that Hitler was
a Catholic, the Catholics claim that he was an atheist or pagan. The
Nazi’s used a swastika – an ancient Indian sacred symbol – rather the
cross for their flags (though the swastika has a cross as its integral
design). The atheists also argue that the Catholic Church celebrated
Hitler’s birthday, while their opponents argue back that it was under
duress, and that Catholic priests tried to save many Jews. I don’t
expect to resolve these arguments, but I think we can all agree that
Hitler was evil and so was the entire Nazi apparatus. In the 1940s,
while this apparatus grew in power and danger, and started killing the
inmates of its mental asylums in anticipation of the Holocaust, the
Germans led the world not in religion but in science. After the war it
was Nazi scientists who were recruited by the Americans under
Operation Paperclip, not German theologians.
The Holocaust was not caused by the isolated, deranged mind of an
individual. The Nazi apparatus was a highly organized, professional
killing machine guided by the best scientific minds in the business
and the latest technology. German science was rivalled only by that of
Japan, Britain, Russia and the USA, not necessarily in that order. It
was German drug company Bayer that gave the world heroin, which was
marketed as a non-addictive way of treating opium addiction. It was
Germany that led the world in the neurosciences – the study of the
brain, which advanced in leaps and bounds when they started killing
mental patients and collecting their brains, to section, stain and
study under the microscope. The University of Heidelberg had the
biggest collection of brains, not in academia, but in formaline. It
was medical doctors, trained by psychiatrists, who decided, during the
notorious Aktion T4 program, which mental patients were ‘curable’ or
‘incurable’. The German psychiatrist Professor Emil Kraepelin, who
first described “dementia praecox” (later named schizophrenia) and
“manic depression” in the late 1890s, at the Heidelberg University was
internationally recognised as an expert in classifying some people as
mad, with the assumption that the system (represented by himself) was
sane. A few years after Kraepelin died, these same criteria were used
to decide whether to send people to the gas chamber (another
scientific achievement of the Germans). The Nazi psychiatrist Karl
Schneider orchestrated the killing of mental patients and collected
their brains for the university’s collection. It was all very
scientific.
The other motivation for the evils of Nazism was the creation of an
Aryan super-race. This was a scientific aim, based on a nationalistic
German interpretation of anthropology and history, but rationalised by
the Darwinian idea of survival of the fittest as interpreted by
Darwin’s cousin, Francis Galton at Cambridge University in England.
This was the doctrine of eugenics, which Galton hoped would become the
“religion of the future” when he promoted it from the 1880s till his
death in 1911. Eugenics means ‘good breeding’, from the Greek eu –
meaning good and genos - meaning race, stock or kin. The enthusiasts
of eugenics hoped that the most superior classes and individuals of
their various nations should be encouraged to have large families,
while the most inferior classes and individuals should be sterilized,
segregated and prevented from contributing their “blood” or
“inheritance” to the next generation. This was based on their
understanding of Darwin’s evolutionary theory, as applied to human
society.
There is an inherent assumption here that science is good – scientific
status for a theory or hypothesis is something to be desired. This is
the result of the status of science – which is based on its
credibility, which in turn is based on its marvellous achievements in
various scientific disciplines. Its credibility is based on its
history and the technological wonders that have transformed human
society - the motor car, airplane, radio and TV, vaccines, knee and
hip replacements, insulin and penicillin (and a range of valuable
drugs) not to mention the miracle of the computer. To a man living in
the Stone Age, or even the Middle Ages, these achievements would
indeed be miraculous.

Dawkins has extended his debunking of religion to the debunking of


what he regards as “pseudoscience”, including the modern guru of
Ayurveda, Dr Deepak Chopra. There is a lot to criticise in Deepak
Chopra’s science, including his interpretations of the Ayurvedic
concepts of vata, pitta and dosha “types” with corresponding herbal
supplements, but Dawkins focused on his claims about “Quantum
Healing”:
HYPERLINK "https://www.youtube.com/watch?
v=qsH1U7zSp7k"https://www.youtube.com/watch?v=qsH1U7zSp7k
Deepak Chopra says that “Quantum Healing” is “the theory that a shift
in consciousness causes a shift in biology”. This cannot be denied,
but what does that have to with physics quantum theory? At first,
Chopra says that this is just a “metaphor”, but then he does on to try
and justify his use of the term. Dawkins isn’t interested in trying to
understand him, his agenda is to discredit Chopra as preaching
pseudoscience – what another debater more rudely called “woo-woo” –
scientific-sounding words and phrases that do not make sense on deeper
consideration.
Deepak Chopra got a more sympathetic hearing from Dr Rupert Sheldrake,
a British biologist who has been writing about his theories about
"Morphogenic Fields” and “Morphic Resonance”, which he argues supports
the idea that minds can exist independent of brains. Chopra has
control of the interview:
HYPERLINK "https://www.youtube.com/watch?
v=l1CcOQnG0uM"https://www.youtube.com/watch?v=l1CcOQnG0uM
Reductionism and mechanistic frameworks don’t work, according to
Chopra.
Another application of Popper’s paradigm is in the double-blind trials
for various drugs, which has evolved into so-called “evidence-based
medicine”. The fact that drug trials are systematically biased in
favour of the drug companies, partly due to conscious but mostly due
to unconscious factors has increasingly become apparent. We all know
that the drug companies are corrupt and that the drug companies drive
medical research, education and clinical practice. A young psychiatry
registrar told me recently that “we all know that drug company
corruption is rampant”. He also admitted that the drug companies
“drive” medicine – but completely failed to introspectively perceive
the influence of this corruption on his own thinking about medicine.
Popper defined what is and is not a scientific theory on the basis of
whether it can be falsified. A hypothesis or theory that cannot be
falsified is not scientific, in his view. He argued that for this
reason contemporary Marxism and also Freudian psychoanalysis did not
qualify as science. His scepticism towards the pseudoscience of
psychoanalysis arose after initially studying under Alfred Adler, who
developed psychoanalysis with Sigmund Freud. Adler was the first major
figure to break away from psychoanalysis to form an independent school
of psychotherapy and personality theory, which he called individual
psychology because he believed a human to be an indivisible whole, an
individuum. Freud declared Adler's ideas too contrary, leading to an
ultimatum to all members of the Psychoanalytical Society (which Freud
led) to drop Adler or be expelled, disavowing the right to dissent.
There have been many reasonable allegations that Freud’s methods were
not scientific, nor were his conclusions scientifically valid. There
is no scientific evidence of the id, ego and superego, of the Oedipus
and Elektra complexes, and some of his more fanciful ideas. Others
like ego defence mechanisms, the pleasure principle and the role of
the unconscious have been widely accepted by many who don’t believe in
his declarations about “penis envy” or “anal fixation”.

Homo sapiens was named in 1758 by Linnaeus.


Scientists have made progress in their quest to understand why some
people have light skin and others have dark skin. I have basic belief
in this science, but also some awareness of the mistakes accepted
scientific experts have made in the past. The inventor of zoological
taxonomy, Baron Carl Von Linne scientifically known as Carolus
Linneaus (1707-1778), the Swedish biologist and physician who invented
the modern system of classification of living creatures and invented
the system of Latin names that we use to this day, classified humans
of the 17th century as belonging to several species, based on his
understanding of science. Skin colour was the primary criterion he
used in classifying humans as belonging to different species, each in
a respective continent.
The 1735 classification of Carolus Linnaeus, inventor of zoological
taxonomy, divided the human race Homo Sapiens into continental
varieties of Europaeus, Asiaticus, Americanus and Afer, each
associated with a different humour: sanguine, melancholic, choleric
and phlegmatic respectively. Homo Sapiens Europaeus was described as
active, acute, and adventurous whereas Homo Sapiens Africanus was
crafty, lazy, and careless.
Linnaeus subdivided the human species into four varieties based on
continent and skin colour: "Europæus albus" (white European),
"Americanus rubescens" (red American), "Asiaticus fuscus" (brown
Asian) and "Africanus Niger" (black African). In the tenth edition of
Systema Naturae he further detailed stereotypical characteristics for
each variety, based on the concept of the four temperaments from
classical antiquity, and changed the description of Asians' skin tone
to "luridus" (yellow). Additionally, Linnaeus created a wastebasket
taxon "monstrosus" for "wild and monstrous humans, unknown groups, and
more or less abnormal people”.

The Americanus: red, choleric, righteous; black, straight, thick hair;


stubborn, zealous, free; painting himself with red lines, and
regulated by customs.[21]
The Europeanus: white, sanguine, browny; with abundant, long hair;
blue eyes; gentle, acute, inventive; covered with close vestments; and
regulated by customs.[22]
The Asiaticus: yellow, melancholic, stiff; black hair, dark eyes;
severe, haughty, greedy; covered with loose clothing; and regulated by
opinions.[23]
The Afer or Africanus: black, phlegmatic, relaxed; black, frizzled
hair; silky skin, flat nose, tumid lips; females without shame;
mammary glands give milk abundantly; crafty, sly, careless; anoints
himself with grease; and regulated by will.[24]

The German anthropologist Johann Blumenbach (1752-1840), Professor of


Medicine and Director of the Museum of Natural History at Gottingen,
included descriptions of sixty human skulls collected from around the
world to justify his view, which was regarded as scientific at the
time (and is certainly more scientific than the view that we are of
different species), that humans were one species but of different
races, the defining characteristic being difference in skin colour.
Blumenbach classified us belonging to either the Caucasian or white
race, the Mongolian or yellow race, including all East Asians and some
Central Asians, the Malayan or brown race, including Southeast Asian
and Pacific Islanders, the Ethiopian or black race, including sub-
Saharan Africans or the American or red race, including American
Indians.
Charles Darwin sided with those who regarded the different races as
having a common ancestor, and that we all belong to the same species,
Homo sapiens. He only came out publicly with his theories about the
evolution of humans from primate ancestors in 1871, with The Descent
of Man, having published the revolutionary On the Origin of Species
twelve years earlier in 1859. It did not take as long for Darwin’s
cousin Francis Galton to come out with Hereditary Genius, which was
published in 1869, two years before On the Origin of Species. Galton
went on to found the Society for Eugenics with Charles Darwin’s son,
Major Leonard Darwin, which Galton envisaged as “the religion of the
future”, where human breeding is based on the rational science that
allowed us to breed superior breeds of horses or dogs. The superior
races and classes would be encouraged to have large families and the
inferior – or undesirable – classes and races would be prevented from
breeding. He also popularised the idea of “miscegenation” – pollution
of blood lines. This was behind the policies of racial segregation, as
well as the idea that “primitive races” such as Australia’s Aborigines
and Sri Lanka’s Veddhas, were “naturally” destined to die out, and
that nothing could be done to stop this.
In Hereditary Genius Galton, who pioneered the idea of IQ tests and
statistical analyses of these, argued that his research at Cambridge
established that the African Black was, on average, two “grades” below
the White Caucasian in what he described as “civic worth”, which he
equated with “genius” or intelligence. His methodology pretended to be
scientific and mathematical, though it was criticised at the time (and
has been since) as being neither. For example, he calculated “civic
worth” on the number of famous people and high-status professionals
born into upper versus lower classes, to conclude that upper class
families are naturally endowed with genius. Differences in educational
opportunities, hereditary monarchies and aristocracies, nepotism and
the other obvious reasons for people becoming famous in European
history or prominent in British society of the day (both were included
without discrimination between the two categories) were not considered
in Galton’s mathematical analyses of “civic worth”. It so happened
that his own family was such a family, liberally endowed with famous
men, though this doesn’t disprove the possibility that some aspects of
intelligence are indeed hereditary. Musicality, for example, or
ability with numbers or languages. Galton does not discuss such things
– his conclusion, based on his experiences in Africa is that the
“African Blacks” have a “slavish instinct” and “naturally fall into
the ways of slavery”. He formed this opinion after making the less
than scientific observation that porters who were employed to carry
the loads, when he tried, unsuccessfully, to reach Lake Nyasa from the
Western Coast, regarded him their “owner”. Maybe the history of
Europeans in Africa had something to do with it.
Galton categorised British society according to the earnings of the
male breadwinners to conclude that the upper classes were genetically
superior to the lower classes, but that there was a growing danger of
the superior, educated classes being out-bred by the inferior, less
intelligent lower classes. He used bell curves to illustrate his
points, indeed his arguments were based on the application of
statistical bell curves. The same way as bell curves describe the
variation in height in a society, Galton argued, so does intelligence,
as indicated by the IQ tests he devised and his evaluation of history.
He argued that there is, according to statistical probability, the
occasional Black who reaches a prominent position by virtue of his
genius, such as Toussaint Louverture, who led the Haitian Revolution
of 1791.

Race is also a scientific concept credited to Georges-Louis Leclerc,


Comte de Buffon (French pronunciation: ?[???? lwi l?kl?? k??t d?
byf??]; 7 September 1707 – 16 April 1788) was a French naturalist,
mathematician, cosmologist, and encyclopedic author.
Buffon and Johann Blumenbach were believers in monogenism, the concept
that all races have a single origin. They also believed in the
"Degeneration theory" of racial origins. They both said that Adam and
Eve were Caucasian and that other races came about by degeneration
from environmental factors, such as the sun and poor diet. They
believed that the degeneration could be reversed if proper
environmental control was taken, and that all contemporary forms of
man could revert to the original Caucasian race.[11]

Buffon and Blumenbach claimed that pigmentation arose because of the


heat of the tropical sun. They suggested cold wind caused the tawny
colour of the Eskimos. They thought the Chinese relatively fair
skinned compared to the other Asian stocks because they kept mostly in
towns and were protected from environmental factors. Buffon said that
food and the mode of living could make races degenerate and
distinguish them from the original Caucasian race.[11]

Buffon believed humanity was only 6000 years old (the time since
Adam). Believing in monogenism, Buffon thought that skin colour could
change in a single lifetime, depending on the conditions of climate
and diet.[12]

Buffon was an advocate of the Asia hypothesis; in his Histoire


Naturelle, he argued that man's birthplace must be in a high temperate
zone. As he believed good climate conditions would breed healthy
humans, he hypothesized that the most logical place to look for the
first humans' existence would be in Asia and around the Caspian Sea
region.
some of which have led humanity astray in the past, including those of
Darwin, which directly led to both Social Darwinism and the doctrines
of eugenics, formulated by Darwin’s cousin, Francis Galton in direct
response to the Origin of Species. Galton applied Darwin’s theory, as
he understood it, to “scientifically” and “statistically” justify his
conclusion that the African Black was, on average, “two grades” less
intelligent than the “White race” or Caucasians, whilst being “grade”

In 2014 Pope Francis announced that evolution and the big bang theory
are in fact real, and one should not gather from the Book of Genesis
that God is a “magician with a magic wand.”
I do not believe in the supernatural or any of the Biblical miracles,
nor any of the alleged miracles from other religious traditions. I
don’t believe in supernatural miracles; I do believe in natural
miracles.
The sun goes up in the sky as the morning progresses and performs
various natural miracles as it moves through the sky. Flowers open and
leaves imperceptibly grow, birds start to sing and then grow quiet,
flocks of migratory birds and butterflies continue their journeys and,
in the deep layers of the skin of human beings that are exposed to its
rays, melanocytes produce the brown pigment melanin and other cells
produce vitamin D from cholesterol. I believe, as a result of my
medical training that Vitamin D is necessary for the healthy growth of
bones, and that deficiency of this vitamin causes the childhood
disease of rickets, that causes bones to weaken and deform. I also
believe, on the basis of what I regard as a sensible hypothesis I read
some years ago, that it is the need to produce vitamin D that is most
likely the evolutionary cause of light and dark skin, although folic
acid may have something to do with it, as has been recently
hypothesised. I also believe in the whole miracle of evolution by
natural selection.

As I said, I do not regard these facts as miracles in the religious


sense of the word. They are, though, marvellous and awe-inspiring,
they are numinous. They are also scientifically explainable, using
what science already knows.
as I think all things can be, though this might require a broadening
of the definition of science beyond the influential philosopher Karl
Popper’s stringent definition of what science is. Popper defined
scientific theories by falsifiability – if a theory cannot be
falsified, it is not truly scientific. At the same time, we cannot be
absolutely certain about anything – just less doubtful.
Popper’s definition has a certain elegance to it, and was taught to me
as fact when I studied medicine in the 1980s but it is not how most
people, including scientists and doctors, think in practice. Much of
what we believe, we believe on the basis of faith – faith in Science
and faith in Medicine. This involves faith in the integrity of other
scientists and the integrity of the institutions that conduct science.
How justifiable is this faith, this belief in Science? How justified
is it in Medicine? How much of medicine is science and how much is the
art of healing? What is the relationship between medicine and health,
and where does science come into it? What is the nature of the
relationships between religion and health, religion and medicine and
religion and science?
To understand the relationship between science and religion we might
consider the science of healing, since all religions profess to heal
and have ancient texts about what health is and how health can be
achieved. In the Western tradition healing has been dominated by
medical doctors and Christian priests. The doctors healed through
science and the priests healed through prayer. The doctors did a far
better job of healing, and together with nurses, became accepted as
the “healing professions”. Later clinical psychologists,
physiotherapists and other “allied health professionals” claimed the
mantle of “healing professions”. The credibility of the medical system
is based on its claims to scientific truth and validity as well as its
results. Prayer too can be judged by its results, despite the fact
that it makes no claim to be scientific.
An American study published in 2008 of over 1,600 patients undergoing
heart surgery showed that prayers offered by strangers had no effect
on the recovery of people who were undergoing heart surgery. This
study showed that patients who knew they were being prayed for had a
higher rate of post-operative complications like abnormal heart
rhythms. The researchers suggested that this was perhaps because of
the expectations the prayers created, with the patients feeling under
pressure to get well, and becoming more stressed as a result of this.
Over the years my faith in the integrity of Western science has taken
something of a beating, and in some areas my faith has been lost. This
is a good thing, I think. In some areas I believed on the basis of
faith rather than evidence. I had faith in Medicine, based on the
false assumption that Medicine is a branch of science and that what I
had learnt at university and in my hospital training was “scientific
medicine”. Science, from the Latin scientio means “knowledge”, and I
assumed that what was in the textbooks and came out of the mouths of
professors was “true knowledge”. I was aware that mistakes had been
made by scientists and doctors in the past, but I assumed that the
fundamentals of what I learned in physics, chemistry, anatomy,
pharmacology, physiology and biology were factual. I also had a deep
faith in the institutions that promulgated this knowledge –
universities, especially those in Australia (where I studied), the UK
(where I was born) and the USA (where most of my textbooks came from).
During the years I worked as a family doctor I grew increasingly
disillusioned with the medical system for a number of reasons. One was
the role of the pharmaceutical industry – “big pharma” as it is
sometimes called – in shaping what medical students and doctors learn.
This has a predictable effect on what they prescribe when they are
treating patients – meaning which drug they choose, from a range of
drugs marketed under various labels. Sometimes the same drug is
marketed under two brand names by the same drug company to give the
consumer an illusion of choice. Treating the public is a massive
profit-driven industry, driven by demand for “treatment services”, of
which there are many alternatives.
The dominant treatment services are based on western medical science –
orthodox drug and surgery oriented medicine based on a disease model.
Much of the medical students training is concerned with how to
diagnose various diseases. This is based on history and examination,
followed by appropriate investigations to arrive (by logical
reasoning) at the most likely diagnosis (disease label) along with
less likely possibilities (differential diagnosis). Once a provisional
diagnosis has been made, this may be confirmed by various
investigations (such as scans or blood tests) or the patient may be
treated entirely based on the “clinical examination”. This is the
style of medicine that I learned at the University of Queensland in
the 1980s, and my father learned at Cambridge University in the 1950s.
The fundamental style of western medicine have not changed, but there
has been an increasing focus, since the 1950s, of “social and
preventive medicine”. In preventive medicine, the ostensible objective
is to prevent illness before it develops, which can be done through
“screening tests” for various disease markers, and promotion of
healthy thinking and activities. Lending itself to corruption and
abuse, preventive medicine also promotes the ingestion of drugs, not
for the treatment of illness but for its prevention. This means giving
drugs to people who are, at the present moment, perfectly well, based
on statistical evidence that the drug, for example, lowers cholesterol
or blood pressure.
There are good scientific reasons for treating high blood pressure and
LDL cholesterol, based on convincing evidence that both promote
atherosclerosis (‘hardening of the arteries’) which in turn leads to
ischaemic heart disease, stroke and kidney failure. The pathogenesis
of ischaemic heart disease, stroke and kidney failure are amongst the
areas of medicine about which much is known, and many scientific
studies have been done. They have also provided a bonanza for drug
companies and advertisers who used fear of cholesterol to market
everything from “low cholesterol potato chips” to statin drugs (one of
the biggest components of the pharmaceutical budget).
I was working as a family doctor (GP) in Melbourne, when the first
statin drugs were released. Prior to these drugs there were unpleasant
drinks that were supposed to lower the cholesterol, but it was only
with the release of the statins in the 1990s, that it became a central
drug in cardiology. Studies have shown that the statins, by raising
the ratio of “good” (HDL) to “bad” (LDL) cholesterol, reduce the
recurrence rate after heart attacks. They have not been shown to
reduce stroke, peripheral vascular disease or ischaemic kidney
disease, which are also thought to be caused by atherosclerosis (which
is macroscopically visible as blockages of arteries).
The difference between LDL and HDL cholesterol was ignored (though
known) in the interests of marketing. While I was testing and
monitoring the HDL/LDL ratio in my patients with angina and heart
disease in the 1990s, there were still drug company reps and ads
referring to cholesterol as if it is some sort of poison, and urging
people to have simple screening tests to measure their “total
cholesterol”. This was bad science and frightened people unnecessarily
(perhaps increasing their rate of atherosclerosis), since the total
cholesterol includes both cardio-toxic and cardio-protective types
(LDL or low density lipoprotein and HDL or high density lipoprotein
respectively).
The known fact that cholesterol is a vital and essential molecule in
human physiology was pushed from consciousness, outside that of
biochemists and endocrinologists who knew that this ubiquitous
molecule is an essential component of cell walls, and the precursor to
all the steroid hormones as well as vitamin D. Instead the public came
to believe that cholesterol is bad and statins are good. That is,
until people started becoming sick with muscle pains, diabetes and the
other problems that can be caused by statins. There were also reports
that the statins could cause decline in cognitive function.
The physical sciences and the social sciences.
There were some areas in which I was less confident about the sources
I was learning from, as far as their scientific merits were concerned.
These were psychology, sociology and psychiatry. I only did a couple
of semesters of anthropology and found only the first one – when we
learned about Australopithecus, Homo erectus and Homo habilis fossil
discoveries in Africa – of interest. The rest of the “humanities” was,
to my increasingly materialist, rationalist way of thinking, too vague
and imprecise. There was much jargon, but the terms were poorly
defined, there was little convincing evidence with numerous rival
theories and every study pointed out “how little we know”. This
attitude is sometimes genuine scientific humility, were every
statement and conclusion is appropriately qualified. At other times it
is used to justify further research grants, or to avoid actually
saying anything clearly. This was, after all, the Age of Behaviorism.
Though what has been retrospectively called the Cognitive Revolution
had begun two decades earlier in the same nations that gave us
behaviourism, here in the Antipodes, I learned, in 1978, that
“psychology is defined as the behavior of organisms” and that
psychology is both “scientific” and “curious”. It is certainly
curious, and one of the curious things about behaviourism is how they
ignored the importance of mind and thinking when they “scientifically”
studied the psyche. I did not gain an understanding of the psyche from
my prescribed psychology textbook, because this textbook was
Psychology and Life by Stanford University’s Philip Zimbardo. Zimbardo
is best known for the glaringly unethical Stanford Prison Experiment,
conducted by the psychology professor in 1971.
HYPERLINK
"http://en.wikipedia.org/wiki/Stanford_prison_experiment"http://en.wik
ipedia.org/wiki/Stanford_prison_experiment
This famous (or notorious) experiment was conducted from 14 to 20,
August, 1971. It was funded by the US Navy and Marine Corps and was
conducted in a “mock prison” which was constructed in the basement of
the Stanford psychology building. Zimbardo advertised for students who
were interested in being paid $15 a day to participate in the research
into “prison life”. They selected 24 young men, who were thought to be
psychologically well balanced, dividing them (how randomly one can’t
tell) into 12 guards and 12 prisoners. Zimbardo himself took the role
of “superintendent”. The prisoners were forced to stay in the basement
prison for the week, while the guards were allowed to go home after an
8-hour shift. Over the duration of the experiment, the guards, who had
been instructed to psychologically but not physically abuse the
prisoners, became increasingly tyrannical and cruel as the experiment
progressed, devising what can only be described as evil ways of
humiliating and demeaning their fellow students who had been assigned
the role of “prisoner”. Original video footage of the experiments can
be found on YouTube, as well as many interviews where Zimbardo refers
to his seminal research into the nature of evil.
The Stanford Prison Experiment was bad science, in that it was
immoral. This immorality was apparent from the outset, but only one of
Zimbardo’s co-researchers objected to it (resulting in the experiment
being ceased a day early). The conclusions Zimbardo reached were also
flawed, as subsequent research has shown, and the US military, which
funded the research, appears to have learned nothing from the well-
known experiment other than to refine its psychological torture
techniques. How to torture effectively is what Zimbardo was clearly
studying in the first place, though he claimed to be testing (and
successfully refuting) a scientific hypothesis (that inherent
personality traits of individual soldiers and guards are the main
cause of the abuse of prisoners, rather than social factors, such as
the power of the system, and assigned roles). Zimbardo’s
interpretation of the experiment was that prisoners and guards
“internalized” their identities as prisoners and guards respectively,
the prisoners becoming submissive and passive, and the guards becoming
increasingly sadistic and cruel. One problem, both ethically and
scientifically, was the instruction he gave the 12 guards the day
before the experiment began:
“You can create in the prisoners feelings of boredom, a sense of fear
to some degree, you can create a notion of arbitrariness that their
life is totally controlled by us, by the system, you, me, and they’ll
have no privacy. We’re going to take away their individuality in
various ways. In general what all this leads to is a sense of
powerlessness. That is, in this situation we’ll have all the power and
they’ll have none.”
Another problem is that he armed the guards – with wooden batons. The
prisoners did not know that Zimbardo had instructed to inflict only
psychological and not physical harm on the prisoners. Where, though,
is the line between what is psychological and what is physical, and is
it not obvious that threats of violence, backed by weapons and locked
doors, can create submission? It was said, long before Zimbardo
studied the matter, that power corrupts. He himself referred to the
experiment as being to do with power and powerlessness.
Is knowledge power, or does power come from locked doors, batons and
reflecting sunglasses? This experiment does tell us something about
power, and how it was misunderstood by Zimbardo and the many social
psychologists who drew dubious conclusions about the relationship
between power and evil from the Stanford Prison Experiment and
Zimbardo’s interpretation of it. /
It doesn’t help that Professor Zimbardo has long sported a little
goatee beard, that makes him look rather like the stereotypical image
of Satan. There are more sympathetic photos of him, but I rather like
this one.

Naming something is not explaining it, identifying is not either.

Western science and Science

Douglas Nakashima and Marie Roué have made a clear distinction between
Western science and “traditional knowledge”, which they equated with
“Indigenous knowledge”. In 2002 they explained, in the Encyclopaedia
of Global Environmental Change that “Western science favours
analytical and reductionist methods as opposed to the more intuitive
and holistic view often found in traditional knowledge. Western
science is positivist and materialist in contrast to traditional
knowledge, which is spiritual and does not make distinctions between
empirical and sacred. Western science is objective and quantitative as
opposed to traditional knowledge, which is mainly subjective and
qualitative. Western science is based on an academic and literate
transmission, while traditional knowledge is often passed on orally
from one generation to the next by the elders. Western science
isolates its objects of study from their vital context by putting them
in simplified and controllable experimental environments—which also
means that scientists separate themselves from nature, the object of
their studies;-by contrast, traditional knowledge always depends on
its context and particular local conditions.”

The Western medical model is not the only scientific model that claims
to be able to heal. There are also rival Traditional Chinese Medicine
(TCM) and Indian Ayurvedic models that compete for customers in
Australia and the West, as in India and China. In addition, there are
various Western healing methods that claim scientific validity but are
regarded, for various reasons as “alternative” or “complementary”
medicine – naturopathy, chiropractic, homeopathy and herbalism, for
example. A few Western-trained doctors like Deepak Chopra have
attempted a fusion between Vedic and Western science, but there are
serious problems with this enterprise. I discovered this when I tried
to integrate Eastern and Western health models twenty years ago.
When I was working as a GP in Melbourne in the 1990s, I was once
consulted by a young woman who had visited an acupuncturist for a
sprained wrist. She had been told, after an examination of her tongue
and pulse, that she had “warm Chi affecting the spleen” and “cold Chi
affecting her kidneys”. She had also been told that she needed to take
a concoction of herbs to correct the “moistness” in her liver. The
lady was insistent that I do some blood tests to “check” her liver,
kidneys and spleen. I took a careful history and examined her, finding
no signs of kidney or liver disease, nor evidence of enlargement of
her spleen, and tried to explain to her that what Chinese medicine
means by “liver”, “kidney” and “spleen” is not really the same as what
these organs mean to Western doctors. She wanted the blood tests
anyway, so I ordered performed a biochemical screen – checking her
electrolytes, liver function tests and blood count. They were all, as
I expected them to be, normal.
This event prompted me to look more closely at the theoretical basis
of Traditional Chinese Medicine (TCM) to see if the core concepts of
Yin and Yang, chi and meridians can be understood in terms of the
fundamental Western scientific medical disciplines of anatomy and
physiology. The meridians are channels through which the chi (which I
interpreted as energy of some sort) flowed. The meridians, which
determine acupuncture points, do not correspond to either the nervous
system or circulatory system in anatomy or function, and they have to
be added on rather than integrated with Western science.
/
/
The concepts of Yin and Yang are also integral to Chinese Medicine.
Yin (black with the white spot in the symbol) is the feminine and Yang
(white with the black spot) is the masculine, which has obvious
correspondence with Western science. One could regard the female
hormones, like oestrogen, as Yin and the masculine hormones like
testosterone as Yang, as well as other distinctly feminine and
masculine aspects of anatomy, physiology and psychology. This is not
what yin and yang mean in Chinese medicine, science and philosophy,
however.
Wikipedia explains that “In Chinese philosophy, yin and yang (also,
yin-yang or yin yang) describes how apparently opposite or contrary
forces are actually complementary, interconnected, and interdependent
in the natural world, and how they give rise to each other as they
interrelate to one another. Many tangible dualities (such as light and
dark, fire and water, and male and female) are thought of as physical
manifestations of the duality symbolized by yin and yang. This duality
lies at the origins of many branches of classical Chinese science and
philosophy, as well as being a primary guideline of traditional
Chinese medicine.”
Chris Kressler, armed with a Master of Science but not a medical
degree claims to be practicing “integrative and functional medicine”,
and has his own website that announces:
“Chris Kresser, M.S., L.Ac is a globally recognized leader in the
fields of ancestral health, Paleo nutrition, and functional and
integrative medicine.”
Kressler assures his readers that acupuncture does work, and provides
purely scientific explanations for why it works. Scientific, meaning
Western scientific. The fact that the Chinese meridian system does not
correspond to the nervous system, does not stop some Kressler from
claiming that acupuncture works through its effects on the nervous
system and stimulation of pain receptors in the skin. This he says
restores homeostasis, increases blood flow to the area, and causes the
release of pain-killing hormones (endorphins) from the brain. This
theory is scientific and also satisfies Carl Popper’s criterion that a
scientific theory be falsifiable. The theory that acupuncture works
because of the release of endorphins can be tested and measured,
supporting or disproving the theory. Kressler gives no evidence that
he is basing his claims on such studies, or whether this is mere
speculation.
In his posting on how he thinks acupuncture works Kressler makes
reference to “purists” who object to these efforts to explain
acupuncture in Western scientific terms, thinking that it takes the
magic out of acupuncture. Kressler claims that equating chi (qi) with
energy is incorrect, and that the original meaning of qi was air –
which he equates with oxygen. He writes: “When the terms qi (oxygen),
mai (vessel) and jie (neurovascular node) are properly translated, it
becomes clear that there is no disagreement between ancient Chinese
medical theory and contemporary principles of anatomy and physiology.
Chinese medicine is not a metaphysical, energy medicine but instead a
“flesh and bones” medicine concerned with the proper flow of oxygen
and blood through the vascular system.”
Kressler is a big admirer of Chinese Medicine and claims that the
Chinese knew much more about anatomy and physiology than the West,
with a tradition of anatomical dissection dating back over 2000 years.
He says that the ancient Chinese identified and weighed all the
internal organs, and knew that the heart pumps blood around the body
long before scientists in the West. This discovery, Kressler claims,
was made by the Chinese more than 2000 years ago “but only discovered
in western medicine in the early 16th century”.
Presumably, Kressler is referring to the famous discovery of the
heart-lung circulation by the British physician William Harvey (1578-
1657). Harvey described in detail how the left side of heart pumps
blood to the body, after receiving oxygenated blood from the lungs and
that the blood is pumped to the lungs from the right side of the heart
via the pulmonary artery. He also described in detail how the
oxygenated blood from the left side is distributed through arteries,
beginning with the aorta, to the head and brain, and to the rest of
the body. The ancient Chinese knew nothing of these things, though
they may well have known that the heart is the organ pumps blood. So
did, we can be confident, did the ancient Greeks, since this was the
reason that Aristotle proposed that that the soul resides in the heart
in the 3rd century BC. This curious idea arose from his quite
reasonable, but incorrect, view that thought is carried in the blood.
He obviously knew that the heart was the organ that pumped the blood.
Kressler makes the rather startling claim that Chinese medicine was
used by emperors and the royal courts to help them live into their 90s
and stay fertile into their 80s at a time when the average life
expectancy in the west was 30 years. Men do remain fertile into their
80s, and since recorded history some individuals have lived into their
90s. There has never been a time in recorded history when life
expectancy in the West was 30, and besides, average life expectancy
does not indicate the length of life of the longest-living
individuals. Besides, if one is going to take 2000-year-old texts
seriously one might also conclude that the medicine of the ancient
Jews enabled the patriarch Abraham to live hundreds of years.

Contradicting himself in his claim that TCM is a continuous tradition


for 2,500 years, Kressler says that the Huangdi Neijing (Yellow
Emperor’s Book of Internal Medicine) is written in a dialect of
Chinese that hadn’t been in common use in China for more than a
thousand years and that you could show it to a modern Chinese person
and they wouldn’t be able to read it.

The first westerner to attempt a translation of the Huangdi Neijing,


according to Kressler, was a Dutch physician named Willem ten Rhijne
who worked for the Dutch East India Company in Japan from 1683-1685.
He apparently reported clinical success by Chinese and Japanese
practitioners in treating a wide range of disorders, including pain,
internal organ problems, emotional disorders and infectious diseases
prevalent at the time. Kressler says that Ten Rhijne accurately
translated the Chinese character for qi as “air”, not energy, in his
reports to the Dutch government.
The translation that became the source for all of the textbooks used
in western schools of Chinese medicine, says Kressler, was done by a
man named Georges Soulie de Morant, a French bank clerk who lived in
China from 1901 to 1917. Enamoured with Chinese culture and
philosophy, he became interested in Chinese medicine and attempting to
translate the Huangdi Neijing, in spite of the fact that he had no
medical training nor any training in ancient Chinese language.
Kressler writes:
“It was a huge undertaking for a French bank clerk to translate a
2,000 year old medical text written in an extinct Chinese dialect into
a modern romance language (French). Under the circumstances, de Morant
did well in many respects. But he made some huge mistakes that had
serious consequences for how Chinese medicine has been interpreted in
the west.”
Far from being a scientific document, the Huangdi Neijing is mystical
in its explanation of the laws of diagnosis:
The laws of diagnosis [are as follows]:
As a rule, it is at dawn,
before yin qi has begun its movement,
before yang qi is dispersed,
before beverages and food have been consumed,
before the conduit vessels are filled to abundance,
when the [contents of the] network vessels are balanced,
before the qi and blood move in disorder,
that, hence, one can diagnose an abnormal [movement in the] vessels.
Squeeze the vessels, whether [their movement] is excited or quiet, and
observe the essence-brilliance.
Investigate the five complexions.
Observe
whether the five depots have a surplus or an insufficiency,
whether the six palaces are strong or weak, and
whether the physical appearance is marked by abundance or decays.
All this is brought together to reach a conclusion [enabling one] to
differentiate
between [the patient's] death and survival.”

The ancient Chinese obviously knew about vessels and blood, but what
are the “five depots” and the “six palaces”?
The Hungarian Sinologist Imre Galambos argues, in reference to the
Huangdi Neijing and its influence, that while in the modern,
“scientific” West it is customary to think that the newer a thing is
the better, in traditional Chinese thought this appears to be just the
opposite; a new thing could be justified and accepted if one could
prove that it has been already mentioned and thought of in ancient
times. As a result of this traditionalistic approach, medicine in
China has been regarded as a body of knowledge which has undergone
very little, if any, changes through the span of history.

Other patients wondered if I knew which foods they should eat to


correct their doshas, after reading Deepak Chopra’s books, expounding
the importance of understanding whether you are of the vata, pitta or
kapha “physiological types”. I had a more open mind about Indian and
Chinese medicine than most of my colleagues, and believed that
acupuncture worked, though I didn’t understand how. I even referred a
few patients to acupuncturists, with mixed results. I also had mixed
results from my referrals to medical specialists, psychologists and
physiotherapists. I never referred anyone to a herbalist or naturopath
and, as I do today, believed that homeopathy is pure placebo.
In 1995 a friend gave me a copy of Book of the Hopi, about the myths
of the Hopi Indian tribe in the USA. It was written in 1963 by Frank
Waters, and became popular with the New Age movement. In this book
Waters draws a comparison between the “scientific” model of the Hopi
and that of the Hindus, in their ideas of energy centres along the
axis, known in the Hindu tradition by the Sanskrit word chakra. I was
particularly interested in Waters’ association of the brow chakra with
the pineal organ, a small organ in the centre of the brain that
Descartes had argued, back in the 17th century, was the seat of the
soul.
This was my starting point in trying to integrate the Western science
in which I was trained with the Eastern models of India and China. I
first considered the concept that we have energy centres or nodes
along our vertebral axis, and the possibility that, for example, our
speech is related to the health of our thyroid. There is a possible
association – when the thyroid is underactive, speech is slower. When
it is over-active metabolism and activity, perhaps including speech,
are increased. The same cannot be said for the parathyroid glands,
which are also in the throat, embedded in the lobes of the thyroid
gland. The parathyroids are involved in calcium metabolism, secreting
the parathyroid hormone to regulate blood levels of calcium. Calcium
has a range of cellular functions, and is essential for both muscle
contraction and nerve conduction. Underactivity and overactivity of
the parathyroids affects the bones but do not affect speech. Likewise,
most of the activity of the thyroids has little to do with speech, and
less to do with hearing, which is also said to be controlled by the
throat chakra.
According to Wikipedia “the Vishuddha chakra is located in the neck
and the throat. Due to its association with hearing, it is related to
the ears, and due to its association with speaking, it is associated
with the mouth. Vishuddha is often associated with the thyroid gland
in the human endocrine system.” Adding the parathyroids to the
activity of the Vishuddha chakra allows one to link the chakra also
with the muscles and nerves. But what does this mean in practice, and
how does it help us to promote health?
The first chakra I considered deeply was the brow chakra – known as
Ajna (Sanskrit: ?????) chakra. The ajna chakra is associated with the
“third eye of Shiva” in Hindu cosmology and iconography. The third eye
is represented in the spot drawn on the forehead of Hindu women, which
in Tamil culture is called a “pottu”. The British missionaries banned
Tamil girls from wearing pottus because of the customs pagan
implications. The Third Eye, in addition to being possessed by Shiva,
was also the eye for perceiving truth. Depictions of the Buddha
traditionally include the Third Eye, though pottus are not worn by
Buddhists.
/
The association between the pineal and the third eye is not as ancient
as I assumed for many years. It is, I have since read, the product of
the fertile imagination of Helena Blavatsky, author of the Secret
Doctrine and guiding light of the Theosophical Society. As far as I
can tell the ancient Indians did not know anything about the internal
structure of the brain, and neither did the Chinese. Though the pineal
is placed at the centre of the meridian system in modern Chinese
medicine, that is because the main meridian runs through the midline.
The ancient Chinese and Chinese in the middle ages knew as little
about the structure and function of the brain as their contemporaries
in India and Europe.
The traditional Chinese and Indian models of medicine are linked to
their respective cosmologies in a way that western medicine is not
linked with Western cosmology. In fact western medicine has nothing
whatsoever to do with the Big Bang, planetary movements or movements
of the sun and moon. Western medicine ignores the wind and largely
ignored the climate till it discovered “Seasonal Affective Disorder”
with the marketable acronym of SAD. Not so traditional Indian and
Chinese models. Vata, which refers to the element of wind and the wind
god in Hinduism also finds expression as the vata dosha, interpreted
by Chopra as a “physiological type”. In Chinese medicine Yin and Yang
refer to the feminine and masculine respectively, but also to the moon
(yin) and sun (yang). There are 365 acupuncture points because of the
number of days in the year, while the Five Elements (wood, fire,
earth, metal, water) are integral to the Chinese model in the same way
that the Four Elements (fire, earth, air and water) were integral to
Medieval European and Islamic models of health. The four elements of
the ancient Greeks gave rise to the humoral theory which held that
health is regulated by the flow of ‘humours’ – black bile, yellow
bile, blood and phlegm.
The first people to note the existence and propose the importance of
the pineal were the ancient Greeks. Herophilus (325-280 BC) according
to the writings of Galen (130-220 AD) thought that the pineal was a
valve regulating the flow of “spiritus” between the ventricles of the
brain. Galen himself though the pineal was a gland. The French
philosopher, scientist and mathematician Descartes (1596-1650)
probably drew on the ideas of Herophilus when he proposed that the
pineal was the seat of the soul – the point at which the soul
communicated with the brain.
Descartes also proposed that that pineal was connected to the eyes, as
this famous drawing shows:
/
During the centuries after Descartes the pineal was gradually
relegated to obscurity. From being the proposed seat of the soul, it
came to be regarded as a primitive vestige of the reptilian brain,
with no function in humans. This view was confirmed by the fact that
the pineal calcifies with age – these calcifications were given the
evocative name of “brain sand”.
In 1957 edition of Arthur Ham’s Histology has a photograph of a
microscopic section of the pineal parenchyma and the explanation that:
“The pineal body of mammals is the vestige of the median eye which was
probably a functioning organ in certain amphibian and reptiles that
are now extinct. Like other vestigial organs in man, it tends to reach
its greatest development relatively early in life and thereafter to
degenerate. One evidence of the latter process is the formation of
calcified bodies of a laminated appearance in the organ. These
constitute what is called brain sand.” (p.746)

When I say that everything can be explained by science, this is a


statement of faith in the future. There are many things that cannot
currently be explained by science and maybe there are things that
can’t be. The famous zoologist Steven Jay Gould, whose books shaped my
understanding of evolution more than any other influence, argued that
science and religion are different domains. They ask different
questions, and there is no necessary conflict between religion and
science. Professor Richard Dawkins has a very different view. He
believes that “there is a profound contradiction between science and
religious belief”.
Dawkins argues, “Science is a discipline of investigation and
constructive doubt questing with logic, evidence and reason to draw
conclusions. Faith, by sharp contrast, demands a positive suspension
of critical faculties. Science proceeds by setting up hypotheses,
ideas or models, and then attempts to disprove them. So a scientist is
constantly asking questions, being sceptical. Religion is about
turning untested belief into unshakable truth, through the power of
institutions and the passage of time.”
In the same documentary in which he makes these statements about
science and religion, Dawkins tells an anecdote about how science
ideally progresses:
“I do remember one formative influence in my undergraduate life. There
was an elderly professor in my department who had been passionately
keen on a particular theory for a number of years. And one day an
American visiting researcher came and he completely and utterly
disproved our old man’s hypothesis. The old man strode to the front,
shook his hand and said, “My dear fellow, I wish to thank you. I have
been wrong these fifteen years”. And we clapped our hands raw.”
This anecdote actually suggests that scientists, like any other
person, do not accept their mistakes readily. Otherwise there would
have been no reason to clap their hands raw. In fact the history of
science suggests a battle of egos, reflected in the hierarchies of the
scientific and medical establishments. And then there is the question
of money. Who pays the piper calls the tune. This is the case with
medical research as well as scientific research more widely.

Christopher Hitchens was the author of God Is Not Great[9] and was
named among the "Top 100 Public Intellectuals" by Foreign Policy and
Prospect magazine. In addition Hitchens served on the advisory board
of the Secular Coalition for America. In 2010 Hitchens published his
memoir Hitch-22 (a nickname provided by close personal friend Salman
Rushdie, whom Hitchens always supported during and following The
Satanic Verses controversy).[10] Shortly after its publication,
Hitchens was diagnosed with esophageal cancer, which led to his death
in December 2011.[11] Before his death, Hitchens published a
collection of essays and articles in his book Arguably;[12] a short
edition Mortality[13] was published posthumously in 2012. These
publications and numerous public appearances provided Hitchens with a
platform to remain an astute atheist during his illness, even speaking
specifically on the culture of deathbed conversions and condemning
attempts to convert the terminally ill, which he opposed as "bad
taste".[14][15]

Daniel Dennett, author of Darwin's Dangerous Idea[16] and Breaking the


Spell[17] and many others, has also been a vocal supporter of The
Clergy Project,[18] an organization which provides support for clergy
in the US who no longer believe in God, and cannot fully participate
in their communities any longer.[19]
My cosmology has no need of the supernatural, though much is
unexplained, and some may be unexplainable. Even less is provable by
empirical science. My cosmology has room for God, not as a
supernatural deity, but as an abstract term for Good or goodness. I am
convinced that we made gods in our image rather than the Biblical idea
that “He made us in his image”. Goodness is obviously not male or
female, however the creation of the universe was good (not just from
our perspective as humans, but certainly that). By this definition,
one cannot speak of God doing something, only whether something done
was God. Therein lies the problem with my simple suggestion that God
is goodness. The God of many religions, in fact all religions, do
things. They have agency. They are supernatural and I don’t think
anything is above Nature.
Over the past few weeks I have been catching up with debates about
religion, science and belief, after reading two books I have had for
some years – The Astonishing Hypothesis by Francis Crick and Kind of
Minds by Daniel Dennett. Links to the video debates on the subject can
be found later in this essay, along with commentary.
Another YouTube clip of Francis Crick was recorded shortly before he
died and is titled “History of Neuroscience: Francis Crick” (it is the
history of Crick’s work and not a general history of neuroscience). It
is published on Youtube by the Society for Neuroscience, that boasts
only one other interview in its “History of Neuroscience” series – an
interview from 2001 with the psychiatrist Eric Kandel, who won a Nobel
Prize for his questionable work with schizophrenia and dopamine.
HYPERLINK "https://www.youtube.com/watch?
v=mXGZ3euhq4g"https://www.youtube.com/watch?v=mXGZ3euhq4g (Francis
Crick)
HYPERLINK "https://www.youtube.com/watch?v=NH9Cc-
YyYt8"https://www.youtube.com/watch?v=NH9Cc-YyYt8 (Eric Kandel,
interview from 2001)

The Koch that Crick speaks of is Christof Koch, who collaborated with
him in his studies of the monkey visual system. Koch can be seen in
this YouTube discussion:

Adenosine receptors – what are they?

Distribution of adenosine receptors in the postmortem human brain: an


extended autoradiographic study.

P Svenningsson
P Svenningsson
H Hall
H Hall
G Sedvall
G Sedvall
Bertil B Fredholm
Bertil B Fredholm
Department of Physiology and Pharmacology, Karolinska Institutet,
Stockholm, Sweden.
Synapse (Impact Factor: 2.43). 01/1998; 27(4):322-35. DOI:
10.1002/(SICI)1098-2396(199712)27:4<322::AID-SYN6>3.0.CO;2-E
Source: PubMed
ABSTRACT Whole-hemisphere sections from six subjects were used in a
quantitative autoradiographic study to characterize and to investigate
the distribution of adenosine receptors, using [3H]DPCPX, [3H]CGS
21680, and [3H]SCH 58261 as radioligands. [3H]DPCPX-binding showed the
pharmacology expected for adenosine A1 receptors and is therefore
taken to mirror adenosine A1 receptors. Adenosine A1 receptors were
widely distributed, with the highest densities in the stratum
radiatum/pyramidale of the hippocampal region CA1. Adenosine A1
receptors were nonhomogeneously distributed in nucleus caudatus,
globus pallidus, and cortical areas: In the cingulate and frontal
cortex the deep layers showed the highest labeling, while in the
occipital, parietal, temporal, and insular cortex it was highest in
the superficial layers. In addition, we found very high levels of
adenosine A1 receptors in structures known to be important for
cholinergic transmission, especially the septal nuclei. The Bmax
values and KD values for [3H]DPCPX-binding in stratum
radiatum/pyramidale of CA1 and the superficial layer of insular cortex
were 598 and 430 fmol/mg gray matter and 9.9 and 14.2 nM,
respectively. [3H]CGS 21680-binding was multiphasic, but showed the
pharmacology expected for adenosine A2A receptors and was taken to
represent them. Adenosine A2A receptors were abundant in putamen,
nucleus caudatus, nucleus accumbens, and globus pallidus pars
lateralis. Specific [3H]CGS 21680-binding was also found in certain
thalamic nuclei and throughout the cerebral cortex. The adenosine A2A
receptor antagonist radioligand [3H]SCH 58261 was also found to label
these extrastriatal structures. Thus, adenosine A2A receptors seem to
be more widely distributed in the human brain than previously
recognized.

At this point in writing this book I was locked up at the Princess


Alexandra Hospital and my office ransacked and vandalised. The
computer I was working on stopped working and I have only just been
able to retrieve the document (17.6.2018)

<< PAGE \* MERGEFORMAT 206>>

Potrebbero piacerti anche