Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
The Australian-English word ‘yakka’ (as in hard yakka) means work. The
term comes from the Yuggera (or Jagera) people – the Aboriginal tribe
that lived where I do now, prior to the arrival of the White Man. The
Yuggera tribe was one of several that spoke the Yuggera languages (the
Yuggera Language Group) which is one of many language groups that
belong to the Pama-Nyungan language family. The Yuggera language is
one of more than twenty languages spoken by the Murri people of
Queensland (murri means kangaroo in Yuggera). The Pama-Nyungan
language family is one of 27 languages families, and the one that
covers the large majority of the Australian landmass. It is shown in
yellow in the map below, based on modern linguistic scholarship and
published on Wikipedia:
I was brought to Yuggera land when I was fifteen, against my will, but
without any voiced objection. I liked where I was, in the hill town of
Kandy in Sri Lanka, and did not see the attraction of leaving my home
and moving to a distant land, about which I knew very little, other
than that it had lots of deserts and strange marsupial animals. I knew
that Australia had been settled by the British, which used it as a
prison colony, and that this land was forcibly acquired – stolen –
from the Aboriginal people. I didn’t know much more about Aboriginal
people and what they suffered at the hands of the colonists, but I did
have something of the Third World perspective in my view of
colonialism – I saw colonialism as an evil imposed on dark-skinned
people by people who considered themselves “white”. This was certainly
the case in Australia, where the discrimination based on race –
specifying the Aboriginal race – was enshrined in the original 1901
constitution that federated what had previously been separate British
colonies into the nation of Australia. In determining the population
of Australia by census, according to the constitution, people of the
“Aboriginal race” were not to be counted (denying their existence as
well as their vote).
verb
(of a loud noise) be repeated several times as an echo.
"her deep booming laugh reverberated around the room"
have continuing and serious effects.
"the statements by the professor reverberated through the Capitol"
Fritjov Capra
James Lovelock (gaia theory)
Lynn Margulis (endosymbiosis)
Rupert Sheldrake (morphic fields)
Jan Smuts (holism – coining of term)
Satish Kumar (holism – New Age)
Vandana Shiva (ecology – dubious claim about zinc)
Deepak Chopra (quantum healing, ayurveda)
Stan Grof (parapsychology)
Dean Radin (parapsychology)
Robert Lanza (biocentrism)
Rudolph Tanzi (dementia – consciousness)
Spirituality without Religion - journey of the soul
1940 Papez circuit from injecting rabies into brains of cats (injected
into hippocampus, which was considered central to emotions – now more
associated with memory):
/
/
/
/
Role of the cortex
The role of neurotransmitters – dopamine, glutamate
Precuneus
/
Insula
Caudate nucleus:
/
/
/
/
What about the pineal? The gland/organ has sympathetic innervation via
the superior cervical ganglion (SCG) and parasympathetic innervation
via the pterygopalatine and otic ganglia. What is the significance of
this? There is also innervation by the SCG of the choroid plexus
(which secretes CSF).
/
Thalamus
RAS
Midbrain amine-generating networks – dopamine, Ach, serotonin
By the seventies, when I entered my teens, the New Age had already
been commercialised.
Milos Forman (One Flew Over The Cuckoo’s Nest in 1975) made film
adaptation of Hair in 1979. Czech film director – Hair – bass player
and singer?
when I wrote Alpha State: A State of Mind for the New Age, I had not
even made the connection between astrology, the Age of Aquarius and
the New Age. The New Age, for me, began in the 1990s, because that’s
when I was introduced to the movement that Richard Dawkins sees as the
“enemies of reason”, including the physician Deepak Chopra.
Satish Kumar does not claim to treat or heal, and was in the first
episode – slaves to superstition – rather than the second – The
Irrational Health Service, which is devoted to exposing medical
charlatans as “enemies of reason”. The Indian-American physician
Deepak-Chopra, who has made a name for himself by marketing a
combination of Ayurvedic treatments, ritual mediation and various
relaxation strategies and dietary regimes as “quantum healing” was
targeted, with rather more justification than Satish Kumar, who makes
it quite clear, in the unedited version of their interview, that when
he says spiritual, he does not mean supernatural. Kumar does not
believe in the supernatural, and believes that everything is natural,
but that all life is interconnected and interdependent. His way of
speaking is poetic and Indian, and the concepts he used are the
product of his culture, which is very different to that of Richard
Dawkins, who is very English indeed (though born in Kenya, which was
then called, by the British, British East India).
Dawkins has been described as “old school”, and this has an element of
truth in it – there is no older school in the English-speaking world
than Oxford University. From my perspective both Dawkins and Kumar
are old men, though Kumar is a few years older than Dawkins, which in
the English language as used in Australia, is commonly called “senior
to”. Old people gain wisdom by virtue of the longer time they have
spent on the planet – the wisdom of age and experience that are
traditionally respected in cultures around the world. Wise old men
have been regarded as sages in the East and the West. The difference
is that Satish Kumar is a sage in the Indian tradition, where sages
and teachers are all called “gurus”, while Dawkins is a sage in the
British tradition, where sages are called “professor emeritus”. By
that token, in the British hierarchy, Dawkins is not yet a sage. In
the Indian tradition, he would be guru Dawkins or Guru Richardawkins.
That would mean “venerable teacher Dawkins” in a tradition that has
great respect for teachers and teaching, and hold gurus in high
regard.
Wiktionary provides this explanation of ‘guru’:
“From Hindi ???? (guru) / Urdu ??? (guru), from Sanskrit ???? (gurú,
“venerable, respectable”). A traditional etymology based on the Advaya
Taraka Upanishad (line 16) describes the syllables gu as 'darkness'
and ru as 'destroyer', thus meaning "one who destroys/dispels
darkness".
The Upanishad talks about awakening the kundalini and thus realizing
Brahman, the Absolute Reality. Its verses on the importance of the
guru (teacher) are often quoted.
The Upanishads (Sanskrit: ???????,) are a collection of texts in the
Vedic Sanskrit language which contain the earliest emergence of some
of the central religious concepts of Hinduism, some of which are
shared with Buddhism and Jainism. The Upanishads are considered by
Hindus to contain revealed truths (sruti) concerning the nature of
ultimate reality (brahman) and describing the character and form of
human salvation (moksha).”
Sanskrit
I have also seen old men deliberately playing the part of sages, by
adopting the stereotypical style of a wise old man, and telling
anecdotes of their youthful exploits and the important people they’ve
met. They play on the cult of the guru – the natural tendency to
worship and obey old men with beards, under the mistaken assumption
that they are wise because of their white hair and long white beards.
This is seen as the superstition of Merlin and Gandalf in the West and
the superstition of the omniscient guru in India. Professors sometimes
wear a beard for this precise reason, though the habit has rather
grown out of fashion as long white beards have become associated with
the Maharishi Yogi and the “mystical gurus” with Mercedes Benz cars.
The Modern Western professors are mostly clean-shaven; the fashionable
ones wear a bit of stubble to show how non-conformist they are, while
others sport long hair, a moustache, or shave their heads, according
to their idea of fashion and style. I have watched the evolution of
the dress and grooming of celebrity professors over the years with
some amusement.
This book is the result of my digesting The God Delusion over decade,
in the back of my mind while battling the Australian psychiatric
system’s very real, and very cruel treatment of people who are
‘diagnosed’ with delusions. This system was developed in Britain with
inspiration from Germany and exported from Britain to Australia as a
means of variably controlling natives, settlers, migrants and convicts
and modified by American psychiatry in recent years. A person with a
delusion is ‘mentally ill’ according to this diagnostic system, and
rather than being caused by others (the act of deluding someone and
thus causing delusions) it is taught, in universities around the
world, that delusions are caused by chemical imbalances in the brain,
and signs of serious, incurable mental illness. Delusions may be
variably diagnosed as evidence of schiozophrenia, schizo-affective
disorder or mania, all of which can be used as diagnostic
justifications for forced treatment in “closed” (meaning locked)
wards. Whether the delusion is diagnosed as evidence of schizophrenia
or mania depends on how ‘odd’ or ‘bizarre’ the delusion is, and
assessments are also made of how strongly held they are (degree of
conviction or ‘intensity’). If the delusion is not deemed to be
“bizarre” people can still be diagnosed as “delusional disorder” and
have drug treatment forced on them. In the Western medical system,
delusions are indicative of psychosis – being out of touch with
reality. The standard treatment for delusions is not to convince them
out of the delusions by reasoned debate, but to force drugs into them
by mouth or injections, and locking them up until they agree that they
have an incurable mental illness. This is termed “gaining insight into
the disease”. This is the problem with saying that everyone who
believes in God is deluded, and one of the problems that I will try
and address in this book. Frankly, I enjoyed The God Delusion, but the
behaviour of Dawkins since then has made me think he may be bit of an
enemy of reason too. I don’t think it is reasonable for a professor of
the public understanding of science declare people deluded without a
knowledge of the current scientific and medical treatment of delusions
in his own country.
What does it matter how a professor is dressed, you may ask? It
matters a great deal – since it is how the professor creates an image
that others judge him or her by. There is a difference between how
female and male professors dress and present themselves to the public
and to their peers and students, and to what they choose to wear when
they are in front of a camera. I am more interested in the clothes men
wear and how they shave their head (if in fact they do) than the
dress-style of women, when it comes to a sociological and
psychological analysis of professors, and how it came to be that an
esteemed Oxford professor chose to frame an Indian philosopher who is
not a professor, an academic, or scientist as an “enemy of reason”
rather than having a healthy, interesting conversation with him. What
makes Dawkins tick, in other words?
This book is the result of my pondering the words and actions of
Richard Dawkins more than anyone else. I began by liking him, though I
had never met him, and had no idea of what he looked like, how old he
was or how he dressed. I had no idea whether he dressed at all or
wrote stark naked – it was his words and how he expressed his ideas
about the evils of organized religion in The God Delusion that made me
like him. I had previously read The Selfish Gene, but other than the
concept of memes, which I have found useful I was not much influenced
by his evolutionary ideas, having already read and very much enjoyed
the popular science books of the great Stephen Jay Gould, whose
opinions about evolution I had every reason to trust, and did. Gould
was a Harvard professor who was also famous as being an “evolutionary
biologist” and an opponent of “creationism” being taught in American
schools, but argued that religion and science belonged to different
domains and could co-exist peacefully. Gould also wrote passionately
and wittily about the failings of science, where evolutionary theory
had gone wrong, and where academia had made mistakes in his own field
of palaeontology. He also wrote about the difficult questions in
evolution, such as how the theory explained the marvel of the human
eye or a bird’s wing (such as how is a part-wing an evolutionary
advantage). One thing I learned from Gould that hadn’t considered
before was the “flaws in design” which disprove the idea of a
perfectly designed creation – the theme of his book The Panda’s Thumb.
I was also mesmerised by Gould’s Wonderful Life about the fossil
discoveries in Burgess Shale in Canada, of a fascinating array of
animals, most of which disappeared in the mass extinction event that
preceded the “Cambrian explosion” that I had read about when I was a
child. I grew up reading about dinosaurs and evolution, and didn’t
find Dawkins’ idea that the primary determinant of selection is the
gene particularly startling. This was the central theory put forward
in the Selfish Gene. I knew that some people would misunderstand the
title as meaning that people are naturally selfish – that they have a
“gene” for selfishness. This would be a natural mistake, but not how I
interpreted Dawkins’ theory. I understood it quite easily, since it is
the logical, linear result of reductionism within a materialist
paradigm. This is the paradigm in which I was educated.
I have not read Dawkins’ book The Blind Watchmaker (1986) but I have
watched the documentary he made of the same name. In this documentary,
Dawkins argues for gene-centred evolutionary theory and against
“intelligent design”, as well as the bizarre claims made by biblical
fundamentalists in the USA at the time that fossilized dinosaur tracks
included the footprints of humans alongside those of the dinosaurs.
This confuses two issues – crazy ideas based on fundamentalist
Christian creationism that the earth is only 6000 years and reasonable
criticism of a gene-centred view of evolution. I didn’t doubt
evolution, and I was well aware that evolutionary theory proposes
“randomness” only for the variation of a species and not its
selection, which is far from random. The selection is based on
reproductive success, which requires survival to reproductive age, at
least. Darwin and Wallace’s similar theories were that speciation
occurred through the survival of some individuals rather than others.
These surviving individuals passed on various inherited traits to
their offspring, and the whole process of evolution was determined by
the fact that some survived and others didn’t. The strong survived and
the weak perished; the best adapted to an environment survived while
those less well adapted perished, or rather, did not have as good
reproductive success. Dawkins argues that the selection occurs not on
the level of the individual but on the gene. Others think that natural
selection occurs also on the individual and group levels. In my
opinion there is natural selection on all these levels as well as
between species (which doesn’t cause speciation, of course) and there
is also natural selection of what Dawkins calls “memes”. Biology and
zoology merge into anthropology and sociology, without clear borders.
How guilty is Satish Kumar of being an enemy of reason? What about the
two doctors, Deepak Chopra and Manjir Samanta-Laughton, who were also
featured in the Enemies of Reason documentaries, along with assorted
con artists and obviously deluded New Age charlatans? What is the New
Age movement, and to what extend has new age philosophy become a
religion? Was it a religion all along, or has it been made into one?
The first book I actually finished, back in 1996, was titled Alpha
State – A State for the New Age. The first chapter was titled “Western
Medicine and Holism”. I’ll begin with the first two paragraphs of this
book, which I never sought a publisher for:
“In recent years there has been increasing criticism of and
disenchantment with western medicine and psychiatry from a range of
people. These include patients who have in their perception received
superior care from ‘alternative’ and natural health practitioners,
natural practitioners themselves who criticise western medicine for
its excessive use of potentially toxic and extremely expensive drugs
and its lack of a holistic approach to individuals they treat, which
leads to a lack of sustained cures for many of the conditions treated
by western doctors.
“A rare group of critics of western medicine come from within the
community of western trained doctors themselves. Some of these defect
to ‘alternative medicine’, others give up medicine altogether. A few,
like Deepak Chopra have made attempts to integrate eastern and western
medicine but have, as a result of it been denigrated and ostracised by
the establishment doctors and specialists who view him as a crackpot
at worst or a maverick at best.”
A few years later I wrote in red pen the note “I think, really, he is
a bit of a crackpot! Especially his explanation of Vedic science and
‘quantum healing’”.
Back in 1995 and 1996, when I wrote this book, I was excited about the
New Age – the Age of Aquarius. The hippie anthem from the musical
Hair, which proclaimed “the dawning of the Age of Aquarius” was one of
my favourite songs when I was a teenager. The Age of Aquarius topped
the Billboard singles charts for six weeks in 1969, and even made it
even to Sri Lankan airwaves, where it was played on the country’s only
English music show on the radio, at the time. It’s a brilliant song,
carefully crafted to draw you in and hypnotise you with melody,
harmony and groove. A strong Black female voice lures you in before
the chorus that reverberates in your mind, reinforces as it is
repeated “The Age of Aquarius, the Age of Aquarius...Aquarius……
Aquarius.” The last Aquarius resolves decisively in the melody.
I only noticed the words of the chorus when I fell in love with the
song. I fell in love with the Black woman who sang the song too,
though I had no idea who she was or what she looked like. I fell in
love with the beauty of her voice, like many other young men of the
time. The song remained in the back of my mind till the Age of
Aquarius was reinvented as a marketable commodity and profitable
religion in the 1990s.
I didn’t listen to the lyrics of The Age of Aquarius closely enough to
realise that they were singing about an age based on astrological
conjunctions, and that the Age of Aquarius followed the Age of Pisces.
I didn’t believe in astrology, though I knew I was “on the cusp
between Virgo and Libra”, having been born on the 22nd of September.
Associating the idea of Virgo with a young woman – a virginal one at
that – I preferred, in my youth, to regard myself as a well-balanced
Libran than a feminine Virgo. In the 1990s, when I was in my thirties,
I was told by experts on such matters that being born in 1960 meant I
was ‘definitely’ a Virgo and that I was also a Rat according to
Chinese astrology. By then, I couldn’t have cared less – I had long
outgrown the superstition of astrology; even as a child I never really
believed it. I did know that my mother was a Piscean and my sister a
Gemini, though I never really attributed any other characteristics to
this label.
The New Age I was writing about in the 1990s, had less to do with
belief in astrology than a social and intellectual movement that
looked forward with optimism to the millennium. This movement was, as
Richard Dawkins rightly points out, extremely superstitious – it was a
time that people started seriously discussing the healing powers of
crystals and the predictive powers of tarot cards, along with
“parallel dimensions” and “alternative realities”. Young people in
Melbourne in the 1990s were enticed into shops that sold charms, and
assorted magical trinkets. Rupert Murdoch’s Harper-Collins corporation
led the action – a series of books published by as their “Aquarian”
series included the titles: Understanding Astrology; Understanding
Tarot; Understanding Runes; Understanding Reincarnation; Understanding
Crystals; Understanding Chakras; Understanding Numerology;
Understanding Dreams; and Understanding Astral Projection. If you
hadn’t been driven crazy enough by this superstitious nonsense you
could obtain more delusions from Dowsing; How to Develop your ESP;
Incense and Candle Burning; Invisibility and, believe it or not,
Levitation. If you believed the rest you probably would not baulk at
levitation.
This Aquarian series is the real culprit when it comes to enemies of
reason, and it should be Rupert Murdoch who should be interviewed by
Professor Dawkins if he wants to get to the root of the evils of the
New Age religion. Let me quote from Understanding Auras, first
published in 1987, and authored by a man who is otherwise
unidentified, named Joseph Ostrom. The book cost me $12.95, and I
bought it only because I was collecting evidence of a conspiracy to
drive people mad and then diagnose them as having schizophrenia. This
was an original hypothesis – I was looking for evidence, based on the
observation that the same beliefs being promoted by Harper Collins in
their Aquarian series was being promoted as evidence of schizophrenia
and psychosis in other publications by the same corporation, as well
as by mainstream medical and psychiatric opinion since long before I
studied medicine. What’s more, they really were delusional beliefs.
Looked at in religious terms, one might put forward the hypothesis
that the New Age religion is one of many religions created by Rupert
Murdoch and his media empire. He took a social movement that
crystallised around opposition to conscription during the Vietnam War,
the theme of Hair, which generated several hit songs, and made a
religion out of the Age of Aquarius – the New Age religion. This is
quite different to the New Age movement, which is spiritual, rather
than religion, and characterised by its emphasis on ecology,
environmental values, international friendship and peace, and similar
values, that are sometimes regarded as “green” or “left”. I think the
term “left” is best left alone, with its connotations of the pompous
debates between the Left and Right wings of the British parliament,
but the concept of the left and right in politics has changed a lot
since then. Left means reasonable and right means extremist, greedy,
materialist and capitalist. But left also means corrupt trade unions
and clueless street marchers, marching under various “left wing”
banners and slogans. Left meant Marxist, and I never was a fan of Karl
Marx. If Marxists meant fans of Groucho or Harpo Marx I would have
seen more sense it. They were funny. I have to say I find economics
boring, which is why I know nothing about it; maybe that’s why I’m
poor.
The New Age was fresh and new
What was, in the 1970s, a new age, is not new anymore. We are well
into the Age of Aquarius; about forty years into it. This age is going
to last for hundreds or thousands of years – it’s our long term
future.
The song peaked at number one for six weeks on the U.S. Billboard Hot
100 pop singles chart in the spring of 1969. The single topped the
American pop charts and was eventually certified platinum in the U.S.
by the RIAA
Drugs, racial integration, nudity, anti-conscription, anti-Vietnam War
– music and optimism.
Counter-culture; counter-terrorism and Chomsky
Bertrand Russell
/
“I call him the Diamond-encrusted guru” says the British physicist
Brian Cox of Deepak Chopra, referring to Chopra’s spectacles. It got
a laugh from the studio audience. Cox knows how to work an audience,
as indeed Chopra does. Born in 1968, Cox is a young professor at the
University of Manchester and a science presenter on the BBC. He knows
much more than Chopra about physics, and this makes gives him a good
position to make fun of the New Age “guru” and his wacky ideas about
quanta, non-locality and other quantum physics concepts.
This culture had outgrown fanaticism and the hysteria of teenage girls
screaming and sobbing at the mere sight of four white English boys who
were often on TV, where they played to the love-struck emotions of
girls who imagined they were “in love” with them. The Beatles gave the
word “love” a bad name. It became clichéd and silly, and when I grew
up in the orient and antipodes my family and friends were not fans of
the Beatles, Elvis Presley or any of the “pop stars” who sang silly
love songs. We were variably interested in classical, pop, rock, folk
and jazz music, but not fanatical about the men and women who
performed it. This is not to say that we didn’t “fall in love” with
pop stars – but at least we didn’t
Deepak Chopra
Eric Topol, MD
Jack Andraka
Kyra Phillips
Rudolph Tanzi
Bruce Vaughn
Christof Koch, PhD
Deepak Chopra
Eric Topol, MD
Jack Andraka
Kyra Phillips
Rudolph Tanzi
/
Deliberate meditation position of hands
/
“Peace with the Cosmos”
MORE SPEAKERS
y in medicine; not like in geological time.
I wonder and ponder much more than I speak or write. I didn’t always.
I used to speak and write much more than I do nowadays, back when I
was working as a family physician. Most of my writing, for many years,
consisted mainly of writing “notes” about my patients, and most of my
speaking was giving medical advice. I also spoke to my family and
friends, and rarely to my medical colleagues, but most of my talking
was to patients, and most of my writing was also about patients.
Then, twenty years ago, I became a patient myself. This had the effect
of stopping me from talking, from which I have never fully recovered.
I can talk, quite reasonably, about many things, but I don’t
customarily do so. The physical effects of the haloperidol syrup that
first stopped me talking wore off long ago, and I am quite able to
talk, and talk quite normally again. The total experience of having
this drug forced on me has scarred me much more than the chemical
effects alone.
This is a book about me, my wonderings, ponderings, musings and
conclusions. Why should you care about any of these? Well, you may
not. I am not a philosopher or a historian, but I have come to
philosophical and historical conclusions based on my reasoning, not as
a philosopher or historian, but a scientifically-trained doctor.
Medical education is not big on philosophy and history, nor on
geography, geology and the other non-medical sciences. Scientific
training is strictly divorced from “the arts”, which means that we
learned nothing about emotional responses to music, conversations or
literature in our training. These are some of the things I have been
pondering about over the years, and occasionally writing about.
Most of my writing over the past twenty years has been about
psychiatry, though I am not a psychiatrist. The books I have written
have not been widely read, though they have been diagnosed as evidence
of mental disorder and paranoia by one of Melbourne’s most senior
professors of psychiatry, Professor Bruce Singh, who offered the
opinion, in a letter to the Victorian Medical Board that “whilst
individual paragraphs and pages contain scientifically valid material
and thoughtful comments and criticisms, interspersed amongst these are
clearly paranoid outbursts and much is made of guilt by association”.
Professor Singh presented his perspective of our one and only meeting
in a letter addressed to Dr Joanne Katsoris, registration manager at
the Medical Practitioner’s Board of Victoria with this introductory
paragraph:
“I saw Dr Senewiratne on 1 May 2001 in my office at the Royal
Melbourne Hospital. Dr Senewiratne presented with a large briefcase
full of multiple manuscripts that he had written and a book that he
had published himself, he showed me many of these over the course of
the interview. I had also been sent a copy of his manuscript on
schizophrenia prior to my seeing him. My comment on all of the
documents shown to me is that whilst individual paragraphs and pages
contain scientifically valid material and thoughtful comments and
criticisms, interspersed amongst these are clearly paranoid outbursts
and much is made of guilt by association.”
Professor Singh did not mention the “manuscript on schizophrenia” I
had laboured over for many years by name; it was a 400-paged book I
had recently finished and printed a dozen copies of which I had, after
some deliberation, titled The Politics of Schizophrenia – dodging the
Inquisitors and Curing Delusions. I knew that the title would be seen
as provocative when I sent a copy to the Medical Board, when they
accused me of being an “impaired practitioner”. I was defiant in my
comparison between psychiatrists and Christian inquisitors, and that
it was a good idea to dodge them. I did not really expect Professor
Singh to agree with my argument that:
“The word psychiatry is derived from the Greek psyche (mind/soul) and
iatros (treatment). It refers, literally, to treatment of the mind.
This treatment can be cruel or kind. If it is cruel, it can be
expected to compound existing problems (and can, moreover, be defined
as torture). If it is kind, it can lead to relief of suffering and
distress. Despite the popular idiom that ‘one must sometimes be cruel
to be kind’, cruelty and kindness are opposites; it is not possible to
treat someone cruelly and kindly at the same time. Neither is it
necessary to be cruel to achieve therapeutic success, unless
therapeutic success is defined as ‘compliance with the orders of the
therapists’. If successful therapy is so defined, and, in the area of
psychiatry it often is, then cruelty becomes a powerful tool for
forcing upon the ‘patient’ the opinions and views of the therapist.”
(p 170)
Psychiatry in theory is rather different to psychiatry in practice. I
had rather hoped that Professor Singh would give me a fair hearing,
after reading how important he recognised cultural context to be in
psychiatry. Transcultural psychiatry was a fashionable new trend that
had been featured in the local newspapers, along with a photograph of
Professor Singh, who is noticeably “transcultural” in that he looks
Indian and has a well-known and common Indian name. As you can see,
from this Flikr posting (which mentions only that he is Chair of the
Board), Professor Bruce Singh wears a tie and a suit, and this is how
he was dressed when he spent an hour attempting to diagnose me with
one or other “disorder”. He did have the option of declaring me to be
mentally well, but that meant allowing me to have divergent views in
medicine without pathologising me.
/
Professor Singh wrote, regarding my medical approach, “He states that
he takes the holistic approach to medicine and says that he practices
scientific medicine and does not prescribe herbs or vitamins. He
listens to his patients and uses investigations when appropriate”. In
his final “impression” the important man wrote that “I suspect that it
is probably true that Dr Senewiratne has the capacity to insulate his
more conventional medical treatment (with the notable exception of
psychiatric disorder) from his broader range of ideas and views
regarding the world for much of the time.” His final curse:
“This is not the situation of a man willing to accept that he has an
illness and to have treatment for it – this is an intelligent man who
is creative and intellectually highly active who is convinced that he
is not ill nor in need of treatment. This is a potentially dangerous
combination for any professional and particularly a doctor interacting
with the community.”
This paragraph was bound to still like mud, since it is was what read
last; being often the only thing that is actually read of a five-paged
report. It was a written curse that doomed me, and that I have been
fighting ever since. I have been pondering and wondering about this
curse, and how to escape it. I have been thinking about labels of
mental illness as curses and men like Bruce Singh as the Grand Wizards
of the cult of psychiatry – or the high priests of the religion of
psychiatry, if you prefer. They are also the masters of the mind in
the Free World, who rule their global enterprise though manipulation
and hypnosis, as well as what they call “biological treatments” –
drugs, shock treatment and brain mutilation. These are better known in
Australia by the euphemisms of ‘medication’, ‘ECT’ and
‘psychosurgery’. This is the paradigm I had when I was diagnosed as
having a “paranoid personality” by Professor Singh, though I minced my
words when I spoke to him, and hid my disdain of the empire he and the
other kings of psychiatry had built in Australia. Not that Australia
was my home, in fact Bruce Singh was born in Australia long before my
parents, sister and I arrived here in 1976, but I had sympathy with
the Aboriginal view of White Invasion, and still do. I had discovered
many things about this invasion that were brought to light by white
and black academics in the 1970s, after abandonment of the White
Australia policy.
In Australia there is an academic discourse that refers to “black” and
“white” historians, who have rather different perspectives. There are
also white historians who are accused to have a “black arm-band
version” of history. Not being a historian I have never ventured into
this debate, but couldn’t help but be aware of it and influenced by
it. By 2001, when I was assessed by Professor Singh, I had already
researched and written Eugenics and Genocide in the Modern World,
which touched on, but didn’t explore the connection between the White
Australia Policy, the atrocious treatment of Aboriginal people and the
Anglo-American eugenics movement. This same racist eugenics movement
shaped the development of academic psychiatry in Australia, being
reinvented as “psychiatric genetics” after the term “eugenics” fell
from favour when the monstrous application of the science of “breeding
better humans” by the Nazis was exposed. Of course, psychiatric
genetics is different in many respects to eugenics, since the
mechanism of inheritance in the form of the DNA molecule had not been
discovered, when Francis Galton wrote Hereditary Genius in which he
claimed to have evidence that the Australian Black is a “further grade
less intelligent” to the African Black, who was two grades, on
average, below the White Man.
In 1998, when I was researching and writing Eugenics and Genocide in
the Modern World, my partner Sara, who was studying education at the
University of Melbourne, found several books on eugenics in the
university library, including Hereditary Genius, and other books by
Galton, as well as others by his disciples, all of whom agreed there
was there was an urgent need to implement policies of segregation
between Blacks and Whites, as well as positive and negative eugenic
programs to “preserve the purity of the White Race”. This racist
science was inevitably embraced by a nation whose first Act of
Parliament was to restrict immigration in what has been since
described as the “introduction of the White Australia Policy”.
Wikipedia presents the bare outline of the White Australia Policy,
from official sources (
“The term White Australia Policy comprises various historical policies
that intentionally favoured immigration to Australia from certain
European countries, and especially from Britain. This was an attempt
of Australians to help shape their own identity after federation. It
came to fruition in 1901 soon after the Federation of Australia, and
the policies were progressively dismantled between 1949 and 1973.[2]
Australia's official First World War historian Charles Bean defined
the early intentions of the policy as "a vehement effort to maintain a
high Western standard of economy, society and culture (necessitating
at that stage, however it might be camouflaged, the rigid exclusion of
Oriental peoples)."[3]
An Oriental Orientation
The term "Orient" derives from the Latin word oriens meaning "east"
(lit. "rising" < orior " rise"). The use of the word for "rising" to
refer to the east (where the sun rises) has analogs from many
languages: compare the terms "Levant" (< French levant "rising"),
"Vostok" Russian: ?????? (< Russian voskhod Russian: ??????
"sunrise"), "Anatolia" (< Greek anatole), "mizrahi" in Hebrew ("zriha"
meaning sunrise), "sharq" Arabic: ???? (< Arabic yashriq ???? "rise",
shur?q Arabic: ????? "rising"), "shygys" Kazakh: ????? (< Kazakh shygu
Kazakh: ???? "come out"), Turkish: do?u (< Turkish do?mak to be born;
to rise), Chinese: ? (pinyin: d?ng, a pictograph of the sun rising
behind a tree[1]) and "The Land of the Rising Sun" to refer to Japan.
Also, many ancient temples, including pagan temples and the Jewish
Temple in Jerusalem, were built with their main entrances facing the
East. This tradition was carried on in Christian churches. To situate
them in such a manner was to "orient" them in the proper direction.
When something was facing the correct direction, it was said to be in
the proper orientation.[citation needed]
East In Sinhalese:
Professor Singh is an important academic psychiatrist in Australia. He
is co-author of the main textbook used to train medical students in
Melbourne (and the state of Victoria) Foundations of Clinical
Psychiatry the first edition of which was published in 1994. This was
the same year that the American Psychiatric Association published the
DSM IV, updating the DSM III-R (where the R stands for ‘revised’).
Foundations of Clinical Psychiatry, the basis for diagnostic labelling
in Melbourne and Victoria that everyone who graduated as a medical
doctor was obliged to learn (if not accept) presents students with a
choice of “The DSM-III-R Classification” in Appendix B. or the “ICD-10
Classification (abbreviated form)” in Appendix A. These are rival
British and American classifications of mental “disorders”.
After an hour of two of grilling me, during which I became
increasingly annoyed, though never rude, I showed Professor Singh some
diagrams I had drawn after analysing the annual reports of the Mental
Health Research Institute (MHRI), and its links with various
organizations. I noted, for example, that the President of the MHRI
Board of Directors was Dr Ben Lochtenberg, who was also the boss of
Orica, previously the Australian branch of the British Imperial
Chemical Industries (ICI), which was a major exporter of cyanide,
explosives and detonators, around the world. I also noted various drug
companies and what their involvement was with the medical profession,
mainly in the area of psychiatry, and how various institutions,
internationally, were connected with the Australian medical research
institutions. I did not explain the connections, I just drew them, to
clarify my thinking on the serious matter of medical corruption.
Professor Singh noted these as evidence of what he called a “paranoid
personality”, and was much more interested in these than the
“scientifically valid material and thoughtful comments and criticism”
that he mentions but does not specify, elaborate on or refute. He
wrote:
“Dr Senewiratne showed me a number of his publications and papers that
he had written. Of particular interest to me were a series of charts
in which he drew lines connecting various organisations that had links
with each other. He was very much of the line of thinking that used
non-Aristotelian logic in the sense that he seemed to be suggesting
that if A had a connection with B and if B had a connection with C,
then A must have a connection with C. He obviously spent a lot of time
looking for these connections and drawing them as the intricate
diagram demonstrated. He claimed however that this line of reasoning
led him to hypotheses for which he then attempted to look for further
information to prove or disprove his theory. I found little evidence
of an open mind.”
I think my mind, is, on the contrary, too open. I have been quite
gullible at times, and fallen for scams, been successfully deceived by
news reports and other programs on TV, and conned by drug companies.
Gradually I have been becoming more critical and discriminating about
what I believe, although I have had lapses. I am guilty of many
things, but not a closed mind, and no one but Professor Singh has ever
accused me of having one. Others have accused me of having delusions,
and I have, but the fact that these delusions have disappeared is
proof that my mind is not closed. But what is a closed mind, anyway?
Minds do not open and close like boxes, black or otherwise.
A closed mind is a metaphor, but it is also a judgement. No one likes
to be accused of having a closed mind, and I was annoyed when I read
that this is what Professor Singh thought of me. Impertinent, maybe,
closed-minded, no. As for Aristotelian and non-Aristotelian logic I
didn’t know the difference; but didn’t Aristotle logically surmise
that we think with our heart rather than our brains, while his non-
Aristotelian colleagues had reasoned, correctly, that we think with
our brains? Didn’t Aristotle also logically surmise that there were
only four, or at most five, elements that corresponded to the four
humours that Galen and his followers believed for a thousand five
hundred years in the West? So what if I use “non-Aristotelean logic”?
Though he is one of the senior editors of Foundations of Clinical
Psychiatry, only two of the chapters are actually credited to
Professor Singh, both of which have co-authors. The other chapters of
the textbook are written by various Melbourne academics from the
University of Melbourne and Monash University. The two chapters by
Singh are “Making Sense of the Psychiatric Patient” and “Failure to
Cope and the Adjustment Disorders”. It is the first of these that has
helped me make sense of Professor Singh, because it tells me how
different what he teaches is from what he does. And what he teaches is
bad enough.
According to the chapter on Making Sense of the Psychiatric Patient,
careful attention should be paid to taking a history and recording
“case notes”. “The length and emphasis of the personal history will
vary widely – the aim is to build up a unique picture of the patient”
advises the professor. The following are “commonly recorded, although
the list is not exhaustive”: early development- complications of
pregnancy, feeding problems, achievement of ‘milestones’, childhood –
hyperactivity, bed wetting, phobias, friendships and play, major
childhood illnesses; school – including academic performance,
disciplinary trouble, peer relationships and emotional problems;
adolescence – adjustment difficulties, sexual behaviour, delinquency,
relationships, drug use; occupation – job record and satisfaction
ambition, and military service; sexual history – attitudes,
activities, past partners, sexual orientation, problems with impotence
or loss of libido; marital history – courtship, relationship with
spouse or de-facto, current state of marriage, past marriages or
divorces; children – names, ages, relationship with patient; habits –
alcohol, tobacco, drug use or abuse; forensic history – past offences,
convictions and sentences, anti-social behaviour; leisure – interests,
hobbies; social network – family, friends, supports.
This is just the “personal history”. There’s also the “psychiatric
history”, the “medical history” (described as a ‘past medical
history’, as if history can be otherwise) and the “family history”
which “gives a wealth of information about who a person is and his or
her background”. The family tree, which is described as a geneogram,
is “a highly effective way of encapsulating a large body of
information at a glance and is therefore an essential part of the
write-up”.
What, you might ask, is all this personal information written down
for? Well, it’s written as ‘case notes’ so that it can be presented at
case conferences and discussed by the ‘treating team’. They discuss
the “large amount of information” come up with a probable diagnosis
and a differential diagnosis, like they do in other areas of medicine,
and use it to decide on what treatment, out of various “biological
treatments” and “psychological treatments”, are best suited to the
individual needs of the unique patient, having taken a thorough
history and conducted a Mental State Examination (MSE). This is the
paradigm is promoted by The Foundations of Clinical Psychiatry as the
“biopsychosocial approach”, which is credited to the American
psychiatrist George Engel (1913-1999).
Professor Singh skipped most of the history taking, for obvious
reasons. His job was not to treat me or cure me, it was to judge me.
He had to make judgements about my judgements, especially my medical
judgements. That’s why I had been sent to see him. There had been no
adverse reports about my medical conduct or judgements, but I had been
declared mentally ill by his and my colleagues, and the Medical Board
wanted his expert opinion on the matter, as one of Melbourne’s most
senior psychiatrists.
This was his verdict:
“Dr Senewiratne presented as a neat well-dressed man and he was
obviously anxious about seeing me. He presented his story with great
clarity and exactness, he is obviously intelligent and has clear views
of the world and of multiple conspiracies he had identified about all
elements of his particular views. He was convinced that his
perspective was right. In my view he demonstrated the classical
thinking of the paranoid personality in that he is constantly looking
for links between various aspects of his behaviour and of the world
and then drawing those links together to reach conclusions about the
world, usually conspiratorial ones (see attachment of chapter from the
book Neurotic Style by David Shapiro, Basic Books, NY, 1965).”
I had never heard of David Shapiro, but I didn’t like the title of the
book. What the heck is a “neurotic style?” I was sent a copy of
Professor Singh’s 5-page report, but not the attachment he sent to the
Board. The closest to a “paranoid personality” in modern psychiatry is
a “paranoid personality disorder”, the label they have retrospectively
applied to tyrants like Stalin.
My book, which Bruce Singh pathologised after scan-reading bits of it,
was titled The Politics of Schizophrenia. If you read the following
profile from the University of Sydney Medical School you may get an
idea of the politics I was exploring in the years before I was
formally assessed by him:
SINGH, BRUCE S
MB BS 1968 PhD RACP FRANZCP
Bruce Singh became Foundation Chair of Psychological Medicine at
Monash University and the Royal Park and Alfred Hospitals in Melbourne
in 1984.
Born in Sydney in 1946, Bruce studied Medicine at the University of
Sydney. After graduating in 1968, he completed his Residency at the
Royal Prince Alfred Hospital, continuing as a Medical Registrar and
Psychiatric Registrar until 1975. He then became a National Health and
Medical Research Council (NHMRC) Travelling Fellow in the Clinical
Sciences and from 1975 to 1978, was based at the University of
Rochester, New York, the Institute of Psychiatry and Maudsley Hospital
in London, and at the University of Sydney.
In 1978, Bruce was appointed Senior Lecturer in Psychiatry at the
University of Newcastle where he remained until 1984, when he became
Foundation Chair of Psychological Medicine at Monash University and
the Royal Park and Alfred Hospitals in Melbourne. In subsequent years,
Bruce worked as Co-Director of the Master of Psychological Medicine
program at Monash University, also playing a role in the development
if the new medical curriculum at the University of Newcastle.
Origins
The Mental Health Research Institute (MHRI) was founded in 1956 as
part of the Victorian Hygiene Department. The Institute then
specialised in common mental disorders in the community, with a focus
on risk factors, prevention and appropriate treatment.
The building was located close to the original Royal Park Hospital in
Parkville, one of Victoria’s busiest hospitals.
Change
A working party established in 1983 recommended that a major expansion
and re-organisation take place. This resulted in a new emphasis on
neuroscience and the creation of an independent organization housed in
a modern neuroscientific research centre completed in 1994 at a cost
of $5.5 million.
The Institute was incorporated in 1987, and has since grown rapidly
from an initial staff of three to over one hundred.
E: c.masters(at)unimelb.edu.au
T: +61 3 9035 6650 / +61 3 9389 2905
Research Interests:
Techniques Used:
Bibliography (1968-2011)
Relevant Publications:
(Top 10 cited publications, updated June 2010)
Cherny RA, Atwood CS, Xilinas ME, Gray DN, Jones WD, McLean CA,
Barnham KJ, Volitakis I, Fraser FW, Kim YS, Huang X, Goldstein LE,
Moir RD, Lim JT, Beyreuther K, Zheng H, Tanzi RE, Masters CL, Bush AI.
Treatment with a copper-zinc chelator markedly and rapidly inhibits ?-
amyloid accumulation in Alzheimer's disease transgenic mice. Neuron
2001, 30:665-676 [and see Previews: Gouras GK, Beal MF. Metal chelator
decreases Alzheimer ?-amyloid plaques. Neuron 2001; 30:641-642]. (498
citations)
Koo EH, Sisodia SS, Archer DR, Martin LJ, Weidemann A, Beyreuther K,
Fischer P, Masters CL, Price DL. Precursor of amyloid protein in
Alzheimer disease undergoes fast anterograde axonal transport. Proc
Natl Acad Sci USA 1990; 87:1561-1565. (498 citations)
Masters CL, Harris JO, Gajdusek DC, Gibbs CJ Jr, Bernoulli C, Asher
DM. Creutzfeldt-Jakob disease: patterns of worldwide occurrence and
the significance of familial and sporadic clustering. Ann Neurol 1979;
5:177-188. (447 citations)
The size and research strength of the group will position the
Melbourne Brain Centre as one of the top five centres for brain
research internationally, alongside the Institutes for Neurology and
Psychiatry (UK), the Janelia Farm at the Howard Hughes Medical
Institute (USA) and the Riken Brain Science Institute (Japan).
Funding for the building project was largely provided by the Federal
government, the Victorian State Government, through the Department of
Innovation, Industry & Regional Development and the University of
Melbourne.
Professional background
Academic interests
He has published more than 220 papers. In 2011 he was awarded a Medal
of the Order of Australia for contributions to higher education,
medical research and professional organisations.
Orygen program –
Orygen Youth Health (OYH) is a world-leading youth mental health
program based in Melbourne, Australia. OYH has two main components: a
specialised youth mental health clinical service; and an integrated
training and communications program.
In the 1970s I was exposed to the American Black civil rights movement
through Mohammed Ali, the boxing icon who boasted that he could “float
like a butterfly and sting like a bee”. I admired his fancy footwork
and his cocky attitude, though I shared neither, and was not keen on
my father’s suggestion that I take up boxing as a sport. I didn’t want
to be hit repeatedly in the head by bigger boys; being the shortest
boy in the class was bad enough as it is. Mohammed Ali was not short,
like me, but he was “black” like me and all the other boys in my class
in “Trinity”. That’s what we called our school – Trinity. Sri Lankans
in Sri Lanka and around the world, (in the so-called ‘diaspora’) know
that Trinity means Trinity College, a school founded by the British in
the medieval city that the British named Kandy. It was in Kandy that I
heard that the boxer my parents had admired as Cassius Clay had
converted to Islam and taken the Muslim name of Mohammed Ali.
My enthusiasm for Mohammed Ali, despite my relative indifference
towards boxing, which was not a sport at Trinity, was because of his
reputation, and hero-worship of the boxing superstar by my classmates,
most of whom were Muslim. I was mystified as to why a black American
man would convert from Christianity to Islam, since I saw Islam as the
religion of my Muslim classmates, who were not any more black or white
than their Christian, Buddhist and Hindu peers. I was taken with his
pointing out, in a snippet that I saw in a cinema, of the fact that
black is equated with evil and that this is a sign of white
oppression. I now think the argument is ridiculous; it is more likely
that black is associated with evil, in many cultures, because of the
absence of light and the darkness and dangers of the night, as well as
the contrast between darkness and illumination – physical and mental.
The black civil rights movement, which began in the USA for obvious
reasons, did have relevance to Sri Lanka, though this relevance has
not been explored, since the focus has been on issues of race and
religion, tied to Sinhalese and Tamil nationalism. But skin colour
prejudice and discrimination against “black” people is deeply embedded
in Sri Lankan and Indian culture. It is rarely mentioned or discussed,
and may be more of a problem in India than in Sri Lanka, but I was
quite aware of it when I was a child, not so much from my friends, who
were of all shades of brown, from the very fair to the very dark, but
from my family, and especially my maternal and paternal grandmothers.
Kunde
Twenty years ago, in 1995, I was thirty four years old. I was working
as a GP in my own suburban general practice in Melbourne, and the
proud father of a two-year-old daughter, who took up most of my time,
when I wasn’t at work. I was also trying to establish a parallel
career in music, having moved to Melbourne from Brisbane in 1988 more
in search of musical opportunities than medical opportunities. The
main reason I moved to Melbourne was that I had fallen in love with
Sue, the mother of my two-year-old daughter, Ruby. Sue, who I met when
I was a medical student when I did an elective term in the remote
mining town of Mt Isa back in 1981, had decided to move to Melbourne
to study Japanese at Monash University. She was already a nurse, which
is how I happened to meet her; she was studying nursing at the Mt Isa
Base Hospital when I, as a fifth year medical student was sent to
learn about “medicine” from a Sri Lankan doctor who I knew as “Kanaks”
who was known to his patients and colleagues in at the Mt Isa Hospital
as “Dr K”, because they were quite unable to pronounce his full
surname - Kanagarajah. Kanaks was an old friend of my father, and had
worked with him in the Kandy Hospital in Sri Lanka in the 1970s, when
they were both medical lecturers at the University of Peradeniya.
I had married Sue in 1991 in a civil ceremony held at my parent’s
house in Brisbane, which was interrupted by a violent thunderstorm.
This thunderstorm was welcome though, since it broke a drought that
was affecting Queensland at the time. Or so I was told, by my mother.
If, having been told that the thunderstorm that came upon our wedding
broke the drought, I concluded that the drought broke because of our
wedding, this would be a delusion. It would not make me mad, but it is
a crazy belief. It is irrational, as well as illogical. It would be
regarded as a sign of psychosis by modern psychiatry. I don’t have
such a delusion, and never have. I have, however had other delusions,
which I have decided to share with you, in the hope that they will be
helpful and enlightening to others. Some will not seem to you like
delusions, because you hold them to be true yourself. One person’s
delusion is another’s ‘deep and ancient wisdom’ or ‘mystical truth’.
The word "schizophrenia" comes from the Greek roots schizo (split) and
phrene (mind) to describe the fragmented thinking of people with the
disorder. His term was not meant to convey the idea of split or
multiple personality, a common misunderstanding by the public at
large. Since Bleuler's time, the definition of schizophrenia has
continued to change, as scientists attempt to more accurately
delineate the different types of mental diseases. Without knowing the
exact causes of these diseases, scientists can only base their
classifications on the observation that some symptoms tend to occur
together.
With the rapid advances in the genetics of human desease now taking
place, the future looks bright that greatly more effective therapies
and eventually cures - will be identified.”
The Mormon psychiatrist is agreeable. He agreed that I had Psychotic
Disorder –NOS and he also agreed that the only way to find out if I
needed Invega injections was to stop them and “observe” me. Invega is
an expensive new drug that is classed as an “atypical” or “new
generation” antipsychotic, with the chemical name of paliperidone.
Despite being a medical doctor I had not heard of paliperidone until I
was told I was going to be injected with it. At the time I was a
prisoner at the PA Hospital, and knew that agreeing to the “depot”
injection was the only way I’d get out of the locked ward. This was in
mid-2013, and since then I had been visited at my home by a middle-
aged gentleman by the name of Nigel Lewin, who agreeably injected me
monthly with Invega Sustenna in my living room, sometimes in the
presence of an equally agreeable Indian gentleman by the name of Sagir
Parkar. Sagir is a psychiatry registrar and Nigel is a registered
nurse. Both are male, and both trained in England, where I was born.
I am even more agreeable than the agreeable Mormon psychiatrist. I
sometimes agree with people when I don’t agree with them mentally or
smile in apparent agreement. Usually it’s silent disagreement. The
better I know someone the more likely I am to voice my disagreement,
though I argue. I do enjoy a good debate, though, and I have had some
animated but never heated debates with Sagir and Nigel. I have never
debated with the Mormon psychiatrist, because debating with him would
have been both futile and fatal. I did reason with him, and he was,
I’m glad to say, amenable to reason. Not that I didn’t have Psychotic
Disorder –NOS, but that I didn’t need to be injected for it. Not that
he was convinced that I didn’t need the drug, he just agreed to stop
it and “watch me” over the next year or so.
Watching me and observing me are turns of phrase used psychiatry and
medicine with which I am very familiar. It means observing me, not for
signs of health, but of illness. The observations are to be made
monthly by Sagir and Nigel, both of whom I like and enjoy a good
debate with. Moreover they don’t think I am mad just because I
disagree with them, even if I laugh at what they say. This is very
unusual for mental health workers, and I have never experienced it in
qualified psychiatrists, though there are psychiatrists one can
disagree with who don’t regard all disagreement as insanity.
All psychiatrists make judgements about whether patients are deluded
or not, and have no reference point for such judgements other than
their own beliefs. I have learnt this from experience. I was careful,
at first, with debating with Sagir and Nigel, and have become bolder
over the past year. They are not psychiatrists, but they are the ones
who are tasked with making periodic assessments of my “mental state”.
It’s called an MSE – which stands for Mental State Examination. MSEs
are at the heart of modern psychiatry. Sagir is training to do MSEs
and his competence in doing MSEs will be examined if he is to become a
psychiatrist. Becoming a member of the college of psychiatrists
requires him to learn and comply with the doctrines of what an MSE
constitutes, and how it is used according to Australian psychiatry. He
never tells me exactly what he writes when he gets back to the
hospital, but I’ve seen enough MSEs and done enough myself to have a
good idea.
I quite like Sagir and Nigel, and I like the Mormon psychiatrist too.
I like the Mormon psychiatrist because he stopped the injections, and
realise that him being believing in the Book of Mormon (if indeed he
does) did not stop him thinking and acting rationally, when it came to
stopping the injections that his colleagues, and not himself, had
initiated. This made him a good doctor, in my eyes. It also makes me
wonder why I mentioned that he was a Mormon psychiatrist, rather than
good psychiatrist, or a Christian psychiatrist. The reason may be
that, after many years of being labelled with this and that, I have
started labelling people myself.
I have had my liberty constrained by many psychiatrists and psychiatry
registrars over the years, and didn’t like it. I didn’t like them when
they ordered that I be given drugs against my will, and I didn’t like
them when they ordered that I be “returned to the ward” if I went
home. I developed a strong dislike, indeed hatred at times, of
psychiatrists and psychiatry registrars who kept me prisoner in
various hospitals in Melbourne and Brisbane. Since they released me
last, a couple of years ago, my anger and hatred have dissipated
considerably, and I can reflect more rationally and reasonably on
psychiatry and the psychiatry profession. The same way that I question
Richard Dawkin’s blanket condemnation of religion, I hesitate to make
a blanket condemnation of psychiatry. My experiences as an involuntary
patient over the past twenty years have given me an unusual
perspective on medicine, since I myself trained and worked as a doctor
of medicine until I was prevented from earning a living as a doctor
because of the psychiatry profession, which declared me to be mentally
ill. The diseases they declared me to have were the most serious
psychotic illnesses in their textbooks – schizophrenia,
schizoaffective disorder and mania.
Iatrogenic Disease:
I had always thought that the Greek iatros meant treatment, but I
find, from Wikipedia, that it actually means “brought forth from the
healer”. That changes everything. It raises the difference between
treating and healing. Nevertheless, iatrogenesis is not the same as
iatrogenic disease, since what is generated by the healer may be
positive, negative or neutral. For healer’s to legitimately claim the
mantle of healer, they need to have a positive therapeutic effect,
rather than a negative one, or neutral one.
However, Wikipedia does not differentiate between iatrogenesis and
iatrogenic disease, providing as examples of iatrogenesis:
Risk associated with medical interventions
Adverse effects of prescription drugs
Over-use of drugs, (causing - for example - antibiotic resistance in
bacteria)
Prescription drug interaction
Medical error
Wrong prescription, perhaps due to illegible handwriting, typos on
computer.
Negligence
Nosocomial (hospital acquired) infections
Faulty procedures, techniques, information, methods, or equipment.
The Encyclopedia of Death and Dying (which I have never consulted
before) informs that
“A 2000 presidential report described iatrogenic error and illness as
"a national problem of epidemic proportions," causing tens of
thousands of annual deaths. The report estimated the cost of lost
income, disability, and health care costs to be $29 billion a year.
The report concluded that half of adverse medical events were
preventable.
“Issued by the most respected agency of American medicine, To Err is
Human generated considerable attention and surprise by concluding that
up to 98,000 Americans are killed annually by medical errors. This
number slightly exceeds the combined total of those killed in one year
by motor vehicle accidents (43,458), breast cancer (42,297), and AIDS
(acquired immunodeficiency syndrome, 16,516).
Most doctors and health workers do not have prisoners, but the most
dangerous ones do. They rarely order these prisoners, the ordering is
left to other members of staff – they tell their prisoners what is
going to be done rather than ask what the prisoner would lik
Doctor’s Orders
“Can anyone read your mind?”
“Do you have a special relationship with God?”
“Is anything like electricity, X-rays or radio waves affecting you?”
“Are thoughts put into your head that are not your own?”
“Have you felt that you were under the control of another person or
force?”
BPRS (WHO)
I became consciously aware of the power of suggestions when I was
employed to teach anatomy and physiology to hypnotherapy students in
Melbourne. This was in 1998 and 1999 and the Millennium was imminent.
New Age ideas were common, including the idea that the mind could heal
the body through various means. The Clinical Hypnotherapy Centre was
run by a lady by the name of Kathleen Frith, who had studied hypnosis
under a man called Jim Golding. Golding died in 2001, but I did once
see him in action at Monash University, when he tried to hypnotise a
young audience that the pineal organ in the brain was a crystal that
contained the soul, which miraculously disappears when you die
(resulting in corresponding weight loss) and the equally implausible
story that an US Navy ship had time-travelled during the Second World
War (the legend of the Philadelphia Experiment). Some of Golding’s
audience seemed to swallow the story – they were hypnotised, judging
by the lack of laughter and the expressions on their faces. He didn’t
like it when I challenged his claims about the pineal after the
lecture (which I had expected to be a lecture about hypnosis rather
than an exercise in casting a hypnotic spell).
Kathleen once tried convincing me that one could change a person’s
gender through hypnosis.
Negative preconceptions
Dementia – from the Latin loss of mind – has been predicted to
increase around the world as the population ages. Of these cases of
dementia, most will be diagnosed as having “Alzheimer’s” or
Alzheimer’s Disease, for which there is no known cure, and few proven
treatments. For the past fifteen years I have been sporadically
investigating the intuitive hypothesis that an active mind prevents or
decreases dementia, and considering social and psychological factors
that might interfere with continued learning. For example the
pervasive cliche that one is “too old to learn” or “too old to
change”, that one “can’t do science” or “has a poor memory for dates
and can’t do history”…and so on. It is common to hear people declare
that they “can’t sing”, “can’t draw” (though rarely that they can’t
write, even when they can’t) based usually on discouraging childhood
experiences. Could such negative preconceptions have a bearing on the
development of dementia?
Negative preconceptions can be expected to lead to negative appraisals
and avoidance of situations anticipated to be unpleasant, as well as
negative evaluations of other people. This may be frank prejudice
against people based on their skin colour or the shape of their nose,
or broad prejudice based on gender or age, accent and fluency in a
particular language, or class and caste prejudices. All can be viewed
as negative preconceptions, which can lead to negative experiences
that reinforce the prejudice through confirmation bias. Negative
preconceptions can also discourage from learning about new things and
continuing the “scaffolding” process the Jerome Bruner theorised about
– where new learning is built on an existing framework of knowledge by
parents and teachers. But new learning also comes from within – we
have natural curiosity. Though it got me into a lot of trouble when I
proposed the hypothesis that the curiosity instinct is a protection
against dementia and depression, I’ll propose it again a couple of
decades later, in the hope that no one will lock me up for it.
Negative preconceptions build up in our minds over the years about
ourselves, others and the world. These negative preconceptions include
obviously harmful prejudices against others, but many more are simply
limitations we place on ourselves and our enjoyment of the world. We
decide on the basis of one or two negative experiences that we don’t
like this or that type of music or art, or type of food. I’d often be
told my older English patients, “I don’t like spicy food” or by
younger Australians “I don’t like vegetables”. It struck me that some
people must like very few things, and the more things you like, the
more pleasure you gain out of life. Gaining pleasure from learning new
things is the same as satisfaction of curiosity. It is pleasant to
have ones curiosity satisfied. Conversely having ones curiosity denied
is stressful, causing irritation, frustration and, importantly,
boredom. Boredom is greatly underestimated as a cause of stress and
suffering. Chimpanzees in cages visibly suffer from boredom, children
in classrooms learn to hide their boredom, but don’t always succeed.
When they are detected as being bored and fidgeting like bored
children do, there is a waiting diagnosis for them, depending whether
they “act out” or “tune out”. If they act out the favoured label is
ODD, if they tune out it is fashionably termed ADD.
ODD and ADD are American psychiatric labels, promoted by the American
Psychiatric Association (APA) for disobedient, bored, argumentative or
distractible children (who have many and varied reasons for their
behaviour, including responding to authoritarianism and oppression by
adults as well as their siblings and peers – being a child is a
complex matter). ODD stands for Oppositional Defiant Disorder and ADD
stands for Attention Deficit Disorder, which is closely related to
ADHD (Attention Deficit Hyperactivity Disorder). ADD and ADHD are
sometimes used interchangeably, but ADHD was only synthesised in 1994
with the DSM IV, by merging “hyperactivity” and “inattention” into a
“disorder”. The American neuroscience establishment then started
claiming that ADD/ADHD was a “genetic disorder” and had the statistics
to “prove it”, which was excitedly published in the Murdoch
newspapers. No one explained how this supposedly genetic disorder
suddenly appeared in epidemic proportions coinciding so precisely with
the publication of the DSM IV and the associated marketing campaigns
to capitalise on any new labels or broadened diagnostic criteria for
the application of these labels. The campaigns to sell the drugs came
in waves, carefully researched and orchestrated marketing campaigns,
by rival, though often colluding drug companies. Drug companies
collude so much they often merge, with giant corporations becoming
even more powerful. This is what is known as Big Pharma – companies
like GSK, Pfizer, Merck, Roche and Eli Lilly. All of these companies
develop and market psychoactive drugs and have common strategies, when
it comes to convincing doctors to prescribe.
When the DSM IV came out I was in suburban general practice in
Melbourne, where most of my patients were elderly. Nobody tried to
convince us that children had “bipolar disorder”, “schizophrenia”, ADD
or ODD, not drug reps, not psychiatrists and the medical literature.
When I was in medical school in the 1980s we were told that children
who were labelled “hyperactive” were just “very active” and that true
hyperactivity only affected about one in 200 children. These few
children, we were told, had a “paradoxical” response to stimulant
drugs, such as amphetamines, which instead of speeding them up, had
the effect of slowing them down. Other children were, in my training,
labelled only as “FLKs” – which stood for “funny looking kids”, who,
it was said, might have mental disabilities as well as “funny” faces.
As the years have gone by, such unscientific approaches have been
replaced with more scientific-sounding words, but ODD and ADD are as
unscientific as FLK is. Diagnosing childhood BD – bipolar disorder –
and schizophrenia is worse, since the drugs routinely used to treat
these “disorders” are extremely toxic, more so than the drugs used to
treat ADD/ADHD. The big problem with ADD is not the side-effects, it
is the fact that stimulant drugs are generally addictive. It has been
known that amphetamines stimulate concentration, but are also
physically and psychologically addictive for more than a century.
There are other side effects of stimulant drugs, depending on the
type, but I had a general concern about starting so early the habit of
taking a pill for ones ills. I was also concerned about the
possibility, indeed the probability, that children would be
scapegoated and stigmatised. Whenever they did or said something out
of order the natural tendency of the parent or teacher (or sibling) is
to wonder, “has he been given his medication?”
I think my concerns were justified, given the problem of crystal
metamphetamine (“ice”) abuse. Like heroin and cocaine, amphetamine
manufacture is a product of Western science and technology that is now
wreaking havoc around the world, spreading like a cancer from the
cities to the towns and villages. It’s ODD that they can’t ADD this
all up.
I am a Wikipedia enthusiast, but this is not the place to find out
what ODD really means; on the contrary it significantly confuses the
issue, assuming that ODD is a real brain disease, characterised by
genetic factors and poorly functioning areas of the brain. Mention is
made of “amygdala, prefrontal cortex, anterior cingulate, and
insula, as well as interconnected regions” that have been implicated
by “neuroimaging studies” in delinquent American youths. Wikipedia
provides a reference for the statement that “ODD has an estimated
lifetime prevalence of 10.2% (11.2% for males, 9.2% for females)”,
which led me to Daniel Dickstein, who made the claim. Dickstein,
according to the Bradley Hospital, Rhode Island website, “leads
Bradley's Pedi-MIND research program, which uses brain imaging
techniques, including magnetic resonance imaging (MRI) and behavioral
measures to identify biological markers of psychiatric illness,
including bipolar disorder in children and adolescents. Such markers
could help physicians make more accurate diagnoses.”
Bipolar disorder in children??
In retrospect what may have seemed like insights, but they were
actually scientific theories of variable merit. Some were falsifiable
and others were not (which discounts them from being scientific). The
psychiatrists termed it hypomania – a mental state on the ways to
mania – meaning madness.
I have also forgotten many things that I learned much more recently.
Other things, learned years ago, remain fresh in my mind. Some of
these things were memories from long before I learned and then forgot
calculus. Why do people remember some things and not others? Is it
just a matter of reinforcing old memories whenever you recall them?
I’m sure if I regularly used calculus I would still remember it, and
would have progressed in my knowledge and understanding of mathematics
rather than regressing from the time I was a teenager at school. Does
that mean I’ve lost synapses in my brain that I could have kept active
by practice? I do believe so.
When I studied medicine in the 1980s, neural plasticity was not
recognised. It was assumed that once the brain had developed to adult
size, there were changes only in the function but not the structure of
the brain. Furthermore, it was assumed that personality, established
in childhood, remains constant the rest of one’s life. In psychiatry
this was termed the “pre-morbid personality” – the personality before
the patient became morbid – meaning ill. The idea that neurons and
other brain cells can grow, forming new connections between cells
throughout life is now widely accepted, and termed neural or neuronal
‘plasticity’. It is also established that neurons, though they reduce
in number, increase in connectivity throughout adolescence, and that
use of neuronal circuits promotes their development and preservation.
It is also known that the growth of the brain is stimulated by
activities that exercise it. The area of motor cortex corresponding to
the little finger of professional violin players has been found to be
larger than normal, reflecting their increased use of this finger.
Calculus and the fact that I have forgotten it, are of less concern to
me than memory and synapses. Doubtless if I was a mathematician I
would feel differently, but I am particularly interested in the brain
and how it works than the mathematics of change. At the same time,
what interests me most about the brain is how it changes, and how
these changes can be put to good use – meaning to promote health.
Having knowledge is not necessarily good for the health. Not having
knowledge is worse. But a little bit of knowledge can be a dangerous
thing if you think it to be a lot of knowledge.
How can I recover my lost knowledge and integrate what I know? At last
the tool is available to me and billions of others, and the
formidable, but vital, task is at hand. The tool is the miracle of the
Internet, which had transformed my life and that of millions of
others. For millions more it has been a given for as long as they can
remember, in the same way the TV was a given when I was a child
growing up in England. The key difference is that TV is inherently
splintered and splintering, while the Internet is inherently
integrative and holistic. The TV is linear, while the Internet is
lateral, as well as being interactive. You can search for what you
want and ask questions in such a way as to fill in the gaps in your
knowledge, as well as revise and learn more, with the greatest of
ease. Having grown up with the TV and remembering what it was like
before Google, Wikipedia and YouTube I have been convinced of the
merits of the Internet, though at first I resisted the new technology
and saw it as a sinister extension of the “manufacture of consent”
that Noam Chomsky had warned us about. Gil-Scott Heron rapped in the
nineties that the “revolution will not be televised”, while the
Disposable Heroes of Hiphoprisy rapped of “television, drug of the
nation, breeding ignorance and feeding radiation”. But the Internet
has liberated humanity in a way TV was never able to, or intended to
do. TV was always about selling advertising, as far as the commercial
channels went. The Internet is also about selling advertising, but it
has turned out to be much more than that. It has become an
unparalleled source of integrated information – though this
integration remains incomplete, and probably always will be. As more
things are discovered and thought of there are more things to
integrate, so the process of integration is necessarily ongoing, but
the Internet provides the means to a vast body of knowledge, far more
than can be stored in the greatest library let alone the greatest
mind.
I have always been aware that the Internet can spread untruths and
falsehoods the same way that the TV can. In some ways it is worse in
this respect, since any rubbish can be posted on the Internet, while
only rubbish that passes “journalistic standards” (whatever they are)
makes it onto the TV. A discerning reader or viewer can find out the
truth, and connect the dots using the Internet in a way that the TV
and old news media never made possible. The Internet not only makes
connecting the dots easy, it brings up the connections made by others,
all over the world. In its best manifestation this had resulted in the
collaborative multi-disciplinary effort of Wikipedia, which has become
my primary source of information on any subject on which I am
uncertain. I don’t trust Wikipedia any more than I trust the sources
various entries provide, but it is a good start for developing an
integrated, factual worldview. YouTube is a valuable secondary source,
but Wikipedia is unrivalled as a primary source, despite its many
flaws, errors and mistakes.
While it is a good place to start Wikipedia is rarely a wise place to
end ones investigation if one wants to find out the truth about
something. What it provides is the mainstream view, but this is an
important view to have, to start with. The mainstream is mainstream
for a reason, but the mainstream may be wrong about things, and you
don’t find that out from Wikipedia. You might find out about
“controversies” though, which can lead you in the direction of the
whole picture. In other areas there are no controversies, as far as
Wikipedia is concerned, but the information provided is incorrect.
This is not because, as its critics argue, “Wikipedia can be written
by anyone” but because the references used are themselves incorrect,
or present only one or two of many competing hypotheses, theories or
views. Wikipedia is also much more reliable in some areas than others,
partly because human knowledge is more certain in some areas than
others.
What is posted about a particular topic on Wikipedia is no doubt
influenced by vested interests and propaganda, though I suspect it is
less so than the other mass-media and the Internet generally. At the
same time, when I have read the entries on some topics on which I have
more knowledge in which I have detailed knowledge, Wikipedia has
rarely been as reliable or readable as truly trustworthy sources. But
truly trustworthy sources are hard to come by. In many areas Wikipedia
is the best of several less reliable alternatives. I’m not certain
about this, though. I believe it right now, but I’m not certain.
I believe much more than I know for certain. Many things that I
thought I knew for certain have turned out to be untrue. They were
delusional in Buddhist sense, but not delusional in a psychiatric
sense, where a delusion is a “false, fixed belief”. If beliefs change,
they were obviously not fixed in the first place. If delusions are
defined merely as false beliefs, few would think that none of their
beliefs are false (that their beliefs are true and correct in every
respect). Finding out that one is wrong is liberating, though it can
also be humbling and embarrassing. There is also a natural reluctance
to believe that you are wrong, which extends to reluctance to believe
that the institutions and individuals that taught you were wrong.
Still, judging by my track record I still have delusions that have not
revealed themselves to me, or been revealed to me by others, as such.
I suspect that I am still suffering from delusions, but I’m not sure
what they are – if I did, I would reject the false belief and assume
the true one.
False certainty
People can feel certain, but be wrong. This is false certainty. People
who think the earth was created 6000 years ago, or that the earth is
flat may be certain about it – meaning they feel certain about it –
but they are wrong. I have watched with some amusement, the self-
proclaimed “Darwinist” Richard Dawkins debating the facts of evolution
with “Young-earthers”, and struck by how tenaciously the latter stick
to their delusion. Dawkins is certain about Darwin’s theory of natural
selection and the creationists are certain about the Biblical account
of creation, and their interpretation of the ancient book. It is
obvious to any rational viewer that Dawkins makes more sense – vastly
more sense, and that the creationists are, to be blunt, stark, raving
mad. This is where the amusement lies, for those of us who agree with
Dawkins about the ancient age of the earth and evolutionary theory.
Why are we amused by crazy people showing that they are crazy? I don’t
know, but I know that watching people make fools of themselves in
public is popular on YouTube, and religious debates featuring Dawkins
and his merry band of atheists have many views and “likes” on YouTube.
I’ve enjoyed them myself, and was spurred on to watch more of Dawkins
media crusade against religion, discovering other icons of atheism
such as the physicist Lawrence Krauss and the outspoken journalist
Christopher Hitchens, author of a book, which I have not read called
“God is Not Great”. Hitchens has written many books, and was very
famous during his lifetime, but I had not heard of him till recently.
Likewise Lawrence Krauss and the other leaders of the atheist
movement, other than Dawkins and Dan Dennett, an American philosopher
I was familiar with (though I did not know about his ardent atheism).
Watching these debates has been interesting but not, for me,
confronting. I don’t have a horse in the race, so to speak. Others,
including but not exclusively, religious people are affronted by
Dawkins, needless to say. People who deeply believe in God can’t be
expected to like being told that their cherished beliefs are
delusional. Their natural reaction is to argue that Dawkins is wrong,
but I am yet to find a situation where they won the debate. Dawkins
likes to debate and to win debates, and they are fun to watch, because
he, and the other atheist intellectuals are, owing to their privileged
educational backgrounds, extremely articulate and often witty. I did
not need to be convinced about the truth of evolution, or the evils of
some religion, though I hesitate to damn all religion, ritual and even
superstition. Maybe religion is instinctual and serves an important
purpose? Maybe religion has some redeeming features which I have
overlooked? The debates, and the longer documentaries about Dawkins’
crusade against religion since writing The God Delusion, have prompted
me to reflect on the role of religion in health and medicine. If
belief in God is a delusion, what does it mean for other religious
beliefs, and the treatment of aberrant or unusual religious and
spiritual beliefs by psychiatrists in Australia and elsewhere. Are all
superstitions to be regarded as delusions, and to what degree are what
are currently diagnosed as delusions commonly held superstitions –
including superstitions disseminated through the globalisation of
religion?
I wondered before about why we are amused when people show themselves
to be crazy. Maybe we have a sadistic streak, or maybe just atheists
do, and they are the ones who watch these videos and attend the
various atheist gatherings. Do atheists have a more cruel sense of
humour than religious people? I really don’t think so, after watching
a recent video of the so-called Islamic State (or ISIS) in Iraq. There
are obviously some forms of craziness that no rational person finds
amusing.
The problem of Islamic extremism and, specifically the September 11,
2001 attacks on the World Trade Centre in New York are what
Christopher Hitchens says promoted him to embark on his campaign
against religion, which he continued to his death in 2012. Hitchens’
book “God is not Great” was a direct challenge, and a word play on the
Muslim exhortation Allah Akbar – God is Great – that the Muslim
extremists have the grotesque habit of shouting when they chop of
heads, filming it with the Western technology that they profess to
hate, and posting it on the Internet, in the hope of attracting more
brainwashed young men to fight and kill people they variously label as
infidels. I am confident that this is not the Islam that most Muslims
follow, any more than Christians follow the example of the Crusades,
but Dawkins and Hitchens have a point when they say that we should
develop our systems of morality (not to mention science) on modern,
enlightened views than those devised thousands of years ago, before
the discoveries of science. I would add before the invention of
international laws and covenants on human and animal rights. These
were not covered by any of the ancient religious traditions, or by the
laws of individual nations, and have transformed what we, in the
modern world, think is acceptable. The fair and equitable enforcement
of these laws is a different matter.
I enjoy watching videos of Hitchens even when I don’t agree with what
he says. He has a likeable directness, and apparent nonchalance. He
can also be passionate in his arguments. He’s a powerful orator and
powerful oratory is pleasurable to watch. Hitchens was a well-known
writer for Vanity Fair and other publications, but I am not familiar
with the Anglo-American media and did not know about his positions on
such things as the Vietnam War, Kissinger, Iraq, Iran and North Korea
until after I had watched and enjoyed his arguments against religious
fundamentalism and the God of the Old Testament. When I watched his
contributions in the political sphere I was rather disappointed in his
apparent blindness to American militarism, that I was well aware of
from such trusted sources as Noam Chomsky. I was also disappointed by
how little he seemed to think of Iran and the achievements of its
people, and his rather dubious claim that having been to Iran, he
could say that most Iranians supported an invasion of their country to
“rid it of the mullahs”. I think this rather unlikely, and gather that
Iran is a relatively progressive, rapidly-developing nation with a
long and proud historical tradition. The literate and multilingually
articulate Iranian people that I have seen on the Internet seem quite
capable of shaking off any domination by the mullahs, if indeed they
are being dominated by the mullahs at all. I don’t know enough about
the subject, but suspect that Hitchens, despite visiting the country
and writing about it and boasting that he had been to all three “Axis
of Evil” countries, did not either. If the people Hitchens met wanted
their country invaded by Americans it must only have been the crazy,
not to say unpatriotic, Iranians who would have expressed such an
opinion.
In his opinions about Saddam Hussein and the probable outcome of an
invasion of Iraq, Hitchens, always happy to venture his opinion and
predictions, was glaringly wrong, as events have turned out. Before
the 2002 invasion by the so-called “coalition of the willing”,
Hitchens predicted that it would all be over within five years of
overthrowing Saddam Hussein, and that the “relatively equal numbers”
of Kurds, Shi’ites and Sunnis would lead to a power balance rather
than sectarian division. He boldly predicted that there would not be
much in the way of revenge attacks and vengeance. He was obviously
wrong here, and I was not too confident on his opinions regarding
Syria either, since I have a suspicion that President Assad is being
demonised, along with President Putin and the other enemies of the US
military and body politic.
One of the words Hitchens has taught me is “numinous”. I had never
heard it before and I liked the sound of it. Sounded like luminous and
we all like to be illuminated. Hitchens used this word in the context
of wonderful and awe-inspiring experiences – “I do not deny the
numinous” he said. I have a linking for my old Oxford Dictionary,
though I could have checked the meaning of the word faster on the
Internet. The dictionary says numinous means “Of a numen; spiritual;
indicating presence of divinity; awe-inspiring”. A numen is,
according to the same dictionary a “local or presiding deity”. What
does Hitchens mean when he says he rejects religion but does not deny
the numinous?
Hichens was a journalist rather than a scientist, but the surviving
leaders of the atheist movement have more to say about science than
politics, which is perhaps just as well. When people such as Sam
Harris have ventured into political debate with the likes of Noam
Chomsky in defence of American foreign policy, they are liable to
expose their ignorance of history and politics. Harris is a
neuroscientist, (though I’m not sure what that means, in his case) and
has many fans as a leader of the atheist brigade rather than any
particular insights into the neurosciences. He specialises in various
ways to wittily put down religion – with a particular focus on
Christianity, Judaism and Islam, and is a popular debater and writer.
Dawkins and Krauss have more to say about science than they do about
politics, with a shared perspective on religion. This is that religion
– all religion – is harmful, because it is based on blind faith rather
than evidence. Dawkins insists that his main concern is with truth and
what is true, what people do is less concern to him. He says though
that he is, in addition to being motivated by love of science, and how
when you are in love you want to talk about it and share it, but
admits to also being angry – angry about children being labelled as
belonging to one or other religion before they are old enough to know
better. The same way we don’t label children according to the
political persuasion of their parents, we shouldn’t label them as
“Christian” or “Muslim” children. This, Dawkins, argues is a form of
child abuse. I can see his point, and am aware that there is a lot of
pressure on children to conform to the religion of their parents and
family, even when logic and rationality dictate otherwise. They are
often afraid, or ashamed, to admit that they don’t believe in their
parental religion. They may feel guilty or embarrassed at their “loss
of faith”. I have a lot of sympathy for such people, and admire people
like Dawkins and Hitchens for drawing attention to the problem of
coercive religion, and Hitchens in particular for his stand against
both female and male genital mutilation in the name of religious
tradition.
When I trained in paediatrics in the 1980s, circumcision in Australia
was in rapid decline. In the 1960s and 70s most Australian men were
circumcised, though I’m not sure why this came to be. There was a
campaign in the 1970s, led by paediatricians, against the practice,
against arguments that included the curious idea that boys would want
to look like their fathers (which patients of mine offered as a reason
to circumcise their sons even in the 1990s) and the idea that it was
good for hygiene. In the 1980s we were told that cutting off the
foreskin to keep the penis clean is like cutting off the ears to keep
the head clean. It stick in my mind, and I used it on the increasingly
rare occasions that I was asked my opinion about whether it was a good
idea to circumcise boys. During our training we were also shown
photographs of circumcisions gone wrong, resulting in horrible
infections and penile deformities. It was enough for me to be
convinced, for good, that circumcision is a brutal tradition, however
widely it is practiced, and even if it was practiced in Australia long
before the White man came here, as I have been told was the case.
Ritual mutilation was common in many tribal societies, and entered
Judaism and later Islam, through the tradition ascribed to Abraham (or
Ibrahim in the Muslim tradition). This is not a good reason to cut of
the foreskin of infants, with or without anaesthetic, 3000 years
later.
Science can tell us what is, what was and what will be, but not what
should be. It can tell us only a bit about what was in the bigger
scheme of things, but it can tell us a lot, in terms of the total
amount of knowledge one person can digest in a lifetime. The rest of
my life could easily be spent discovering what science already knows –
not thinks or believes – about the past. It includes all that science
has discovered about history, anthropology and biology, and what
science has discovered about physics, chemistry and cosmology. The sum
of all this knowledge is still only a small part of scientific
knowledge – there is also linguistic knowledge and the vast amount of
knowledge that has been acquired, through scientific means, of the
human body, including the brain and nervous system.
What is known is naturally a much smaller amount of information than
what is theorised or hypothesised. People have many theories, about
all manner of things, and these theories are dynamic. Scientific
theories come and go. Some are disproved, some are just forgotten
about and go out of fashion, only to be rediscovered some years or
decades later. Some are disproved and yet resurface time and time
again in various guises. This is especially so when it comes to
theories about how various medical treatments work. Note that it is
rarely the theory that leads to the treatment, rather the treatment
that leads to the explanatory theory. This has been the tradition in
Western as well as Eastern therapeutic models. In the case of the
drugs I prescribed daily as a GP I understood the mechanism of action
of only a few, satisfied that no one knew how they worked but they
did. I assumed that someone in the past had proven their efficacy, or
they wouldn’t be listed as pharmaceutical drugs for the treatment of
various conditions. In terms of the specifics I was guided by the
opinions of specialists in the field, though I found that each
specialty had a peculiar obsession with itself and tended to see
everyone through the prism of its own particular doctrines.
I also trusted the drug reps but not much further than I could throw
them. Drug reps are enthusiastic promoters of their drugs and tell
doctors about the upside and not much about the downside of various
drugs. They themselves do not know about the down side, since they are
not told about them in their pre-sale indoctrination. It’s all done by
hypnosis, though it’s never acknowledged as such. The process of
getting people to suspend their critical judgement and implant
suggestions “directly into the subconscious” is how the advertising
experts are taught to think of the minds of their targets. The art and
science of hypnosis has been neglected by the medical profession, but
not by the advertising industry and media machine.
I like the media machine much more than I used to. After the initial
hypnosis more off I became wary of the TV, which I had trusted as a
source of information when I was a teenager and first came to
Australia, and had a profound effect on my taste. Back in the 1970s I
was well and truly hypnotised every time I turned the TV on, which was
frequently. There had been no TV in Sri Lanka, so when we arrived in
Brisbane I watched the box till my eyes were square. I fell in love
with pop stars and actresses, as well as for some reason, Princess
Caroline of Monaco, who I regarded as very beautiful. I had a crush on
Suzie Quatro and Deborah Harry from Blondie, and a picture, from TV
Week, of the country singer Linda Rondstadt. The TV directed my likes
and dislikes, and I fancied girls who looked like the pretty girls I
saw on TV. During this time of TV hypnosis I often found myself
humming jingles from the TV in my mind and out loud. Then there were
slogans from the TV “Don’t sign anything till you see Zupps!” I
gathered from the ad that Zupps was a used car outlet located in what
later became my home suburb – Moorooka. Until I moved here and gained
a much richer perspective of Moorooka, this Aboriginal word meaning
“Iron Bark”, was “Moorooka, the magic mile of motors” where “you’re
sure to find a car to suit your style”. This may be how most
Brisbanites of my age think of Moorooka to this day – from a meme
generated by a used car dealer’s catchy jingle, but things have
changed dramatically in Moorooka since the 1970s when it was famous
for its mile of car retailers. Now it is most remarkable for its
multiculturalism.
Moorooka was last in the news in Brisbane when it was affected by
flood a few years ago, and there is not much about my hometown on the
Internet. Moorooka is one town of thousands that are covered by the
Queensland news media, and is mentioned only when there is a natural
disaster that affects it. In fact, though it has a distinct identity,
Moorooka is only one of many suburbs in Brisbane, and grew, like the
rest of suburbia, out of a British convict colony on the Brisbane
river, named after the British governor of the day, Sir Thomas
Brisbane.
Most of the suburb names in Brisbane are not Aboriginal. When I busk
in the local shopping centre it is rare indeed to see any of the
original Jaggera people coming to the Woolworth’s supermarket. Who I
do see is a multitude of different races of all colours, shapes, sizes
and presumed ages. I don’t spend much time guessing people’s ages, but
I do notice many things about new and old Australians and how they
interact with a stranger playing unfamiliar music on keyboard in the
arcade, and respond to the grooves and melodies he is producing. The
music is unfamiliar in that they have never heard the particular music
I am playing, since it is original and improvised, though certain
aspects of the music would be variably familiar to them. Previously I
have busked playing guitar and singing in Melbourne, which was also an
interesting experience, but my recent ventures into the commercial
heart of Moorooka have been different, because the same people return
to do their shopping, and get used to seeing me there. I generally
smile at everyone who walks by. Some people smile back, and some say
something too. Usually it’s nice. If they don’t like my music they’re
unlikely to say anything at all. Sometimes they smile because I’m
smiling at them, and not because they like my music, but sometimes
they show, from their body movements that they do – whether they smile
or not, and whether they say anything or not. This is because rhythmic
music makes them dance – what I call the dance instinct.
I am intrigued by the dance instinct, and makes people feel like
moving their head and limbs in time to music. Why doesn’t it happen
with our eyes or tongue? What parts of the brain are involved, and how
can the dance instinct be used to promote health? As soon as infants
are old enough to coordinate the movements of their limbs many start
dancing with them, and I have seen toddlers dancing well before they
can talk. Why did this instinct evolve, if indeed it is instinctual,
and I’m sure those, like Steven Pinker who deny a musical instinct,
would also deny a human dance instinct.
The gems of truth are scattered between disciplines and nations, and
between different historical and philosophical traditions. Science is
a body of knowledge (from the Latin scientia, meaning knowledge)
derived from various means of reasoning and logic. There have been
confusing debates over the years about what the “scientific method”
is, but this debate was exclusively in the West, largely between
philosophers in Europe, Britain and the USA. I will discuss my take on
these debates later, but first I might consider the difference between
knowledge and belief, theory, hypothesis, suspicion and disbelief.
Knowledge, belief, theory, hypothesis, suspicion and disbelief are all
degrees of belief. These may be regarded as degrees of certainty. This
is, of course a rather arbitrary classification, but it will do for
now. People talk about things being “scientifically proven”. This
claim is often made for various drugs and other treatments.
Scientifically proven implies that scientific experiments have been
done, using the scientific method – being that falsifiable hypotheses
are supported (made more likely) or disproved (in which case the
hypothesis is discarded). This is the Western philosophical definition
of what the “scientific method” is, not the definition of science as
construed by the ancient Greek scientists and philosophers, or those
of the East, in China, India, Persia, Egypt and Iraq – which led the
world in technology after the fall of the Greek and then the Roman
Empire. So when did the West become the sole credible authorities on
science, scientific knowledge and what the scientific method entails?
The dominance of Western Science came about through the Scientific
Revolution, which has been written about at length. Scientific
“revolutionaries” like Isaac Newton, Voltaire and Descartes were
indeed brilliant and original thinkers, but the greatest discovery of
the eighteenth century, from my perspective, is the invention of the
microscope and amazing discovery that all living organisms are
composed of tiny, amazingly varied cells. This was a true revolution
in and of itself, and it was made in the West.
The other great revolution, made around the same time as Darwin
published his theory of natural selection as an explanation for
evolution, was in the East, according to the perspective of the West –
in Russia, with the wonderful creation of the Periodic Table of
Elements by the Russian scientist Dmitri Mendeleev. The ancient and
medieval Greeks, Romans, Indians, Persians and Chinese physicians had
all assumed that the observable world was composed of only four or
five primary elements – Mendeleev showed that that there were over a
hundred, and predicted from his model elements that had not yet, but
have since, been discovered. Physicians in the West also believed in
the four elements and four related humours, the balance of which
determined health until the Scientific Revolution – which occurred
over a couple of centuries. Descartes, for example, still believed in
the four humours, though he made some surprisingly accurate
observations about the nervous system, along with his philosophical
meditations. The four humours that determined health were named as
blood, phlegm, black and yellow bile, though the meaning of these
terms is not what we mean today (especially phlegm). Each of the four
humours was associated with a personality type, the names of which are
now part of the common (and less common) English vocabulary – sanguine
(dominated by blood), phlegmatic (dominated by phlegm), choleric
(dominated by yellow bile) and melancholic (dominated by black bile).
Melancholic and melancholia were used as medical terms for what was
renamed “depression”. The humoral theory was that each of the humours
had variable amounts of each of the four elements, and that blood had
all of them. This was the rationale behind the widespread intervention
of blood-letting, which was a mainstay of medical treatment by
physicians in the West as well as the Near East (from the Western
perspective – also called rather misleadingly the Muslim World).
Despite growing up in the South and South East of the European maps, I
know much more about the history of Western medicine than I do of the
medical traditions of the East, but unlike most doctors I have
approached them sympathetically, with the assumption that they had
something to offer to what I recognised as a limited perspective
inherent in my training in what I saw as Western Medicine. This was
about twenty years ago, when I first started seriously thinking about
Chinese and Indian health models, and the unfamiliar concepts of chi
(qi), yin and yang, and meridians from Traditional Chinese Medicine
and the concept of chakras from the Ayurvedic system of India and Sri
Lanka. I also looked towards Buddhist philosophy for a new
understanding of psychology, with a growing awareness of the many
flaws in the philosophies of Western psychiatry and psychology.
In late 1994 I met a young woman who challenged my assumptions. Her
name was Helen, and she was a masseur and shiatsu therapist who was
studying Traditional Chinese Medicine. I met Helen and her friend Sara
at a nightclub, and became deeply infatuated with Sara (who later
became my partner and the mother of our daughter Zoe). My infatuation
with Sara led me to become friends with Helen, who directly challenged
me with what she had learned about shiatsu, and was learning about
Traditional Chinese Medicine. I had an open mind about meridians, yin
and yang, but found her explanation of chi unsatisfying. It was clear
that when she was talking about liver and kidneys being affected by
yin and yang, she wasn’t talking with any of the basic knowledge of
these organs that I had gathered at university. In retrospect, using
the modern sceptic terminology, I was hypnotised by woo woo.
Helen introduced me to another occult tradition, for which Sara was
enthusiastic, that of Tarot Cards. I had never heard of Tarot Cards
till Helen brought out a pack, carefully wrapped in her silk
underwear, for some mystical reason. She then brought out the Holy
Book for its interpretation – the Mystic Tarot Book. I was hooked when
Sara gave me a copy of the Mystic Tarot Book and a pack of tarot cards
for my 34th birthday. In retrospect, the Tarot cards drove me mad, for
a while, mainly by convincing me that Sara was in love with me and we
were destined to be together. As it turned out, we did, though; maybe
the tarot cards shaped this event. They certainly didn’t predict them,
and I now don’t think such prediction of the future is possible. But
at the time I was consciously maintaining an open mind, under the
impression that it was unscientific to have a closed mind – you might
say my mind was so open my brain fell out.
I now realise that when I fell in love with Sara I suspended my
critical judgement regarding what she said, and to some degree what
her friend Helen did as well. I believed what she said and I trusted
her judgement, even when it conflicted with what I thought I knew.
This was very unusual for me, but had happened when I had fallen in
love in the past. Over the years, though, my critical faculties
returned to me when it comes to what Sara thinks, though I still
respect what she says, and take her opinion seriously.
Sara now says she warned me, at the time, that tarot cards only show
you what you want to see, and can be interpreted however you want. If
she did so, I was so mesmerised by the occult ritual that I didn’t
hear it – I certainly didn’t heed it. The Mythic Tarot Book presents
various layouts, based on different traditions, though I didn’t bother
reading all the details – I was interested in what it said about the
future. I was excited about the New Age. The millennium was only five
years away.
I was caught up with New Age excitement which was fed by two books
that I read at the time – the Celestine Prophesy by James Redfield and
the Book of the Hopi by Frank Waters. The Book of the Hopi has been
around since the 1960s, but had a resurgence in popularity with the
New Age movement in the 1990s. The thing that captured my attention
the most in the Book of the Hopi was the authors claim that the Hopi
Indians of Arizona had a health system that had the equivalent of the
Indian Chakra model and that the brow chakra corresponds with the
pineal organ in the brain. This led to my enduring interest in the
pineal organ, which I had thought, till then, was a primitive vestige
of the “reptilian brain” that no longer had a function in humans. All
we had learned in medical school was that the pineal, which has no
known function in humans, usually calcifies with age. This
calcification, we were told, can be used to determine whether a
patient had a brain tumour or haemorrhage which was compressing and
distorting the brain (called midline shift, when the calcified pineal
is no longer in the midline).
In 1995 I had neither access to a computer nor to a medical library,
so I had to depend on the few old medical books I had acquired from my
father and a few that I had bought myself over the years, along with
the many medical magazines that were routinely sent to me by sole
virtue of the fact that I was a GP, as well as the CME (Continued
Medical Education) literature that allowed me to call myself a
“Vocationally Registered” GP. These sources didn’t tell me much about
the pineal, other than the 1980 edition of Harrison’s Principles of
Medicine, my father’s old physician’s bible, which had some intriguing
information:
“Since the discovery of melatonin in 1958, compelling evidence has
accumulated that the mammalian pineal is not a vestige but an
important component of a neuroendocrine control system. The organ has
been shown to function as a neuroendocrine transducer.”
This respected textbook contained a lot of information I didn’t know
about, including the metabolic pathway by which the amino acid
tryptophan is converted to serotonin which is then metabolised to
melatonin in the pineal. Acknowledging the mistake, in passing, that
the pineal is a vestige, Professor Richard Wurtman, the author of the
chapter on Diseases of the Pineal Gland, informs us that the pineal is
now thought to influence secretion of hormones by the pituitary, and
to be influenced by light. I was particularly interested in the pineal
organ’s connection with the visual system, and the fact that it has a
sympathetic innervation:
“The ‘information’ about the light travels to the pineal by a
route which involves (1) the inferior accessory optic tract (2)
centers in the brain and spinal cord that regulate the sympathetic
nervous system and (3) the sympathetic nerves to the pineal which
originate in the superior cervical ganglia. The diurnal variation in
melatonin secretion from the human pineal, which causes melatonin
levels to peak during the hours of darkness, provides the body with a
circulating ‘clock’ which is under the direct control of the lighting
environment”
In the early 1990s there was a new sensation on the pharmaceutical
market, about which there was a flurry of hype and pop psychiatry
books – Prozac. Eli Lilly’s Prozac was the first of the SSRI
antidepressants, which rapidly usurped the position of No.1
antidepressant from the tricyclic antidepressants, which had been the
mainstay of psychiatric treatment of depression from the 1950s to the
1990s. I was trained to prescribe tricyclics liberally, in the belief
that these drugs were a good way of getting people off the more
addictive benzodiazepines (or benzos) which were regarded as a big
problem at the time. We were taught that the mechanism of action of
tricyclics is uncertain, but that it was likely to work by affecting
the neurotransmitter noradrenaline (called norepinephrine in the USA).
This led to the promotion of the “noradrenaline theory of depression”
which the drug reps promoted before switching to the “serotonin theory
of depression” with the new class of drugs (which were said to work
through blockage of re-uptake of serotonin by synapses in the brain).
This is where I was at in 1995, when I started trying to make sense of
these conflicting views, all claiming to be “scientific”, and wound up
getting myself diagnosed as insane and locked up for talking too much
about my developing theories.
From my vantage point on the far south-east of the old colonial maps,
it’s difficult to ignore the fact that Australia is not, by any
stretch of the imagination, part of the West. These maps showed
Australia down in the far, lower corner, with only New Zealand further
away from the centre of the Empire. People in England (and the USA)
still describe us as living “down under”. One doesn’t find many
Australians regarding England as being “down under”, although of
course it is. The idea of The West is pervasive though, and important
for very practical reasons – including the diagnosis and treatment of
disease. What I studied at the University of Queensland and Royal
Brisbane Hospital was Western Medicine.
My training in medicine was conventional for the 1980s in Australia,
and was splintered from the outset. We studied anatomy and physiology
as separate subjects, which were disconnected from our studies of
embryology, histology, biochemistry, sociology, psychology and various
specialties of medicine and surgery. These were supposed to somehow
come together when we were confronted with whole human beings with
real problems who sought our medical assistance. We were called the
healing profession, but we did not learn much about health. We did
learn an enormous amount, if splintered, about disease and how to
diagnose it. We also learned the rudiments of how to treat these
diseases with drugs, and when surgery was indicated rather than drugs,
but we learned very little about the natural healing mechanisms of the
body or what we could do to maximise these without resorting to drugs
or surgery.
Over the past twenty years I have been theorising on the healing
mechanisms that were ignored in my training, but that can be explained
in terms of established science. These have been centred on what might
be called the mind-body relationship and a holistic approach to
health, but here I hesitate, because both “mind-body medicine” and
“holistic medicine” are often taken to mean something very different
to my own interpretation of these terms.
When I studied medicine there was already widespread criticism of the
tendency of doctors to view patients in terms of diseased body parts
rather than whole people. This was reinforced by our training. When we
were first allowed on the medical wards we’d be given a list of
patients with “interesting signs” – examine Mr Smith’s liver, listen
to Mrs Brown’s heart (to detect a murmur) or palpate Mr Green’s
enlarged kidneys. It is important that students gain these clinical
skills, but it is easy to lose sight of the human beings who are
enduring the many indignities of being in hospital. This was the
opposite of what I, and many other doctors, regard as a holistic
approach – approaching patients (and I will use this dubious term for
the moment) as a whole rather than dismembered parts.
There was also criticism of the trend towards specialization and
superspecialization, where each speciality had no idea of what was
going on in other specialities. It was, theoretically, the General
Practitioner, or GP who looked after the “primary care” of patients in
the community (the public), coordinating the general healthcare of
each individual in society. This was the theory then, and it is still
the basic paradigm of healthcare in Australia today. The problem is
that GPs are trained first in a splintered view of the body and mind
in university followed by splintered post-graduate studies in the
public and private hospital systems, which are associated with a
splintered network of universities, rival drug companies and assorted
vested interests. Health is big money, and big profits are to be had.
Diagnosing and treating diseases around the world is a multi-billion
dollar industry, in which GPs play a key role. This is why, when I was
working as a GP in Melbourne, the drug companies inundated me with so
much drug-promotional literature, samples and prescribing incentives,
that I started refusing to see reps, except when I was particularly
interested in the drug they were selling. It was only after I had been
working as a GP for decade that I started realising how splintered and
limited my training at university and the hospitals had been, and how
poorly it had equipped me for the essentially holistic task of healing
rather than indefinitely treating. What was needed was for me to
integrate the splinters of knowledge I had about disease, health and
healing and integrate this with what I knew about anatomy and
physiology. My knowledge in all these areas was full of big holes, but
I had studied rudiments of all of them at university and gathered
other knowledge through my three years of hospital training (mostly in
paediatrics) and many years in general practice, mostly as family
doctor in Melbourne.
This tendency to splintering is inherent in the medical system,
reflecting a historical division in Western medicine between surgery
and medicine. My degree was termed a bachelor of medicine and surgery,
and from the time of my hospital internship I had the choice of
steering towards either medicine or surgery. If I chose paediatrics, I
could become either a paediatrician or a paediatric surgeon; the two
disciplines were necessarily separate. The knowledge and skills
required for good surgery and orthopaedics is very different to the
knowledge and skills required to be a good physician, though both
medicine and surgery benefit from what are called “bedside manners”.
When I worked in the hospitals in the 1980s surgeons and orthopaedic
surgeons in particular were notorious for having poor bedside manners,
while physicians prided themselves on having good bedside manners as
well as superior “clinical skills”. This was not necessarily the case.
Some of the medical professors had neither knowledge nor clinical
skills, but had climbed the ladder in order of seniority over the
years. Some of the surgeons were not only rude to their patients but
they were also dangerously incompetent by today’s standards. I have
been assured that the situation has changed, maybe it has. I think
doctors are generally more polite than they used to be in the past,
when the doctor’s gave “orders” which the patients were expected to
“comply with” if they were to be judged “good patients”. Old habits
die hard, though, and in some areas of medicine such as psychiatry,
authoritarianism is still the order of the day. This is the opposite
of what I mean by holistic medicine, which is focused on the whole
human being – mind, body and (dare I say it?) spirit.
Wikipedia has this to say about Holistic Health:
“The holistic concept in medical practice, which is distinct from the
concept in alternative medicine, upholds that all aspects of people's
needs including psychological, physical and social should be taken
into account and seen as a whole. A 2007 study said the concept was
alive and well in general medicine in Sweden.
Some practitioners of holistic medicine use alternative medicine
exclusively, though sometimes holistic treatment can mean simply that
a physician takes account of all a person's circumstances in giving
treatment. Sometimes when alternative medicine is mixed with
mainstream medicine the result is called "holistic" medicine, though
this is more commonly termed integrative medicine.
According to the American Holistic Medical Association it is believed
that the spiritual element should also be taken into account when
assessing a person's overall well-being.”
Wikipedia the makes the startling claim, from the American Cancer
Society website that, “Holistic health is a diverse field in which
many techniques and therapies are used. Practitioners of alternative
approaches may include many methods including colon therapy, metabolic
therapy and orthomolecular medicine.”
There is nothing holistic or scientific about colon therapy, metabolic
therapy or orthomolecular medicine, and most of what is passed off as
“holistic” by the plethora of “alternative medicine” practitioners is
not what I mean by holistic. I mean merely that the whole is greater
than the parts that constitute it, when it comes to living organisms
and ecosystems, and that reductionism is of value only when it is
integrated to gain a whole picture. Reductionism is vital for science,
but so is holism. This is the case for ecology, and also the case for
human biology and psychology. We gain much information by looking at
the detail down to the molecular and atomic level, but unless this
information is integrated into a whole we cannot hope to understand
biology, which is intrinsically holistic on many levels. Biology is
also based on individuals; individual bodies with individual minds
which are subjected to the forces of natural selection and artificial
selection, as well as the interventions of intended healers. Some of
these healers are medically trained, others are not. Some are deluded
about their ability to heal. Some are realistic about their
limitations. Some do actually heal. Some heal but only call it
treatment or cure rather than healing. When I trained as a doctor it
was not regarded as acceptable to call oneself a healer. This term was
reserved for quacks and religious cranks.
The definition of Holistic Medicine by the Canadian Holistic Medical
Association is closer to what I mean as holistic medicine and reads as
follows:
“Holistic medicine is a system of health care which fosters a
cooperative relationship among all those involved, leading towards
optimal attainment of the physical, mental emotional, social and
spiritual aspects of health.
It emphasizes the need to look at the whole person, including analysis
of physical, nutritional, environmental, emotional, social, spiritual
and lifestyle values. It encompasses all stated modalities of
diagnosis and treatment including drugs and surgery if no safe
alternative exists. Holistic medicine focuses on education and
responsibility for personal efforts to achieve balance and well being.
This Canadian website touches on the confusion I have encountered over
the years.
“Other Terms Associated with Holistic Medicine
Alternative Medicine is often used by the general public and some
healthcare practitioners to refer to medical techniques which are not
known or accepted by the majority "conventional" or "allopathic"
medical practitioners (usually M.D.'s). Such techniques could include
non-invasive, non-pharmaceutical techniques such as Medical Herbalism,
Acupuncture, Homeopathy, Reiki, and many others. However, the term
Alternative Medicine can also refer to any experimental drug or non-
drug technique that is not currently accepted by "conventional"
medical practitioners. As non-invasive, non-pharmaceutical techniques
become popular and accepted by large number of "conventional"
practitioners, these techniques will no longer be considered
Alternative Medicine.
Alternative Medicine refers to techniques that are not currently
accepted by "conventional" practitioners, but what is currently
accepted is quickly changing. Even the definition of "conventional
practitioners" is quickly changing. Therefore, techniques that are now
considered part of Alternative Medicine will soon be considered part
of "conventional" medicine. The terms Holistic Healing and Holistic
Medicine are slightly more stable than Alternative Medicine and are
therefore preferable.
Complementary Medicine is often used by "conventional" medical
practitioners to refer to non-invasive, non-pharmaceutical techniques
used as a complement to "conventional" medical treatments such as
drugs and surgery. The term implies that "conventional" medicine is
used as a primary tool and the non-invasive, non-pharmaceutical
techniques are used as a supplement when needed.
In many cases, properly chosen non-invasive and non-pharmaceutical
healing techniques plus properly chosen lifestyle changes can
completely and safely heal both acute and chronic illnesses. In other
cases, "conventional" medicine is only needed in emergencies or when
the safer non-invasive, non-pharmaceutical methods fail. In some cases
"conventional" medicine will be a major part of a Holistic Healing
Plan, but in some cases it is not needed at all.”
According to Wikipedia:
“Natural Healing usually refers to the use of non-invasive and non-
pharmaceuticals techniques to help heal the patient. When most people
use the term Natural Healing, they are usually referring to physical
healing techniques only.”
What is physical healing?
When I checked on Google the only thing that came up for ‘physical
healing’ was the power of prayer, where the contrast is between
physical healing from spiritual healing (whatever that means). There
are also assorted websites talking about Tibetan Buddhism and mind-
body techniques for self-healing. Again they are talking about healing
the physical body (as opposed to the mind) and not any particularly
physical treatment, such as physiotherapy or massage.
I am interested in understanding natural healing mechanisms in order
to promote natural healing. By this I mean non-interventional
treatments, avoiding the use of drugs or surgery (both of which carry
risks). This seems like sensible clinical medicine – to reassure
patients that they will get better by themselves (or that there is
nothing wrong with them – if indeed there is nothing wrong them).
There are many things people can do to speed this process of recovery
up, and other things they may do which slow the recovery down, but
even untreated, many maladies are temporary and are healed by the body
in time. It is these healing processes that I am most interested in,
with the related question of what suggestions can be made, in the
course of a consultation, to promote, rather than hinder these natural
healing mechanisms.
In my experience there are several difficulties with promoting a non-
interventional approach in medicine. Patients come to their doctors
for many reasons. Sometimes it is to check if they are ill in the
first place. What is sometimes called a “check-up”. Sometimes they
already know or think they know what is wrong with them, and have a
clear idea what drug or other treatment they are after (such as a skin
cancer removed, or a wound sutured). At other times they present with
symptoms of which they are uncertain, regarding the cause or
seriousness of their illness (whether it is, to use a common phrase,
“something to worry about”). This is where sound scientific knowledge
and clinical experience is essential. Experience is necessary to
recognise what is normal from what is abnormal, and the difference
between normality and health. What is normal, meaning common or usual,
is not the same as what is healthy. Reassurance without carefully
checking for serious disease is irresponsible and dangerous. On the
other hand instilling pessimism and hopelessness, or causing
unnecessary worry are known to be detrimental to short term and long-
term health. It is a fine balance that is necessary and this balance
should be based on a sound understanding of human biology and
psychology in sickness and health. The medical tradition has been, for
obvious reasons, focused on sickness, hoping that identifying and
treating sickness with drugs and surgery will lead to health (taken to
mean the absence of illness).
The big change in medicine over the three decades since I graduated,
is the increased role of so-called preventive medicine, when risk
factors for various diseases (especially heart disease) are identified
and treated. This is a very costly business for governments, since
cholesterol-lowering and blood-pressure lowering drugs are expensive.
There are also various “early detection” and “screening” programs,
especially for cancer. These programs are all supported by statistical
studies, though questions have been raised about conscious and
unconscious bias in these studies, and the ubiquitous role of the
pharmaceutical industry in driving medical research and publications.
I say so-called, because preventative medicine often causes needless
fear, anxiety and expense, which are experienced as stress. Stress in
turn is known to cause or worsen a range of medical problems, as well
as often being a mental problem in itself.
Words are powerful tools. Words can heal, but words can also kill.
They don’t kill straight away except when orders are given to behead,
bomb or shoot someone, but words can be intentionally or
unintentionally harmful to the health. In the mouths of trusted and
respected people the power of words and phrases increases. When many
people use particular words or phrases, particular memes, they grow
further in power and influence, for good or ill. Increasing
“awareness” of “ADD”, “schizophrenia” or “Alzheimer’s Disease” does
nothing to promote the health of society, though it may get many
people into experimental drug treatment programs. Likewise campaigns
to increase awareness of depression, which has been a bonanza for the
drug companies and the sales of Prozac and other SSRI drugs.
On the other hand early diagnosis of some diseases – what I regard as
real diseases – is important and saves many lives. Breast self-
examination for women, looking for early breast cancer and knowledge
of the signs of breast cancer; knowing that cardiac pain is usually
felt as a dull heavy or crushing pain in the chest, rather than a
sharp pain, and may not be severe to be serious is essential knowledge
that should be passed on to patients by their doctors if they hadn’t
already been taught this at school or by their parents (which is, in
fact very unlikely, despite the millions of lives this simple
knowledge could save). It is essential that patients know that
abnormal bleeding from any orifice needs medical attention, though
most don’t need reminding of this, and worry about it automatically
(though they may be reluctant to seek medical advice). I have mixed
feelings about immunization, cholesterol screening and PAP smears,
though I am generally in support of PAP smears done by experienced
doctors who will not over-treat. It is the same with immunizations. I
do believe in immunization as a preventive measure but am concerned at
the increased number of antigen assaults we are subjecting our
children to, and worry about possible long-term chronic risks such as
autoimmune disease and chronic arthritis. I am not concerned about
mercury and other additives in vaccines, or the supposed association
between autism and vaccination; I think the documented increase in
autism diagnosis can be more reasonably attributed to broader criteria
for application of the label, and increased popularity of the
diagnosis.
My reservations about cholesterol screening are also centred on over-
treatment, and the vested interests of drug companies. The statin
drugs are effective in lowering LDL cholesterol and raising HDL
cholesterol and I accept that high LDL is associated with the
development of atherosclerosis and high HDL has a protective effect
when it comes to heart disease. I also remember how, as soon as the
meme that “high cholesterol causes heart disease” caught on in the
1980s, the big chemist stores were offering on-the-spot blood tests to
measure the cholesterol of the customers, with the hope of selling
them products that had been “scientifically proven” to lower
cholesterol (this was before the development and release of the
statins, which do actually do this, as I have found from my own
clinical experience). It was only later that the mainstream started
talking about “good” (HDL) and “bad” (LDL) cholesterol. What was not
promoted by the drug reps who visited me to encourage me to prescribe
their particular brand of statin drug, was the non-drug methods of
lowering LDL and raising HDL cholesterol. These were, of course,
behavioural changes – changes in diet and activities.
Much has been written about the dietary changes that can be made to
raise HDL (such as more whole grains and fish) and lower LDL
(reduction of saturated fats and fats generally) with various fads
coming and going. I haven’t kept up with these fads, and have little
interest in diet, though I do recognise that it is very important. It
is one of the gaps in my knowledge that I hope to address in the
fullness of time. In the meantime I am drinking a cup of coffee and
pondering the broader physiology and metabolism of cholesterol and how
it relates to the healing mechanisms I have mentioned, but not
expanded on.
Cholesterol is an essential molecule for
/
One event I remember clearly was the death of our dog Smoky, when I
was 13.
/
Charles M Schultz’s Peanuts cartoons made beagles famous around the
world, but I got to know a delightful beagle personally before I was
introduced to the world of Snoopy, Linus and Charlie Brown. When we
arrived in Kandy from England in 1968, when I was eight, our first pet
was a beagle, a special breed that was said to be “blue”. My uncle
Terence was breeding these hounds as attractive, friendly pets rather
than hunting dogs (they are smaller and have a grey and brown rather
than black and brown coat). We called him Smoky because he was not
what could reasonably be defined as blue. The blue beagle puppy
pictured above seems to have blue eyes, but I remember Smoky’s as
being brown.
I loved Smoky as much as I loved any human, when I was a child living
in Sri Lanka and attending high school in the hill town of Kandy. When
Smoky was run over by a bus, when I was 13, I cried for hours. I could
not stop sobbing; it was uncontrollable. I just lay on my bed and
sobbed for hours. I had never wept like this before, and I have never
wept like it in the four decades since then (despite many times I have
been moved to tears over the years). This was a different level of
grief – I was devastated. Then I got angry. I blamed my new school-
friend who had visited me at home to play for the first time, and whom
Smoky had followed down the hill to the main road, where he had been
run over by the bus. I never spoke to him again. Then I blamed the guy
in charge of it all – God.
Of course I could have blamed myself; maybe deep inside that was
indeed what I was doing, when I projected my anger against my school-
friend. I never thought to blame the bus driver or my dog Smoky;
instead I blamed the poor boy that Smoky followed down the hill. I
thought he should have brought him or led him back, rather than
letting him walk onto the main road (which was about 200 metres down a
winding road that led up to our house, along which Smoky did not
usually venture). I didn’t think about the tragedy rationally; I was
driven by overwhelming emotion.
As far as I can recall, after sobbing for a few hours I was able to
get control of the actual sobbing and recuperate from the trauma of
the sudden loss. In medical school, years later, I learned about
normal phases of grief, including anger and projection. The
experience, the most emotionally devastating event in my life till
then, taught me something about what it feels like to lose someone you
love; for to me Smoky was someone not something.
I gathered some consolation from the idea that Smoky was in heaven,
since he was clearly an innocent and good-natured dog. It did not
cross my mind that dogs do not have souls; if humans went to heaven
when they died, life in heaven would be empty without dogs, cats,
birds and butterflies. My concept of heaven was of a place much like
earth, but without mosquitoes.
I knew about mosquitoes because I had been bitten by them. When we
first arrived in Sri Lanka from England where I was born and spent my
early years, every bite would erupt into an itchy blister. After a few
months this stopped, and I stopped reacting to the bites so
dramatically, but I developed an early dislike of these little
creatures and absolutely failed to see why God would create such
pests.
When I was eleven my mother showed me how to recognise malaria
parasites in white blood cells under the microscope and I learned
about the life-cycle of the malaria parasite and how mosquitoes spread
the disease. I was told that malaria causes more illness and death
than any other infectious disease, and that it was only with the use
of DDT after the Second World War, that Sri Lanka was able to largely
eradicate malaria. This increased my belief that mosquitoes were evil
and they must therefore be the work of the Devil. I had a clear
polarity in my mind between what was good – God and what was evil –
the Devil.
? ?????? ????
3 ????? ??????????? ??????? ??????????????. ???? ????? ????????? ?????
? ???????? ?????? ????? ??????? ??????????????.
Its method of alighting is interesting – as soon as it lands, it turns
around and waggles its tail filaments, it also sidesteps for a while –
all this is apparently to confuse a predator as to which side is the
head. This is a likely reason that the first naturalists may have
named the species the Monkey Puzzle.
????
????? ????
“Giddai, Howyagoin?” the boy asked me, a few days after I arrived in
Australia, and had started year 11 at ‘Churchie’, as I found my new
school was fondly called by its students.
I had no idea what he was saying. “What?” I asked.
“Giddai. You know, giddai, giddai…like Hawaii!”
I could see that he thought I was daft, or at best unable to
comprehend English. What I could not understand was Australian, and
the uniquely Australian pronunciation of the old-fashioned English
greeting, “Good Day”. If you say “good day” to an Australian they will
understand it, but they will also recognise you as a foreigner. The
Australian greeting is “giddai”, not “gidday” or “gooday”.
The appropriate response, if one wishes to be recognised as Australian
and not English, is “I’m good”. The English response is, “I’m well”.
There is a difference between being well and being good, which I have
wondered about for many years. Did the Australian language betray its
criminal heritage, where the early colonists were eager to assure
their neighbours that they were good, rather than assure them about
the state of their health?
Pissed and Fell Over (PFO) – pissed: English - drunk, American –
angry.
HYPERLINK
"http://espace.library.uq.edu.au/view/UQ:179979/Amy_Humphreys_Honours_
Thesis_2008.pdf"http://espace.library.uq.edu.au/view/UQ:179979/Amy_Hum
phreys_Honours_Thesis_2008.pdf
Amy Humphreys wrote in her 2008 social science honours thesis
Representation of Aboriginal Women and their Sexuality that:
“colonists who engaged in prolonged personal and sexual
relationships with Aboriginal women were called a ‘gin-jockey’ or
‘combo’ and were
rejected by white society as morally degraded individuals”.
1982 ‘Don’t you remember Black Alice Sam Holt? Aboriginal Women in
Queensland History’. In Hecate, 8(2.1):6-21.
Wiktionary: boong
English
Etymology[edit]
First used by soldiers in New Guinea. Suggested sources are
Noun
boong (plural boongs)
1988, The Bulletin, Issues 5617-5625, page 121 — They would doubtless
have been amused to learn that in New Guinea, where the term "boong"
originated, it means "brother" and has a kinship with the Indonesian
"bung" and Thursday Island's "binghi".
I would have thought that the Vikings are more famous for
slaughtering, raping and pillaging others than self-sacrifice; and
it’s hard to see how crossed axes signify self-sacrifice either. I’m
not sure that self-sacrifice is such a good thing, anyway. Isn’t it
what all these jihadis think they’re doing when they strap on suicide
bombs and what the Tamil Tigers thought they were doing when they blew
themselves up in the hope of becoming martyrs?
Whilst many point to Aristotle and the Greek philosophers as the prime
movers behind the development of the scientific method, this is too
much of a leap.
Muslim scholars, between the 10th and 14th centuries, were the prime
movers behind the development of the scientific method.
They were the first to use experiment and observation as the basis of
science, and many historians regard science as starting during this
period.
These steps are very similar to the modern scientific method and they
became the basis of Western science during the Renaissance.
Observation
Hypothesis
Experiment
Verification
This process continued with the Enlightenment, with Francis Bacon
(1561 - 1626) and Descartes (1596 - 1650). Francis Bacon continued the
work of his Renaissance namesake, strengthening the inductive process.
His method became:
Empirical Observations
Systematic Experiments
Analyzing Experimental Evidence
Inductive Reasoning
Bacon's inductive method was a way of relating observations to the
universe and natural phenomena through establishing cause and effect.
Descartes broke away from the model of induction and reasoning and
again proposed that deduction was the only way to learn and
understand, harking back to Plato. His method was almost the reverse
of induction:
Descartes believed that the entire universe was a perfect machine and
that, if you knew the first principles, derived from mathematical
proofs.
After Newton
There were many other great thinkers who refined the scientific
method, including Einstein, Russell, Popper and Feyerabend, amongst a
whole host of other great thinkers.
All of these great thinkers, and many others beside, had a great
influence upon determining the course of modern science as we know it.
So, when you ask 'Who invented the Scientific Method?" the answer is
no-one, as the scientific method is in a state of constant evolution
and modification.
Sadly, if you were looking for a simple answer that will fit into the
short answer section of a test, you will not find it here!
Wikipedia:
The Islamic Golden Age refers to the period in Islam's history during
the Middle Ages when much of the Muslim world was ruled by various
caliphates, experiencing a scientific, economic, and cultural
flourishing.[1][2][3] This period is traditionally understood to have
begun during the reign of the Abbasid caliph Harun al-Rashid (786 to
809) with the inauguration of the House of Wisdom in Baghdad, where
scholars from various parts of the world sought to translate and
gather all the known world's knowledge into Arabic.[4][5] It is said
to have ended with the collapse of the Abbasid Caliphate with the
Mongol invasions and the Sack of Baghdad in 1258.[6] Several
contemporary scholars, however, place the end of the Islamic Golden
Age to be around the 15th to 16th centuries.[1][2][3]
Starting in the 16th century, the opening of new sea trade routes by
Western European powers to South Asia and the Americas bypassed the
Islamic economies, and led to colonial empires, greatly reducing the
Muslim world's prosperity.
HYPERLINK
"http://en.wikipedia.org/wiki/Scientific_method"http://en.wikipedia.or
g/wiki/Scientific_method
The scientific method is a body of techniques for investigating
phenomena, acquiring new knowledge, or correcting and integrating
previous knowledge.[2] To be termed scientific, a method of inquiry is
commonly based on empirical or measurable evidence subject to specific
principles of reasoning.[3] The Oxford English Dictionary defines the
scientific method as "a method or procedure that has characterized
natural science since the 17th century, consisting in systematic
observation, measurement, and experiment, and the formulation,
testing, and modification of hypotheses."[4]
I believe that the sun will rise tomorrow morning. I have faith in
this. I am certain about it. I also realise that the rising of the sun
is something of an illusion. The sun appears to ‘rise’ because of the
rotation of the earth. It rises only relative to my position. While it
is rising to me, here in the Antipodes, it’s setting on the opposite
side of the world.
The famous philosopher of science, Karl Popper argued that just
because the sun has risen every day in the past does not mean you can
be certain that the sun will rise again tomorrow. His argument was
that one cannot be 100% certain about anything, just closer to the
truth, as various falsifiable alternative hypotheses are disproved by
observable, measurable evidence. This is the paradigm of science I was
trained to believe in – that science is what scientists do when they
use the “scientific method”, which constitutes developing and then
testing falsifiable hypotheses with “empirical evidence”. But what do
all these terms mean? I must admit I have never really considered them
deeply. I just accepted Popper’s definition of science without even
knowing who Karl Popper was, or why his definition of science was
taught to me at medical school.
Karl Popper (1902-1994), was an Austrian philosopher and
psychoanalyst, who had a long and distinguished career at the London
School of Economics and University of London, where he played a key
role in shaping what scientists think science is in the West.
According to Popper’s definition, scientific theories must be
falsifiable to regard them as ‘scientific’. Thus things, including the
rising of the sun tomorrow, become very likely but never certain.
Whether or not we should trust our belief that the sun will rise
tomorrow, and regard it as certain, on the basis of inductive
reasoning, was questioned by Popper.
In Conjectures and Refutations (1963) the philosopher argued that:
“The sun may have risen again after every past day of which we have
knowledge, but this does not entail that it will rise tomorrow. If
someone says: 'Ah yes, but we can in fact predict the precise time at
which the sun will rise tomorrow from the established laws of physics,
as applied to conditions as we have them at this moment', we can
answer him twice over. First the fact that the laws of physics have
been found to hold good in the past does not logically entail that
they will continue to hold good in the future. Second, the laws of
physics are themselves general statements which are not logically
entailed by the observed instances, however numerous, which are
adduced in their support. So this attempt to justify induction begs
the question by taking the validity of induction for granted. The
whole of our science assumes the regularity of nature – assumes that
the future will be like the past in all those respects in which the
natural laws are taken to operate – yet there is no way in which this
assumption can be secured. It cannot be established by observation,
since we cannot observe future events. And it cannot be established by
logical argument, since from the fact that all past futures have
resembled past pasts it does not follow that all future futures will
resemble future pasts.“
Good philosophers raise questions that one hasn’t thought of. I had
assumed that everyone accepted that the sun will rise tomorrow, and
that this is accepted fact. It is a true, correct belief, and it is
reasonable and wise to have faith that the sun will rise. If one did
not believe the sun would rise it would likely lead to irrational,
unreasonable thoughts and actions. It came as a surprise that Karl
Popper, whose sage advice I was trained to respect, argued otherwise.
Hitchens, who died in 2011, used to point out that there is nothing
new about atheism, but the arguments of these intellectuals are, to
some extent new (including their arguments against new theological
arguments that have arisen from quantum and cosmological discoveries).
Some critics have accused the New Atheism of being a religion itself,
and its proponents of attacking other religions with a religious
zealotry, in proselytising their particular brand of atheism and
“scientism”. Others have pointed out that the leaders of the New
Atheism are all male and all white, and provide a white, male
perspective.
The British philosopher John Gray, himself an atheist, wrote in the
Guardian in 2008 that:
“Zealous atheism renews some of the worst features of Christianity and
Islam. Just as much as these religions, it is a project of universal
conversion. Evangelical atheists never doubt that human life can be
transformed if everyone accepts their view of things, and they are
certain that one way of living – their own, suitably embellished – is
right for everybody.”
In Gray’s view, “Dawkins, Hitchens and the rest may still believe
that, in the long run, the advance of science will drive religion to
the margins of human life, but this is now an article of faith rather
than a theory based on evidence”.
The famous linguist and political analyst Noam Chomsky has criticised
the New Atheists as being “religious fanatics” who believe in the
State Religion, rather than the other religions, and there is some
truth in this, when it comes to Christopher Hitchens, who declared
himself to be an “anti-Theist” in addition to an atheist and “adeist”.
This terminology, used by Hitchens, differentiated between deism and
theism (though both have the same linguistic roots, meaning God – from
which also the Greek god Zeus is derived). The New Atheists are not
all as vehemently anti-religion as Hichens was. Richard Dawkins has
described himself as a “spiritual atheist”, making the distinction
between “spirituality” and “religiosity”. One can be spiritual, in
this view without being religious, and presumably religious without
being spiritual, but what do the words religious and spiritual
actually mean? What constitutes a religion? What is a spirit, and
what is the difference, if any, between spirit and soul? Does atheism
necessarily infer disbelief in spirit and soul? What can science
contribute to an understanding of spirit, soul and mind (if indeed
these are different things)? Does thinking scientifically and
rationally lead inevitably to atheism?
The Four Horsemen of the non-Apocalypse, all true believers in the
steady progress of science and the falseness of religion, argue that
religion is not only false, but that it is harmful to society. Dawkins
argues that the falsity of religion is itself harmful, and that his
aim is to promote rational thinking, and the scientific method. He
expresses the hope, which he admits may not be realistic, that
humankind will abandon religion in favour of Science.
Dawkins says that, "We are all atheists about most of the gods that
societies have ever believed in. Some of us just go one god further."
Hitchens uses the same argument – since the dawn of civilization
people have worshipped thousands of gods, which are mutually
exclusive. By the laws of probability, Hitchens argues, the likelihood
of ones chosen god being the true god is less that one in a thousand.
Though they win applause from their many fans, these arguments are
somewhat flawed; though the many religious debaters that have
challenged the Four Horsemen have had difficulty identifying some of
these flaws because of their own beliefs in the Biblical Word. It is
easy to argue against the morality (and factuality) of the holy books
of Judaism, Christianity and Islam, pointing to their support of
slavery and the subjugation of women and children alone. It is not so
easy to argue that all religion is harmful, as the New Atheists
maintain.
Examples of behaviour and teachings that with a modern understanding
of morality we can clearly recognise as evil, abound in the Old
Testament. King Solomon, famed for his supposed wisdom, advised that
to “spare the rod” was to “spoil the child”. The story of Abraham
being prepared to make a human sacrifice of his son, Isaac, because
the “voice of God” or an “angel of God” commanded him to do so, and
that it demonstrated his “faith” or “belief”. That this is both bad
and mad is easy to establish on logical and humanitarian grounds,
without recourse to science. Philosophy has more to say about morality
than science does. The book of Leviticus has Moses getting angry that
his soldiers had spared the women and children in their genocide of
the Midianites, ordering that the women be killed, and all the male
children, but the virgin girls be spared death, to be used as slaves.
This is obviously evil. Science doesn’t enlighten us on these matters,
philosophy may, but the reason more of us know not to beat our
children is not because of science. It is not because of religion
either. Maybe it is because of natural human morality.
Dawkins sums up the God of the Old Testament as “the most evil
character in fiction”. In The God Delusion he writes:
“The God of the Old Testament is a petty, unjust, unforgiving control
freak, a vindictive, bloodthirsty ethnic cleanser, a misogynistic,
homophobic, racist, infanticidal, genocidal, filicidal, pestilential,
megalomaniacal, sado-masochistic, capricious, malevolent bully.”
Hitchens uses similar terms – cruel, vindictive, murderous,
judgemental and genocidal. I have a long agreed with Hitchens and
Dawkins on this assessment of the God of the Old Testament, but I am
not so convinced that religion is at the root of as much evil as they
claim. I am also cognisant of what religion, ritual and even theology
have contributed to human civilization. The world would be a poorer
place without the magnificent architecture of cathedrals, temples and
mosques and the various religious festivals that add colour, music and
even mystery to human culture. Religious sculptures can be extremely
beautiful, even sublime, and date back to the dawn of human
civilization and long into prehistory. At the same time, there has
been much evil done in the name of religion. The inquisitions of the
Catholic Church, the Crusades, and the monstrous behaviour of the
Spanish conquistadores in the Americas are obvious examples. When the
Portuguese, enthused with Catholicism and hatred of idolatry, arrived
in India and Sri Lanka in the 1600s, they wasted no time in reducing
to rubble as many Hindu and Buddhist temples as they could. Monstrous
brutality was committed on “heathens” by the Spanish and Portuguese
soldiers, justified in the name of spreading the Word of Jesus.
Having been baptised and confirmed as an Anglican Christian, and even
winning the “Christianity Prize” for many years running at a church
school in Sri Lanka I am familiar enough with the contents of the Old
and New Testaments to wonder whether I should choose, as an object of
worship, Yahweh, the jealous and vengeful god of the Jews or
Sarasvati, the Hindu goddess of knowledge and wisdom. Both are mere
metaphysical concepts to me, but I believe in many metaphysical
concepts, why not gods and goddesses? As long as I know they are
metaphysical concepts and not supernatural beings. Maybe one can
worship Goodness – or God – without any thought of reward. Can one
worship Wisdom, or Truth, or Beauty as metaphysical concepts with or
without agency? Is it reasonable to define what one worships as one’s
god (or gods)?
The “four horsemen of atheism”, as they have been called are the
British biologist Richard Dawkins, the British-American journalist
Christopher Hitchens, the American philosopher Daniel Dennett and the
American neuroscientist Sam Harris. Their discussion (not debate) on
the evils of religion, hosted by Hitchens his home, has become a
classic among their many secular fans.
HYPERLINK "https://www.youtube.com/watch?
v=rRLYL1Q9x9g"https://www.youtube.com/watch?v=rRLYL1Q9x9g
Hitchens argues that there is nothing new about atheism, and this is
true. Atheism has a long philosophical tradition in both the East and
the West. The New Atheists focus only on the atheism of the West and
never mention, for example, the Hindu C?rv?ka, a materialistic and
atheistic school of Indian philosophy, that had developed a systematic
philosophy by 6th century CE. C?rv?kas rejected metaphysical concepts
like reincarnation, afterlife, extracorporeal soul, efficacy of
religious rites, other world (heaven and hell), fate, and accumulation
of merit or demerit through the performance of certain actions. C?rv?
kas also refused to ascribe supernatural causes to describe natural
phenomena.
The ‘four horsemen of atheism’, pit science against religion, which is
argued to be harmful and ignorant of the facts. Dawkins is rightly
concerned about fundamentalist Biblical Creationism being taught as an
‘alternative theory’ to evolution by natural selection, but extends
his criticism to religion more generally and decries faith as a reason
for believing anything. It could be said, though that he has more
faith in science than an objective assessment of this all-too-human
endeavour warrants.
HYPERLINK "https://www.youtube.com/watch?
v=50pq71Bmils"https://www.youtube.com/watch?v=50pq71Bmils
In his 2011 debate with Sam Harris, Craig expanded on these arguments
and Harris struggled to counter them. Few scientists have the combined
oratorical and debating skills of William Lane Craig:
HYPERLINK "https://www.youtube.com/watch?
v=IwhgPjPCpL8"https://www.youtube.com/watch?v=IwhgPjPCpL8
According to Wikipedia’s entry on dialectics, debates are won through
a combination of persuading the opponent; proving one's argument
correct; or proving the opponent's argument incorrect. By such a
definition there was no clear (to me) winner of the debate between Sam
Harris and William Lane Craig. Neither convinced his opponent. Though
I agreed with much of what Harris said, I did not think he addressed
the specific arguments for the existence of God that were expounded by
Craig. Harris is perhaps limited by his training as a cognitive
neuroscientist, rather than a physicist, when it comes to cosmological
arguments.
The cosmological expert among the New Atheists is Lawrence Krauss, who
also frequents the debate circuit in support of the atheist arguments
– which are generally regarded as synonymous with “materialist”. This
results in a sharp division between materialism and spiritualism,
though Krauss makes a distinction between spirituality and religion.
In this debate Krauss argues in favour of the proposition that
“science refutes God”:
HYPERLINK "https://www.youtube.com/watch?
v=x0qmr5AYFTg"https://www.youtube.com/watch?v=x0qmr5AYFTg
Krauss and Shermer won the debate, according to the voting of the
audience. The vote for the motion increased from 37% to 50%; the vote
against also increased, but only from 34 to 38% (converts from the 27%
that were undecided before the debate).
Hitchens argues against all religion, but celebrates what he calls
“the numinous”.
HYPERLINK "https://www.youtube.com/watch?
v=KkxgrrHbKG0"https://www.youtube.com/watch?v=KkxgrrHbKG0
In the following panel discussion on science, faith and religion at a
science festival in the USA, Krauss is accompanied by the British
philosopher Colin McGinn who professes atheism but adds that certain
things are beyond human understanding, the Vatican astronomer Brother
Guy Consolmagno and Professor Kenneth Miller, a cell biologist who
believes both in science and Catholicism.
HYPERLINK "https://www.youtube.com/watch?
v=KkxgrrHbKG0"https://www.youtube.com/watch?v=KkxgrrHbKG0
I also believe that the sun consists mostly of hydrogen and helium,
and that the heat of the sun is caused by a nuclear explosion that has
been going on for billions of years, that the sun is one of millions
of stars in the Milky Way galaxy, and that this is one of billions of
galaxies that I can see only as stars in the night sky. This is
something of a miracle to me, in that I am mystified as to how this
came to be, and how scientists know this, but I have good reasons for
believing it. The reason I believe that what appear to me as stars are
actually galaxies, is that I have basic faith in the integrity of
Western science and the Western scientists of the past, and not
because I have looked at stars through a telescope, though I have seen
photographs of distant galaxies taken through telescopes.
Though I heard the theory many years ago, until recently I was not so
convinced about the Big Bang Theory. To my sceptical mind it sounded
too much like creationism in disguise, and though I accepted that the
distant galaxies showed red shift, indicating that the universe was
expanding, I was not convinced that extrapolating back to a
‘singularity’ 13.8 billion years ago was logical and reasonable. When
scientists were talking about the first seconds of the formation of
the universe, I wasn’t sure that they weren’t deluding themselves. How
on Earth could we mere humans presume to understand the first seconds
of the universe?
My scepticism about the Big Bang has largely evaporated after hearing
the debates posted on YouTube of the physicist Lawrence Krauss, who
argues that the Big Bang is evidence against the biblical account of
creation, though the Catholic pope did, apparently, claim the theory
as evidence of biblical truth when the Big Bang theory was first
proposed. The current Pope Francis has declared that the Big Bang is
fact, but that is not a good reason to believe it to be so. The
alternative Steady State Theory, held by the famous British astronomer
Fred Hoyle, was accepted by most physicists in the 1930s, who were
concerned that the originator of the Big Bang theory, Monsignor
Georges Lemaître, was a Roman Catholic priest; it was thought that he
was trying to sneak creationism into science. It was, in fact, Hoyle
who is credited with coining the term Big Bang, intending it
pejoratively and contrasting it with the Steady State Theory, which he
favoured. In my ignorance of the evidence in favour of the Big Bang,
and without trying to find out the truth about it, I clung to the
discredited Steady State Theory until I watched Professor Krauss
recently. Since reading the Wikipedia entry on the Big Bang theory my
remaining doubts have been resolved.
Krauss is a great educator, and the author of several popular science
books including The Physics of Star Trek and A Universe from Nothing.
Though I have not read these I have watched several debates, lectures
and discussions on YouTube in which he appears. The debates have
centred on the battle between science and religion, raising the issues
of belief and faith. Krauss proudly wears the label of atheist and
scientist, and argues against holding beliefs on the basis of religion
or faith, rather than evidence. He says there is no evidence of a God
or gods and uses cosmological arguments, including the Big Bang (on
which he is an expert) to argue against the Biblical account of
creation, as told in the Book of Genesis, the first book of the
Christian Old Testament.
When I was seven I was given a copy of Burian and Augusta’s classic
Prehistoric Animals, a book that I treasured. This was the antidote to
the story of Noah’s Ark, and it made more sense to me, despite the
fact that I struggled to understand the text.
/
The reason I loved this book so much was the pictures, which are
exquisite. In 1963, when it was published, it was revolutionary. Never
before had the landscapes and animals of millions of years ago, been
brought to life so vividly. This is the Czech artist Zdenek Burian’s
rendition of the Middle Devonian landscape, about 400 million years
ago:
/
In my opinion, Zdenek Burian’s art stands among the greatest human art
of all time. It is not just because of his excellent technique and
aesthetic. It is his creative ability, his ability to bring fossils to
life, and to create realistic landscapes according to the scientific
facts as they were understood at the time. This required intimate
knowledge of the plants as well as the animals, and the technical
ability to draw and paint them. It required close collaboration with
the palaeontologist Josef Augusta and others. He was a pioneer in art
as well as science.
/
This painting of Diplodocus, from Prehistoric Animals, is one of close
to 500 prehistoric images painted by Burian between the early 1930s
and 1981. He was a prolific artist, credited with over 15,000 works.
Though I accepted, because my books and my mother told me so, that the
dinosaurs became extinct about 60 million years ago, I did not know
how palaeontologists came to this conclusion. I didn’t realise that
there was a gradual evolution of understanding of the age of the
earth, based on scientific evidence as it unfolded over two centuries.
When the famous British geologist Charles Lyell published Principles
of Geology in 1830 he argued that the earth was at least 300 million
years old. Scientists now estimate the age of the earth as about 4.5
billion years – more than 10 times older – based on radioisotope
dating of the oldest rocks and meteorites on earth. Lyell’s date of
300 million years suggested the age of the earth was much older than
most of his contemporaries thought in the West (though not as old as
was taught by the doctrines of Hinduism and Buddhism in the East).
Lyell proposed that the geological features of the earth (including
the various layers of fossils) had developed gradually by natural
(physical) forces that were still at work, over hundreds of millions
of years (gradualism) while the dominant scientific view in the West
was that of the French naturalist Georges Cuvier (1769-1832) who
thought the earth was only a few million years old at most. Cuvier,
who was also famous for his studies of comparative anatomy, was the
first naturalist to establish, from his studies of the fossil record,
the now widely accepted fact of mass extinctions, which he attributed
to catastrophes, such as floods, volcanoes and earthquakes. These
extinctions, according to Cuvier, were followed by creations of new
forms of life by God. The flood of the Old Testament was the most
recent of these catastrophic events. There developed a debate between
‘catastrophists’ such as Cuvier and the proponents of
‘uniformitarianism’ also known as ‘gradualism’, first proposed by
James Hutton in the late 18th century and popularised by Lyell in
Principles of Geology. Lyell was a close friend and colleague of
Charles Darwin, providing him with a geological framework for the
theory of natural selection, which was also gradualist – evolution,
according to Darwin, occurred slowly and gradually, over millions of
years.
The fact that deeper layers of fossils indicate earlier animals – the
basis of stratigraphy – was established by the Danish scientist (and
later Catholic bishop) Nicholas Steno (1638-1686) who introduced the
law of superposition, the principle of original horizontality and the
principle of lateral continuity in a 1669 work on the fossilization of
organic remains in layers of sediment. Steno logically surmised that
“at the time when any given stratum was being formed, all the matter
resting upon it was fluid, and, therefore, at the time when the lower
stratum was being formed, none of the upper strata existed”. His
principle of original horizontality stated that “strata either
perpendicular to the horizon or inclined to the horizon were at one
time parallel to the horizon”. Steno’s principle of lateral continuity
stated that sediments initially extend laterally in all directions;
his principle of cross-cutting relationships held that “If a body or
discontinuity cuts across a stratum, it must have formed after that
stratum.” Steno's ideas still form the basis of stratigraphy
contributing to the development of James Hutton's theory of infinitely
repeating cycles of seabed deposition, uplifting, erosion, and
submersion.
I did not grasp, until I was much older, that the geological eras –
the Palaeozoic, Mesozoic and Cenozoic are so named because of the
types of animals fossilized in various geological strata, rather than
the absolute ages of the rocks, which were unknown at the time. The
Palaeozoic, Mesozoic and Cenozoic eras comprise the Phanerozoic eon
(from 542 million years ago to the present) which is named as such
because of the fossils in various rock layers (strata). The geology is
named according to the zoology (hence Palaeozoic – ancient life,
Mesozoic – middle life and Cenozoic – new life). This is why different
sources have given different dates as to when each era began and
ended. Scientists (or rather naturalists) knew that the trilobites,
which were plentiful and diverse before the age of the dinosaurs,
became extinct long before the dinosaurs did because the trilobite
fossils were confined to deeper layers of sedimentary rock. Likewise,
they knew that the mastodons, mammoths and sabre-toothed cats had
become extinct more recently, since the dinosaurs were found in deeper
(and thus older) layers of rock. The first fish fossils are found in
deeper layers still, and these were more ancient than the dinosaurs
(and other land vertebrates). On this basis they created a
chronological sequence of the zoology of deep time of which they were
confident, dividing say the Mesozoic era into Triassic, Jurassic and
Cretaceous periods, and the Cenozoic era into older and younger
periods based on the types of fauna and flora preserved in the
fossils, but were not so confident about the precise lengths of these
periods or of the larger eras and eons. This required the development
of radioisotope dating in the 20th century.
There has been refinement and standardization of terminology over the
years since then, and many changes since I was a child and read for
the time about the Age of Dinosaurs and the Age of Mammals. By
international convention, the older term Tertiary, referring to the
period from 65 million years to 2.5 million years ago, was abandoned
in 2004. Instead, the subdivision of the Cenozoic has been
standardized as divided into 3 periods and 7 epochs. The present
Holocene epoch (the most recent epoch of the Quaternary period of the
Cenozoic era) began 12,000 years ago and is also known as the Age of
Man. The fact that it is not called the Age of Woman reflects the
history of both science and religion, as well as the history of
philosophy. But I’ll get to that later.
Nowadays the Proterozoic – Phanerozoic boundary is precisely dated at
541 (±1) million years ago. The Phanerozoic eon encompasses the
enormous period of time from 541 million years ago to the present, but
the preceding Proterozoic lasted even longer, from 2,500 to 542
million years ago. It used to be thought that life began only during
the Cambrian period, the first period of the Phanerozoic eon, but in
the last century there have been amazing discoveries of pre-Cambrian
fossils, such as the Ediacara fossil bed in South Australia and the
Burgess Shale discoveries in Canada. These indicate that there was an
explosion of life at the end of the Proterozoic eon followed by a
catastrophic mass extinction about 540 million years ago, in which
only a few phyla survived. These surviving phyla have given rise to
all the species of animal – vertebrate and invertebrate - on the
planet today.
There was another mass extinction (the P-Tr extinction) about 250
million years ago, at the end of the Permian period (the last period
of Palaeozoic era) in which it is estimated that 95% of all marine
species (including the trilobites) and 70% of terrestrial vertebrate
species become extinct. It is the only known mass extinction of
insects, and is thought to be the worst extinction event that has
occurred in the history of life. The fossil record suggests that more
than 50% of all families and 80% of all genera became extinct. The
cause of this catastrophic extinction is unknown; theories include
meteor impact, environmental change, massive volcanic eruptions in
Siberia and sudden release of methane, by methane-producing microbes,
into the ocean. The fact that the extinction occurred is undisputed.
The P-Tr extinction marks the beginning of the Triassic period of the
Mesozoic era, and the beginning of the age of the dinosaurs, which
fascinated me as a child, as it does children around the world today.
When I was a child, the cause of the extinction of the dinosaurs,
which had, by then, been fairly precisely dated to 65 million years
ago, was one of the great unanswered scientific mysteries.
The K-T boundary refers to the geological layer above which no
dinosaurs are found. K-T stands for Cretaceous-Tertiary, the
Cretaceous period being the third period of the Mesozoic era (the Age
of the Dinosaurs) and the Tertiary period being the old (and now
discarded) name for the first period of the Cenozoic era, heralding
the rise of mammals as the dominant vertebrate species on earth. The
K-T boundary has been radioisotope dated to 66 million years ago. It
is estimated that 75% of the plant and animal species, including the
dinosaurs and plesiosaurs, became extinct over a period of a few
million years (though the time frame of extinction is subject to
debate).
The discovery of iridium deposits at the K-T boundary in the 1970s by
the Nobel-prize winning physicist Luis Alvarez, may have solved this
mystery. In 1980 Alvarez published evidence that he had discovered a
fine layer of iridium, which is abundant in asteroids but rare on
earth, in a number of sites, hypothesising that a large meteorite or
comet had impacted with the earth, causing the mass extinction that
killed off the dinosaurs. The finding was challenged, as it should be,
but subsequent findings have supported Alvarez’s hypothesis. In the
1990s evidence of a large impact crater called Chicxulub was found off
the coast of Mexico, providing support for the theory. Other
researchers later found that the extinction of the dinosaurs at the
end of the Cretaceous may have occurred rapidly in geological terms,
over thousands of years, rather than millions of years as had
previously been supposed. Others have suggested alternative extinction
causes such as increased volcanism, particularly the massive Deccan
Traps eruptions in India that occurred around the same time, and
climate change. However, on March 4, 2010, a panel of 41 scientists
agreed that the Chicxulub asteroid impact triggered the mass
extinction.
More recently still, evidence has emerged of another catastrophic
extinction only 75,000 years ago, when most of humanity may have been
killed as a consequence of the massive explosion of the Toba
supervolcano in the Indonesian island of Sumatra, following which
there may have been a period of rapid cooling that lasted thousands of
years. It was proposed in the 1990s, that the Toba explosion produced
a “genetic bottleneck” in human evolution. This hypothesis is
supported by mitochondrial DNA studies that suggest that today's
humans are all descended from a very small population of between 1,000
and 10,000 breeding pairs that existed about 70,000 years ago. There
is recent evidence that a small human population may have survived the
Toba volcano in Jwalapuram, Southern India. A 2007 paper by Michael
Petralgia and others, revealed stone tools found above and below the
layer of 74,000 year-old Toba volcanic dust. The relative precision
with which dates of events such as the Toba explosion can be
determined is due to the modern technology of radioisotope dating.
"MY DEAR GALTON,--I have only read about 50 pages of your book (to the
Judges), but I must exhale myself, else something will go wrong in my
inside. I do not think I ever in all my life read anything more
interesting and original--and how well and clearly you put every
point! George, who has finished the book, and who expressed himself in
just the same terms, tells me that the earlier chapters are nothing in
interest to the later ones! It will take me some time to get to these
latter chapters, as it is read aloud to me by my wife, who is also
much interested. You have made a convert of an opponent in one sense,
for I have always maintained that, excepting fools, men did not differ
much in intellect, only in zeal and hard work; and I still think this
is an eminently important difference. I congratulate you on producing
what I am convinced will prove a memorable work. I look forward with
intense interest to each reading, but it sets me thinking so much that
I find it very hard work; but that is wholly the fault of my brain and
not of your beautifully clear style.--Yours most sincerely,
(Signed) "CH. DARWIN"
This appears in The Letters of Charles Darwin as LETTER 410.
/
The idea that some races and classes are genetically superior to
others was the keystone of eugenics, which was developed by Charles
Darwin’s cousin Francis Galton and promoted by Darwin’s son, Leonard
Darwin from the 1880s onwards. Basing their ideas on what they took to
be the latest scientific and statistical evidence (based on IQ
testing), the eugenicists recommended social programs of segregation,
sterilization and prevention of marriage of those deemed to be
“feeble-minded” or “insane”. This was in order to “improve the stock”
in Britain, which they feared was racially degenerating due to the
proliferation of the inferior lower classes. The same thinking was
applied globally in terms of the relative merits of different races,
resulting in different evils in the various nations that embraced
eugenics doctrines.
The titles of ‘The God Delusion’ and ‘The Root of all Evil’ are
deliberately provocative. Dawkins is prepared to go where most polite
people fear to venture and deliberately confronts religious belief,
which he regards as inconsistent with Science. Science with a capital
S. Dawkins has founded a Foundation for Reason and Science and was the
Professor for the Public Understanding of Science at Oxford
University. He clearly does not believe that to get along in society
on should avoid discussing religion or politics.
The first of these documentaries, The God Delusion, went to air in the
UK in January 2006. Channel 4 said that 2/3 of the responses had been
positive. The book The God Delusion was released in September 2006,
and developed the themes of the documentary further. The documentary
was uploaded to YouTube on 5.7.2013 and has had 7,140 views so far
(9.5.2015). An overwhelming majority of 68 likes contrasts with 3
dislikes. This suggests that people generally agree with Dawkins’
views, at least on face value. But 7,140 views is not many in almost
two years. Maybe only Dawkins fans are inclined to view a documentary
titled “The Root of All Evil” that is not about money.
The Wikipedia page on the documentary says that Dawkins admitted that
the title The Root of All Evil? was not his preferred choice, but that
Channel 4 had insisted on it to create controversy. This, he
apparently said on the Jeremy Vine Show on the BBC in 2006, but I have
no way of knowing whether this is true. Apparently the concession was
the addition of the question mark, changing it from a statement to a
question.
The provocative title is obviously based on the popular saying that
“money is the root of all evil”. The more biblically-oriented
capitalists emphasise the Bible says that it is not money but the
“love of money” that is the root of all evil and not money itself. But
Dawkins and Channel 4 provide a polemic in support of the idea that
religion is the root of all evil (and not money or the love of it,
which he does not discuss).
But what do they mean by religion, and what do they mean by evil? What
other causes can be identified for evil other than religion? Is human
nature at the root of all evil, or is there also evil in nature? What
about the Devil – is the Devil or Satan the root of all evil, as some
religious people believe? Is belief in the Devil without reference to
the supernatural consistent with both science and atheism, or when we
reject the existence of God is it assumed that we also reject the
existence of the Devil, the personification of evil? What, perhaps
more importantly, are the roles of such things as nationalism,
territorialism, authoritarianism, sexism, acquisitiveness, politics,
racism and intolerance in causing evil? There is also the problem of
the evils created by scientists – chemical weapons, land-mines, multi-
barrel rocket-launchers, AK 47s and other weapons, including nuclear
bombs. Bombs may be used by religious fanatics, but they are
manufactured using scientific knowledge.
In the introduction of The Root of All Evil? Dawkins begins ominously,
with what he regards as the consummate evil in the world as he saw it
in 2006:
“There are would-be murderers all around the world who want to kill
you, me and themselves because they are motivated by what they think
is the highest ideal. Of course, politics are important – Iraq,
Palestine, even social deprivation in Bradford, but as we wake up to
this huge challenge to our civilized values, don’t let’s forget the
elephant in the room, an elephant called religion.”
Dawkins is warning about the dangers of suicide bombers, who he
implies are mostly male and motivated by religious faith. His target
for criticism is faith in Judaism, Christianity and Islam – the
Abrahamic Religions, that all venerate what the Christians call the
Old Testament. My limited understanding is that in Judaism a variant
of the Old Testament is venerated as the Torah, which is only some of
the corpus of Jewish religious literature. The Jews do not recognise
Jesus as a prophet or messiah, while Islam accepts Jesus of Nazareth
as a prophet. Some schools of Islam also believe in the Second Coming
of Jesus and the Apocalypse, while the Jews are still waiting for the
Messiah. In Islam, from what little I understand of the religion, some
of the Old Testament stories are accepted, including the importance of
the Old Testament prophets like Moses and Abraham and the six-day
creation, but there is considerable diversity and divergence of
thought in Islamic scholarship about such things.
Dawkins admits that politics is important to understand the “would be
murderers” that he imagines are plotting evil suicide-bombings all
over the world. He says they want to kill “you, me and themselves”,
This may be true in his case, given his public attacks on their faith,
but I’m confident that it is not the case for myself. If I went around
saying people wanted to kill me but I didn’t know who they are I would
probably, given my circumstances, be locked back up in a mental
hospital. He mentions the politics of Iraq, Palestine and Bradford,
but doesn’t explore them, nor is there much depth in his analysis of
the problem of suicide bombers, who are not, by any means, responsible
for most of the murders that occur in the world today. Most murderers
kill others and not themselves. Many of the murders are called
“military operations”, including many of the murders in Iraq and
Palestine.
If one looks more carefully at the current threat of militant Islam in
the Middle East, Africa and Asia, one finds the USA and the West
arming and training of the Mujahideen in Afghanistan to fight a proxy
war against the atheist, communist Russians in the 1980s, giving rise
to the Taliban. One finds the arming of the secular regime of Saddam
Hussein with chemical and biological weapons and support, by the USA
and West, of Hussein and number of other dictators around the world in
the 1970s and 1980s. Palestine and Iraq have long and complex
histories, including the crusades and the centuries-old struggle for
control of Jerusalem and the Holy Land of the Bible. More recently,
European colonization and dominance, both cultural and political, and
the colonial wars, culminating in the First and Second World Wars,
have fed Islamic militancy as well as nationalism in the previously
colonized world. Islamic militancy has many complexities and a long
history. So does Christian militancy and Jewish militancy. There have
been plenty of militant Buddhists and Hindus in history, too, though
Dawkins focuses only on the Abrahamic religions.
It is a common and accurate perception, influenced by the media, but
confirmed by statistics and the actual statements of suicide bombers,
that Islamic fundamentalism and extremism have been responsible for
most of the suicide bombings in the world since the defeat of the
Tamil Tigers (LTTE) in Sri Lanka in May 2009. At the time that Dawkins
and Channel 4 made this documentary it’s hard to see how the LTTE’s
female suicide bombers, who were not motivated by belief in God or
heaven to strap on ‘suicide vests’ and blow up both military and
civilian targets, escaped the attention of Channel 4 and Professor
Dawkins. Christopher Hitchens, who also uses the example of suicide
bombers as an argument against all religion, corrected himself in
later debates, when he made reference to the Tigers use of suicide
bombers as an exception to his pronouncements against Islam and, more
broadly, religion.
According to Dawkins:
“The suicide bomber is convinced that in killing for his God, he will
be fast-tracked to a special martyr’s heaven. This isn’t just a
problem of Islam. In this program I want to examine that dangerous
thing that is common to Judaism and Christianity as well, the process
of non-thinking called faith.”
He then states his position clearly:
“I am a scientist, and I believe there is a profound contradiction
between science and religious belief.”
During their militant campaign for a separate state for Sri Lankan
Tamils, which they called Tamil Eelam, the organization known as the
“Tamil Tigers” were a specific military and political entity formally
known as the Liberation Tigers of Tamil Eelam or LTTE. The political
ideology of the LTTE was ostensibly secular and Marxist, and they drew
inspiration from the “liberation struggle” of Cuban Marxists. This was
their ideology as espoused by the self-declared “political strategist
and theoretician” of the LTTE, Anton Balasingham, who spent his later
years orchestrating the international political machinery of the LTTE
from London, together with his wife Adele, who, when the husband and
wife team were still in Sri Lanka, led the Women’s Wing of the Tigers.
There is YouTube footage of Adele Balasingham handing out vials of
cyanide to young women, who were Tamil recruits for the LTTE, but all
the supporters of the LTTE, including Adele Balasingham were not Tamil
(though all of the actual fighters were). These young women were
expected to commit suicide by swallowing the cyanide if they got
caught. It is a scientific discovery that the ingestion of cyanide is
fatal at a certain dose and it was a scientific calculation that
allowed the LTTE to know what dose to give these women (most of whom
were still girls). The guns that they carried and killed with were
invented and refined into ever-more murderous models through the
genius of science. The suicide vests that were pioneered by the Tigers
were also contributions from Western science and technology.
At the same time, the LTTE cynically used religion and religious, as
well as non-religious Tamil cultural traditions to further their
political and military agenda. Self-sacrifice is integral to the Hindu
religion, martyrdom as a direct route to Heaven is a feature of the
Christian and Muslim religions. The LTTE created a cult around its
military leader, Vellupillai Prabakaran, who was hero-worshipped, but
the concept of martyrdom was promoted more for nationalistic reasons –
which were politically secular. The suicide squad was given the self-
identity of Black Tigers, and before they went on their suicide
mission they would be honoured by a dinner with the leader. Those who
died in the cause were commemorated in rituals that calculatingly drew
on both Hinduism and Christianity. For example, Prabakaran was
insistent that the dead “cadres” as they were called, were buried in
the Christian tradition rather than cremated in the Hindu tradition,
so that he could impress Western visitors with the determination and
sacrifice of the “freedom fighters” as evidenced by the cemeteries
full of white crosses. The Australian paediatrician, John Whitehall,
who trained the LTTE medicos after the 2004 Asian tsunami, says that
he was very impressed by these cemeteries. He was supposed to be.
The LTTE welcomed any religion to kill and be killed in the interests
of winning the war for “self-determination” of the Tamils of Tamil
Eelam. The state they aimed to create was secular, rather than
religious, but there was to be supremacy of two languages – Tamil and
English, rather than Sinhalese, which was seen as the language of the
enemy. Sinhalese was also the language of the majority of the
population of Sri Lanka, and this was one of the roots of the evil
that ensued, but not the root of “all” the evil of the war. There were
many factors, some political, some social, some linguistic. All the
killing and maiming was done with weapons, developed through Western
science, though much damage was also done through words and
propaganda, which were used to raise money for the war effort on both
sides. Though portrayed as an “ethnic war” between the Sinhalese and
Tamils, there were many factors that fed into the equation. Religion
was one of them, in that the state religion of Sri Lanka is Buddhism
and the large majority of Sinhalese are Buddhist, with a small
minority of Catholic and Protestant Christians. The Tamils are mostly
Hindu, also with a minority of Catholic and Protestant Christians.
Most of the members of the Sri Lankan army were Sinhalese and
Buddhist, while most of the Tigers were Hindu or Christian, adding a
religious dimension to the conflict.
Language, though, is not the root of all evil any more than religion
is. There is no “root of all evil”. Evil has many causes, many roots.
The challenge is to identify them, including the evils of certain
religious doctrines – but also the evils of certain scientific
doctrines, political doctrines, economic doctrines and philosophical
doctrines. One of the roots of evil is dogma and dogmatism, but there
are scientific dogmas and well as religious, political and social
dogmas. It seems sensible and mature to look for the evil in one’s own
belief system before attacking the beliefs of others, and to make a
clear distinction between the two different meanings of what it is to
be wrong. Wrong can mean incorrect, it can also mean evil or wicked.
This is the important difference between mad and bad, between delusion
and crime. We all agree that crime is wrong (though we may differ on
what constitutes a crime). Delusions are also wrong, in that they are
not true and correct beliefs, but they are not wrong in a moral sense.
There is an important difference between being wrong and doing wrong.
Being wrong means that you are mistaken, and this can be corrected.
There is no moral condemnation of people being wrong, and though they
might resist admitting it to themselves or others, being wrong about
things is not evil or sinful. Doing wrong is a different matter. To do
wrong is to sin and all sins are evil or wicked, according to the
shared religious traditions of East and West. Some sins, like murder,
rape and stealing are condemned by all religions except those that
truly worship evil (and there are such religions, though with
fortunately few adherents).
The regime of Adolf Hitler is almost universally regarded as evil. The
exception is the neo-Nazis who continue to hero-worship Hitler and the
Nazi regime, who are mostly in the West – the UK and Europe, and North
America (the USA and Canada). I personally regard Hitler as one of
most evil men in history and regard the Nazis with revulsion, but not
because of Hitler’s supposed “paganism” and interest in Theosophy. I
don’t care if Hitler was a vegetarian and cared about animals, nor if
he truly loved Eva Braun; what matters to me is the genocide that he
presided over.
The undoubted evil of Hitler has been used by both sides of the
argument for and against religion. The atheists claim that Hitler was
a Catholic, the Catholics claim that he was an atheist or pagan. The
Nazi’s used a swastika – an ancient Indian sacred symbol – rather the
cross for their flags (though the swastika has a cross as its integral
design). The atheists also argue that the Catholic Church celebrated
Hitler’s birthday, while their opponents argue back that it was under
duress, and that Catholic priests tried to save many Jews. I don’t
expect to resolve these arguments, but I think we can all agree that
Hitler was evil and so was the entire Nazi apparatus. In the 1940s,
while this apparatus grew in power and danger, and started killing the
inmates of its mental asylums in anticipation of the Holocaust, the
Germans led the world not in religion but in science. After the war it
was Nazi scientists who were recruited by the Americans under
Operation Paperclip, not German theologians.
The Holocaust was not caused by the isolated, deranged mind of an
individual. The Nazi apparatus was a highly organized, professional
killing machine guided by the best scientific minds in the business
and the latest technology. German science was rivalled only by that of
Japan, Britain, Russia and the USA, not necessarily in that order. It
was German drug company Bayer that gave the world heroin, which was
marketed as a non-addictive way of treating opium addiction. It was
Germany that led the world in the neurosciences – the study of the
brain, which advanced in leaps and bounds when they started killing
mental patients and collecting their brains, to section, stain and
study under the microscope. The University of Heidelberg had the
biggest collection of brains, not in academia, but in formaline. It
was medical doctors, trained by psychiatrists, who decided, during the
notorious Aktion T4 program, which mental patients were ‘curable’ or
‘incurable’. The German psychiatrist Professor Emil Kraepelin, who
first described “dementia praecox” (later named schizophrenia) and
“manic depression” in the late 1890s, at the Heidelberg University was
internationally recognised as an expert in classifying some people as
mad, with the assumption that the system (represented by himself) was
sane. A few years after Kraepelin died, these same criteria were used
to decide whether to send people to the gas chamber (another
scientific achievement of the Germans). The Nazi psychiatrist Karl
Schneider orchestrated the killing of mental patients and collected
their brains for the university’s collection. It was all very
scientific.
The other motivation for the evils of Nazism was the creation of an
Aryan super-race. This was a scientific aim, based on a nationalistic
German interpretation of anthropology and history, but rationalised by
the Darwinian idea of survival of the fittest as interpreted by
Darwin’s cousin, Francis Galton at Cambridge University in England.
This was the doctrine of eugenics, which Galton hoped would become the
“religion of the future” when he promoted it from the 1880s till his
death in 1911. Eugenics means ‘good breeding’, from the Greek eu –
meaning good and genos - meaning race, stock or kin. The enthusiasts
of eugenics hoped that the most superior classes and individuals of
their various nations should be encouraged to have large families,
while the most inferior classes and individuals should be sterilized,
segregated and prevented from contributing their “blood” or
“inheritance” to the next generation. This was based on their
understanding of Darwin’s evolutionary theory, as applied to human
society.
There is an inherent assumption here that science is good – scientific
status for a theory or hypothesis is something to be desired. This is
the result of the status of science – which is based on its
credibility, which in turn is based on its marvellous achievements in
various scientific disciplines. Its credibility is based on its
history and the technological wonders that have transformed human
society - the motor car, airplane, radio and TV, vaccines, knee and
hip replacements, insulin and penicillin (and a range of valuable
drugs) not to mention the miracle of the computer. To a man living in
the Stone Age, or even the Middle Ages, these achievements would
indeed be miraculous.
Buffon believed humanity was only 6000 years old (the time since
Adam). Believing in monogenism, Buffon thought that skin colour could
change in a single lifetime, depending on the conditions of climate
and diet.[12]
In 2014 Pope Francis announced that evolution and the big bang theory
are in fact real, and one should not gather from the Book of Genesis
that God is a “magician with a magic wand.”
I do not believe in the supernatural or any of the Biblical miracles,
nor any of the alleged miracles from other religious traditions. I
don’t believe in supernatural miracles; I do believe in natural
miracles.
The sun goes up in the sky as the morning progresses and performs
various natural miracles as it moves through the sky. Flowers open and
leaves imperceptibly grow, birds start to sing and then grow quiet,
flocks of migratory birds and butterflies continue their journeys and,
in the deep layers of the skin of human beings that are exposed to its
rays, melanocytes produce the brown pigment melanin and other cells
produce vitamin D from cholesterol. I believe, as a result of my
medical training that Vitamin D is necessary for the healthy growth of
bones, and that deficiency of this vitamin causes the childhood
disease of rickets, that causes bones to weaken and deform. I also
believe, on the basis of what I regard as a sensible hypothesis I read
some years ago, that it is the need to produce vitamin D that is most
likely the evolutionary cause of light and dark skin, although folic
acid may have something to do with it, as has been recently
hypothesised. I also believe in the whole miracle of evolution by
natural selection.
Douglas Nakashima and Marie Roué have made a clear distinction between
Western science and “traditional knowledge”, which they equated with
“Indigenous knowledge”. In 2002 they explained, in the Encyclopaedia
of Global Environmental Change that “Western science favours
analytical and reductionist methods as opposed to the more intuitive
and holistic view often found in traditional knowledge. Western
science is positivist and materialist in contrast to traditional
knowledge, which is spiritual and does not make distinctions between
empirical and sacred. Western science is objective and quantitative as
opposed to traditional knowledge, which is mainly subjective and
qualitative. Western science is based on an academic and literate
transmission, while traditional knowledge is often passed on orally
from one generation to the next by the elders. Western science
isolates its objects of study from their vital context by putting them
in simplified and controllable experimental environments—which also
means that scientists separate themselves from nature, the object of
their studies;-by contrast, traditional knowledge always depends on
its context and particular local conditions.”
The Western medical model is not the only scientific model that claims
to be able to heal. There are also rival Traditional Chinese Medicine
(TCM) and Indian Ayurvedic models that compete for customers in
Australia and the West, as in India and China. In addition, there are
various Western healing methods that claim scientific validity but are
regarded, for various reasons as “alternative” or “complementary”
medicine – naturopathy, chiropractic, homeopathy and herbalism, for
example. A few Western-trained doctors like Deepak Chopra have
attempted a fusion between Vedic and Western science, but there are
serious problems with this enterprise. I discovered this when I tried
to integrate Eastern and Western health models twenty years ago.
When I was working as a GP in Melbourne in the 1990s, I was once
consulted by a young woman who had visited an acupuncturist for a
sprained wrist. She had been told, after an examination of her tongue
and pulse, that she had “warm Chi affecting the spleen” and “cold Chi
affecting her kidneys”. She had also been told that she needed to take
a concoction of herbs to correct the “moistness” in her liver. The
lady was insistent that I do some blood tests to “check” her liver,
kidneys and spleen. I took a careful history and examined her, finding
no signs of kidney or liver disease, nor evidence of enlargement of
her spleen, and tried to explain to her that what Chinese medicine
means by “liver”, “kidney” and “spleen” is not really the same as what
these organs mean to Western doctors. She wanted the blood tests
anyway, so I ordered performed a biochemical screen – checking her
electrolytes, liver function tests and blood count. They were all, as
I expected them to be, normal.
This event prompted me to look more closely at the theoretical basis
of Traditional Chinese Medicine (TCM) to see if the core concepts of
Yin and Yang, chi and meridians can be understood in terms of the
fundamental Western scientific medical disciplines of anatomy and
physiology. The meridians are channels through which the chi (which I
interpreted as energy of some sort) flowed. The meridians, which
determine acupuncture points, do not correspond to either the nervous
system or circulatory system in anatomy or function, and they have to
be added on rather than integrated with Western science.
/
/
The concepts of Yin and Yang are also integral to Chinese Medicine.
Yin (black with the white spot in the symbol) is the feminine and Yang
(white with the black spot) is the masculine, which has obvious
correspondence with Western science. One could regard the female
hormones, like oestrogen, as Yin and the masculine hormones like
testosterone as Yang, as well as other distinctly feminine and
masculine aspects of anatomy, physiology and psychology. This is not
what yin and yang mean in Chinese medicine, science and philosophy,
however.
Wikipedia explains that “In Chinese philosophy, yin and yang (also,
yin-yang or yin yang) describes how apparently opposite or contrary
forces are actually complementary, interconnected, and interdependent
in the natural world, and how they give rise to each other as they
interrelate to one another. Many tangible dualities (such as light and
dark, fire and water, and male and female) are thought of as physical
manifestations of the duality symbolized by yin and yang. This duality
lies at the origins of many branches of classical Chinese science and
philosophy, as well as being a primary guideline of traditional
Chinese medicine.”
Chris Kressler, armed with a Master of Science but not a medical
degree claims to be practicing “integrative and functional medicine”,
and has his own website that announces:
“Chris Kresser, M.S., L.Ac is a globally recognized leader in the
fields of ancestral health, Paleo nutrition, and functional and
integrative medicine.”
Kressler assures his readers that acupuncture does work, and provides
purely scientific explanations for why it works. Scientific, meaning
Western scientific. The fact that the Chinese meridian system does not
correspond to the nervous system, does not stop some Kressler from
claiming that acupuncture works through its effects on the nervous
system and stimulation of pain receptors in the skin. This he says
restores homeostasis, increases blood flow to the area, and causes the
release of pain-killing hormones (endorphins) from the brain. This
theory is scientific and also satisfies Carl Popper’s criterion that a
scientific theory be falsifiable. The theory that acupuncture works
because of the release of endorphins can be tested and measured,
supporting or disproving the theory. Kressler gives no evidence that
he is basing his claims on such studies, or whether this is mere
speculation.
In his posting on how he thinks acupuncture works Kressler makes
reference to “purists” who object to these efforts to explain
acupuncture in Western scientific terms, thinking that it takes the
magic out of acupuncture. Kressler claims that equating chi (qi) with
energy is incorrect, and that the original meaning of qi was air –
which he equates with oxygen. He writes: “When the terms qi (oxygen),
mai (vessel) and jie (neurovascular node) are properly translated, it
becomes clear that there is no disagreement between ancient Chinese
medical theory and contemporary principles of anatomy and physiology.
Chinese medicine is not a metaphysical, energy medicine but instead a
“flesh and bones” medicine concerned with the proper flow of oxygen
and blood through the vascular system.”
Kressler is a big admirer of Chinese Medicine and claims that the
Chinese knew much more about anatomy and physiology than the West,
with a tradition of anatomical dissection dating back over 2000 years.
He says that the ancient Chinese identified and weighed all the
internal organs, and knew that the heart pumps blood around the body
long before scientists in the West. This discovery, Kressler claims,
was made by the Chinese more than 2000 years ago “but only discovered
in western medicine in the early 16th century”.
Presumably, Kressler is referring to the famous discovery of the
heart-lung circulation by the British physician William Harvey (1578-
1657). Harvey described in detail how the left side of heart pumps
blood to the body, after receiving oxygenated blood from the lungs and
that the blood is pumped to the lungs from the right side of the heart
via the pulmonary artery. He also described in detail how the
oxygenated blood from the left side is distributed through arteries,
beginning with the aorta, to the head and brain, and to the rest of
the body. The ancient Chinese knew nothing of these things, though
they may well have known that the heart is the organ pumps blood. So
did, we can be confident, did the ancient Greeks, since this was the
reason that Aristotle proposed that that the soul resides in the heart
in the 3rd century BC. This curious idea arose from his quite
reasonable, but incorrect, view that thought is carried in the blood.
He obviously knew that the heart was the organ that pumped the blood.
Kressler makes the rather startling claim that Chinese medicine was
used by emperors and the royal courts to help them live into their 90s
and stay fertile into their 80s at a time when the average life
expectancy in the west was 30 years. Men do remain fertile into their
80s, and since recorded history some individuals have lived into their
90s. There has never been a time in recorded history when life
expectancy in the West was 30, and besides, average life expectancy
does not indicate the length of life of the longest-living
individuals. Besides, if one is going to take 2000-year-old texts
seriously one might also conclude that the medicine of the ancient
Jews enabled the patriarch Abraham to live hundreds of years.
The ancient Chinese obviously knew about vessels and blood, but what
are the “five depots” and the “six palaces”?
The Hungarian Sinologist Imre Galambos argues, in reference to the
Huangdi Neijing and its influence, that while in the modern,
“scientific” West it is customary to think that the newer a thing is
the better, in traditional Chinese thought this appears to be just the
opposite; a new thing could be justified and accepted if one could
prove that it has been already mentioned and thought of in ancient
times. As a result of this traditionalistic approach, medicine in
China has been regarded as a body of knowledge which has undergone
very little, if any, changes through the span of history.
Christopher Hitchens was the author of God Is Not Great[9] and was
named among the "Top 100 Public Intellectuals" by Foreign Policy and
Prospect magazine. In addition Hitchens served on the advisory board
of the Secular Coalition for America. In 2010 Hitchens published his
memoir Hitch-22 (a nickname provided by close personal friend Salman
Rushdie, whom Hitchens always supported during and following The
Satanic Verses controversy).[10] Shortly after its publication,
Hitchens was diagnosed with esophageal cancer, which led to his death
in December 2011.[11] Before his death, Hitchens published a
collection of essays and articles in his book Arguably;[12] a short
edition Mortality[13] was published posthumously in 2012. These
publications and numerous public appearances provided Hitchens with a
platform to remain an astute atheist during his illness, even speaking
specifically on the culture of deathbed conversions and condemning
attempts to convert the terminally ill, which he opposed as "bad
taste".[14][15]
The Koch that Crick speaks of is Christof Koch, who collaborated with
him in his studies of the monkey visual system. Koch can be seen in
this YouTube discussion:
P Svenningsson
P Svenningsson
H Hall
H Hall
G Sedvall
G Sedvall
Bertil B Fredholm
Bertil B Fredholm
Department of Physiology and Pharmacology, Karolinska Institutet,
Stockholm, Sweden.
Synapse (Impact Factor: 2.43). 01/1998; 27(4):322-35. DOI:
10.1002/(SICI)1098-2396(199712)27:4<322::AID-SYN6>3.0.CO;2-E
Source: PubMed
ABSTRACT Whole-hemisphere sections from six subjects were used in a
quantitative autoradiographic study to characterize and to investigate
the distribution of adenosine receptors, using [3H]DPCPX, [3H]CGS
21680, and [3H]SCH 58261 as radioligands. [3H]DPCPX-binding showed the
pharmacology expected for adenosine A1 receptors and is therefore
taken to mirror adenosine A1 receptors. Adenosine A1 receptors were
widely distributed, with the highest densities in the stratum
radiatum/pyramidale of the hippocampal region CA1. Adenosine A1
receptors were nonhomogeneously distributed in nucleus caudatus,
globus pallidus, and cortical areas: In the cingulate and frontal
cortex the deep layers showed the highest labeling, while in the
occipital, parietal, temporal, and insular cortex it was highest in
the superficial layers. In addition, we found very high levels of
adenosine A1 receptors in structures known to be important for
cholinergic transmission, especially the septal nuclei. The Bmax
values and KD values for [3H]DPCPX-binding in stratum
radiatum/pyramidale of CA1 and the superficial layer of insular cortex
were 598 and 430 fmol/mg gray matter and 9.9 and 14.2 nM,
respectively. [3H]CGS 21680-binding was multiphasic, but showed the
pharmacology expected for adenosine A2A receptors and was taken to
represent them. Adenosine A2A receptors were abundant in putamen,
nucleus caudatus, nucleus accumbens, and globus pallidus pars
lateralis. Specific [3H]CGS 21680-binding was also found in certain
thalamic nuclei and throughout the cerebral cortex. The adenosine A2A
receptor antagonist radioligand [3H]SCH 58261 was also found to label
these extrastriatal structures. Thus, adenosine A2A receptors seem to
be more widely distributed in the human brain than previously
recognized.